Skip to content

msb-msb/awesome-local-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Awesome Local AI Awesome

A curated list of resources for running AI locally on consumer hardware -- LLMs, image generation, and AI agents without cloud dependencies. 230+ guides, tools, and community links.

Running AI locally means privacy, no subscriptions, and full control. This list covers the tools, guides, and communities that make it practical.

Last updated: 2026-03-06

Contents


Getting Started

New to local AI? Start here.

Tools

Interactive tools for planning and optimizing your local AI setup.

Hardware Guides

Figuring out what hardware you need (or what to do with what you have).

VRAM Requirements

Buying Guides

Platform Guides

Inference Engines

The software that actually runs the models.

  • llama.cpp - The foundational CPU/GPU inference engine, supports GGUF
  • Ollama - User-friendly wrapper around llama.cpp with model management
  • vLLM - High-throughput serving for production deployments
  • ExLlamaV2 - Fastest single-user NVIDIA inference, EXL2 format
  • MLX - Apple's framework optimized for M-series Macs
  • llama-cpp-python - Python bindings for llama.cpp
  • candle - Rust ML framework with LLM support

Guides

User Interfaces

GUIs and web interfaces for interacting with local models.

Desktop Applications

  • LM Studio - Polished desktop app with built-in model browser
  • GPT4All - Simple desktop app, good for beginners
  • Jan - Open-source ChatGPT alternative with local models
  • Msty - Mac-native LLM interface

Web Interfaces

Guides

Models

Language Models

Model Libraries

Model Families

  • Llama 3 - Meta's flagship open model (1B to 405B)
  • Qwen 2.5 - Alibaba's strong multilingual models
  • Mistral - Efficient 7B-12B models
  • DeepSeek - Strong reasoning and coding models
  • Phi - Microsoft's small-but-capable models
  • Gemma - Google's open models

Model Guides

By Use Case

Image Generation Models

  • Stable Diffusion - The original open image model
  • SDXL - Higher resolution, better quality
  • Flux - Best open image model for prompt following
  • SD 3.5 - Stability's latest
  • Civitai - Community checkpoints, LoRAs, and embeddings

Image Generation

Interfaces and tools for running image generation locally.

Interfaces

  • ComfyUI - Node-based workflow, supports everything
  • AUTOMATIC1111 - Classic web UI, huge extension ecosystem
  • Forge - A1111 fork with better performance
  • Fooocus - Simplified UI, Midjourney-like experience
  • SD.Next - A1111 fork with AMD/Intel support
  • InvokeAI - Professional-grade creative tool

Guides

Extensions & Tools

AI Agents

Running autonomous AI agents locally.

Frameworks

OpenClaw Guides

Agent Concepts

Advanced Topics

Going deeper into local AI.

Architecture & Theory

RAG & Document Search

Fine-Tuning & Training

Voice & Multimodal

Distributed Inference

Coding Assistants

Cost & Strategy

Privacy & Security

Troubleshooting

Use Cases

Practical applications and scenario-specific guides.

Philosophy & Opinion

Blog

Development logs, news reactions, and opinion pieces.

Communities

Where to get help and stay updated.

Reddit

Discord

Other

Contributing

Contributions welcome! Please read the contribution guidelines first.

  • Add resources that are genuinely useful for running AI locally
  • Include a brief description explaining why the resource is valuable
  • Verify links are working and resources are actively maintained
  • Keep the list organized and avoid duplicates

License

CC0

To the extent possible under law, the contributors have waived all copyright and related rights to this work.

About

A curated list of resources for running AI locally on consumer hardware

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors