Open-source ChatGPT alternative. Run local LLMs or connect cloud models — with full control and privacy.
Getting Started · Discord · X / Twitter · Bug Reports
| macOS (Universal) | atomic-chat.dmg |
Download from atomic.chat or GitHub Releases.
- 🧠 Local AI Models — download and run LLMs (Llama, Gemma, Qwen, and more) from HuggingFace
- ☁️ Cloud Integration — connect to OpenAI, Anthropic, Mistral, Groq, MiniMax, and others
- 🤖 Custom Assistants — create specialized AI assistants for your tasks
- 🔌 OpenAI-Compatible API — local server at
localhost:1337for other applications - 🔗 Model Context Protocol — MCP integration for agentic capabilities
- 🔒 Privacy First — everything runs locally when you want it to
- Node.js ≥ 20.0.0
- Yarn ≥ 4.5.3
- Make ≥ 3.81
- Rust (for Tauri)
- (Apple Silicon) MetalToolchain
xcodebuild -downloadComponent MetalToolchain
git clone https://github.com/AtomicBot-ai/Atomic-Chat
cd Atomic-Chat
make devThis handles everything: installs dependencies, builds core components, and launches the app.
Available make targets:
make dev— full development setup and launchmake build— production buildmake test— run tests and lintingmake clean— delete everything and start fresh
yarn install
yarn build:tauri:plugin:api
yarn build:core
yarn build:extensions
yarn dev- macOS: 13.6+ (8GB RAM for 3B models, 16GB for 7B, 32GB for 13B)
If something isn't working:
- Copy your error logs and system specs
- Open an issue on GitHub
- Or ask for help in our Discord
#🆘|atomic-chat-helpchannel
Contributions welcome. See CONTRIBUTING.md for details.
- Bugs: GitHub Issues
- General Discussion: Discord
- Updates: X / Twitter
Apache 2.0 — see LICENSE for details.
Built on the shoulders of giants:
© 2026 Atomic Chat · Built with ❤️ · atomic.chat

