Skip to content

AtomicBot-ai/Atomic-Chat

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8,090 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Atomic Chat

Atomic Chat

Open-source ChatGPT alternative. Run local LLMs or connect cloud models — with full control and privacy.

Stars  Forks  Last Commit  Tauri  Node.js

Getting Started · Discord · X / Twitter · Bug Reports

Atomic Chat Interface


Download

macOS (Universal) Atomic.Chat_1.1.66_universal.dmg
Windows (x64) Atomic.Chat_1.1.66_x64-setup.exe
iOS App Store

Download from atomic.chat or GitHub Releases.


Features

  • 🧠 Local AI Models — download and run LLMs (Llama, Gemma, Qwen, and more) from HuggingFace
  • Fast Inference Engines — TurboQuant-optimized llama.cpp on all platforms, MLX for Apple Silicon
  • ☁️ Cloud Integration — connect to OpenAI, Anthropic, Mistral, Groq, MiniMax, and others
  • 🤖 Custom Assistants — create specialized AI assistants for your tasks
  • 🔌 OpenAI-Compatible API — local server at localhost:1337 for other applications
  • 🔗 Model Context Protocol — MCP integration for agentic capabilities
  • 🔒 Privacy First — everything runs locally when you want it to

Inference Engines

Atomic Chat ships its own optimized inference stack so models run fast on whatever hardware you have:

  • atomic-llama-cpp-turboquant — our fork of llama.cpp with TurboQuant optimizations for faster quantized inference. Works on macOS, Windows, and Linux across CPU and GPU backends.
  • MLX-VLM — Apple Silicon-native engine for vision-language models, running directly on the Neural Engine and unified memory. Faster than llama.cpp on M-series chips for supported models.

The local API server at http://localhost:1337/v1 exposes models from both engines through a single OpenAI-compatible endpoint — tools don't need to know which backend is running underneath.


Launch With

Atomic Chat runs an OpenAI-compatible server at http://localhost:1337/v1, so any agent, CLI, IDE plugin, or app that speaks the OpenAI API can run on top of your local models — no extra glue needed. Just point its base URL at Atomic Chat and you're done.

A few projects already ship first-class support with their own setup docs:

Tool What it is Setup
OpenCode Open-source TUI coding agent. Add Atomic Chat as a local provider in opencode.json. Setup guide →
OpenClaude Open-source coding-agent CLI for cloud and local models. Lists Atomic Chat as a supported provider. Providers list →
Hermes Workspace Local-first agent workspace built on Nous Research's Hermes. Uses Atomic Chat as its inference backend. Repo →
nanoclaw Containerized agent runtime that calls Atomic Chat as an MCP tool. Skill guide →

Built something that runs on Atomic Chat? Open a PR and we'll add it here.


Build from Source

Prerequisites

  • Node.js ≥ 20.0.0
  • Yarn ≥ 4.5.3
  • Make ≥ 3.81
  • Rust (for Tauri)
  • (Apple Silicon) MetalToolchain xcodebuild -downloadComponent MetalToolchain

Run with Make

git clone https://github.com/AtomicBot-ai/Atomic-Chat
cd Atomic-Chat
make dev

This handles everything: installs dependencies, builds core components, and launches the app.

Available make targets:

  • make dev — full development setup and launch
  • make build — production build
  • make test — run tests and linting
  • make clean — delete everything and start fresh

Manual Commands

yarn install
yarn build:tauri:plugin:api
yarn build:core
yarn build:extensions
yarn dev

System Requirements

  • macOS: 13.6+ (8GB RAM for 3B models, 16GB for 7B, 32GB for 13B)
  • Windows: 10/11 x64 (same RAM recommendations as macOS)
  • iOS: 17+ (download from App Store)

Troubleshooting

If something isn't working:

  1. Copy your error logs and system specs
  2. Open an issue on GitHub
  3. Or ask for help in our #🆘|atomic-chat-help channel

Contributing

Contributions welcome. See CONTRIBUTING.md for details.

Discord  Report Issues  Submit PRs


Contact


License

Apache 2.0 — see LICENSE for details.

Acknowledgements

Built on the shoulders of giants:


© 2026 Atomic Chat · Built with ❤️ · atomic.chat

About

Atomic-Chat is an open source alternative to ChatGPT that runs 100% offline on your computer.

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • TypeScript 69.8%
  • Rust 20.3%
  • Python 3.3%
  • Swift 2.0%
  • PowerShell 1.2%
  • Shell 1.0%
  • Other 2.4%