Lightweight local bridge for running an mlx-lm model behind an OpenAI-compatible API and exposing it through a LiteLLM proxy.
-
Updated
Apr 4, 2026 - Python
Lightweight local bridge for running an mlx-lm model behind an OpenAI-compatible API and exposing it through a LiteLLM proxy.
An intelligent local AI agent powered by open-source LLMs, featuring free web search, hybrid memory, and context-aware query rewriting for real-time, grounded answers.
A local AI-powered OS assistant for Windows that organizes files, monitors system pressure, suggests cleanup actions, and stays privacy-friendly by running with local tooling such as Ollama.
Local AI Assistant working with Ollama and Python | Локальный ИИ Ассистент, работающий на Ollama и Python
CLI based Local Coding Agent Written in Python
The intelligent command-line interface directly powering the Toolpack SDK
Local-first AI assistant for files, reminders, and document generation
Provide privacy-focused local or cloud AI assistance with Python, supporting tool calls like web search, file access, and system info retrieval.
Add a description, image, and links to the local-ai-agent topic page so that developers can more easily learn about it.
To associate your repository with the local-ai-agent topic, visit your repo's landing page and select "manage topics."