Self-hosted AI assistant framework built in Rust. Multi-channel, multi-provider, with built-in self-evolution.
Forked from ZeroClaw and extended with production reliability, governance-aware AI, and a self-evolution system.
- 9 LLM providers — Anthropic, OpenAI, Google Gemini, GitHub Copilot, Ollama, AWS Bedrock, GLM, OpenAI Codex, and OpenAI-compatible endpoints
- LLM Router — heuristic routing (capability + Elo + cost + latency), KNN semantic routing (cold-start guard + 100ms timeout fallback), and Automix low-confidence auto-upgrade
- Causal Tree Engine — speculative multi-branch prediction with rehearsal, scoring, and circuit breaker; opt-in via
causal_tree.enabled(disabled by default) - 19 messaging channels — Signal, WhatsApp, Telegram, Discord, Slack, Matrix, and more
- 43+ built-in tools — shell, browser, MCP, memory, scheduling, remote nodes
- Xin (心) task engine — autonomous heartbeat scheduler with 3 execution modes (Rust/LLM/Shell), 5 built-in system tasks, SQLite persistence
- Web Console — browser-based management interface (
console/) - Remote Nodes — control macOS/Linux/Pi devices via
prx-nodeagent - Self-Evolution — autonomous prompt/memory/strategy improvement with xin-managed scheduling
- Subagent Governance — concurrency limits, depth control, config inheritance
- 3,500+ tests — comprehensive test coverage across all modules
router.enabled— enable heuristic model routingrouter.knn_enabled— enable semantic KNN scoring (with timeout-safe fallback)router.automix.enabled— enable cheap-first, low-confidence upgrade to premium model
Disabled by default. Set
causal_tree.enabled = trueto activate.
causal_tree.enabled— master switch for the CTE pipeline (default:false)causal_tree.policy.max_branches— maximum candidate branches to expand (default:3)causal_tree.policy.commit_threshold— minimum score to commit a branch (default:0.62)causal_tree.policy.extra_latency_budget_ms— max additional latency budget in ms (default:300)causal_tree.policy.rehearsal_timeout_ms— per-rehearsal timeout in ms (default:5000)causal_tree.policy.circuit_breaker_threshold— consecutive failures before tripping (default:5)causal_tree.w_confidence— scoring weight for confidence dimension (default:0.50)causal_tree.w_cost— scoring weight for cost penalty (default:0.25)causal_tree.w_latency— scoring weight for latency penalty (default:0.25)
# Build
git clone https://github.com/openprx/prx.git && cd prx
cargo build --release --all-features
# Setup
cp target/release/openprx /usr/local/bin/
openprx onboard
# Run
openprx startDefault build (cargo build) includes llm-router.
Or download pre-built binaries from Releases.
| Binary | Description |
|---|---|
openprx |
Main AI daemon — providers, channels, tools, evolution |
prx-node |
Lightweight remote node agent — runs on managed devices |
Channels (19) Tools (43+) Remote Nodes
Signal · WA · TG · ... Shell · MCP · ... macOS · Pi · ...
│ │ │
▼ ▼ ▼
┌─────────────────────────────────────────────────────┐
│ openprx daemon │
│ Agent Loop · Gateway · CTE · Xin · Memory · Evo │
└──────────────────────┬──────────────────────────────┘
│
Providers (9 LLMs)
Anthropic · OpenAI · Google · ...
| Topic | Description |
|---|---|
| Providers | 9 LLM providers, fallback chains, token refresh |
| Channels | 19 messaging platforms, DM/group policies |
| Tools | 43 built-in tools, hooks system, webhooks |
| Remote Nodes | prx-node agent, device pairing, JSON-RPC |
| Web Console | Browser-based management interface |
| Evolution | Self-improvement pipeline |
| Configuration | Config reference, workspace files, security |
| Router | LLM Router config, flow, safety boundaries |
| Causal Tree Engine | CTE pipeline, branch prediction, rehearsal, scoring |
| WASM Plugins | Plugin developer guide (Rust/Python/JS/Go) |
| Host Function Reference | WASM plugin host API reference |
- Documentation — Full PRX documentation (10 languages)
- Community — OpenPRX community forum
- OpenPRX — Project homepage
| Repository | Description |
|---|---|
| openprx/prx | AI assistant framework (this repo) |
| openprx/prx-memory | Standalone memory MCP server |
| openprx/openpr | Project management platform |
| openprx/openpr-webhook | Webhook receiver for OpenPR |
| openprx/wacli | WhatsApp CLI with JSON-RPC daemon |
Forked from zeroclaw-labs/zeroclaw (MIT / Apache-2.0). "ZeroClaw" is a trademark of ZeroClaw Labs. This project is OpenPRX, an independent fork.
Dual-licensed under MIT and Apache-2.0.