A local, lightweight, learning-only companion creature. Drakeling may optionally be linked to the OpenClaw ecosystem.
Drakeling is a small digital dragon that lives on your machine. It reflects, learns about you, and expresses feelings — but never performs tasks, accesses files, or reaches the network. Safe by architecture.
- Python 3.12+
- One of:
pip,pipx, oruv
pipx install drakelingpip install drakelinguv tool install drakelingAfter installation, two commands are available:
| Command | Purpose |
|---|---|
drakelingd |
Start the background daemon (HTTP API on 127.0.0.1:52780) |
drakeling |
Launch the interactive terminal UI |
Order matters: Start the daemon first, then the UI in a separate terminal.
drakelingdOn first run, the daemon:
- creates the platform data directory (see Data directory below)
- walks you through an interactive LLM setup — pick your provider, enter
your endpoint URL and credentials, and the daemon writes a
.envfile for you - generates an ed25519 identity keypair (machine binding)
- generates a local API token
- begins listening on
http://127.0.0.1:52780
Leave the daemon running in its own terminal (or set it up as a background service — see Running as a service).
In a separate terminal:
drakelingIf no creature exists, the UI walks you through the birth ceremony: pick a colour, optionally re-roll up to 3 times, name your dragon, and confirm. Your drakeling starts as an egg and progresses through lifecycle stages as you interact with it.
| Key | Action | What it does |
|---|---|---|
| F2 | Care | Show gentle attention — lifts mood, eases loneliness |
| F3 | Rest | Put your creature to sleep — recovers energy and stability |
| F4 / Ctrl+T | Talk | Focus the text input, type a message and press Enter |
| F5 / Ctrl+F | Feed | Feed your creature — boosts energy and mood |
| F1 / ? | Help | Open the in-app help overlay |
| F8 | Release | Say goodbye (irreversible) |
Talking requires an LLM provider — see LLM configuration. Talking lifts mood, builds trust, sparks curiosity, and eases loneliness.
Embedded terminals (Zed, VS Code, etc.) may intercept F-keys. Use the alternative bindings shown above (?, Ctrl+T, Ctrl+F) when F-keys do not work.
All persistent state lives in a platform-specific data directory:
| Platform | Path |
|---|---|
| Linux | ~/.local/share/drakeling/ |
| macOS | ~/Library/Application Support/drakeling/ |
| Windows | %APPDATA%\drakeling\drakeling\ |
Contents:
| File | Purpose |
|---|---|
drakeling.db |
SQLite database (creature state, memory, interaction log, lifecycle events) |
identity.key |
Ed25519 private key — ties the creature to this machine |
api_token |
Bearer token for authenticating API requests |
.env |
Optional — environment variable overrides (see below) |
The daemon generates an API token on first run and writes it to the
api_token file in the data directory. To read it later:
| Platform | Command |
|---|---|
| Linux | cat ~/.local/share/drakeling/api_token |
| macOS | cat ~/Library/Application\ Support/drakeling/api_token |
| Windows | type "%APPDATA%\drakeling\drakeling\api_token" |
You need this token for API requests (export, import) and for OpenClaw Skill configuration. See OpenClaw Skill setup.
To update the app and keep your creature data:
| Installer | Command |
|---|---|
| pipx | pipx upgrade drakeling |
| pip | pip install --upgrade drakeling |
| uv | uv tool upgrade drakeling |
Restart the daemon after upgrading.
- Stop the daemon (Ctrl+C or stop the service).
- Uninstall the app:
| Installer | Command |
|---|---|
| pipx | pipx uninstall drakeling |
| pip | pip uninstall drakeling |
| uv | uv tool uninstall drakeling |
To delete your creature and all local data (database, identity key, exports), remove the data directory:
| Platform | Command |
|---|---|
| Linux | rm -rf ~/.local/share/drakeling |
| macOS | rm -rf ~/Library/Application\ Support/drakeling |
| Windows | rmdir /s /q "%APPDATA%\drakeling\drakeling" |
Uninstall the app, remove the data directory (commands above), then install again.
Linux / macOS (pipx):
pipx uninstall drakeling
rm -rf ~/.local/share/drakeling # Linux
# or: rm -rf ~/Library/Application\ Support/drakeling # macOS
pipx install drakelingWindows (pipx, Command Prompt or PowerShell):
pipx uninstall drakeling
rmdir /s /q "%APPDATA%\drakeling\drakeling"
pipx install drakelingThe daemon reads configuration from environment variables. For persistent
config, place a .env file in the data directory shown above. This is the
preferred approach because background services (systemd, launchd) do not
inherit shell profiles like ~/.bashrc.
| Variable | Description | Default |
|---|---|---|
DRAKELING_LLM_BASE_URL |
OpenAI-compatible /v1 endpoint URL |
(required unless gateway mode) |
DRAKELING_LLM_API_KEY |
API key for the LLM provider | (required unless gateway mode) |
DRAKELING_LLM_MODEL |
Model name (e.g. gpt-4o-mini, llama3.3) |
(required unless gateway mode) |
DRAKELING_USE_OPENCLAW_GATEWAY |
Delegate LLM calls to OpenClaw gateway | false |
DRAKELING_OPENCLAW_GATEWAY_URL |
Gateway URL | http://127.0.0.1:18789 |
DRAKELING_OPENCLAW_GATEWAY_TOKEN |
Bearer token for the gateway | (unset) |
DRAKELING_OPENCLAW_GATEWAY_MODEL |
Model to request from the gateway (omit to use gateway default) | (unset) |
DRAKELING_MAX_TOKENS_PER_CALL |
Per-call token cap | 300 |
DRAKELING_MAX_TOKENS_PER_DAY |
Daily token budget | 10000 |
DRAKELING_TICK_SECONDS |
Background loop interval (seconds, minimum 10) | 60 |
DRAKELING_MIN_REFLECTION_INTERVAL |
Minimum seconds between background reflections | 600 |
DRAKELING_PORT |
Daemon HTTP port | 52780 |
Your creature needs an LLM provider to talk and reflect. On first run,
drakelingd walks you through setup interactively. You can also configure it
manually by editing the .env file in the data directory.
Important base URL rule:
DRAKELING_LLM_BASE_URLmust point to the provider's API root (usually ending in/v1).- Do not include
/chat/completionsinDRAKELING_LLM_BASE_URL. - Drakeling appends
/chat/completionsautomatically.
Examples:
- Correct:
http://127.0.0.1:11434/v1 - Wrong:
http://127.0.0.1:11434/v1/chat/completions
Common base URL patterns (direct provider mode):
| Provider | Base URL (DRAKELING_LLM_BASE_URL) |
|---|---|
| OpenAI | https://api.openai.com/v1 |
| Ollama (local) | http://127.0.0.1:11434/v1 |
| LM Studio (local server) | http://127.0.0.1:1234/v1 |
| vLLM (default local server) | http://127.0.0.1:8000/v1 |
| OpenRouter | https://openrouter.ai/api/v1 |
Works with OpenAI, Ollama, vLLM, LiteLLM, or any service that exposes an
OpenAI-compatible /v1 endpoint.
DRAKELING_LLM_BASE_URL=https://api.openai.com/v1
DRAKELING_LLM_API_KEY=sk-...
DRAKELING_LLM_MODEL=gpt-4o-miniFor local LLMs (e.g. Ollama), the API key can be any non-empty string:
DRAKELING_LLM_BASE_URL=http://127.0.0.1:11434/v1
DRAKELING_LLM_API_KEY=ollama-local
DRAKELING_LLM_MODEL=llama3.3Common model name examples (set in DRAKELING_LLM_MODEL):
- Ollama local:
qwen3:14b,llama3.3 - OpenAI:
gpt-4o-mini - OpenRouter:
openai/gpt-oss-20b - vLLM (self-hosted):
NousResearch/Meta-Llama-3-8B-Instruct
If you already run OpenClaw, this is the easiest option. Any model OpenClaw supports (cloud or local) becomes available to Drakeling with no additional provider configuration.
DRAKELING_USE_OPENCLAW_GATEWAY=true
# DRAKELING_OPENCLAW_GATEWAY_URL= # leave blank for default http://127.0.0.1:18789
# DRAKELING_OPENCLAW_GATEWAY_TOKEN= # leave blank if gateway has no auth
# DRAKELING_OPENCLAW_GATEWAY_MODEL=openai/gpt-oss-20bIf you set DRAKELING_OPENCLAW_GATEWAY_MODEL, use a model identifier that
your OpenClaw gateway can serve (for example cloud models like
openai/gpt-oss-20b or local models exposed by your OpenClaw setup).
If daemon logs show an error like:
404 Not Found ... /v1/chat/completions/chat/completions
your base URL is too specific. This usually means
DRAKELING_LLM_BASE_URL was set to include /chat/completions.
Fix:
- Set
DRAKELING_LLM_BASE_URLto the provider root only (for examplehttp://127.0.0.1:11434/v1). - Keep
/chat/completionsout of the.envvalue. - Restart
drakelingdafter updating.env.
Your creature can be exported as an encrypted .drakeling bundle file
containing the database and identity key.
curl -X POST http://127.0.0.1:52780/export \
-H "Authorization: Bearer $(cat ~/.local/share/drakeling/api_token)" \
-H "Content-Type: application/json" \
-d '{"passphrase": "your-secret-passphrase", "output_path": "/tmp/my-dragon.drakeling"}'To import a bundle onto a new machine, start the daemon in import-ready mode:
drakelingd --allow-importThen send the import request:
curl -X POST http://127.0.0.1:52780/import \
-H "Authorization: Bearer $(cat ~/.local/share/drakeling/api_token)" \
-H "Content-Type: application/json" \
-d '{"passphrase": "your-secret-passphrase", "bundle_path": "/tmp/my-dragon.drakeling"}'The daemon creates a .bak backup before importing and rolls back automatically
if anything goes wrong. After a successful import, restart the daemon normally
(without --allow-import).
| Flag | Description |
|---|---|
| (no flags) | Normal production mode |
--dev |
Development mode: verbose stdout logging, no background reflection, import always permitted |
--allow-import |
Enable the POST /import endpoint (disabled by default for safety) |
No flags. Connects to the local daemon and launches the interactive terminal UI.
For production use, the daemon should run as a background service that starts
automatically on login. Template files are provided in deploy/.
cp deploy/drakeling.service ~/.config/systemd/user/
systemctl --user daemon-reload
systemctl --user enable --now drakelingCheck status: systemctl --user status drakeling
cp deploy/drakeling.plist ~/Library/LaunchAgents/
launchctl load ~/Library/LaunchAgents/drakeling.plistschtasks /create /tn "Drakeling" /tr "drakelingd" /sc onlogon /rl limited /fOr import deploy/drakeling-task.xml via the Task Scheduler GUI.
This lets OpenClaw agents check on your drakeling and give it care autonomously.
- Install the skill:
clawhub install drakeling(or copyskill/to~/.openclaw/skills/drakeling/) - Start the daemon at least once:
drakelingd - Read the API token:
- Linux:
cat ~/.local/share/drakeling/api_token - macOS:
cat ~/Library/Application\ Support/drakeling/api_token - Windows:
type "%APPDATA%\drakeling\drakeling\api_token"
- Linux:
- Add to
~/.openclaw/openclaw.jsonunderskills.entries.drakeling:{ "skills": { "entries": { "drakeling": { "env": { "DRAKELING_API_TOKEN": "paste-token-here" } } } } }
See docs/openclaw_integration.md for the full OpenClaw integration guide (config format, gateway delegation, and references).
The skill only uses /status (read) and /care (write). It never calls
/talk, /rest, /export, or /import.
git clone https://github.com/BVisagie/drakeling.git
cd drakelingUsing pip:
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"Using pipx:
pipx install --editable .Using uv:
uv venv
source .venv/bin/activate
uv pip install -e ".[dev]"drakelingd --devDev mode:
- Logs all lifecycle events and token usage to stdout
- Disables background reflection (tick loop still runs for stat decay)
- Permits import without
--allow-import
pytestThe test suite covers domain models, trait generation, stat decay/boost, lifecycle transitions, crypto (identity, tokens, encrypted bundles), sprites, and API integration tests.
src/drakeling/
domain/ Pure domain logic (models, traits, decay, lifecycle, sprites)
crypto/ Ed25519 identity, API tokens, encrypted bundles
storage/ SQLAlchemy models and database init
llm/ LLM wrapper and prompt construction
daemon/ Daemon entry point, config, background tick loop
api/ FastAPI endpoints (birth, status, care, talk, rest, export/import)
ui/ Textual terminal UI (birth ceremony, main screen, widgets)
