Skip to content

LimeChain/ai-explorer-backend

Repository files navigation

Hederion AI Explorer Backend

Hederion AI Explorer is a next-generation block explorer for the Hedera network that enables users to consume and understand on-chain data through natural language queries.

Status

  • API endpoints
    • ✅ Chat endpoint for real-time user queries and token-by-token streaming responses over WebSockets
    • ✅ Suggested queries endpoint for list of pre-defined queries
    • ✅ IP and global rate/cost limiting
  • AI Agent
    • ✅ LLM agentic reasoning and tool use
    • ✅ Multi-turn conversation with context retention
    • ✅ Session-based conversations (anonymous and pseudonymous sessions)
    • ✅ Contextual user data (wallet account ID)
  • Relational Database
    • ✅ Chat history with database persistence
  • Vector Database
    • ✅ Semantic search with embeddings
  • MCP tools
    • ✅ Hedera's Mirror Node REST API
    • ✅ Hgraph GraphQL API
    • ❌ Hedera's BigQuery
    • ✅ Timestamp conversion tool
    • ✅ Money value conversion tool
  • Benchmarking
    • ✅ Tracing
    • ✅ Evaluations
  • Unit & Integration Tests
  • ✅ CI/CD
  • ✅ Documentation

Setup

Prerequisites

  • Docker, Docker Compose
  • uv package manager
  • Python 3.13+
  • PostgreSQL
  • Redis
  • LLM API (OpenAI, Google, etc.)

Run Locally

  1. Clone the repository:
git clone https://github.com/LimeChain/ai-explorer-backend
cd ai-explorer-backend
  1. Create .env file and configure the necessary environment variables:
cp .env.example .env
  1. Install dependencies:
uv sync
  1. Start the database and redis:
docker compose up postgres redis -d
  1. Run database migrations:
uv run alembic upgrade head
  1. Start the API server:
uv run uvicorn app.main:app --reload --port 8000
  1. Install the Hedera SDK as a package and start the internal tools MCP server:
uv pip install -e ./sdk
uv run python mcp_servers/main.py
  1. Send a sample query over WebSocket:
uv run python scripts/ws_send_query.py
  1. Start the external MCP server that exposes the whole service as a tool for AI agents:
uv run python mcp_external/main.py  --transport http --port 8002

Connect via Postman MCP client.

List the available tools:

{
  "jsonrpc": "2.0",
  "id": 2,
  "method": "tools/list",
  "params": {}
}

Call ask_explorer tool:

{
  "jsonrpc": "2.0",
  "id": 4,
  "method": "tools/call",
  "params": {
    "name": "ask_explorer",
    "arguments": {
      "question": "What are the recent transactions for account 0.0.123?",
      "network": "mainnet",
      "account_id": "0.0.123"
    }
  }
}

Run Locally (with Docker)

  1. Configure the .env file to use the correct mcp endpoint:

  2. Start all services with Docker:

docker compose up
  1. Send a sample query over WebSocket:
docker compose exec api uv run python scripts/dev/query_websocket_dev.py

Development

Database Management

Create a new migration:

uv run alembic revision --autogenerate -m "Description of changes"

Apply migrations:

uv run alembic upgrade head

Enable Tracing

Configure the LangSmith tracing in the .env file.

Run Evaluations

uv run python -m evals.main

Testing

Spamming the WebSocket endpoint

Tests the rate and cost limiting by sending multiple requests to the WebSocket endpoint.

uv run python scripts/spam.py
uv run python scripts/spam.py concurrent

uv run python scripts/check_limits.py list --details   
uv run python scripts/check_limits.py stats
uv run python scripts/check_limits.py clear
uv run python scripts/check_limits.py monitor

API Documentation

Once running, visit:

  • Swagger UI: http://localhost:8000/docs
  • ReDoc: http://localhost:8000/redoc

Deployment

Prod and dev environments are running in the same GCP project.

Dev environment is deployed with default Terraform workspace while prod is deployed with prod workspace. This makes Terraform use different state files for both environments.

Before deploying check the Terraform workspace:

terraform workspace list

If needed change the workspace:

terraform workspace select <workspace>

tfvars used for prod

project_id        = "<PROJECT_ID>>"
llm_api_key       = "<API_KEY>"
langsmith_api_key = ""
environment       = "production"
domain_name       = "hederion.com"
app_name          = "ai-explorer-prod"

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •