A Model Context Protocol (MCP) server that provides RAG-powered semantic search over technical documentation PDFs using Ollama.
- Semantic search with natural language queries
- Multiple PDF documents with page citations
- Docker support with persistent caching
- TOML-based configuration
1. Create askdocs-mcp.toml in your project's docs directory:
[[doc]]
name = "my_manual"
description = "My Product Manual"
path = "pdf/manual.pdf"2. Run with Docker:
docker run -it --rm --network=host -v ./docs:/docs askdocs-mcp:latestaskdocs-mcp expects an Ollama server to be running on http://localhost:11434.
3. Directory structure:
docs/
├── askdocs-mcp.toml # Configuration
├── .askdocs-cache/ # Vector store (auto-created)
└── pdf/
└── manual.pdf
Add **/.askdocs-cache/** to your .gitignore file.
# Optional: Configure models
embedding_model = "snowflake-arctic-embed:latest"
llm_model = "qwen3:14b"
[[doc]]
name = "unique_identifier"
description = "Human description"
path = "pdf/document.pdf"{
"mcpServers": {
"askdocs-mcp": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"--network=host",
"--volume=${workspaceFolder}/docs:/docs",
"ghcr.io/dymk/askdocs-mcp:latest"
]
}
}
}[mcp_servers.askdocs-mcp]
command = "docker"
args = [
"run", "-i", "--rm",
"--network=host",
"--volume=/your/workspace/folder/docs:/docs",
"ghcr.io/dymk/askdocs-mcp:latest"
]Environment variable:
ASKDOCS_OLLAMA_URL: Ollama server URL (default:http://localhost:11434)
List all documentation sources.
Search documentation with natural language.
Retrieve full text from specific pages.
Ollama must be running with the required models:
ollama pull snowflake-arctic-embed:latest
ollama pull qwen3:14b# Docker
docker build -t askdocs-mcp:latest .
# Local
uv sync
uv run askdocs-mcp --docs-dir /path/to/docsMIT