Skip to content

Feature Request - Support Large Repo #468

@ryanzhengrz

Description

@ryanzhengrz

I finally set up and ran DeepWiki-Open on small repos successfully. Kudos to the author!

It seems that the larger repos will have difficulty in loading into memory or LLM and will need some optimization to send repo files in batches. Here are some error messages I received about input exceeding the LLM context length. I tried both Google Gimini and Ollama Qwen3:1.7b models, and received the same message.

2026-01-30 03:17:10,318 - INFO - httpx - _client.py:1025 - HTTP Request: POST http://192.168.0.88:11434/api/embeddings "HTTP/1.1 500 Internal Server Error" 2026-01-30 03:17:10,327 - INFO - backoff - _common.py:105 - Backing off call(...) for 0.5s (ollama._types.ResponseError: the input length exceeds the context length (status code: 500))

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions