-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Open
Description
I finally set up and ran DeepWiki-Open on small repos successfully. Kudos to the author!
It seems that the larger repos will have difficulty in loading into memory or LLM and will need some optimization to send repo files in batches. Here are some error messages I received about input exceeding the LLM context length. I tried both Google Gimini and Ollama Qwen3:1.7b models, and received the same message.
2026-01-30 03:17:10,318 - INFO - httpx - _client.py:1025 - HTTP Request: POST http://192.168.0.88:11434/api/embeddings "HTTP/1.1 500 Internal Server Error" 2026-01-30 03:17:10,327 - INFO - backoff - _common.py:105 - Backing off call(...) for 0.5s (ollama._types.ResponseError: the input length exceeds the context length (status code: 500))
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels