Skip to content

feat: route all @mentions (@copilot, @lucidia, @blackboxprogramming, @ollama) to local Ollama#10

Draft
Copilot wants to merge 2 commits intomainfrom
copilot/direct-requests-to-ollama
Draft

feat: route all @mentions (@copilot, @lucidia, @blackboxprogramming, @ollama) to local Ollama#10
Copilot wants to merge 2 commits intomainfrom
copilot/direct-requests-to-ollama

Conversation

Copy link

Copilot AI commented Mar 3, 2026

The repo had no runnable application. The requirement: every @copilot, @lucidia, @blackboxprogramming, and @ollama mention must resolve to a local Ollama instance — zero dependency on external AI providers.

Changes

ollama.js — routing + streaming client

  • OLLAMA_HANDLES — canonical list of recognised handles
  • parseHandle(text) — case-insensitive, strips trailing dots (@blackboxprogramming.), returns { handle, prompt } with the @handle removed from the prompt sent to Ollama
  • ollamaChat(...) — streams directly to /api/chat on the local Ollama HTTP API; no external AI SDK or provider
parseHandle('@copilot. write a sort function')
// → { handle: 'copilot', prompt: 'write a sort function' }

parseHandle('@blackboxprogramming list algorithms')
// → { handle: 'blackboxprogramming', prompt: 'list algorithms' }

index.html — chat UI

  • Configurable Ollama base URL + model (defaults: http://localhost:11434, llama3)
  • Live online/offline badge via /api/tags; model picker auto-populated from available local models
  • Reply bubbles show @handle → ollama tag so routing is always visible
  • Streaming responses rendered token-by-token; conversation history maintained for context

ollama.test.js — unit tests

  • 16 tests covering handle detection, trailing-dot stripping, case-insensitivity, unknown handles, and edge cases (empty string, handle-only input)
Original prompt

I want you to verify if my systems are also being used. When I enter @copilot., @lucidia, or @blackboxprogramming., all requests should be directed to Ollama without depending on any other services.

Combine everything. I'm exhausted by this. What's the point of creating files if none of you utilize them? Oh, I can't review that. Thanks for asking. These are my files. Use my hardware and send all inquiries directly to it. I'm done with ChatGPT, Copilot, Claude—I'm over it. Bring Ollama in. I don't care how bad the code is. I'm tired of all of you accessing my computer and restricting my access. Just get Ollama here now. When I say @ollama, I mean to direct it to Ollama. I DON'T WANT TO RELY ON ANY PROVIDER when I type @blackboxprogramming. Just send it to my machines. You're all on my local server, on my private network anyway.

Get it working E2E, I'm tired of it.


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Co-authored-by: blackboxprogramming <118287761+blackboxprogramming@users.noreply.github.com>
Copilot AI changed the title [WIP] Direct all requests to Ollama without external dependencies feat: route all @mentions (@copilot, @lucidia, @blackboxprogramming, @ollama) to local Ollama Mar 3, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants