Motivation
I'm an engineer on the OrcaRouter team. OrcaRouter (https://www.orcarouter.ai) is an OpenAI-compatible LLM gateway that routes each request to the cheapest or fastest provider across many upstream model providers.
We'd like to add OrcaRouter as a dedicated BYOK provider card in Agents → Models, alongside the other configured providers. Two notes on the proposed UX:
- Card placement. OrcaRouter sits immediately after Anthropic (above OpenRouter), so the two gateway-style providers are next to each other near the top of the BYOK list.
- Searchable model picker. Instead of a free-form text input for Model Type, OrcaRouter renders a two-column dropdown (left: upstream provider list, right: that provider's models). A refresh button next to the trigger fetches
/v1/models with the saved API key. Results are filtered to chat-capable models and cached in localStorage.
Solution
Same alias pattern as ModelArk / grok / llama.cpp for the backend wiring, plus two new generic primitives on the Provider type — both opted-into only by the OrcaRouter entry, with no behavior change to existing providers:
modelsEndpoint?: string — when set, the card renders a searchable model dropdown that fetches ${apiHost}${modelsEndpoint} with Authorization: Bearer ${apiKey}. Otherwise the card keeps its existing free-form text input.
websiteUrl?: string — when set, the card renders a small "Visit " link inline after the description.
Supporting pieces:
- New
src/lib/providerModels.ts util: fetch + filter chat-capable (output_modalities contains text, or architecture is absent) + group by id prefix + localStorage cache.
- New
ProviderModelCombobox component (Popover + Command-based) wired into Models.tsx.
Alternatives
- Free-form text input (current state for other BYOK cards): users have to know and type exact OrcaRouter model ids (
anthropic/claude-opus-4.6, openai/gpt-4o, …). Manageable for power users but rough first-run UX.
- First-class
ModelPlatformType.ORCAROUTER upstream in CAMEL: not pursued — heavier, longer release loop, and the alias pattern already provides functional parity.
Additional context
Resources
Implementation PR: #1632
Motivation
I'm an engineer on the OrcaRouter team. OrcaRouter (https://www.orcarouter.ai) is an OpenAI-compatible LLM gateway that routes each request to the cheapest or fastest provider across many upstream model providers.
We'd like to add OrcaRouter as a dedicated BYOK provider card in Agents → Models, alongside the other configured providers. Two notes on the proposed UX:
/v1/modelswith the saved API key. Results are filtered to chat-capable models and cached inlocalStorage.Solution
Same alias pattern as
ModelArk/grok/llama.cppfor the backend wiring, plus two new generic primitives on theProvidertype — both opted-into only by the OrcaRouter entry, with no behavior change to existing providers:modelsEndpoint?: string— when set, the card renders a searchable model dropdown that fetches${apiHost}${modelsEndpoint}withAuthorization: Bearer ${apiKey}. Otherwise the card keeps its existing free-form text input.websiteUrl?: string— when set, the card renders a small "Visit " link inline after the description.Supporting pieces:
src/lib/providerModels.tsutil: fetch + filter chat-capable (output_modalitiescontainstext, orarchitectureis absent) + group by id prefix +localStoragecache.ProviderModelComboboxcomponent (Popover + Command-based) wired intoModels.tsx.Alternatives
anthropic/claude-opus-4.6,openai/gpt-4o, …). Manageable for power users but rough first-run UX.ModelPlatformType.ORCAROUTERupstream in CAMEL: not pursued — heavier, longer release loop, and the alias pattern already provides functional parity.Additional context
Resources
GET /v1/models(OpenAI-compatible)Implementation PR: #1632