fix(llm): support Azure AI Foundry deployments for openai_legacy#1199
fix(llm): support Azure AI Foundry deployments for openai_legacy#1199kingdomseed wants to merge 3 commits intoMoonshotAI:mainfrom
Conversation
- Forward provider custom headers and default query params to OpenAI clients\n- Add provider.default_query to config schema\n- Auto-detect Azure base URLs and attach api-key + api-version\n- Document Azure configuration and add regression test
Prefer Azure-specific api-version env vars only to avoid confusion with OpenAI.
There was a problem hiding this comment.
Pull request overview
Adds first-class support for using Azure-hosted (Foundry/Cognitive Services) OpenAI-compatible deployment endpoints with the openai_legacy / openai_responses providers by allowing default query params (e.g., api-version) and applying Azure-specific auth/query behavior when an Azure base URL is detected.
Changes:
- Add
providers.<name>.default_queryconfig field for default query parameters. - Detect Azure OpenAI-style base URLs and inject
api-keyheader +api-versionquery param (from config or env) for OpenAI-compatible providers. - Update EN/ZH docs with an Azure-hosted Moonshot example and add a unit test covering header/query injection.
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
| src/kimi_cli/config.py | Adds default_query to LLMProvider so query params can be configured per-provider. |
| src/kimi_cli/llm.py | Adds Azure base URL detection and forwards default_headers / default_query into Kosong OpenAI clients. |
| tests/core/test_create_llm.py | Tests Azure behavior: api-key header and injected api-version query parameter. |
| docs/en/configuration/providers.md | Documents Azure OpenAI setup using openai_legacy + default_query. |
| docs/zh/configuration/providers.md | Same as EN, plus Azure-hosted Moonshot model notes/link. |
| docs/en/configuration/config-files.md | Documents default_query in provider config schema and examples. |
| docs/zh/configuration/config-files.md | Same as EN for default_query. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Addressed Copilot suggestion: Azure Foundry/Cognitive Services base_url now fails fast with ConfigError when api-version cannot be resolved (from provider.default_query or AZURE_COGNITIVE_SERVICES_API_VERSION / AZURE_OPENAI_API_VERSION). Also tightened Azure detection to avoid triggering on non-Azure custom proxies, and filled augment_provider_with_env_vars()'s applied dict for logging when env vars override base_url/api_key. |
|
I just realized a simpler solution is just to add the ability for a query parameter to the existing system. Then you could easily use Kimi K2.5. Another option is a proxy. Closing this as overkill. |
This enables using Moonshot (Kimi) deployments hosted on Azure (the /openai/deployments//chat/completions endpoint on Cognitive Services / Foundry).
Changes:
Related work (not included here): reasoning key support is tracked in #1154 and #782.