Skip to content

fix(llm): support Azure AI Foundry deployments for openai_legacy#1199

Closed
kingdomseed wants to merge 3 commits intoMoonshotAI:mainfrom
kingdomseed:fix/azure-openai-openai_legacy
Closed

fix(llm): support Azure AI Foundry deployments for openai_legacy#1199
kingdomseed wants to merge 3 commits intoMoonshotAI:mainfrom
kingdomseed:fix/azure-openai-openai_legacy

Conversation

@kingdomseed
Copy link

@kingdomseed kingdomseed commented Feb 20, 2026

This enables using Moonshot (Kimi) deployments hosted on Azure (the /openai/deployments//chat/completions endpoint on Cognitive Services / Foundry).

Changes:

  • Config: add providers..default_query to support required query params like api-version.
  • LLM: when base_url is an Azure deployment URL, send api-key auth and ensure api-version is present (from config or AZURE_COGNITIVE_SERVICES_API_VERSION).
  • Docs: add an Azure-hosted Moonshot setup example (EN/ZH) and reference Microsoft Learn “Moonshot AI models sold directly by Azure”.
  • Tests: cover api-key header + api-version query injection behavior.

Related work (not included here): reasoning key support is tracked in #1154 and #782.


Open with Devin

- Forward provider custom headers and default query params to OpenAI clients\n- Add provider.default_query to config schema\n- Auto-detect Azure base URLs and attach api-key + api-version\n- Document Azure configuration and add regression test
Prefer Azure-specific api-version env vars only to avoid confusion with OpenAI.
Copilot AI review requested due to automatic review settings February 20, 2026 10:25
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Devin Review: No Issues Found

Devin Review analyzed this PR and found no potential bugs to report.

View in Devin Review to see 4 additional findings.

Open in Devin Review

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds first-class support for using Azure-hosted (Foundry/Cognitive Services) OpenAI-compatible deployment endpoints with the openai_legacy / openai_responses providers by allowing default query params (e.g., api-version) and applying Azure-specific auth/query behavior when an Azure base URL is detected.

Changes:

  • Add providers.<name>.default_query config field for default query parameters.
  • Detect Azure OpenAI-style base URLs and inject api-key header + api-version query param (from config or env) for OpenAI-compatible providers.
  • Update EN/ZH docs with an Azure-hosted Moonshot example and add a unit test covering header/query injection.

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
src/kimi_cli/config.py Adds default_query to LLMProvider so query params can be configured per-provider.
src/kimi_cli/llm.py Adds Azure base URL detection and forwards default_headers / default_query into Kosong OpenAI clients.
tests/core/test_create_llm.py Tests Azure behavior: api-key header and injected api-version query parameter.
docs/en/configuration/providers.md Documents Azure OpenAI setup using openai_legacy + default_query.
docs/zh/configuration/providers.md Same as EN, plus Azure-hosted Moonshot model notes/link.
docs/en/configuration/config-files.md Documents default_query in provider config schema and examples.
docs/zh/configuration/config-files.md Same as EN for default_query.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@kingdomseed
Copy link
Author

Addressed Copilot suggestion: Azure Foundry/Cognitive Services base_url now fails fast with ConfigError when api-version cannot be resolved (from provider.default_query or AZURE_COGNITIVE_SERVICES_API_VERSION / AZURE_OPENAI_API_VERSION).

Also tightened Azure detection to avoid triggering on non-Azure custom proxies, and filled augment_provider_with_env_vars()'s applied dict for logging when env vars override base_url/api_key.

@kingdomseed
Copy link
Author

I just realized a simpler solution is just to add the ability for a query parameter to the existing system. Then you could easily use Kimi K2.5. Another option is a proxy. Closing this as overkill.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants