Skip to content

Add MiniMax as cloud LLM provider#596

Closed
octo-patch wants to merge 1 commit intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider
Closed

Add MiniMax as cloud LLM provider#596
octo-patch wants to merge 1 commit intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class cloud LLM provider, following the existing external model provider pattern alongside OpenAI, Ollama, Foundry Local, and Lemonade.

MiniMax offers OpenAI-compatible API endpoints at api.minimax.io/v1, enabling seamless integration using the existing Microsoft.Extensions.AI.OpenAI NuGet package (no additional dependencies needed).

Models included

  • MiniMax-M2.7 - Latest flagship model with 1M context window
  • MiniMax-M2.5 - 204K context window
  • MiniMax-M2.5-highspeed - High-speed variant with 204K context

Changes (18 files, 790 additions)

Core provider:

  • MiniMaxModelProvider.cs: Provider implementing IExternalModelProvider, uses OpenAI SDK with custom endpoint
  • HardwareAccelerator.MINIMAX: Enum value added in all 3 locations (main, source generator, project template)
  • ExternalModelHelper.cs: Registered MiniMax in provider list
  • ModelDetailsHelper.cs: Added to IsLanguageModel() check

UI:

  • MiniMaxPickerView.xaml + .xaml.cs: API key entry and model selection view (mirrors OpenAI picker pattern)
  • ModelPickerDefinitions.cs: Registered MiniMax picker tab
  • Icon assets: SVG (dark/light themes) + PNG variants

Tests:

  • 10 unit tests: singleton, properties, NuGet refs, model listing, client creation, code snippet generation
  • 3 integration tests: API connectivity, chat completion, streaming (gated by MINIMAX_API_KEY env var)

Docs:

  • AddingSamples.md: Added MINIMAX to HardwareAccelerator values list

Test plan

  • Build solution on Windows with Visual Studio 2022
  • Run unit tests (MiniMaxModelProviderTests)
  • Verify MiniMax tab appears in model picker alongside OpenAI
  • Enter MiniMax API key and verify model list loads
  • Select a MiniMax model and run a language model sample
  • Run integration tests with MINIMAX_API_KEY set (optional)

Add MiniMax as a first-class external model provider alongside OpenAI,
Ollama, Foundry Local, and Lemonade. MiniMax uses OpenAI-compatible API
via api.minimax.io/v1 with models M2.7, M2.5, and M2.5-highspeed.

Changes:
- MiniMaxModelProvider: OpenAI-compat client with static model list
- MiniMaxPickerView: XAML + code-behind for API key entry and model selection
- HardwareAccelerator.MINIMAX enum value in all three locations
- ExternalModelHelper: registered MiniMax provider in provider list
- ModelDetailsHelper: added MINIMAX to IsLanguageModel check
- ModelPickerDefinitions: registered MiniMax picker tab
- Icon assets: SVG (dark/light) and PNG variants
- Unit tests: 10 tests covering properties, models, client creation
- Integration tests: 3 tests for API connectivity and streaming
Copy link
Copy Markdown
Collaborator

@haoliuu haoliuu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the contribution! AI Dev Gallery is primarily focused on local AI capabilities - helping developers run models directly on their Windows devices. We'd like to keep the project aligned with that direction rather than expanding individual cloud LLM provider at this time.

@haoliuu haoliuu closed this Mar 30, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants