Skip to content

feat: add MiniMax as a first-class AI provider#60

Closed
octo-patch wants to merge 17 commits intoTraderAlice:devfrom
octo-patch:feature/add-minimax-provider
Closed

feat: add MiniMax as a first-class AI provider#60
octo-patch wants to merge 17 commits intoTraderAlice:devfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Add MiniMax as a built-in provider option for the Vercel AI SDK backend, alongside Anthropic, OpenAI, and Google.

MiniMax offers an OpenAI-compatible API, so this leverages the existing @ai-sdk/openai package with a custom base URL (https://api.minimax.io/v1) — no new dependencies required.

Changes

  • Model factory (model-factory.ts): add minimax case using createOpenAI with MiniMax's default endpoint
  • Config schema (config.ts): add minimax to the apiKeys Zod object
  • Web UI (AIProviderPage.tsx): add MiniMax to the provider selector with MiniMax-M2.5 model presets
  • Channel config (ChannelConfigModal.tsx): add MiniMax to per-channel provider dropdown
  • API route (config.ts): include minimax in the API key status endpoint
  • UI types (types.ts): add minimax to AIProviderConfig.apiKeys
  • README: mention MiniMax alongside other supported providers

Configuration Example

Available Models

Model Context Window Description
MiniMax-M2.5 204K tokens Full capability model
MiniMax-M2.5-highspeed 204K tokens Optimized for speed

Test Plan

  • All existing tests pass (766/766)
  • Backend builds successfully (pnpm build:backend)
  • UI builds successfully (pnpm build:ui)
  • No new dependencies added
  • Fully backward compatible — existing provider configs unaffected

fix: CCXT contract resolution + provider resilience
v0.9: Unified trading, OpenTypeBB, Agent SDK, SSE streaming
v0.9.0-beta.1: versioning, testing, and stability
v0.9.0-beta.2: package publishing, bug fixes, refactoring
v0.9.0-beta.3: opentypebb bug fix, npmjs publish fix
ci: workflow_dispatch + independent registry checks
fix(ci): unify ui into pnpm workspace
fix(ci): remove pnpm cache from release publish job
v0.9.0-beta.5: cherry-pick fixes & docs from Claude branches
fix(build): fix opentypebb exports for production + add start script
fix: Vercel AI SDK parallel tool results + Anthropic base URL hint
fix: restore preview image in README
refactor: AI provider architecture cleanup + bug fixes
feat: add @traderalice/ibkr TWS API package + refactors
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from 1052347 to 22e116b Compare March 16, 2026 00:18
Add MiniMax (https://www.minimax.io) support via the Vercel AI SDK's
OpenAI-compatible adapter. MiniMax offers MiniMax-M2.5 (204K context)
and MiniMax-M2.5-highspeed models through an OpenAI-compatible API.

Changes:
- model-factory.ts: add minimax provider case using @ai-sdk/openai
  with compatibility mode and chat completions endpoint
- config.ts: add minimax to apiKeys Zod schema
- config.ts (web routes): expose minimax API key status
- types.ts (UI): add minimax to ApiKeys type
- AIProviderPage.tsx: add MiniMax to provider list with model presets
- README.md: mention MiniMax in AI Provider and api-keys docs

Tested with both MiniMax-M2.5 and MiniMax-M2.5-highspeed models.
All 766 existing tests continue to pass.
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from 22e116b to 8ad0645 Compare March 16, 2026 00:22
@luokerenx4
Copy link
Contributor

Closing as duplicate of #61.

@luokerenx4 luokerenx4 closed this Mar 16, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants