Open
Conversation
Add MiniMax AI (https://www.minimax.io/) as a fourth LLM provider option alongside OpenAI, Azure OpenAI, and Anthropic. MiniMax uses an OpenAI-compatible API, so it integrates via ChatOpenAI with a custom base_url. - Add MINIMAX_API_KEY setting to CodeInterpreterAPISettings - Add MiniMax branch in _choose_llm() with temp clamping [0.01, 1.0] - Default to MiniMax-M2.5 model when no MiniMax-specific model is set - Update README with MiniMax configuration instructions - Add 12 unit tests and 3 integration tests
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds MiniMax AI as a fourth LLM provider option alongside OpenAI, Azure OpenAI, and Anthropic.
MINIMAX_API_KEYenv var inCodeInterpreterAPISettingsfor auto-detection_choose_llm()usingChatOpenAIwithbase_url=https://api.minimax.io/v1(OpenAI-compatible API)MiniMax-M2.5when no MiniMax-specific model is configured; also supportsMiniMax-M2.5-highspeedfor faster responses[0.01, 1.0]range required by MiniMax APIChatOpenAIinstance, soOpenAIFunctionsAgentis used automatically (function calling works)Usage
Changes
src/codeinterpreterapi/config.pyMINIMAX_API_KEYsettingsrc/codeinterpreterapi/session.py_choose_llm()README.mdtests/test_minimax_provider.pytests/test_minimax_integration.pyTest plan
pytest tests/test_minimax_provider.py)MINIMAX_API_KEY=... pytest tests/test_minimax_integration.py)ChatOpenAIinstance forOpenAIFunctionsAgentcompatibility