Skip to content

feat: add MiniMax as LLM provider with M2.7 default#134

Open
octo-patch wants to merge 2 commits intoMiroMindAI:mainfrom
octo-patch:feat/add-minimax-provider
Open

feat: add MiniMax as LLM provider with M2.7 default#134
octo-patch wants to merge 2 commits intoMiroMindAI:mainfrom
octo-patch:feat/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 12, 2026

Summary

  • Add MiniMax as a new LLM provider via OpenAI-compatible API
  • Support MiniMax-M2.7 (default), MiniMax-M2.7-highspeed, MiniMax-M2.5, and MiniMax-M2.5-highspeed models
  • Add Hydra config files for easy model switching
  • Update README with MiniMax usage examples

Changes

  • Add minimax provider to factory.py using OpenAIClient
  • Add config files: minimax.yaml (M2.7 default), minimax-m2.7-highspeed.yaml, minimax-m2.5-highspeed.yaml
  • Update README with MiniMax provider documentation

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities. The OpenAI-compatible API makes integration seamless.

Testing

  • Config files validated (YAML parsing)
  • Factory pattern tested with minimax provider

PR Bot and others added 2 commits March 12, 2026 19:58
Add MiniMax provider support using the OpenAI-compatible API interface.
MiniMax offers two models: MiniMax-M2.5 (default) and MiniMax-M2.5-highspeed,
both supporting 204K context window.

Changes:
- Add minimax to supported providers in factory.py (reuses OpenAIClient)
- Add Hydra config files for both MiniMax models
- Update default.yaml provider comment to include minimax
- Add MiniMax usage example in README.md
- Update default MiniMax model from M2.5 to M2.7 in minimax.yaml
- Add MiniMax-M2.7-highspeed config for low-latency scenarios
- Keep M2.5 and M2.5-highspeed configs as available alternatives
- Update README with M2.7 as default and list all available models
@octo-patch octo-patch changed the title feat: add MiniMax as LLM provider feat: add MiniMax as LLM provider with M2.7 default Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant