-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Feat: Add MiniMax AI Provider #3295
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
|
MiniMax should not use OpenAi endpoints but Anthropic endpoints to preserve thinking! |
|
Sadly, RooCode merged this suggestion today. I can only warn against the usage of this implementation. |
|
|
@nokaka there's no stagnation, support for MiniMax as a provider will be released shortly. |
The implementation, at least in Cline (cant speak for Roo), is like this and useless. The degradation of performance is absolutely noticeable if OpenApi is used. So if it works well in Roo, it means they did a proper anthropic API support. Kilo works with MiniMax lead engineer to bring a proper anthropic api implementation. |
Hi If Roo Code allows users to select custom models—including MiniMax’s M2—within the Anthropic API framework, we could simply update the base URL to point to MiniMax, and everything would work as expected. |
Context
Add MiniMax as a new provider to enable users to leverage the kilocode plugin for coding, reasoning, and other AI-assisted development tasks using the MiniMax-M2 model.
Implementation
Implementation follows the same architectural pattern as existing providers (Doubao, Z AI, etc.)
All protocol buffer definitions, state management, and UI components have been properly integrated MiniMax AI uses OpenAI-compatible API endpoints, ensuring reliable integration.
Screenshots
How to Test
I have thoroughly tested this implementation: