Skip to content

Conversation

@Maosghoul
Copy link

Context

Add MiniMax as a new provider to enable users to leverage the kilocode plugin for coding, reasoning, and other AI-assisted development tasks using the MiniMax-M2 model.

Implementation

Implementation follows the same architectural pattern as existing providers (Doubao, Z AI, etc.)
All protocol buffer definitions, state management, and UI components have been properly integrated MiniMax AI uses OpenAI-compatible API endpoints, ensuring reliable integration.

Screenshots

Clipboard_Screenshot_1761383785 Clipboard_Screenshot_1761383894

How to Test

I have thoroughly tested this implementation:

  1. API Integration: Select MiniMax provider and added MiniMax API key and successfully connected to the service using OpenAI-compatible endpoints
  2. Model Selection: MiniMax-M2 is selectable and functional
  3. Chat Functionality: Tested chat interactions with both models - they respond appropriately to coding requests and general queries
  4. UI Integration: Confirmed the MiniMax provider appears correctly in the provider dropdown with proper API key validation
  5. Build Validation: Full build passes (npm run package) with no errors or warnings
  6. Press F5 to start

@changeset-bot
Copy link

changeset-bot bot commented Oct 25, 2025

⚠️ No Changeset found

Latest commit: 4442799

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@secondsky
Copy link

MiniMax should not use OpenAi endpoints but Anthropic endpoints to preserve thinking!
It's also what their engineers suggest using as their model performance is significantly better with the anthropic endpoint.
Therefore, if this is using OpenAI endpoints, it should not be merged and used.

@secondsky
Copy link

Sadly, RooCode merged this suggestion today. I can only warn against the usage of this implementation.
Minimax is stating clearly on their Twitter that the OpenAI endpoints should only be used if one doesn't have any other option. And I can only second this; I tried both, and the difference is night and day between their anthropic endpoint on the OpenAI completion API endpoint.
People will have a subpar model experience with this implementation as its using OpenAI Completition API.

@nokaka
Copy link

nokaka commented Oct 31, 2025

@chrarnoldus @secondsky

  • The latest version of roocode already includes this provider and related models.
  • In practice, it works very well, and its stability is comparable to deepseek, far surpassing ZAI's GLM-4.6.
  • This matter shouldn't remain stagnant just because it's not perfect.

@chrarnoldus
Copy link
Collaborator

@nokaka there's no stagnation, support for MiniMax as a provider will be released shortly.

@secondsky
Copy link

@chrarnoldus @secondsky

  • The latest version of roocode already includes this provider and related models.
  • In practice, it works very well, and its stability is comparable to deepseek, far surpassing ZAI's GLM-4.6.
  • This matter shouldn't remain stagnant just because it's not perfect.

The implementation, at least in Cline (cant speak for Roo), is like this and useless. The degradation of performance is absolutely noticeable if OpenApi is used. So if it works well in Roo, it means they did a proper anthropic API support.

Kilo works with MiniMax lead engineer to bring a proper anthropic api implementation.
There will also be a Coding Plan like for GLM at some point.
So please be patient or use another option.

@kavehsfv
Copy link

kavehsfv commented Oct 31, 2025

@chrarnoldus @secondsky

  • The latest version of roocode already includes this provider and related models.
  • In practice, it works very well, and its stability is comparable to deepseek, far surpassing ZAI's GLM-4.6.
  • This matter shouldn't remain stagnant just because it's not perfect.

The implementation, at least in Cline (cant speak for Roo), is like this and useless. The degradation of performance is absolutely noticeable if OpenApi is used. So if it works well in Roo, it means they did a proper anthropic API support.

Kilo works with MiniMax lead engineer to bring a proper anthropic api implementation. There will also be a Coding Plan like for GLM at some point. So please be patient or use another option.

Hi
What about Roo Code? Are they planning to transition to an Anthropic API implementation?

If Roo Code allows users to select custom models—including MiniMax’s M2—within the Anthropic API framework, we could simply update the base URL to point to MiniMax, and everything would work as expected.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants