Skip to content

feat: add MiniMax as LLM provider#360

Open
octo-patch wants to merge 1 commit intoSmythOS:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider#360
octo-patch wants to merge 1 commit intoSmythOS:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax AI as a first-class LLM provider in SmythOS SRE, using the existing OpenAI-compatible connector pattern (same approach as DeepSeek and TogetherAI).

  • Register MiniMax in BuiltinLLMProviders enum and wire it to OpenAIConnector
  • Add 4 MiniMax models: M2.7 (1M context), M2.7-highspeed, M2.5 (204K context), M2.5-highspeed
  • Add 10 unit tests + 5 integration tests (all passing)
  • Update README to list MiniMax as a supported LLM connector

Why MiniMax?

MiniMax provides OpenAI-compatible chat completion APIs with competitive models:

  • MiniMax-M2.7: Latest model with 1M token context window
  • MiniMax-M2.5: 204K context with highspeed variant for lower latency

Since the API is fully OpenAI-compatible, no new SDK dependency or custom connector is needed — MiniMax reuses the existing OpenAIConnector with baseURL: 'https://api.minimax.io/v1'.

Usage

// SDK factory
const llm = LLM.MiniMax('MiniMax-M2.7', { temperature: 0.7 });
const response = await llm.prompt('Hello!');

// Model factory
const model = Model.MiniMax('MiniMax-M2.7');

Files Changed (6 files, 376 additions)

File Change
packages/core/src/types/LLM.types.ts Add MiniMax to BuiltinLLMProviders
packages/core/src/subsystems/LLMManager/LLM.service/index.ts Register + init MiniMax connector
packages/core/src/subsystems/LLMManager/models.ts Add 4 MiniMax model definitions
packages/sdk/tests/unit/005-LLM/07-llm-minimax.test.ts 10 unit tests
packages/sdk/tests/integration/005-LLM/07-llm-minimax.test.ts 5 integration tests
README.md Add MiniMax to supported connectors list

Test Plan

  • 10 unit tests pass (vitest run packages/sdk/tests/unit/005-LLM/07-llm-minimax.test.ts)
  • 5 integration tests pass (vitest run packages/sdk/tests/integration/005-LLM/07-llm-minimax.test.ts)
  • All existing LLM tests continue to pass (no regressions)
  • Manual verification with MiniMax API key (optional)

Add MiniMax AI as a first-class LLM provider using the existing OpenAI-compatible
connector pattern (same approach as DeepSeek and TogetherAI).

Changes:
- Register MiniMax in BuiltinLLMProviders enum
- Wire MiniMax to OpenAIConnector in LLMService (register + init)
- Add 4 MiniMax models: M2.7, M2.7-highspeed, M2.5, M2.5-highspeed
- Add 10 unit tests and 5 integration tests
- Update README to list MiniMax as supported connector
@SyedZawwarAhmed
Copy link
Copy Markdown
Contributor

@octo-patch Hey, thanks so much for this contribution! Really appreciate you taking the time to add MiniMax as a provider.

We'll run through our own testing on our end to make sure everything integrates cleanly, and then we'll get this merged. Thanks again, great first contribution to the project!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants