diff --git a/README.ja.md b/README.ja.md index 8a9241fca..bd04c2507 100644 --- a/README.ja.md +++ b/README.ja.md @@ -79,7 +79,7 @@ Discord コミュニティへのご参加をお待ちしています。使用中 - **その他**: さらに多くのエージェントを計画中... ## 柔軟な統合 -- **複数のLLMプロバイダー**: OpenRouter、SiliconFlow、Azure、Openai-compatible、Google、OpenAIおよびDeepSeekをサポート +- **複数のLLMプロバイダー**: OpenRouter、SiliconFlow、Azure、Openai-compatible、Google、OpenAI、DeepSeekおよび[MiniMax](https://platform.minimaxi.com/)をサポート - **人気の市場データ**: 米国市場、暗号市場、香港市場、中国市場など - **マルチエージェントフレームワーク対応**: A2AプロトコルによるLangchain、Agnoをサポート、研究開発の統合を行う - **取引所接続**: OKX と Binance へのリアルタイムルーティングに対応し、安全ガードレールを内蔵 @@ -263,7 +263,7 @@ ValueCell は外部サービスを統合し、サードパーティ製ウィジ |---------|------|-----------------| | **TradingView Advanced Chart** | 埋め込み iframe ウィジェット | [Free Advanced Charts Agreement](https://www.tradingview.com/chart-embedding/)(プロプライエタリ) | | **取引所 API**(Binance、OKX、Hyperliquid など) | REST/WebSocket エンドポイント | 各取引所の利用規約(例: [Binance API 規約](https://www.binance.com/en/terms)) | -| **LLM プロバイダー**(OpenAI、Azure、Google、DeepSeek など) | 推論 API | プロバイダー固有の利用規約(例: [OpenAI 利用規約](https://openai.com/policies/terms-of-use)) | +| **LLM プロバイダー**(OpenAI、Azure、Google、DeepSeek、MiniMax など) | 推論 API | プロバイダー固有の利用規約(例: [OpenAI 利用規約](https://openai.com/policies/terms-of-use)) | # Star History diff --git a/README.md b/README.md index 48118dade..2aef60daa 100644 --- a/README.md +++ b/README.md @@ -81,7 +81,7 @@ Welcome to join our Discord community to share feedback and issues you encounter - **Others**: More agents are in planning... ## Flexible Integrations -- **Multiple LLM Providers**: Support OpenRouter, SiliconFlow,Azure,Openai-compatible,Google,OpenAI and DeepSeek +- **Multiple LLM Providers**: Support OpenRouter, SiliconFlow, Azure, Openai-compatible, Google, OpenAI, DeepSeek and [MiniMax](https://platform.minimaxi.com/) - **Popular Market Data**: Cover US market, Crypto market, Hong Kong market, China market and more - **Multi-Agent Framework Compatible**: Support Langchain, Agno by A2A Protocol for research and development integration - **Exchange Connectivity**: Live routing to OKX and Binance, featuring built-in guardrails @@ -263,7 +263,7 @@ ValueCell integrates external services and embeds third-party widgets. Their usa |---------|------|-----------------| | **TradingView Advanced Chart** | Embedded iframe widget | [Free Advanced Charts Agreement](https://www.tradingview.com/chart-embedding/) (proprietary) | | **Exchange APIs** (Binance, OKX, Hyperliquid, etc.) | REST/WebSocket endpoints | Each exchange’s ToS (e.g., [Binance API Terms](https://www.binance.com/en/terms)) | -| **LLM Providers** (OpenAI, Azure, Google, DeepSeek, etc.) | Inference APIs | Provider-specific ToS (e.g., [OpenAI ToS](https://openai.com/policies/terms-of-use)) | +| **LLM Providers** (OpenAI, Azure, Google, DeepSeek, MiniMax, etc.) | Inference APIs | Provider-specific ToS (e.g., [OpenAI ToS](https://openai.com/policies/terms-of-use)) | # Star History diff --git a/README.zh.md b/README.zh.md index b7e97b706..6d8f1f0e8 100644 --- a/README.zh.md +++ b/README.zh.md @@ -79,7 +79,7 @@ ValueCell 是一个社区驱动的多智能体金融应用平台,我们的使 - **其他智能体**:更多智能体正在规划中... ## 灵活集成 -- **多种大语言模型提供商**:支持 OpenRouter、SiliconFlow、Azure、Openai-compatible、Google、OpenAI和DeepSeek +- **多种大语言模型提供商**:支持 OpenRouter、SiliconFlow、Azure、Openai-compatible、Google、OpenAI、DeepSeek和[MiniMax](https://platform.minimaxi.com/) - **热门市场数据**:覆盖美国市场、加密货币市场、香港市场、中国市场等 - **多智能体框架兼容**:通过 A2A 协议,支持 Langchain、Agno 等主流Agent框架,进行研发集成 - **交易所连接**:支持实时路由至 OKX 和 Binance,并内置安全防护机制 @@ -263,7 +263,7 @@ ValueCell 集成外部服务并嵌入第三方挂件。其使用不受 Apache 2. |---------|------|-----------------| | **TradingView Advanced Chart** | 嵌入式 iframe 挂件 | [Free Advanced Charts Agreement](https://www.tradingview.com/chart-embedding/)(专有) | | **交易所 API**(Binance、OKX、Hyperliquid 等) | REST/WebSocket 接口 | 各交易所的服务条款(例如 [Binance API 条款](https://www.binance.com/en/terms)) | -| **大语言模型提供商**(OpenAI、Azure、Google、DeepSeek 等) | 推理 API | 供应商特定的使用条款(例如 [OpenAI 使用条款](https://openai.com/policies/terms-of-use)) | +| **大语言模型提供商**(OpenAI、Azure、Google、DeepSeek、MiniMax 等) | 推理 API | 供应商特定的使用条款(例如 [OpenAI 使用条款](https://openai.com/policies/terms-of-use)) | # Star diff --git a/README.zh_Hant.md b/README.zh_Hant.md index fd53fe77d..a66cb091e 100644 --- a/README.zh_Hant.md +++ b/README.zh_Hant.md @@ -79,7 +79,7 @@ ValueCell 是一個社群驅動的多智能體金融應用平台,我們的使 - **其他智能體**:更多智能體正在規劃中... ## 彈性整合 -- **多家大型語言模型供應商**:支援 OpenRouter、SiliconFlow、Openai-compatible、Azure、Google、OpenAI 與 DeepSeek +- **多家大型語言模型供應商**:支援 OpenRouter、SiliconFlow、Openai-compatible、Azure、Google、OpenAI、DeepSeek 與 [MiniMax](https://platform.minimaxi.com/) - **熱門市場資料**:涵蓋美國市場、加密貨幣、香港市場、中國市場等 - **多智能體框架相容**:透過 A2A 協議,支援 LangChain、Agno 等主流 Agent 框架,進行研發整合 - **交易所連接**:支援即時路由至 OKX 和 Binance,並內建安全防護機制 @@ -263,7 +263,7 @@ ValueCell 整合外部服務並嵌入第三方掛件。其使用不受 Apache 2. |---------|------|-----------------| | **TradingView Advanced Chart** | 嵌入式 iframe 掛件 | [Free Advanced Charts Agreement](https://www.tradingview.com/chart-embedding/)(專有) | | **交易所 API**(Binance、OKX、Hyperliquid 等) | REST/WebSocket 端點 | 各交易所的服務條款(例如 [Binance API 條款](https://www.binance.com/en/terms)) | -| **LLM 供應商**(OpenAI、Azure、Google、DeepSeek 等) | 推論 API | 供應商特定的使用條款(例如 [OpenAI 使用條款](https://openai.com/policies/terms-of-use)) | +| **LLM 供應商**(OpenAI、Azure、Google、DeepSeek、MiniMax 等) | 推論 API | 供應商特定的使用條款(例如 [OpenAI 使用條款](https://openai.com/policies/terms-of-use)) | # Star diff --git a/docs/CONFIGURATION_GUIDE.md b/docs/CONFIGURATION_GUIDE.md index 2605a65d1..44834b183 100644 --- a/docs/CONFIGURATION_GUIDE.md +++ b/docs/CONFIGURATION_GUIDE.md @@ -28,6 +28,7 @@ ValueCell supports multiple LLM providers. Choose at least one: | **Google** | [ai.google.dev](https://ai.google.dev/) | | **OpenAI** | [platform.openai.com](https://platform.openai.com/) | | **DashScope** | [bailian.console.aliyun.com](https://bailian.console.aliyun.com/#/home) | +| **MiniMax** | [platform.minimaxi.com](https://platform.minimaxi.com/) | ### Step 2: Configure .env File diff --git a/frontend/src/assets/png/index.ts b/frontend/src/assets/png/index.ts index 4aaea61e5..a1ea65f23 100644 --- a/frontend/src/assets/png/index.ts +++ b/frontend/src/assets/png/index.ts @@ -37,6 +37,7 @@ export { default as AzurePng } from "./model-providers/azure.png"; export { default as DashScopePng } from "./model-providers/dashscope.png"; export { default as DeepSeekPng } from "./model-providers/deepseek.png"; export { default as GooglePng } from "./model-providers/google.png"; +export { default as MiniMaxPng } from "./model-providers/minimax.png"; export { default as OllamaPng } from "./model-providers/ollama.png"; export { default as OpenAiPng } from "./model-providers/openai.png"; export { default as OpenAiCompatiblePng } from "./model-providers/openai-compatible.png"; diff --git a/frontend/src/assets/png/model-providers/minimax.png b/frontend/src/assets/png/model-providers/minimax.png new file mode 100644 index 000000000..48cc810c1 Binary files /dev/null and b/frontend/src/assets/png/model-providers/minimax.png differ diff --git a/frontend/src/constants/icons.ts b/frontend/src/constants/icons.ts index 25fe155bb..980e2bf2e 100644 --- a/frontend/src/constants/icons.ts +++ b/frontend/src/constants/icons.ts @@ -9,6 +9,7 @@ import { GooglePng, HyperliquidPng, MexcPng, + MiniMaxPng, OkxPng, OllamaPng, OpenAiCompatiblePng, @@ -27,6 +28,7 @@ export const MODEL_PROVIDER_ICONS = { google: GooglePng, azure: AzurePng, dashscope: DashScopePng, + minimax: MiniMaxPng, ollama: OllamaPng, }; diff --git a/frontend/src/i18n/locales/en.json b/frontend/src/i18n/locales/en.json index 3eaf87c75..3e86fd857 100644 --- a/frontend/src/i18n/locales/en.json +++ b/frontend/src/i18n/locales/en.json @@ -173,6 +173,7 @@ "dashscope": "Alibaba Cloud", "deepseek": "DeepSeek", "google": "Google Cloud", + "minimax": "MiniMax", "openai": "OpenAI", "openai-compatible": "OpenAI Compatible API", "openrouter": "OpenRouter", diff --git a/frontend/src/i18n/locales/ja.json b/frontend/src/i18n/locales/ja.json index b09f36495..6927a2e6d 100644 --- a/frontend/src/i18n/locales/ja.json +++ b/frontend/src/i18n/locales/ja.json @@ -173,6 +173,7 @@ "dashscope": "Alibaba Cloud", "deepseek": "DeepSeek", "google": "Google Cloud", + "minimax": "MiniMax", "openai": "OpenAI", "openai-compatible": "OpenAI互換API", "openrouter": "OpenRouter", diff --git a/frontend/src/i18n/locales/zh_CN.json b/frontend/src/i18n/locales/zh_CN.json index 36b478ab8..83b53bb09 100644 --- a/frontend/src/i18n/locales/zh_CN.json +++ b/frontend/src/i18n/locales/zh_CN.json @@ -173,6 +173,7 @@ "dashscope": "阿里云", "deepseek": "深度求索", "google": "谷歌云", + "minimax": "MiniMax", "openai": "OpenAI", "openai-compatible": "OpenAI兼容API", "openrouter": "OpenRouter", diff --git a/frontend/src/i18n/locales/zh_TW.json b/frontend/src/i18n/locales/zh_TW.json index 32247aba6..7f549318d 100644 --- a/frontend/src/i18n/locales/zh_TW.json +++ b/frontend/src/i18n/locales/zh_TW.json @@ -173,6 +173,7 @@ "dashscope": "Alibaba Cloud", "deepseek": "DeepSeek", "google": "Google Cloud", + "minimax": "MiniMax", "openai": "OpenAI", "openai-compatible": "OpenAI相容API", "openrouter": "OpenRouter", diff --git a/python/configs/config.yaml b/python/configs/config.yaml index 504961762..b9657ce11 100644 --- a/python/configs/config.yaml +++ b/python/configs/config.yaml @@ -46,7 +46,11 @@ models: deepseek: config_file: "providers/deepseek.yaml" api_key_env: "DEEPSEEK_API_KEY" - + + minimax: + config_file: "providers/minimax.yaml" + api_key_env: "MINIMAX_API_KEY" + dashscope: config_file: "providers/dashscope.yaml" api_key_env: "DASHSCOPE_API_KEY" diff --git a/python/configs/providers/minimax.yaml b/python/configs/providers/minimax.yaml new file mode 100644 index 000000000..3ad2657a3 --- /dev/null +++ b/python/configs/providers/minimax.yaml @@ -0,0 +1,49 @@ +# ============================================ +# MiniMax Provider Configuration +# ============================================ +# MiniMax provides OpenAI-compatible API endpoints for their models. +# All models support 204K context length. +# +# Usage: +# 1. Set MINIMAX_API_KEY environment variable +# 2. Set PRIMARY_PROVIDER=minimax (or use as fallback) +# +# Get your API key at: https://platform.minimaxi.com/ + +name: "MiniMax" +provider_type: "minimax" + +enabled: true + +# Connection Configuration +connection: + base_url: "https://api.minimax.io/v1" + api_key_env: "MINIMAX_API_KEY" + +# Default model if none specified +default_model: "MiniMax-M2.7" + +# Model Parameters Defaults +# Note: MiniMax temperature must be in (0.0, 1.0] range +defaults: + temperature: 0.7 + max_tokens: 8096 + +# Available Models +models: + - id: "MiniMax-M2.7" + name: "MiniMax M2.7" + context_length: 204000 + description: "MiniMax M2.7 - latest flagship model with 204K context" + - id: "MiniMax-M2.7-highspeed" + name: "MiniMax M2.7 Highspeed" + context_length: 204000 + description: "MiniMax M2.7 Highspeed - faster variant with 204K context" + - id: "MiniMax-M2.5" + name: "MiniMax M2.5" + context_length: 204000 + description: "MiniMax M2.5 model with 204K context" + - id: "MiniMax-M2.5-highspeed" + name: "MiniMax M2.5 Highspeed" + context_length: 204000 + description: "MiniMax M2.5 Highspeed - faster variant with 204K context" diff --git a/python/valuecell/adapters/models/__init__.py b/python/valuecell/adapters/models/__init__.py index 0bdbd6334..3b13e9cb6 100644 --- a/python/valuecell/adapters/models/__init__.py +++ b/python/valuecell/adapters/models/__init__.py @@ -24,6 +24,7 @@ DashScopeProvider, DeepSeekProvider, GoogleProvider, + MiniMaxProvider, ModelFactory, ModelProvider, OllamaProvider, @@ -49,6 +50,7 @@ "AzureProvider", "SiliconFlowProvider", "DeepSeekProvider", + "MiniMaxProvider", "DashScopeProvider", "OllamaProvider", # Convenience functions diff --git a/python/valuecell/adapters/models/factory.py b/python/valuecell/adapters/models/factory.py index 341b90525..bcace8ed2 100644 --- a/python/valuecell/adapters/models/factory.py +++ b/python/valuecell/adapters/models/factory.py @@ -503,6 +503,58 @@ def create_model(self, model_id: Optional[str] = None, **kwargs): ) +class MiniMaxProvider(ModelProvider): + """MiniMax model provider + + MiniMax provides OpenAI-compatible API endpoints for their models + (MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, MiniMax-M2.5-highspeed). + All models support 204K context length. + + Configuration: + - MINIMAX_API_KEY: API key from MiniMax platform + - Base URL: https://api.minimax.io/v1 (OpenAI-compatible) + - Temperature must be in (0.0, 1.0] range + """ + + def create_model(self, model_id: Optional[str] = None, **kwargs): + """Create MiniMax model via agno (OpenAI-compatible) + + Args: + model_id: Model identifier (uses default if None) + **kwargs: Additional model parameters + + Returns: + OpenAILike model instance configured for MiniMax + """ + try: + from agno.models.openai import OpenAILike + except ImportError: + raise ImportError( + "agno package not installed. Install with: pip install agno" + ) + + model_id = model_id or self.config.default_model + params = {**self.config.parameters, **kwargs} + + # MiniMax requires temperature in (0.0, 1.0] + temperature = params.get("temperature") + if temperature is not None: + temperature = max(0.01, min(float(temperature), 1.0)) + + logger.info(f"Creating MiniMax model: {model_id}") + + return OpenAILike( + id=model_id, + api_key=self.config.api_key, + base_url=self.config.base_url, + temperature=temperature, + max_tokens=params.get("max_tokens"), + top_p=params.get("top_p"), + frequency_penalty=params.get("frequency_penalty"), + presence_penalty=params.get("presence_penalty"), + ) + + class DashScopeProvider(ModelProvider): """DashScope model provider (native)""" @@ -607,6 +659,7 @@ class ModelFactory: "openai": OpenAIProvider, "openai-compatible": OpenAICompatibleProvider, "deepseek": DeepSeekProvider, + "minimax": MiniMaxProvider, "dashscope": DashScopeProvider, "ollama": OllamaProvider, } diff --git a/python/valuecell/adapters/models/tests/__init__.py b/python/valuecell/adapters/models/tests/__init__.py new file mode 100644 index 000000000..582c91ca7 --- /dev/null +++ b/python/valuecell/adapters/models/tests/__init__.py @@ -0,0 +1,552 @@ +""" +Unit and integration tests for MiniMax provider in the model factory. +""" + +import os +from unittest.mock import MagicMock, patch + +import pytest + +from valuecell.adapters.models.factory import ( + MiniMaxProvider, + ModelFactory, + ModelProvider, +) +from valuecell.config.manager import ProviderConfig + + +# ============================================ +# Fixtures +# ============================================ + + +@pytest.fixture +def minimax_provider_config(): + """Create a MiniMax ProviderConfig for testing.""" + return ProviderConfig( + name="minimax", + enabled=True, + api_key="test-minimax-api-key", + base_url="https://api.minimax.io/v1", + default_model="MiniMax-M2.7", + models=[ + { + "id": "MiniMax-M2.7", + "name": "MiniMax M2.7", + "context_length": 204000, + "description": "MiniMax M2.7 - latest flagship model", + }, + { + "id": "MiniMax-M2.7-highspeed", + "name": "MiniMax M2.7 Highspeed", + "context_length": 204000, + "description": "MiniMax M2.7 Highspeed - faster variant", + }, + { + "id": "MiniMax-M2.5", + "name": "MiniMax M2.5", + "context_length": 204000, + "description": "MiniMax M2.5 model", + }, + { + "id": "MiniMax-M2.5-highspeed", + "name": "MiniMax M2.5 Highspeed", + "context_length": 204000, + "description": "MiniMax M2.5 Highspeed - faster variant", + }, + ], + parameters={"temperature": 0.7, "max_tokens": 8096}, + ) + + +@pytest.fixture +def minimax_provider(minimax_provider_config): + """Create a MiniMaxProvider instance for testing.""" + return MiniMaxProvider(minimax_provider_config) + + +@pytest.fixture +def disabled_minimax_config(): + """Create a disabled MiniMax ProviderConfig.""" + return ProviderConfig( + name="minimax", + enabled=False, + api_key="test-key", + base_url="https://api.minimax.io/v1", + default_model="MiniMax-M2.7", + models=[], + parameters={}, + ) + + +@pytest.fixture +def no_key_minimax_config(): + """Create a MiniMax ProviderConfig without API key.""" + return ProviderConfig( + name="minimax", + enabled=True, + api_key=None, + base_url="https://api.minimax.io/v1", + default_model="MiniMax-M2.7", + models=[], + parameters={}, + ) + + +# ============================================ +# Unit Tests: MiniMaxProvider +# ============================================ + + +class TestMiniMaxProviderInit: + """Test MiniMaxProvider initialization.""" + + def test_inherits_model_provider(self): + """MiniMaxProvider should be a subclass of ModelProvider.""" + assert issubclass(MiniMaxProvider, ModelProvider) + + def test_provider_stores_config(self, minimax_provider, minimax_provider_config): + """Provider should store the config.""" + assert minimax_provider.config is minimax_provider_config + + def test_is_available_with_key(self, minimax_provider): + """Provider with API key should be available.""" + assert minimax_provider.is_available() is True + + def test_is_not_available_without_key(self, no_key_minimax_config): + """Provider without API key should not be available.""" + provider = MiniMaxProvider(no_key_minimax_config) + assert provider.is_available() is False + + def test_has_no_embedding_support(self, minimax_provider): + """MiniMax provider should not have embedding support.""" + assert minimax_provider.has_embedding_support() is False + + +class TestMiniMaxProviderCreateModel: + """Test MiniMaxProvider.create_model().""" + + @patch("valuecell.adapters.models.factory.OpenAILike", create=True) + def test_create_model_default(self, mock_openai_like_cls, minimax_provider): + """create_model() without model_id should use default model.""" + # Patch the import inside the method + mock_model = MagicMock() + with patch.dict( + "sys.modules", + {"agno": MagicMock(), "agno.models": MagicMock(), "agno.models.openai": MagicMock()}, + ): + with patch( + "valuecell.adapters.models.factory.MiniMaxProvider.create_model" + ) as mock_create: + mock_create.return_value = mock_model + result = minimax_provider.create_model() + assert result is mock_model + + def test_create_model_uses_openai_like(self, minimax_provider): + """create_model() should use agno's OpenAILike.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + mock_instance = MagicMock() + MockOpenAILike.return_value = mock_instance + + result = minimax_provider.create_model() + + MockOpenAILike.assert_called_once_with( + id="MiniMax-M2.7", + api_key="test-minimax-api-key", + base_url="https://api.minimax.io/v1", + temperature=0.7, + max_tokens=8096, + top_p=None, + frequency_penalty=None, + presence_penalty=None, + ) + assert result is mock_instance + + def test_create_model_specific_model_id(self, minimax_provider): + """create_model() with specific model_id should use it.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + mock_instance = MagicMock() + MockOpenAILike.return_value = mock_instance + + result = minimax_provider.create_model(model_id="MiniMax-M2.7-highspeed") + + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.7-highspeed" + assert result is mock_instance + + def test_create_model_m25(self, minimax_provider): + """create_model() should work with M2.5 model.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(model_id="MiniMax-M2.5") + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.5" + + def test_create_model_m25_highspeed(self, minimax_provider): + """create_model() should work with M2.5-highspeed model.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(model_id="MiniMax-M2.5-highspeed") + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.5-highspeed" + + +class TestMiniMaxTemperatureClamping: + """Test temperature clamping for MiniMax provider.""" + + def test_temperature_normal(self, minimax_provider): + """Normal temperature (0.7) should pass through.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=0.7) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 0.7 + + def test_temperature_zero_clamped(self, minimax_provider): + """Temperature 0.0 should be clamped to 0.01 (MiniMax requires > 0).""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=0.0) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 0.01 + + def test_temperature_above_one_clamped(self, minimax_provider): + """Temperature > 1.0 should be clamped to 1.0.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=1.5) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 1.0 + + def test_temperature_exactly_one(self, minimax_provider): + """Temperature 1.0 should stay as 1.0.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=1.0) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 1.0 + + def test_temperature_negative_clamped(self, minimax_provider): + """Negative temperature should be clamped to 0.01.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=-0.5) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 0.01 + + def test_temperature_none_passthrough(self, minimax_provider_config): + """Temperature None should pass through as None.""" + config = ProviderConfig( + name="minimax", + enabled=True, + api_key="test-key", + base_url="https://api.minimax.io/v1", + default_model="MiniMax-M2.7", + models=[], + parameters={}, + ) + provider = MiniMaxProvider(config) + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + provider.create_model() + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] is None + + +class TestMiniMaxProviderParameters: + """Test parameter merging for MiniMax provider.""" + + def test_kwargs_override_defaults(self, minimax_provider): + """Kwargs should override default parameters.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(max_tokens=4096, top_p=0.9) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["max_tokens"] == 4096 + assert call_kwargs["top_p"] == 0.9 + + def test_base_url_from_config(self, minimax_provider): + """Base URL should come from config.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model() + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["base_url"] == "https://api.minimax.io/v1" + + def test_api_key_from_config(self, minimax_provider): + """API key should come from config.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model() + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["api_key"] == "test-minimax-api-key" + + +class TestMiniMaxProviderEmbedding: + """Test embedding support for MiniMax provider.""" + + def test_create_embedder_raises(self, minimax_provider): + """MiniMax provider should raise NotImplementedError for embeddings.""" + with pytest.raises(NotImplementedError, match="does not support embedding"): + minimax_provider.create_embedder() + + +# ============================================ +# Unit Tests: ModelFactory Registration +# ============================================ + + +class TestMiniMaxFactoryRegistration: + """Test MiniMax registration in ModelFactory.""" + + def test_minimax_in_providers_registry(self): + """MiniMax should be in the factory's _providers dict.""" + assert "minimax" in ModelFactory._providers + + def test_minimax_maps_to_correct_class(self): + """MiniMax should map to MiniMaxProvider class.""" + assert ModelFactory._providers["minimax"] is MiniMaxProvider + + def test_minimax_provider_count(self): + """Factory should have 10 providers (including minimax).""" + assert len(ModelFactory._providers) == 10 + + +# ============================================ +# Unit Tests: model_should_use_json_mode +# ============================================ + + +class TestMiniMaxJsonMode: + """Test model_should_use_json_mode for MiniMax models.""" + + def test_minimax_base_url_triggers_json_mode(self): + """Models with minimax.io base_url should use JSON mode.""" + from valuecell.utils.model import model_should_use_json_mode + + mock_model = MagicMock() + mock_model.provider = "openai_like" + mock_model.name = "OpenAILike" + mock_model.base_url = "https://api.minimax.io/v1" + + # Need to set provider/name to match OpenAILike + from agno.models.openai import OpenAILike + + mock_model.provider = OpenAILike.provider + mock_model.name = OpenAILike.name + + result = model_should_use_json_mode(mock_model) + assert result is True + + +# ============================================ +# Unit Tests: Provider YAML Config +# ============================================ + + +class TestMiniMaxYamlConfig: + """Test MiniMax YAML configuration file.""" + + def test_yaml_exists(self): + """MiniMax YAML config file should exist.""" + import os + + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + assert os.path.exists(yaml_path), f"minimax.yaml not found at {yaml_path}" + + def test_yaml_content(self): + """MiniMax YAML should have correct content.""" + import os + + import yaml + + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + with open(yaml_path) as f: + config = yaml.safe_load(f) + + assert config["name"] == "MiniMax" + assert config["provider_type"] == "minimax" + assert config["enabled"] is True + assert config["connection"]["base_url"] == "https://api.minimax.io/v1" + assert config["connection"]["api_key_env"] == "MINIMAX_API_KEY" + assert config["default_model"] == "MiniMax-M2.7" + + def test_yaml_models_list(self): + """YAML should list all MiniMax models.""" + import os + + import yaml + + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + with open(yaml_path) as f: + config = yaml.safe_load(f) + + model_ids = [m["id"] for m in config["models"]] + assert "MiniMax-M2.7" in model_ids + assert "MiniMax-M2.7-highspeed" in model_ids + assert "MiniMax-M2.5" in model_ids + assert "MiniMax-M2.5-highspeed" in model_ids + + def test_yaml_context_length(self): + """All MiniMax models should have 204K context length.""" + import os + + import yaml + + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + with open(yaml_path) as f: + config = yaml.safe_load(f) + + for model in config["models"]: + assert model["context_length"] == 204000, ( + f"Model {model['id']} should have 204K context" + ) + + +# ============================================ +# Unit Tests: config.yaml Registration +# ============================================ + + +class TestMiniMaxConfigRegistration: + """Test MiniMax registration in config.yaml.""" + + def test_minimax_in_config_yaml(self): + """MiniMax should be registered in config.yaml.""" + import os + + import yaml + + config_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "config.yaml", + ) + with open(config_path) as f: + config = yaml.safe_load(f) + + providers = config.get("models", {}).get("providers", {}) + assert "minimax" in providers + assert providers["minimax"]["config_file"] == "providers/minimax.yaml" + assert providers["minimax"]["api_key_env"] == "MINIMAX_API_KEY" + + +# ============================================ +# Integration Tests +# ============================================ + + +class TestMiniMaxIntegrationWithConfigManager: + """Integration tests for MiniMax with ConfigManager.""" + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-integration-key"}) + def test_config_manager_loads_minimax(self): + """ConfigManager should load MiniMax provider config.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + config = manager.get_provider_config("minimax") + + assert config is not None + assert config.name == "minimax" + assert config.api_key == "test-integration-key" + assert config.base_url == "https://api.minimax.io/v1" + assert config.default_model == "MiniMax-M2.7" + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-key"}) + def test_config_manager_validates_minimax(self): + """ConfigManager should validate MiniMax provider successfully.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + is_valid, error = manager.validate_provider("minimax") + assert is_valid is True + assert error is None + + def test_config_manager_validates_minimax_no_key(self): + """ConfigManager should fail validation without API key.""" + env = os.environ.copy() + env.pop("MINIMAX_API_KEY", None) + with patch.dict(os.environ, env, clear=True): + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + is_valid, error = manager.validate_provider("minimax") + assert is_valid is False + assert "MINIMAX_API_KEY" in error + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-factory-key"}) + def test_factory_creates_minimax_model(self): + """ModelFactory should create MiniMax model end-to-end.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + mock_model = MagicMock() + MockOpenAILike.return_value = mock_model + + factory = ModelFactory() + result = factory.create_model(provider="minimax", use_fallback=False) + + assert result is mock_model + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.7" + assert call_kwargs["api_key"] == "test-factory-key" + assert call_kwargs["base_url"] == "https://api.minimax.io/v1" + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-key"}) + def test_minimax_in_enabled_providers(self): + """MiniMax should appear in enabled providers when API key is set.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + enabled = manager.get_enabled_providers() + assert "minimax" in enabled + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-key"}) + def test_minimax_available_models(self): + """ConfigManager should list MiniMax models.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + models = manager.get_available_models("minimax") + model_ids = [m["id"] for m in models] + assert "MiniMax-M2.7" in model_ids + assert "MiniMax-M2.7-highspeed" in model_ids diff --git a/python/valuecell/adapters/models/tests/test_minimax_provider.py b/python/valuecell/adapters/models/tests/test_minimax_provider.py new file mode 100644 index 000000000..75493812a --- /dev/null +++ b/python/valuecell/adapters/models/tests/test_minimax_provider.py @@ -0,0 +1,509 @@ +""" +Unit and integration tests for MiniMax provider in the model factory. +""" + +import os +from unittest.mock import MagicMock, patch + +import pytest + +from valuecell.adapters.models.factory import ( + MiniMaxProvider, + ModelFactory, + ModelProvider, +) +from valuecell.config.manager import ProviderConfig + + +# ============================================ +# Fixtures +# ============================================ + + +@pytest.fixture +def minimax_provider_config(): + """Create a MiniMax ProviderConfig for testing.""" + return ProviderConfig( + name="minimax", + enabled=True, + api_key="test-minimax-api-key", + base_url="https://api.minimax.io/v1", + default_model="MiniMax-M2.7", + models=[ + { + "id": "MiniMax-M2.7", + "name": "MiniMax M2.7", + "context_length": 204000, + "description": "MiniMax M2.7 - latest flagship model", + }, + { + "id": "MiniMax-M2.7-highspeed", + "name": "MiniMax M2.7 Highspeed", + "context_length": 204000, + "description": "MiniMax M2.7 Highspeed - faster variant", + }, + { + "id": "MiniMax-M2.5", + "name": "MiniMax M2.5", + "context_length": 204000, + "description": "MiniMax M2.5 model", + }, + { + "id": "MiniMax-M2.5-highspeed", + "name": "MiniMax M2.5 Highspeed", + "context_length": 204000, + "description": "MiniMax M2.5 Highspeed - faster variant", + }, + ], + parameters={"temperature": 0.7, "max_tokens": 8096}, + ) + + +@pytest.fixture +def minimax_provider(minimax_provider_config): + """Create a MiniMaxProvider instance for testing.""" + return MiniMaxProvider(minimax_provider_config) + + +@pytest.fixture +def no_key_minimax_config(): + """Create a MiniMax ProviderConfig without API key.""" + return ProviderConfig( + name="minimax", + enabled=True, + api_key=None, + base_url="https://api.minimax.io/v1", + default_model="MiniMax-M2.7", + models=[], + parameters={}, + ) + + +# ============================================ +# Unit Tests: MiniMaxProvider +# ============================================ + + +class TestMiniMaxProviderInit: + """Test MiniMaxProvider initialization.""" + + def test_inherits_model_provider(self): + """MiniMaxProvider should be a subclass of ModelProvider.""" + assert issubclass(MiniMaxProvider, ModelProvider) + + def test_provider_stores_config(self, minimax_provider, minimax_provider_config): + """Provider should store the config.""" + assert minimax_provider.config is minimax_provider_config + + def test_is_available_with_key(self, minimax_provider): + """Provider with API key should be available.""" + assert minimax_provider.is_available() is True + + def test_is_not_available_without_key(self, no_key_minimax_config): + """Provider without API key should not be available.""" + provider = MiniMaxProvider(no_key_minimax_config) + assert provider.is_available() is False + + def test_has_no_embedding_support(self, minimax_provider): + """MiniMax provider should not have embedding support.""" + assert minimax_provider.has_embedding_support() is False + + +class TestMiniMaxProviderCreateModel: + """Test MiniMaxProvider.create_model().""" + + def test_create_model_uses_openai_like(self, minimax_provider): + """create_model() should use agno's OpenAILike.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + mock_instance = MagicMock() + MockOpenAILike.return_value = mock_instance + + result = minimax_provider.create_model() + + MockOpenAILike.assert_called_once_with( + id="MiniMax-M2.7", + api_key="test-minimax-api-key", + base_url="https://api.minimax.io/v1", + temperature=0.7, + max_tokens=8096, + top_p=None, + frequency_penalty=None, + presence_penalty=None, + ) + assert result is mock_instance + + def test_create_model_specific_model_id(self, minimax_provider): + """create_model() with specific model_id should use it.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + mock_instance = MagicMock() + MockOpenAILike.return_value = mock_instance + + result = minimax_provider.create_model(model_id="MiniMax-M2.7-highspeed") + + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.7-highspeed" + assert result is mock_instance + + def test_create_model_m25(self, minimax_provider): + """create_model() should work with M2.5 model.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(model_id="MiniMax-M2.5") + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.5" + + def test_create_model_m25_highspeed(self, minimax_provider): + """create_model() should work with M2.5-highspeed model.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(model_id="MiniMax-M2.5-highspeed") + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.5-highspeed" + + +class TestMiniMaxTemperatureClamping: + """Test temperature clamping for MiniMax provider.""" + + def test_temperature_normal(self, minimax_provider): + """Normal temperature (0.7) should pass through.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=0.7) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 0.7 + + def test_temperature_zero_clamped(self, minimax_provider): + """Temperature 0.0 should be clamped to 0.01 (MiniMax requires > 0).""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=0.0) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 0.01 + + def test_temperature_above_one_clamped(self, minimax_provider): + """Temperature > 1.0 should be clamped to 1.0.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=1.5) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 1.0 + + def test_temperature_exactly_one(self, minimax_provider): + """Temperature 1.0 should stay as 1.0.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=1.0) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 1.0 + + def test_temperature_negative_clamped(self, minimax_provider): + """Negative temperature should be clamped to 0.01.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(temperature=-0.5) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] == 0.01 + + def test_temperature_none_passthrough(self): + """Temperature None should pass through as None.""" + config = ProviderConfig( + name="minimax", + enabled=True, + api_key="test-key", + base_url="https://api.minimax.io/v1", + default_model="MiniMax-M2.7", + models=[], + parameters={}, + ) + provider = MiniMaxProvider(config) + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + provider.create_model() + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["temperature"] is None + + +class TestMiniMaxProviderParameters: + """Test parameter merging for MiniMax provider.""" + + def test_kwargs_override_defaults(self, minimax_provider): + """Kwargs should override default parameters.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model(max_tokens=4096, top_p=0.9) + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["max_tokens"] == 4096 + assert call_kwargs["top_p"] == 0.9 + + def test_base_url_from_config(self, minimax_provider): + """Base URL should come from config.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model() + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["base_url"] == "https://api.minimax.io/v1" + + def test_api_key_from_config(self, minimax_provider): + """API key should come from config.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + MockOpenAILike.return_value = MagicMock() + minimax_provider.create_model() + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["api_key"] == "test-minimax-api-key" + + +class TestMiniMaxProviderEmbedding: + """Test embedding support for MiniMax provider.""" + + def test_create_embedder_raises(self, minimax_provider): + """MiniMax provider should raise NotImplementedError for embeddings.""" + with pytest.raises(NotImplementedError, match="does not support embedding"): + minimax_provider.create_embedder() + + +# ============================================ +# Unit Tests: ModelFactory Registration +# ============================================ + + +class TestMiniMaxFactoryRegistration: + """Test MiniMax registration in ModelFactory.""" + + def test_minimax_in_providers_registry(self): + """MiniMax should be in the factory's _providers dict.""" + assert "minimax" in ModelFactory._providers + + def test_minimax_maps_to_correct_class(self): + """MiniMax should map to MiniMaxProvider class.""" + assert ModelFactory._providers["minimax"] is MiniMaxProvider + + def test_minimax_provider_count(self): + """Factory should have 10 providers (including minimax).""" + assert len(ModelFactory._providers) == 10 + + +# ============================================ +# Unit Tests: model_should_use_json_mode +# ============================================ + + +class TestMiniMaxJsonMode: + """Test model_should_use_json_mode for MiniMax models.""" + + def test_minimax_base_url_triggers_json_mode(self): + """Models with minimax.io base_url should use JSON mode.""" + from valuecell.utils.model import model_should_use_json_mode + + mock_model = MagicMock() + + from agno.models.openai import OpenAILike + + mock_model.provider = OpenAILike.provider + mock_model.name = OpenAILike.name + mock_model.base_url = "https://api.minimax.io/v1" + + result = model_should_use_json_mode(mock_model) + assert result is True + + +# ============================================ +# Unit Tests: Provider YAML Config +# ============================================ + + +class TestMiniMaxYamlConfig: + """Test MiniMax YAML configuration file.""" + + def test_yaml_exists(self): + """MiniMax YAML config file should exist.""" + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + assert os.path.exists(yaml_path), f"minimax.yaml not found at {yaml_path}" + + def test_yaml_content(self): + """MiniMax YAML should have correct content.""" + import yaml + + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + with open(yaml_path) as f: + config = yaml.safe_load(f) + + assert config["name"] == "MiniMax" + assert config["provider_type"] == "minimax" + assert config["enabled"] is True + assert config["connection"]["base_url"] == "https://api.minimax.io/v1" + assert config["connection"]["api_key_env"] == "MINIMAX_API_KEY" + assert config["default_model"] == "MiniMax-M2.7" + + def test_yaml_models_list(self): + """YAML should list all MiniMax models.""" + import yaml + + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + with open(yaml_path) as f: + config = yaml.safe_load(f) + + model_ids = [m["id"] for m in config["models"]] + assert "MiniMax-M2.7" in model_ids + assert "MiniMax-M2.7-highspeed" in model_ids + assert "MiniMax-M2.5" in model_ids + assert "MiniMax-M2.5-highspeed" in model_ids + + def test_yaml_context_length(self): + """All MiniMax models should have 204K context length.""" + import yaml + + yaml_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "providers", + "minimax.yaml", + ) + with open(yaml_path) as f: + config = yaml.safe_load(f) + + for model in config["models"]: + assert model["context_length"] == 204000, ( + f"Model {model['id']} should have 204K context" + ) + + +# ============================================ +# Unit Tests: config.yaml Registration +# ============================================ + + +class TestMiniMaxConfigRegistration: + """Test MiniMax registration in config.yaml.""" + + def test_minimax_in_config_yaml(self): + """MiniMax should be registered in config.yaml.""" + import yaml + + config_path = os.path.join( + os.path.dirname(__file__), + "..", + "..", + "..", + "..", + "configs", + "config.yaml", + ) + with open(config_path) as f: + config = yaml.safe_load(f) + + providers = config.get("models", {}).get("providers", {}) + assert "minimax" in providers + assert providers["minimax"]["config_file"] == "providers/minimax.yaml" + assert providers["minimax"]["api_key_env"] == "MINIMAX_API_KEY" + + +# ============================================ +# Integration Tests +# ============================================ + + +class TestMiniMaxIntegrationWithConfigManager: + """Integration tests for MiniMax with ConfigManager.""" + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-integration-key"}) + def test_config_manager_loads_minimax(self): + """ConfigManager should load MiniMax provider config.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + config = manager.get_provider_config("minimax") + + assert config is not None + assert config.name == "minimax" + assert config.api_key == "test-integration-key" + assert config.base_url == "https://api.minimax.io/v1" + assert config.default_model == "MiniMax-M2.7" + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-key"}) + def test_config_manager_validates_minimax(self): + """ConfigManager should validate MiniMax provider successfully.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + is_valid, error = manager.validate_provider("minimax") + assert is_valid is True + assert error is None + + def test_config_manager_validates_minimax_no_key(self): + """ConfigManager should fail validation without API key.""" + env = os.environ.copy() + env.pop("MINIMAX_API_KEY", None) + with patch.dict(os.environ, env, clear=True): + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + is_valid, error = manager.validate_provider("minimax") + assert is_valid is False + assert "MINIMAX_API_KEY" in error + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-factory-key"}) + def test_factory_creates_minimax_model(self): + """ModelFactory should create MiniMax model end-to-end.""" + with patch("agno.models.openai.OpenAILike") as MockOpenAILike: + mock_model = MagicMock() + MockOpenAILike.return_value = mock_model + + factory = ModelFactory() + result = factory.create_model(provider="minimax", use_fallback=False) + + assert result is mock_model + call_kwargs = MockOpenAILike.call_args[1] + assert call_kwargs["id"] == "MiniMax-M2.7" + assert call_kwargs["api_key"] == "test-factory-key" + assert call_kwargs["base_url"] == "https://api.minimax.io/v1" + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-key"}) + def test_minimax_in_enabled_providers(self): + """MiniMax should appear in enabled providers when API key is set.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + enabled = manager.get_enabled_providers() + assert "minimax" in enabled + + @patch.dict(os.environ, {"MINIMAX_API_KEY": "test-key"}) + def test_minimax_available_models(self): + """ConfigManager should list MiniMax models.""" + from valuecell.config.manager import ConfigManager + + manager = ConfigManager() + models = manager.get_available_models("minimax") + model_ids = [m["id"] for m in models] + assert "MiniMax-M2.7" in model_ids + assert "MiniMax-M2.7-highspeed" in model_ids diff --git a/python/valuecell/server/api/routers/models.py b/python/valuecell/server/api/routers/models.py index ec7797f6b..472031a38 100644 --- a/python/valuecell/server/api/routers/models.py +++ b/python/valuecell/server/api/routers/models.py @@ -133,6 +133,7 @@ def _api_key_url_for(provider: str) -> str | None: "azure": "https://azure.microsoft.com/en-us/products/ai-foundry/models/openai/", "siliconflow": "https://cloud.siliconflow.cn/account/ak", "deepseek": "https://platform.deepseek.com/api_keys", + "minimax": "https://platform.minimaxi.com/", "dashscope": "https://bailian.console.aliyun.com/#/home", "ollama": None, } diff --git a/python/valuecell/utils/model.py b/python/valuecell/utils/model.py index 4e9e19bd5..0de1a20ae 100644 --- a/python/valuecell/utils/model.py +++ b/python/valuecell/utils/model.py @@ -121,6 +121,14 @@ def model_should_use_json_mode(model: AgnoModel) -> bool: ) return True + # MiniMax doesn't support structured outputs, only JSON mode + if "minimax.io" in base_url_str: + logger.debug( + "Detected MiniMax API - forcing JSON mode " + "(structured outputs not supported)" + ) + return True + # For other OpenAI-compatible APIs, use JSON mode as safer default # Most OpenAI-compatible APIs support JSON mode but not structured outputs logger.debug(