Skip to content

Commit fe94b85

Browse files
authored
Merge pull request #42 from MiniMax-AI/feat/M2.1
feat: M2.1
2 parents b2f8000 + 33ac1de commit fe94b85

13 files changed

Lines changed: 29 additions & 29 deletions

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
English | [中文](./README_CN.md)
44

5-
**Mini Agent** is a minimal yet professional demo project that showcases the best practices for building agents with the MiniMax M2 model. Leveraging an Anthropic-compatible API, it fully supports interleaved thinking to unlock M2's powerful reasoning capabilities for long, complex tasks.
5+
**Mini Agent** is a minimal yet professional demo project that showcases the best practices for building agents with the MiniMax M2.1 model. Leveraging an Anthropic-compatible API, it fully supports interleaved thinking to unlock M2's powerful reasoning capabilities for long, complex tasks.
66

77
This project comes packed with features designed for a robust and intelligent agent development experience:
88

@@ -115,7 +115,7 @@ Fill in your API Key and corresponding API Base:
115115
api_key: "YOUR_API_KEY_HERE" # API Key from step 1
116116
api_base: "https://api.minimax.io" # Global
117117
# api_base: "https://api.minimaxi.com" # China
118-
model: "MiniMax-M2"
118+
model: "MiniMax-M2.1"
119119
```
120120
121121
**Start Using:**
@@ -182,7 +182,7 @@ Fill in your API Key and corresponding API Base:
182182
api_key: "YOUR_API_KEY_HERE" # API Key from step 1
183183
api_base: "https://api.minimax.io" # Global
184184
# api_base: "https://api.minimaxi.com" # China
185-
model: "MiniMax-M2"
185+
model: "MiniMax-M2.1"
186186
max_steps: 100
187187
workspace_dir: "./workspace"
188188
```

README_CN.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
[English](./README.md) | 中文
44

5-
**Mini Agent** 是一个极简但专业的演示项目,旨在展示使用 MiniMax M2 模型构建 Agent 的最佳实践。项目通过兼容 Anthropic 的 API,完全支持交错思维(interleaved thinking),从而解锁 M2 模型在处理长而复杂的任务时强大的推理能力。
5+
**Mini Agent** 是一个极简但专业的演示项目,旨在展示使用 MiniMax M2.1 模型构建 Agent 的最佳实践。项目通过兼容 Anthropic 的 API,完全支持交错思维(interleaved thinking),从而解锁 M2 模型在处理长而复杂的任务时强大的推理能力。
66

77
该项目具备一系列为稳健、智能的 Agent 开发而设计的特性:
88

@@ -115,7 +115,7 @@ nano ~/.mini-agent/config/config.yaml
115115
api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key
116116
api_base: "https://api.minimaxi.com" # 国内版
117117
# api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请取消本行注释)
118-
model: "MiniMax-M2"
118+
model: "MiniMax-M2.1"
119119
```
120120
121121
**开始使用:**
@@ -182,7 +182,7 @@ vim mini_agent/config/config.yaml # 或使用您偏好的编辑器
182182
api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key
183183
api_base: "https://api.minimaxi.com" # 国内版
184184
# api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请修改此行)
185-
model: "MiniMax-M2"
185+
model: "MiniMax-M2.1"
186186
max_steps: 100
187187
workspace_dir: "./workspace"
188188
```

docs/PRODUCTION_GUIDE.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ This project is a **teaching-level demo** that demonstrates the core concepts an
3434

3535
### 2.2 Model Fallback Mechanism
3636

37-
Currently using a single fixed model (MiniMax-M2), which will directly report errors on failure.
37+
Currently using a single fixed model (MiniMax-M2.1), which will directly report errors on failure.
3838

3939
- Introduce a model pool by configuring multiple model accounts to improve availability
4040
- Introduce automatic health checks, failure removal, circuit breaker strategies for the model pool

docs/PRODUCTION_GUIDE_CN.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@
3434

3535
### 2.2 模型回退机制
3636

37-
当前 Demo 固定使用单一模型(MiniMax-M2),调用失败时会直接报错。
37+
当前 Demo 固定使用单一模型(MiniMax-M2.1),调用失败时会直接报错。
3838

3939
- **建立模型池**:配置多个模型账号,建立模型池以提高服务可用性。
4040
- **引入高可用策略**:为模型池引入自动健康检测、故障节点切换、熔断等高可用策略。

examples/05_provider_selection.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ async def demo_anthropic_provider():
2828
client = LLMClient(
2929
api_key=config["api_key"],
3030
provider=LLMProvider.ANTHROPIC, # Specify Anthropic provider
31-
model=config.get("model", "MiniMax-M2"),
31+
model=config.get("model", "MiniMax-M2.1"),
3232
)
3333

3434
print(f"Provider: {client.provider}")
@@ -63,7 +63,7 @@ async def demo_openai_provider():
6363
client = LLMClient(
6464
api_key=config["api_key"],
6565
provider=LLMProvider.OPENAI, # Specify OpenAI provider
66-
model=config.get("model", "MiniMax-M2"),
66+
model=config.get("model", "MiniMax-M2.1"),
6767
)
6868

6969
print(f"Provider: {client.provider}")
@@ -97,7 +97,7 @@ async def demo_default_provider():
9797
# Initialize client without specifying provider (defaults to Anthropic)
9898
client = LLMClient(
9999
api_key=config["api_key"],
100-
model=config.get("model", "MiniMax-M2"),
100+
model=config.get("model", "MiniMax-M2.1"),
101101
)
102102

103103
print(f"Provider (default): {client.provider}")
@@ -130,13 +130,13 @@ async def demo_provider_comparison():
130130
anthropic_client = LLMClient(
131131
api_key=config["api_key"],
132132
provider=LLMProvider.ANTHROPIC,
133-
model=config.get("model", "MiniMax-M2"),
133+
model=config.get("model", "MiniMax-M2.1"),
134134
)
135135

136136
openai_client = LLMClient(
137137
api_key=config["api_key"],
138138
provider=LLMProvider.OPENAI,
139-
model=config.get("model", "MiniMax-M2"),
139+
model=config.get("model", "MiniMax-M2.1"),
140140
)
141141

142142
# Same question for both

examples/06_tool_schema_demo.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -166,7 +166,7 @@ async def demo_tool_schemas():
166166
client = LLMClient(
167167
api_key=config["api_key"],
168168
provider=LLMProvider.ANTHROPIC,
169-
model="MiniMax-M2",
169+
model="MiniMax-M2.1",
170170
)
171171

172172
# Test with a query that should trigger weather tool
@@ -215,7 +215,7 @@ async def demo_multiple_tools():
215215
client = LLMClient(
216216
api_key=config["api_key"],
217217
provider=LLMProvider.ANTHROPIC,
218-
model="MiniMax-M2",
218+
model="MiniMax-M2.1",
219219
)
220220

221221
messages = [Message(role="user", content="Calculate 15 * 23 for me")]

mini_agent/cli.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -418,7 +418,7 @@ def on_retry(exception: Exception, attempt: int):
418418
system_prompt = system_prompt_path.read_text(encoding="utf-8")
419419
print(f"{Colors.GREEN}✅ Loaded system prompt (from: {system_prompt_path}){Colors.RESET}")
420420
else:
421-
system_prompt = "You are Mini-Agent, an intelligent assistant powered by MiniMax M2 that can help users complete various tasks."
421+
system_prompt = "You are Mini-Agent, an intelligent assistant powered by MiniMax M2.1 that can help users complete various tasks."
422422
print(f"{Colors.YELLOW}⚠️ System prompt not found, using default{Colors.RESET}")
423423

424424
# 6. Inject Skills Metadata into System Prompt (Progressive Disclosure - Level 1)

mini_agent/config.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ class LLMConfig(BaseModel):
2424

2525
api_key: str
2626
api_base: str = "https://api.minimax.io"
27-
model: str = "MiniMax-M2"
27+
model: str = "MiniMax-M2.1"
2828
provider: str = "anthropic" # "anthropic" or "openai"
2929
retry: RetryConfig = Field(default_factory=RetryConfig)
3030

@@ -116,7 +116,7 @@ def from_yaml(cls, config_path: str | Path) -> "Config":
116116
llm_config = LLMConfig(
117117
api_key=data["api_key"],
118118
api_base=data.get("api_base", "https://api.minimax.io"),
119-
model=data.get("model", "MiniMax-M2"),
119+
model=data.get("model", "MiniMax-M2.1"),
120120
provider=data.get("provider", "anthropic"),
121121
retry=retry_config,
122122
)

mini_agent/config/config-example.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
api_key: "YOUR_API_KEY_HERE" # Replace with your MiniMax API Key
2020
api_base: "https://api.minimax.io" # Global users (default)
2121
# api_base: "https://api.minimaxi.com" # China users
22-
model: "MiniMax-M2"
22+
model: "MiniMax-M2.1"
2323
# LLM provider: "anthropic" or "openai"
2424
# The LLMClient will automatically append /anthropic or /v1 to api_base based on provider
2525
provider: "anthropic" # Default: anthropic

mini_agent/llm/anthropic_client.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,15 +25,15 @@ def __init__(
2525
self,
2626
api_key: str,
2727
api_base: str = "https://api.minimaxi.com/anthropic",
28-
model: str = "MiniMax-M2",
28+
model: str = "MiniMax-M2.1",
2929
retry_config: RetryConfig | None = None,
3030
):
3131
"""Initialize Anthropic client.
3232
3333
Args:
3434
api_key: API key for authentication
3535
api_base: Base URL for the API (default: MiniMax Anthropic endpoint)
36-
model: Model name to use (default: MiniMax-M2)
36+
model: Model name to use (default: MiniMax-M2.1)
3737
retry_config: Optional retry configuration
3838
"""
3939
super().__init__(api_key, api_base, model, retry_config)

0 commit comments

Comments
 (0)