Minimax

MiniMax: MiniMax M2

MiniMax-M2 is a compact, high-efficiency large language model optimized for end-to-end coding and agentic workflows. With 10 billion activated parameters (230 billion total), it delivers near-frontier intelligence across general reasoning, tool use, and multi-step task execution while maintaining low latency and deployment efficiency. The model excels in code generation, multi-file editing, compile-run-fix loops, and test-validated repair, showing strong results on SWE-Bench Verified, Multi-SWE-Bench, and Terminal-Bench. It also performs competitively in agentic evaluations such as BrowseComp and GAIA, effectively handling long-horizon planning, retrieval, and recovery from execution errors. Benchmarked by [Artificial Analysis](https://artificialanalysis.ai/models/minimax-m2), MiniMax-M2 ranks among the top open-source models for composite intelligence, spanning mathematics, science, and instruction-following. Its small activation footprint enables fast inference, high concurrency, and improved unit economics, making it well-suited for large-scale agents, developer assistants, and reasoning-driven applications that require responsiveness and cost efficiency. To avoid degrading this model's performance, MiniMax highly recommends preserving reasoning between turns. Learn more about using reasoning_details to pass back reasoning in our [docs](https://ask-coreai.com).

Context Window
197K tokens
Parameters
N/A
Input Price
$0.26/1M
Output Price
$1.00/1M
Price Tier
standard
Provider
Minimax

How to Use MiniMax: MiniMax M2

With CoreAI, you can start chatting with MiniMax: MiniMax M2 instantly — no separate subscription needed. CoreAI bundles access to MiniMax: MiniMax M2 along with 300+ other AI models from Minimax and other providers like OpenAI, Anthropic, Google, Meta, and more.

  1. Download the CoreAI app for iOS, Android, or use the Web App
  2. Select MiniMax: MiniMax M2 from the model selector
  3. Start chatting, comparing, or creating with AI

More Minimax Models

Minimax

MiniMax: MiniMax M2.7

MiniMax-M2.7 is a next-generation large language model designed for autonomous, real-world productivity and continuous improvement. Built to actively
205K standard
Minimax

MiniMax: MiniMax M2.5 (free)

MiniMax-M2.5 is a SOTA large language model designed for real-world productivity. Trained in a diverse range of complex real-world digital working env
197K budget
Minimax

MiniMax: MiniMax M2.5

MiniMax-M2.5 is a SOTA large language model designed for real-world productivity. Trained in a diverse range of complex real-world digital working env
197K standard
Minimax

MiniMax: MiniMax M2-her

MiniMax M2-her is a dialogue-first large language model built for immersive roleplay, character-driven chat, and expressive multi-turn conversations.
66K standard
Minimax

MiniMax: MiniMax M2.1

MiniMax-M2.1 is a lightweight, state-of-the-art large language model optimized for coding, agentic workflows, and modern application development. With
197K budget
Minimax

MiniMax: MiniMax M1

MiniMax-M1 is a large-scale, open-weight reasoning model designed for extended context and high-efficiency inference. It leverages a hybrid Mixture-of
1000K standard

Try MiniMax: MiniMax M2 Now

Chat with MiniMax: MiniMax M2 and 300+ other AI models — all in one app.

Download App → Try on Web App