Verified working with MiniMax-M2.5 on Infercom.
Prerequisites
- macOS, Linux, or Windows
- Infercom API key
Installation
Configuration
OpenCode requires a configuration file to define the Infercom provider.Step 1: Create Config File
Create~/.config/opencode/opencode.json:
Step 2: Add API Key
Run the provider login command:Step 3: Verify Setup
Check that the provider is configured:Model
Useinfercom/MiniMax-M2.5 - optimized for agentic coding with 160K context, reasoning, tool calling, and 75.8% SWE-bench.
Usage
Interactive TUI
Launch the TUI:/models to switch models if needed.
Non-Interactive Mode
For scripts and CI:Model Configuration Options
Customize model capabilities inopencode.json:
| Option | Description | Default |
|---|---|---|
reasoning | Enable reasoning/thinking | true for MiniMax |
interleaved.field | Field for reasoning context | reasoning_content |
tool_call | Enable function calling | true |
temperature | Temperature parameter | true |
limit.context | Max context tokens | Model-specific |
limit.output | Max output tokens | Model-specific |
Troubleshooting
Read-Before-Write Errors
OpenCode requires reading files before overwriting them (safety feature):Provider Not Found
Ensure the config file is in the correct location:Authentication Failed
Re-run the login command:~/.local/share/opencode/auth.json:
Performance
- Token throughput: 400+ tokens/sec with MiniMax-M2.5
- Context window: 160K tokens (163,840)
- Tool calling: Fully supported for file operations
Next Steps
- Aider - Simpler terminal tool
- Choosing a Tool - Compare options
- API Reference - Direct API usage