Skip to main content
OpenCode is a terminal-based AI coding assistant with a rich TUI (text user interface). It provides an interactive coding experience with file management, reasoning visualization, and tool calling capabilities.
Verified working with MiniMax-M2.5 on Infercom.

Prerequisites

Installation

curl -fsSL https://opencode.ai/install | bash
Verify installation:
opencode --version

Configuration

OpenCode requires a configuration file to define the Infercom provider.

Step 1: Create Config File

Create ~/.config/opencode/opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "infercom": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Infercom (EU Sovereign)",
      "api": "https://api.infercom.ai/v1",
      "models": {
        "MiniMax-M2.5": {
          "id": "MiniMax-M2.5",
          "name": "MiniMax M2.5 (EU Sovereign)",
          "reasoning": true,
          "tool_call": true,
          "temperature": true,
          "interleaved": {
            "field": "reasoning_content"
          },
          "limit": {
            "context": 163840,
            "output": 16384
          }
        }
      }
    }
  },
  "model": "infercom/MiniMax-M2.5"
}

Step 2: Add API Key

Run the provider login command:
opencode providers login -p infercom
Enter your Infercom API key when prompted.
Credentials are stored in ~/.local/share/opencode/auth.json

Step 3: Verify Setup

Check that the provider is configured:
opencode providers list
Expected output:
+  Credentials ~/.local/share/opencode/auth.json
|
*  infercom api
|
-  1 credential
List available models:
opencode models infercom
Expected output:
infercom/MiniMax-M2.5

Model

Use infercom/MiniMax-M2.5 - optimized for agentic coding with 160K context, reasoning, tool calling, and 75.8% SWE-bench.

Usage

Interactive TUI

Launch the TUI:
opencode
Use /models to switch models if needed.

Non-Interactive Mode

For scripts and CI:
opencode run --model infercom/MiniMax-M2.5 "Your task description"
Example:
opencode run --model infercom/MiniMax-M2.5 \
  "Add input validation to the login function in auth.py"

Model Configuration Options

Customize model capabilities in opencode.json:
OptionDescriptionDefault
reasoningEnable reasoning/thinkingtrue for MiniMax
interleaved.fieldField for reasoning contextreasoning_content
tool_callEnable function callingtrue
temperatureTemperature parametertrue
limit.contextMax context tokensModel-specific
limit.outputMax output tokensModel-specific

Troubleshooting

Read-Before-Write Errors

OpenCode requires reading files before overwriting them (safety feature):
Error: You must read file /path/to/file.py before overwriting it.
This is expected behavior. OpenCode will automatically read the file and retry.

Provider Not Found

Ensure the config file is in the correct location:
ls ~/.config/opencode/opencode.json

Authentication Failed

Re-run the login command:
opencode providers login -p infercom
Or manually edit ~/.local/share/opencode/auth.json:
{
  "infercom": {
    "type": "api",
    "key": "your-infercom-api-key"
  }
}

Performance

  • Token throughput: 400+ tokens/sec with MiniMax-M2.5
  • Context window: 160K tokens (163,840)
  • Tool calling: Fully supported for file operations

Next Steps