Codex CLI is OpenAI’s open-source agentic coding assistant that runs in your terminal. It can read files, write code, run commands, and iterate on its work using the Responses API.Documentation Index
Fetch the complete documentation index at: https://docs.infercom.ai/llms.txt
Use this file to discover all available pages before exploring further.
Requires Responses API - Codex CLI uses the
/v1/responses endpoint which Infercom supports with MiniMax-M2.5.Prerequisites
- Node.js 18 or later
- npm or yarn
- Infercom API key
Installation
Configuration
Codex CLI supports custom providers via a TOML configuration file.Step 1: Set Environment Variable
Step 2: Create Config File
Create~/.codex/config.toml:
Step 3: Verify Setup
Run Codex in any project directory:Model
UseMiniMax-M2.5 - optimized for agentic coding with 160K context, built-in reasoning, and 75.8% SWE-bench.
Usage
Interactive Mode
Start Codex in your project directory:- Read relevant files
- Write or edit code
- Run terminal commands
- Iterate until the task is complete
Example Tasks
- “Add error handling to the login function”
- “Write unit tests for the User class”
- “Refactor this file to use async/await”
- “Find and fix the bug causing the test to fail”
Non-Interactive Mode
For scripted or one-shot use, use theexec subcommand:
Configuration Options
Full~/.codex/config.toml reference:
Approval Modes
| Mode | Behavior |
|---|---|
suggest | Shows diffs and asks before applying (default) |
auto-edit | Automatically applies file changes, asks for commands |
full-auto | Automatically applies all changes |
Troubleshooting
Connection Errors
Verify your configuration:"completed"
Model Not Found
Ensure the model name is exact (case-sensitive):MiniMax-M2.5
Slow Responses
MiniMax-M2.5 runs at 400+ tokens/sec. If responses seem slow:- Check your network connection
- Large context (many files) increases processing time
- First request may be slower due to model loading
Why Codex CLI with Infercom?
| Feature | Benefit |
|---|---|
| EU Sovereign | Data processed in Germany, GDPR compliant |
| Responses API | Native support for Codex’s agentic architecture |
| Fast inference | 400+ tokens/sec with MiniMax-M2.5 |
| No vendor lock-in | Open-source tool, standard API |
Next Steps
- Aider - Alternative terminal-based tool
- OpenCode - Modern TUI with similar features
- Responses API - API documentation for custom integrations