Setup time: ~3 minutes
Prerequisites
- Windsurf editor installed
- Infercom API key
Configuration
Step 1: Open Settings
- Open Windsurf
- Go to Settings → Windsurf Settings
- Navigate to Cascade → Model Configuration
Step 2: Configure Custom Provider
- Look for BYOK (Bring Your Own Key) or Custom Model Provider options
- Select OpenAI Compatible as the provider type
- Enter the configuration:
- Base URL:
https://api.infercom.ai/v1 - API Key: Your Infercom API key
- Model:
MiniMax-M2.5
- Base URL:
Step 3: Select the Model
- In the model selector, choose your configured Infercom model
- Start coding with Cascade
Model
UseMiniMax-M2.5 - optimized for agentic coding with 160K context (163,840 tokens) and 75.8% SWE-bench.
Usage
With Windsurf configured:- Cascade Chat: Use the AI chat panel for questions and tasks
- Inline Edits: Select code and ask for modifications
- Multi-file Changes: Cascade can work across your entire project
Troubleshooting
BYOK Not Available
Custom model configuration may require:- Windsurf Pro subscription
- Latest version of Windsurf
Connection Errors
Verify the endpoint format:Model Not Recognized
Use the exact model name:MiniMax-M2.5
Limitations
Next Steps
- Cursor - Alternative AI-native IDE
- Continue - Open-source VS Code extension
- Choosing a Tool - Compare all options