Skip to main content
Windsurf is an AI-native code editor (formerly by Codeium, now Google). It features Cascade, an agentic AI that can work across your entire codebase.
Setup time: ~3 minutes

Prerequisites

Configuration

Step 1: Open Settings

  1. Open Windsurf
  2. Go to SettingsWindsurf Settings
  3. Navigate to CascadeModel Configuration

Step 2: Configure Custom Provider

  1. Look for BYOK (Bring Your Own Key) or Custom Model Provider options
  2. Select OpenAI Compatible as the provider type
  3. Enter the configuration:
    • Base URL: https://api.infercom.ai/v1
    • API Key: Your Infercom API key
    • Model: MiniMax-M2.5

Step 3: Select the Model

  1. In the model selector, choose your configured Infercom model
  2. Start coding with Cascade

Model

Use MiniMax-M2.5 - optimized for agentic coding with 160K context (163,840 tokens) and 75.8% SWE-bench.

Usage

With Windsurf configured:
  • Cascade Chat: Use the AI chat panel for questions and tasks
  • Inline Edits: Select code and ask for modifications
  • Multi-file Changes: Cascade can work across your entire project

Troubleshooting

BYOK Not Available

Custom model configuration may require:
  • Windsurf Pro subscription
  • Latest version of Windsurf
Check Windsurf’s documentation for current BYOK availability.

Connection Errors

Verify the endpoint format:
https://api.infercom.ai/v1
No trailing slash required.

Model Not Recognized

Use the exact model name: MiniMax-M2.5

Limitations

Custom model configuration in Windsurf may be limited to certain subscription tiers. If BYOK options are not available in your version, consider Cursor or Continue as alternatives.

Next Steps