Skip to main content
The Infercom Inference Service provides access to a broad selection of AI models through the Global Model Catalog. Models are available in two categories: EU-hosted models running on Infercom’s sovereign infrastructure in Germany, and additional models available via global infrastructure.

EU-hosted models

The following models run on Infercom’s EU infrastructure in Germany. For these models, all data processing happens entirely within the EU — no data leaves EU jurisdiction, with full GDPR compliance and no US CLOUD Act exposure.
DeveloperModel IDContext lengthRegionView on Hugging Face
DeepSeekDeepSeek-V3.1128k tokensEUModel card
MetaMeta-Llama-3.3-70B-Instruct128k tokensEUModel card
OpenAIgpt-oss-120b128k tokensEUModel card

Global Model Catalog

In addition to EU-hosted models, the Infercom Inference Service provides access to a broader selection of models through the Global Model Catalog. Models not hosted in our EU datacenters are served via global infrastructure. You can always check a model’s hosting region via the API or the Playground.
Every model is clearly labeled with its hosting region — both in the API response and in the Playground. You always know where your data is being processed.
DeveloperModel IDContext lengthRegionView on Hugging Face
AlibabaQwen3-32B32k tokensGlobalModel card
AlibabaQwen3-235B64k tokensGlobalModel card
DeepSeekDeepSeek-R1-0528128k tokensGlobalModel card
DeepSeekDeepSeek-R1-Distill-Llama-70B128k tokensGlobalModel card
DeepSeekDeepSeek-V3-0324128k tokensGlobalModel card
DeepSeekDeepSeek-V3.1-Terminus128k tokensGlobalModel card
DeepSeekDeepSeek-V3.28k tokensGlobalModel card
MetaLlama-4-Maverick-17B-128E-Instruct128k tokensGlobalModel card
MetaMeta-Llama-3.1-8B-Instruct16k tokensGlobalModel card
See Identifying model regions below for how to check where each model runs.

Identifying model regions

You can identify where a model is hosted through the API or the Playground.

Via the API

The /v1/models endpoint includes an sn_metadata object for each model. Use the region field to determine where a model is hosted: "EU" for sovereign models on Infercom’s EU infrastructure, or a non-EU region (e.g. "US", "JP") for models on global infrastructure. Use the ?verbose=true query parameter to retrieve detailed model metadata including sovereignty information:
curl -s "https://api.infercom.ai/v1/models?verbose=true" \
  -H "Authorization: Bearer $INFERCOM_API_KEY" | \
  jq '.data[] | {id, region: .sn_metadata.region}'
Example response for an EU-hosted model:
{
  "id": "DeepSeek-V3.1",
  "object": "model",
  "sn_metadata": {
    "is_external": false,
    "region": "EU"
  }
}
Example response for a globally-routed model:
{
  "id": "DeepSeek-R1-0528",
  "object": "model",
  "sn_metadata": {
    "is_external": true,
    "region": "US"
  }
}

Via the Playground

In the Infercom Playground, region flags are displayed next to each model name, letting you see at a glance where each model runs.

Data sovereignty

EU sovereignty applies to EU-hosted models only. When using models from the Global Model Catalog that are not hosted on EU infrastructure, requests are processed on global infrastructure outside the EU. Always check the model’s region before processing sensitive or regulated data.
For EU-hosted models, Infercom provides:
  • EU data residency — inference runs in our EU datacenters
  • GDPR compliance — full compliance with EU data protection regulations
  • No US CLOUD Act exposure — your inference data is not subject to US jurisdiction
  • AI Act readiness — designed for compliance with the EU AI Act