Skip to main content
LLM Connections let you add multiple AI provider configurations and switch between them. Each session locks to a specific connection after the first message, and workspaces can define their own default connection.

Location

LLM connections are stored in:
~/.craft-agent/config.json

How Connections Are Used

Connections resolve in this order:
  1. Session connection (locked after first message)
  2. Workspace default connection (defaults.defaultLlmConnection)
  3. Global default connection (defaultLlmConnection)
  4. First connection in the list (fallback)
Each session locks to a connection after the first message. To change connections, start a new session.

Connection Schema

{
  "slug": "anthropic-api",
  "name": "Anthropic (API Key)",
  "providerType": "anthropic",
  "authType": "api_key",
  "baseUrl": "https://api.anthropic.com",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}

Fields

FieldRequiredDescription
slugYesURL-safe identifier (e.g., anthropic-api, codex)
nameYesDisplay name shown in the UI
providerTypeYesProvider backend (see list below)
authTypeYesAuth mechanism (see list below)
baseUrlNoCustom base URL for compatible providers
modelsNoExplicit model list. Accepts strings ("gpt-5.4") or objects with optional contextWindow and supportsImages overrides (see Custom Endpoint Capabilities).
customEndpointNoCustom endpoint protocol config. Use api to select the wire format and optional supportsImages to opt an entire endpoint into image input.
defaultModelNoDefault model for this connection
codexPathNoPath to Codex binary (OpenAI/Codex only)
awsRegionNoAWS region for Bedrock
gcpProjectIdNoGCP project for Vertex
gcpRegionNoGCP region for Vertex
createdAtYesTimestamp (ms) when created
lastUsedAtNoTimestamp (ms) when last used

providerType Values

ValueDescription
anthropicDirect Anthropic API
anthropic_compatAnthropic‑compatible endpoints (OpenRouter, Vercel AI Gateway, custom)
openaiOpenAI via Codex app‑server
openai_compatOpenAI‑compatible endpoints
bedrockAWS Bedrock
vertexGoogle Vertex AI

authType Values

ValueDescription
api_keyAPI key only
api_key_with_endpointAPI key + custom endpoint
oauthOAuth login (Claude Max / Codex / OpenAI)
iam_credentialsAWS IAM credentials (Bedrock)
service_account_fileGCP service account JSON (Vertex)
environmentUses environment variables
noneNo auth required

Examples

Anthropic (API Key)

{
  "slug": "anthropic-api",
  "name": "Anthropic (API Key)",
  "providerType": "anthropic",
  "authType": "api_key",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}

Claude Max (OAuth)

{
  "slug": "claude-max",
  "name": "Claude Max",
  "providerType": "anthropic",
  "authType": "oauth",
  "defaultModel": "claude-opus-4-6",
  "createdAt": 1737451800000
}

OpenRouter (Anthropic‑compatible)

{
  "slug": "openrouter",
  "name": "OpenRouter",
  "providerType": "anthropic_compat",
  "authType": "api_key_with_endpoint",
  "baseUrl": "https://openrouter.ai/api",
  "models": [
    "anthropic/claude-opus-4.6",
    "anthropic/claude-haiku-4.5"
  ],
  "defaultModel": "anthropic/claude-opus-4.6",
  "createdAt": 1737451800000
}

OpenRouter (OpenAI‑compatible)

{
  "slug": "openrouter-openai",
  "name": "OpenRouter (OpenAI‑compat)",
  "providerType": "openai_compat",
  "authType": "api_key_with_endpoint",
  "baseUrl": "https://openrouter.ai/api/v1",
  "models": [
    "openai/gpt-5.2-codex",
    "openai/gpt-5.1-codex-mini"
  ],
  "defaultModel": "openai/gpt-5.2-codex",
  "createdAt": 1737451800000
}

AWS Bedrock (IAM Credentials)

{
  "slug": "bedrock",
  "name": "AWS Bedrock",
  "providerType": "bedrock",
  "authType": "iam_credentials",
  "awsRegion": "us-east-1",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}
With iam_credentials, your AWS Access Key ID, Secret Access Key, and optional Session Token are stored securely and injected into the subprocess environment at runtime. Use this when you want to configure credentials directly in the UI.

AWS Bedrock (Environment)

{
  "slug": "bedrock",
  "name": "AWS Bedrock",
  "providerType": "bedrock",
  "authType": "environment",
  "awsRegion": "us-east-1",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}
With environment, the subprocess inherits your shell’s AWS credential chain — ~/.aws/credentials, AWS_PROFILE, IAM roles, SSO sessions, and environment variables all work. No credentials are stored in Craft Agents.
To set up Bedrock in the UI: Settings → AI → Add Connection → I use other provider → Amazon Bedrock. You can choose between IAM Credentials and Environment (AWS CLI) authentication.
Set awsRegion to the region where you have Bedrock model access enabled (e.g., us-east-1, us-west-2, eu-west-1).

Codex / OpenAI (OAuth)

{
  "slug": "codex",
  "name": "OpenAI (Codex)",
  "providerType": "openai",
  "authType": "oauth",
  "codexPath": "/Applications/Craft Agents.app/Contents/Resources/vendor/codex/darwin-arm64/codex",
  "defaultModel": "codex-mini-latest",
  "createdAt": 1737451800000
}

Custom Endpoint Capabilities

Custom endpoints default to a 128K context window and text-only input. If your model supports a larger context window or accepts image input, set those capabilities explicitly in the connection config. Use model objects when an endpoint hosts a mix of text-only and multimodal models:
{
  "slug": "ollama",
  "name": "Ollama",
  "providerType": "pi_compat",
  "authType": "none",
  "baseUrl": "http://localhost:11434/v1",
  "customEndpoint": { "api": "openai-completions" },
  "models": [
    { "id": "gemma4", "contextWindow": 262144, "supportsImages": true },
    { "id": "qwen3-coder", "contextWindow": 131072 }
  ],
  "defaultModel": "gemma4",
  "createdAt": 1737451800000
}

Whole-endpoint opt-in

If every model behind the endpoint is multimodal, you can opt in at the endpoint level:
{
  "customEndpoint": {
    "api": "openai-completions",
    "supportsImages": true
  }
}
Craft Agents does not auto-detect image support for arbitrary endpoints. Custom endpoints stay text-only unless you explicitly set supportsImages: true at the endpoint or model level. Plain string model entries continue to use the default 128K context window and text-only input.
For models like Gemma 4 served through Ollama, vLLM, or another OpenAI-compatible proxy, prefer the per-model form so only the vision-capable model opts into image input.

Managing Connections

Connections are managed in Settings → AI:
  • Add/edit/delete connections
  • Set a global default connection
  • Validate connection status
  • Set per‑workspace defaults
If you only need a single provider, keep one connection and set it as default.