LLM Connections let you add multiple AI provider configurations and switch between them. Each session locks to a specific connection after the first message, and workspaces can define their own default connection.
Location
LLM connections are stored in:
~/.craft-agent/config.json
How Connections Are Used
Connections resolve in this order:
- Session connection (locked after first message)
- Workspace default connection (
defaults.defaultLlmConnection)
- Global default connection (
defaultLlmConnection)
- First connection in the list (fallback)
Each session locks to a connection after the first message. To change connections, start a new session.
Connection Schema
{
"slug": "anthropic-api",
"name": "Anthropic (API Key)",
"providerType": "anthropic",
"authType": "api_key",
"baseUrl": "https://api.anthropic.com",
"defaultModel": "claude-sonnet-4-6",
"createdAt": 1737451800000
}
Fields
| Field | Required | Description |
|---|
slug | Yes | URL-safe identifier (e.g., anthropic-api, codex) |
name | Yes | Display name shown in the UI |
providerType | Yes | Provider backend (see list below) |
authType | Yes | Auth mechanism (see list below) |
baseUrl | No | Custom base URL for compatible providers |
models | No | Explicit model list. Accepts strings ("gpt-5.4") or objects with optional contextWindow and supportsImages overrides (see Custom Endpoint Capabilities). |
customEndpoint | No | Custom endpoint protocol config. Use api to select the wire format and optional supportsImages to opt an entire endpoint into image input. |
defaultModel | No | Default model for this connection |
codexPath | No | Path to Codex binary (OpenAI/Codex only) |
awsRegion | No | AWS region for Bedrock |
gcpProjectId | No | GCP project for Vertex |
gcpRegion | No | GCP region for Vertex |
createdAt | Yes | Timestamp (ms) when created |
lastUsedAt | No | Timestamp (ms) when last used |
providerType Values
| Value | Description |
|---|
anthropic | Direct Anthropic API |
anthropic_compat | Anthropic‑compatible endpoints (OpenRouter, Vercel AI Gateway, custom) |
openai | OpenAI via Codex app‑server |
openai_compat | OpenAI‑compatible endpoints |
bedrock | AWS Bedrock |
vertex | Google Vertex AI |
authType Values
| Value | Description |
|---|
api_key | API key only |
api_key_with_endpoint | API key + custom endpoint |
oauth | OAuth login (Claude Max / Codex / OpenAI) |
iam_credentials | AWS IAM credentials (Bedrock) |
service_account_file | GCP service account JSON (Vertex) |
environment | Uses environment variables |
none | No auth required |
Examples
Anthropic (API Key)
{
"slug": "anthropic-api",
"name": "Anthropic (API Key)",
"providerType": "anthropic",
"authType": "api_key",
"defaultModel": "claude-sonnet-4-6",
"createdAt": 1737451800000
}
Claude Max (OAuth)
{
"slug": "claude-max",
"name": "Claude Max",
"providerType": "anthropic",
"authType": "oauth",
"defaultModel": "claude-opus-4-6",
"createdAt": 1737451800000
}
OpenRouter (Anthropic‑compatible)
{
"slug": "openrouter",
"name": "OpenRouter",
"providerType": "anthropic_compat",
"authType": "api_key_with_endpoint",
"baseUrl": "https://openrouter.ai/api",
"models": [
"anthropic/claude-opus-4.6",
"anthropic/claude-haiku-4.5"
],
"defaultModel": "anthropic/claude-opus-4.6",
"createdAt": 1737451800000
}
OpenRouter (OpenAI‑compatible)
{
"slug": "openrouter-openai",
"name": "OpenRouter (OpenAI‑compat)",
"providerType": "openai_compat",
"authType": "api_key_with_endpoint",
"baseUrl": "https://openrouter.ai/api/v1",
"models": [
"openai/gpt-5.2-codex",
"openai/gpt-5.1-codex-mini"
],
"defaultModel": "openai/gpt-5.2-codex",
"createdAt": 1737451800000
}
AWS Bedrock (IAM Credentials)
{
"slug": "bedrock",
"name": "AWS Bedrock",
"providerType": "bedrock",
"authType": "iam_credentials",
"awsRegion": "us-east-1",
"defaultModel": "claude-sonnet-4-6",
"createdAt": 1737451800000
}
With iam_credentials, your AWS Access Key ID, Secret Access Key, and optional Session Token are stored securely and injected into the subprocess environment at runtime. Use this when you want to configure credentials directly in the UI.
AWS Bedrock (Environment)
{
"slug": "bedrock",
"name": "AWS Bedrock",
"providerType": "bedrock",
"authType": "environment",
"awsRegion": "us-east-1",
"defaultModel": "claude-sonnet-4-6",
"createdAt": 1737451800000
}
With environment, the subprocess inherits your shell’s AWS credential chain — ~/.aws/credentials, AWS_PROFILE, IAM roles, SSO sessions, and environment variables all work. No credentials are stored in Craft Agents.
To set up Bedrock in the UI: Settings → AI → Add Connection → I use other provider → Amazon Bedrock. You can choose between IAM Credentials and Environment (AWS CLI) authentication.
Set awsRegion to the region where you have Bedrock model access enabled (e.g., us-east-1, us-west-2, eu-west-1).
Codex / OpenAI (OAuth)
{
"slug": "codex",
"name": "OpenAI (Codex)",
"providerType": "openai",
"authType": "oauth",
"codexPath": "/Applications/Craft Agents.app/Contents/Resources/vendor/codex/darwin-arm64/codex",
"defaultModel": "codex-mini-latest",
"createdAt": 1737451800000
}
Custom Endpoint Capabilities
Custom endpoints default to a 128K context window and text-only input. If your model supports a larger context window or accepts image input, set those capabilities explicitly in the connection config.
Per-model overrides (recommended)
Use model objects when an endpoint hosts a mix of text-only and multimodal models:
{
"slug": "ollama",
"name": "Ollama",
"providerType": "pi_compat",
"authType": "none",
"baseUrl": "http://localhost:11434/v1",
"customEndpoint": { "api": "openai-completions" },
"models": [
{ "id": "gemma4", "contextWindow": 262144, "supportsImages": true },
{ "id": "qwen3-coder", "contextWindow": 131072 }
],
"defaultModel": "gemma4",
"createdAt": 1737451800000
}
Whole-endpoint opt-in
If every model behind the endpoint is multimodal, you can opt in at the endpoint level:
{
"customEndpoint": {
"api": "openai-completions",
"supportsImages": true
}
}
Craft Agents does not auto-detect image support for arbitrary endpoints. Custom endpoints stay text-only unless you explicitly set supportsImages: true at the endpoint or model level. Plain string model entries continue to use the default 128K context window and text-only input.
For models like Gemma 4 served through Ollama, vLLM, or another OpenAI-compatible proxy, prefer the per-model form so only the vision-capable model opts into image input.
Managing Connections
Connections are managed in Settings → AI:
- Add/edit/delete connections
- Set a global default connection
- Validate connection status
- Set per‑workspace defaults
If you only need a single provider, keep one connection and set it as default.