Skip to main content
Craft Agents supports multiple API providers through a built-in preset system. You can connect to Anthropic directly, use aggregators like OpenRouter, run local models via Ollama, or point to any API endpoint compatible with the Anthropic Messages format.
This page covers Anthropic-compatible providers. For Codex/OpenAI connections and multi‑connection setup, see LLM Connections.

Supported Providers

ProviderBase URLAPI Key RequiredNotes
Anthropichttps://api.anthropic.comYesDefault provider. No model override needed.
OpenRouterhttps://openrouter.ai/apiYesAccess multiple AI providers through one API.
Vercel AI Gatewayhttps://ai-gateway.vercel.shYesUnified gateway for AI model routing.
Ollamahttp://localhost:11434NoRun models locally. Requires Ollama 0.14+.
CustomAny URLDepends on providerAny Anthropic-compatible endpoint.

Setting Up a Provider

During First Launch

  1. In the setup wizard, select API Key
  2. Enter your API key
  3. Select a Base URL preset from the dropdown (Anthropic, OpenRouter, Vercel AI Gateway, or Custom)
  4. Optionally specify a Model name (required for non-Anthropic providers)
  5. The connection is tested automatically before saving

In Settings

  1. Open Settings (gear icon or Cmd+,)
  2. Click on the API Connection section
  3. Change your API key, base URL, or model as needed

Model Names

For Anthropic, no model override is needed — Craft Agents uses its built-in model routing (Sonnet, Opus, Haiku) automatically. For OpenRouter and Vercel AI Gateway, models use the provider/model-name format:
anthropic/claude-sonnet-4
anthropic/claude-opus-4
openai/gpt-4o
google/gemini-2.5-pro
meta-llama/llama-4-maverick
For Ollama, use the local model name directly:
llama3.2
qwen3-coder
deepseek-r1
When the Model field is left empty for non-Anthropic providers, Craft Agents defaults to Anthropic model name formatting. This works for providers that support Anthropic model names natively but may not work for all providers.

Provider Details

OpenRouter

OpenRouter gives you access to hundreds of AI models through a single API key. It handles billing, rate limiting, and fallbacks across providers.
  1. Get your API key at openrouter.ai/keys
  2. Select the OpenRouter preset in the Base URL dropdown
  3. Set your model (e.g. anthropic/claude-sonnet-4)
Browse available models at openrouter.ai/models.

Ollama (Local Models)

Ollama runs open-source models locally on your machine. No API key is required, and data never leaves your computer. Requirements:
  • Ollama 0.14 or newer (for Anthropic-compatible API format)
  • A model pulled locally
# Install and pull a model
ollama pull llama3.2
To connect:
  1. Select the Custom preset in the Base URL dropdown
  2. Enter http://localhost:11434 as the URL
  3. Leave the API key empty
  4. Set the model name (e.g. llama3.2)
Ollama requires version 0.14+ for compatibility with Craft Agents. Earlier versions do not support the Anthropic Messages API format. Update with ollama update if needed.

Vercel AI Gateway

Vercel AI Gateway provides a unified endpoint for routing requests to multiple AI providers with built-in observability and caching.
  1. Get your API key from your Vercel dashboard
  2. Select the Vercel AI Gateway preset
  3. Set your model using provider/model-name format
See supported models in the Vercel documentation.

Custom Endpoint

For any API that implements the Anthropic Messages format:
  1. Select the Custom preset
  2. Enter the full base URL of your endpoint
  3. Enter your API key (if required)
  4. Specify the model name your endpoint expects
This works with self-hosted proxies, enterprise gateways, or any service that implements the /v1/messages endpoint.

How It Works

When you configure a non-default provider, Craft Agents stores:
  • The API key in the encrypted credentials file (~/.craft-agent/credentials.enc)
  • The base URL and default model in the LLM connection configuration
At session launch, the base URL is passed via the ANTHROPIC_BASE_URL environment variable to the underlying Claude Code SDK.

Troubleshooting

Verify:
  • The base URL is correct and accessible from your machine
  • Your API key is valid and has sufficient permissions
  • The endpoint supports the Anthropic Messages API format (/v1/messages)
Check that the model name matches exactly what your provider expects:
  • OpenRouter/Vercel: Use provider/model-name format (e.g. anthropic/claude-sonnet-4)
  • Ollama: Use the local model name (e.g. llama3.2)
  • Custom: Check your provider’s documentation for valid model identifiers
  • Ensure your API key is correct and hasn’t expired
  • For Ollama: no API key should be set (leave it empty)
  • Check if your key has available credits/quota
  • Verify Ollama is running: ollama list
  • Check you’re on version 0.14+: ollama --version
  • Ensure the model is pulled: ollama pull llama3.2
  • Verify the URL is http://localhost:11434 (note: HTTP, not HTTPS)
If you hit rate limits, check your provider’s usage limits and consider upgrading your plan or using a different provider.

Security

Your API key is stored securely in the encrypted credentials file. See Credentials for details on how credentials are protected. The base URL and model name are stored in the LLM connection configuration (not encrypted, as they are not sensitive).