Skip to main content

CLI Reference

The craft-cli is a terminal client that connects to a running Craft Agent headless server over WebSocket. It provides commands for listing resources, managing sessions, sending messages with real-time streaming, and validating server health.

Installation

# Clone the repository
git clone https://github.com/anthropics/craft-agents.git
cd craft-agents

# Install dependencies
bun install

# Option A: Run directly
bun run apps/cli/src/index.ts <command>

# Option B: Link globally (adds craft-cli to PATH)
cd apps/cli && bun link
craft-cli <command>

Quick Start

The fastest way to try it out — no server setup needed:
# Self-contained run (spawns a server automatically)
ANTHROPIC_API_KEY=sk-... bun run apps/cli/src/index.ts run "Hello, world!"

Connection Options

FlagEnv VariableDefaultDescription
--url <ws[s]://...>CRAFT_SERVER_URLServer WebSocket URL
--token <secret>CRAFT_SERVER_TOKENAuthentication token
--workspace <id>auto-detectWorkspace ID
--timeout <ms>10000Request timeout
--tls-ca <path>CRAFT_TLS_CACustom CA cert (self-signed TLS)
--jsonfalseJSON output for scripting
--send-timeout <ms>300000Timeout for send command
Flags override environment variables. If --workspace is omitted, the first available workspace is used automatically.

Commands

Info & Health

CommandChannelDescription
pinghandshake onlyVerify connectivity — prints clientId and latency
healthcredentials:healthCheckCheck credential store health
versionssystem:versionsShow server runtime versions

Resource Listing

CommandChannelDescription
workspacesworkspaces:getList all workspaces (id, name, path)
sessionssessions:getList sessions (id, name, preview, status)
connectionsLLM_Connection:listList LLM connections
sourcessources:getList configured sources

Session Operations

CommandChannelDescription
session create [--name <n>] [--mode <m>]sessions:createCreate a new session
session messages <id>sessions:getMessagesPrint message history
session delete <id>sessions:deleteDelete a session
cancel <id>sessions:cancelCancel in-progress processing

Send Message (Streaming)

craft-cli send <session-id> <message>
echo "Summarize this" | craft-cli send <session-id>
The send command subscribes to session:event and streams the AI response to stdout in real time:
Event TypeOutput
text_deltaText appended inline
tool_start[tool: name — intent] marker
tool_resultTool output (truncated to 200 chars)
errorError to stderr, exit code 1
completeNewline, exit code 0
interrupted[interrupted], exit code 130

Raw RPC

craft-cli invoke <channel> [json-args...]   # Send any RPC channel
craft-cli listen <channel>                   # Subscribe to push events
invoke sends a request to any channel and prints the response. listen subscribes to a push channel and prints events as they arrive (Ctrl+C to stop).

Run (Self-Contained)

craft-cli run <prompt>
craft-cli run --workspace-dir ./project --source github "List open PRs"
The run command is fully self-contained — it spawns a headless server, creates a session, sends the prompt, streams the response, and exits. No separate server setup needed. An API key is resolved from --api-key, $LLM_API_KEY, or a provider-specific env var (e.g., $ANTHROPIC_API_KEY, $OPENAI_API_KEY).
FlagDefaultDescription
--workspace-dir <path>Register a workspace directory before running
--source <slug>Enable a source (repeatable)
--output-format <fmt>textOutput format: text or stream-json
--mode <mode>allow-allPermission mode for the session
--no-cleanupfalseSkip session deletion on exit
--server-entry <path>Custom server entry point
LLM Configuration:
FlagEnv FallbackDefaultDescription
--provider <name>LLM_PROVIDERanthropicProvider: anthropic, openai, google, openrouter, groq, mistral, xai, etc.
--model <id>LLM_MODEL(provider default)Model ID (e.g., claude-sonnet-4-5-20250929, gpt-4o, gemini-2.0-flash)
--api-key <key>LLM_API_KEY(provider env)API key — also checks provider-specific vars like $OPENAI_API_KEY
--base-url <url>LLM_BASE_URLCustom endpoint for proxies, OpenRouter, or self-hosted models
# Multi-provider examples
craft-cli run --provider openai --model gpt-4o "Summarize this repo"
GOOGLE_API_KEY=... craft-cli run --provider google --model gemini-2.0-flash "Hello"
craft-cli run --provider anthropic --base-url https://openrouter.ai/api/v1 --api-key $OR_KEY "Hello"

Validate Server

# Against a running server
craft-cli --validate-server --url ws://127.0.0.1:9100 --token <token>

# Self-contained (auto-spawns a server)
craft-cli --validate-server
When no --url is provided, --validate-server automatically spawns a local headless server, runs the validation, and shuts it down. Runs a 21-step integration test covering the full server lifecycle including source and skill creation:
  1. Connect + handshake
  2. credentials:healthCheck
  3. system:versions
  4. system:homeDir
  5. workspaces:get
  6. sessions:get
  7. LLM_Connection:list
  8. sources:get
  9. sessions:create (temporary __cli-validate-* session)
  10. sessions:getMessages
  11. Send message + stream (text response)
  12. Send message + tool use (Bash tool)
  13. sources:create (temporary Cat Facts API source)
  14. Send + source mention (uses the created source)
  15. Send + skill create (writes SKILL.md via Bash)
  16. skills:get (verify skill appears)
  17. Send + skill mention (invokes the created skill)
  18. skills:delete (cleanup)
  19. sources:delete (cleanup)
  20. sessions:delete (cleanup)
  21. Disconnect
Note: This test mutates workspace state — it creates and deletes a temporary session, source, and skill. All resources are cleaned up on completion. Continues on failure and reports a summary. Use --json for machine-readable output:
craft-cli --validate-server --json | jq '.results[] | select(.status == "FAIL")'

Troubleshooting

SymptomCauseFix
Connection timeoutServer not runningVerify server URL and that it’s started
AUTH_FAILEDWrong tokenCheck CRAFT_SERVER_TOKEN matches
PROTOCOL_VERSION_UNSUPPORTEDVersion mismatchUpdate CLI and server
WebSocket errorNetwork/TLS issueUse --tls-ca for self-signed certs