This skill should be used when configuring or using the OpenCode CLI for headless LLM automation.
Use OpenCode CLI for headless LLM automation via subprocess invocation.
export ANTHROPIC_API_KEY="sk-..." # For Anthropic
# OR
export GOOGLE_CLOUD_PROJECT="project-id" # For Vertex AI
opencode --version
opencode run --model google/gemini-2.5-pro "Hello, world"
OpenCode is a Go-based CLI that provides access to 75+ LLM providers through a unified interface. This skill focuses on the headless run command for automation and subprocess integration.
opencode run --model <provider/model> "<prompt>"
Key points:
run subcommand for headless (non-interactive) modeprovider/model-p flag)# Using Anthropic Claude
opencode run --model anthropic/claude-sonnet-4-20250514 "Explain this code"
# Using Google Gemini
opencode run --model google/gemini-2.5-pro "Review this architecture"
# Using free Grok tier
opencode run --model opencode/grok-code "Generate tests for this function"
Models use the pattern provider/model-name:
| Provider | Example Model |
|---|---|
anthropic |
anthropic/claude-sonnet-4-20250514 |
google |
google/gemini-2.5-pro |
opencode |
opencode/grok-code (free tier) |
openai |
openai/gpt-4o |
google-vertex |
google-vertex/gemini-2.5-pro |
OPENCODE_CONFIG pathopencode.json in project root~/.config/opencode/opencode.jsonConfigs are merged (project overrides global).
{
"$schema": "https://opencode.ai/config.json",
"model": "anthropic/claude-sonnet-4-5",
"small_model": "anthropic/claude-haiku-4-5"
}
Credentials stored in ~/.local/share/opencode/auth.json after running /connect in TUI mode, or configure via environment variables.
Load the appropriate reference for detailed configuration:
| Task | Reference File |
|---|---|
| Setting up Google Vertex AI | vertex-ai-setup.md |
| Configuring providers (Anthropic, OpenAI, etc.) | provider-config.md |
| Cloud providers (Deepseek, Kimi, Mistral, etc.) | cloud-providers.md |
| Local models (Ollama, LM Studio) | local-models.md |
| MCP server configuration | mcp-servers.md |
| Subprocess integration patterns | integration-patterns.md |
See vertex-ai-setup.md for Vertex AI configuration including environment variables and service account setup.
import subprocess
result = subprocess.run(
["opencode", "run", "--model", "google/gemini-2.5-pro", prompt],
capture_output=True,
text=True,
timeout=600
)
output = result.stdout
opencode --version to verify availabilitySee integration-patterns.md for complete patterns.
| Feature | OpenCode | Claude CLI |
|---|---|---|
| Headless mode | run subcommand |
-p flag with stdin |
| Hooks/settings | Not supported | --settings flag |
| Directory access | Not supported | --add-dir flag |
| Tool pre-approval | Not supported | --allowedTools flag |
| Prompt input | Positional argument | Stdin or -p |
| Variable | Purpose |
|---|---|
OPENCODE_CONFIG |
Custom config file path |
GOOGLE_CLOUD_PROJECT |
GCP project for Vertex AI |
GOOGLE_APPLICATION_CREDENTIALS |
Service account JSON path |
VERTEX_LOCATION |
Vertex AI region |
Complete this checklist to verify a working installation:
opencode --version
opencode run --model google/gemini-2.5-pro "Say hello"
cat ~/.config/opencode/opencode.json
opencode.json for project-specific settings{env:VAR_NAME} syntax in config for secrets