Skip to main content
OpenRouter is a model gateway that provides access to 200+ models from dozens of providers through a single OpenAI-compatible API. Useful for model experimentation and fallback routing.

Overview

OpenRouter acts as a proxy in front of all major AI providers. Configure one API key and access any model:
  • All Anthropic Claude models
  • All OpenAI GPT and o-series models
  • All Google Gemini models
  • Open-source models (Llama, Mistral, Qwen, DeepSeek, etc.)
  • Automatic fallback if a provider is down

Setup

1

Get an API key

Sign up at openrouter.ai. Free credits on signup.
2

Set the environment variable

export OPENROUTER_API_KEY=sk-or-...
3

Use any model

Reference models by their OpenRouter IDs (e.g., anthropic/claude-opus-4-6).

Environment Variables

OPENROUTER_API_KEY
string
required
Your OpenRouter API key. Format: sk-or-...

Configuration Example

OPENROUTER_API_KEY=sk-or-...

Using Models via OpenRouter

Reference models using the openrouter/ prefix or the full provider/model path:
# These all work:
profclaw chat --model openrouter/anthropic/claude-opus-4-6 "Hello"
profclaw chat --model openrouter/openai/gpt-4o "Hello"
profclaw chat --model openrouter/meta-llama/llama-3.3-70b-instruct "Hello"
profclaw chat --model openrouter/google/gemini-pro-1.5 "Hello"

Usage Examples

# Access Claude via OpenRouter (useful when direct Anthropic key not set)
profclaw chat --model openrouter/anthropic/claude-sonnet-4-5 "Review this PR"

# Model auto-routing
profclaw chat --model openrouter/auto "Use the best available model"

Notes

  • API endpoint: https://openrouter.ai/api/v1 (OpenAI-compatible)
  • Status: Stable
  • OpenRouter adds a small markup over provider prices (typically 10-15%).
  • Supports the X-Title and HTTP-Referer headers for app attribution.
  • Good for organizations that want one vendor relationship instead of managing many keys.
  • OpenRouter’s “auto” model routes to the best available model for your prompt.