Overview
OpenRouter acts as a proxy in front of all major AI providers. Configure one API key and access any model:- All Anthropic Claude models
- All OpenAI GPT and o-series models
- All Google Gemini models
- Open-source models (Llama, Mistral, Qwen, DeepSeek, etc.)
- Automatic fallback if a provider is down
Setup
Get an API key
Sign up at openrouter.ai. Free credits on signup.
Environment Variables
Your OpenRouter API key. Format:
sk-or-...Configuration Example
- .env
- settings.yml
Using Models via OpenRouter
Reference models using theopenrouter/ prefix or the full provider/model path:
Usage Examples
Notes
- API endpoint:
https://openrouter.ai/api/v1(OpenAI-compatible) - Status: Stable
- OpenRouter adds a small markup over provider prices (typically 10-15%).
- Supports the
X-TitleandHTTP-Refererheaders for app attribution. - Good for organizations that want one vendor relationship instead of managing many keys.
- OpenRouter’s “auto” model routes to the best available model for your prompt.
Related
- AI Providers Overview - Compare all 37 supported providers
- Anthropic - Direct Anthropic API for lower latency
- Together AI - Direct open-source model hosting alternative
- profclaw models - Browse and alias available models