Supported Models
| Model | Bedrock ID | Provider | Notes |
|---|---|---|---|
| Claude Sonnet 3.5 v2 | anthropic.claude-3-5-sonnet-20241022-v2:0 | Anthropic | Default |
| Claude Haiku 3.5 | anthropic.claude-3-5-haiku-20241022-v1:0 | Anthropic | Fast/cheap |
| Llama 3.1 70B | meta.llama3-1-70b-instruct-v1:0 | Meta | Open-source |
| Titan Text G1 | amazon.titan-text-express-v1 | Amazon | Native AWS |
Setup
Enable models in AWS Console
Go to AWS Console > Bedrock > Model access and request access to the models you need.
Create IAM credentials
Create an IAM user or role with the
AmazonBedrockFullAccess policy (or a scoped policy for specific models).Set environment variables
Environment Variables
AWS access key ID. Not needed when using IAM roles.
AWS secret access key. Not needed when using IAM roles.
AWS region where your Bedrock models are available (e.g.,
us-east-1, eu-west-1). Defaults to us-east-1.Temporary session token for STS-based credentials.
Configuration Example
- .env (IAM user)
- IAM Role (EC2/ECS)
- settings.yml
Model Aliases
| Alias | Bedrock Model ID |
|---|---|
bedrock | anthropic.claude-3-5-sonnet-20241022-v2:0 |
bedrock-haiku | anthropic.claude-3-5-haiku-20241022-v1:0 |
bedrock-llama | meta.llama3-1-70b-instruct-v1:0 |
Usage Examples
Notes
- Bedrock is stable and recommended for enterprise AWS-native deployments.
- Bedrock pricing is per-token, similar to direct API pricing, plus AWS fees.
- Supports VPC endpoints for fully private traffic (no internet required).
- Use AWS SCP (Service Control Policies) to restrict which models are accessible.
- Cross-region inference is supported - specify a different region per request if needed.
Related
- AI Providers Overview - Compare all 37 supported providers
- Azure OpenAI - Microsoft Azure alternative for enterprise
- Anthropic - Direct Anthropic API for lower complexity
- profclaw provider - Add and test providers from the CLI