Skip to main content
Fireworks AI specializes in fast inference for open-source models with serverless and dedicated deployment options.

Supported Models

ModelIDContextNotes
Llama 3.1 70Baccounts/fireworks/models/llama-v3p1-70b-instruct128KDefault
Llama 3.1 8Baccounts/fireworks/models/llama-v3p1-8b-instruct128KFast/cheap
Mixtral 8x22Baccounts/fireworks/models/mixtral-8x22b-instruct65KLarge MoE
DeepSeek R1accounts/fireworks/models/deepseek-r164KReasoning

Setup

1

Get an API key

Sign up at fireworks.ai. Free trial credits available.
2

Set the environment variable

export FIREWORKS_API_KEY=fw_...
3

Verify

profclaw doctor --provider fireworks

Environment Variables

FIREWORKS_API_KEY
string
required
Your Fireworks AI API key. Format: fw_...

Configuration Example

FIREWORKS_API_KEY=fw_...

Model Aliases

AliasModel
fireworksaccounts/fireworks/models/llama-v3p1-70b-instruct

Usage Examples

profclaw chat --model fireworks "Analyze this log file"

# Full model ID
profclaw chat --model fireworks/accounts/fireworks/models/llama-v3p1-8b-instruct "Quick summary"

Notes

  • API endpoint: https://api.fireworks.ai/inference/v1 (OpenAI-compatible)
  • Status: Beta
  • Fireworks supports fine-tuned model deployment and dedicated instances.
  • Model IDs use the accounts/fireworks/models/ prefix format.