Skip to main content
Together AI hosts hundreds of open-source models with fast inference. It’s a good choice when you need specific open-source models without running them locally.

Supported Models

ModelIDContextToolsNotes
Llama 3.3 70B Turbometa-llama/Llama-3.3-70B-Instruct-Turbo128KYesBest general-purpose
Qwen 2.5 72B TurboQwen/Qwen2.5-72B-Instruct-Turbo32KYesMultilingual
DeepSeek R1deepseek-ai/DeepSeek-R1128KNoReasoning model
Mixtral 8x22Bmistralai/Mixtral-8x22B-Instruct-v0.165KYesLarge MoE model
Any model on the Together AI catalog can be referenced by its full ID.

Setup

1

Get an API key

Sign up at api.together.xyz. Free credits on signup.
2

Set the environment variable

export TOGETHER_API_KEY=...
3

Verify

profclaw doctor --provider together

Environment Variables

TOGETHER_API_KEY
string
required
Your Together AI API key.

Configuration Example

TOGETHER_API_KEY=...

Model Aliases

AliasModel
togethermeta-llama/Llama-3.3-70B-Instruct-Turbo
together-qwenQwen/Qwen2.5-72B-Instruct-Turbo

Usage Examples

profclaw chat --model together "Explain this algorithm"
profclaw chat --model together-qwen "Translate this to Chinese"

# Use any Together AI model directly
profclaw chat --model together/deepseek-ai/DeepSeek-R1 "Solve this math problem"

Notes

  • API endpoint: https://api.together.xyz/v1 (OpenAI-compatible)
  • Status: Beta
  • Together AI offers serverless and dedicated inference options.
  • Great for accessing models not available on other providers (e.g., specific fine-tuned variants).