Skip to main content
SambaNova provides enterprise-grade inference on their custom RDU hardware. They host Llama and other open-source models with high throughput and competitive pricing.

Supported Models

SambaNova hosts various open-source models. Common options:
ModelNotes
Meta-Llama-3.1-70B-InstructGeneral purpose
Meta-Llama-3.1-405B-InstructLargest open model
Llama-3.2-90B-Vision-InstructMultimodal
Check the SambaNova model catalog for the current list.

Setup

1

Get an API key

Sign up at cloud.sambanova.ai.
2

Set the environment variable

export SAMBANOVA_API_KEY=...
3

Configure in profClaw

SambaNova uses an OpenAI-compatible endpoint. Configure it as a custom OpenAI provider:
OPENAI_API_KEY=your-sambanova-key
OPENAI_BASE_URL=https://api.sambanova.ai/v1

Environment Variables

SAMBANOVA_API_KEY
string
required
Your SambaNova Cloud API key. Configure via the OPENAI_API_KEY + OPENAI_BASE_URL pattern.

Configuration Example

# Use the openai provider with SambaNova's endpoint
OPENAI_API_KEY=your-sambanova-api-key
OPENAI_BASE_URL=https://api.sambanova.ai/v1

Usage Examples

# After configuring via OPENAI_BASE_URL
profclaw chat --model Meta-Llama-3.1-70B-Instruct "Explain this architecture"
profclaw chat --model Meta-Llama-3.1-405B-Instruct "Complex analysis task"

Notes

  • SambaNova’s API is OpenAI-compatible. Configure it using the openai provider with a custom base URL.
  • Best suited for enterprise batch inference and high-throughput workloads.
  • Contact SambaNova for dedicated instance pricing.