Skip to main content
Perplexity’s online models have built-in internet access and return grounded answers with citations. Unlike standard LLMs, they can answer questions about current events.

Supported Models

ModelIDContextWeb SearchNotes
Sonar Hugellama-3.1-sonar-huge-128k-online128KYesBest quality
Sonar Largellama-3.1-sonar-large-128k-online128KYesBalanced
Sonar Smallllama-3.1-sonar-small-128k-online128KYesFastest/cheapest

Setup

1

Get an API key

Sign up at perplexity.ai and go to API settings.
2

Set the environment variable

export PERPLEXITY_API_KEY=pplx-...
3

Verify

profclaw doctor --provider perplexity

Environment Variables

PERPLEXITY_API_KEY
string
required
Your Perplexity API key. Format: pplx-...

Configuration Example

PERPLEXITY_API_KEY=pplx-...

Model Aliases

AliasModel
perplexityllama-3.1-sonar-huge-128k-online
pplx-fastllama-3.1-sonar-small-128k-online

Usage Examples

# Current events / web search
profclaw chat --model perplexity "What are the latest changes to the TypeScript spec?"

# Fast web lookup
profclaw chat --model pplx-fast "Current Node.js LTS version?"

Notes

  • Status: Beta - Perplexity models have unique behavior (web search, citations) that differs from standard chat models.
  • API endpoint: https://api.perplexity.ai (OpenAI-compatible)
  • All Sonar Online models have real-time internet access built in.
  • Responses include source citations automatically.
  • Not recommended for tasks requiring deterministic, non-web outputs.