Skip to main content

Supported Providers

orxhestra supports all major LLM providers via LangChain integrations.
ProviderNamePackageInstall
OpenAIopenailangchain-openaipip install orxhestra[openai]
Azure OpenAIazure_openailangchain-openaipip install orxhestra[openai]
Azure AIazure_ailangchain-azure-aipip install langchain-azure-ai
Anthropicanthropiclangchain-anthropicpip install orxhestra[anthropic]
Anthropic Bedrockanthropic_bedrocklangchain-awspip install langchain-aws
Google GenAIgooglelangchain-google-genaipip install orxhestra[google]
Google Vertex AIgoogle_vertexailangchain-google-vertexaipip install langchain-google-vertexai
AWS Bedrockbedrock_converselangchain-awspip install langchain-aws
Mistralmistralailangchain-mistralaipip install langchain-mistralai
Coherecoherelangchain-coherepip install langchain-cohere
Fireworksfireworkslangchain-fireworkspip install langchain-fireworks
Togethertogetherlangchain-togetherpip install langchain-together
Groqgroqlangchain-groqpip install langchain-groq
NVIDIA NIMnvidialangchain-nvidia-ai-endpointspip install langchain-nvidia-ai-endpoints
HuggingFacehuggingfacelangchain-huggingfacepip install langchain-huggingface
DeepSeekdeepseeklangchain-deepseekpip install langchain-deepseek
Ollamaollamalangchain-ollamapip install langchain-ollama
xAI (Grok)xailangchain-xaipip install langchain-xai
IBM watsonxibmlangchain-ibmpip install langchain-ibm
Perplexityperplexitylangchain-perplexitypip install langchain-perplexity
OpenRouteropenrouterlangchain-openrouterpip install langchain-openrouter
Upstageupstagelangchain-upstagepip install langchain-upstage

CLI Auto-Detection

The CLI automatically detects the provider from the model name:
orx --model gpt-4o           # → openai
orx --model claude-sonnet-4-6 # → anthropic
orx --model gemini-2.0-flash  # → google
orx --model mistral-large     # → mistralai
orx --model deepseek-chat     # → deepseek
orx --model grok-2            # → xai
orx --model command-r-plus    # → cohere

YAML Configuration

defaults:
  model:
    provider: anthropic
    name: claude-sonnet-4-6
    temperature: 0.7

# Or use named models
models:
  fast:
    provider: groq
    name: llama-3.3-70b-versatile
  smart:
    provider: anthropic
    name: claude-opus-4-6

agents:
  researcher:
    model: fast
  writer:
    model: smart

Azure OpenAI

Azure OpenAI requires explicit provider since model names overlap with OpenAI:
defaults:
  model:
    provider: azure_openai
    name: gpt-4o
    azure_endpoint: https://my-resource.openai.azure.com/
    api_version: "2024-12-01-preview"
    azure_deployment: gpt-4o

Custom Providers

Register a custom provider or use a dotted import path:
from orxhestra.composer.builders.models import register

register("my_provider", MyCustomChatModel)
Or in YAML:
defaults:
  model:
    provider: my_module.MyCustomChatModel
    name: my-model

Environment Variables

ProviderEnv Var
OpenAIOPENAI_API_KEY
Azure OpenAIAZURE_OPENAI_API_KEY
AnthropicANTHROPIC_API_KEY
GoogleGOOGLE_API_KEY
MistralMISTRAL_API_KEY
GroqGROQ_API_KEY
DeepSeekDEEPSEEK_API_KEY
xAIXAI_API_KEY
CohereCOHERE_API_KEY
TogetherTOGETHER_API_KEY
FireworksFIREWORKS_API_KEY
NVIDIANVIDIA_API_KEY
PerplexityPPLX_API_KEY
OpenRouterOPENROUTER_API_KEY
UpstageUPSTAGE_API_KEY