Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Supported AI Providers

OpenCrabs supports 11+ AI providers out of the box. Switch between them at any time via /models in the TUI or any channel.

ProviderAuthModelsStreamingToolsNotes
Anthropic ClaudeAPI keyClaude Opus 4.6, Sonnet 4.5, Haiku 4.5YesYesExtended thinking, 200K context
OpenAIAPI keyGPT-5 Turbo, GPT-5, o3/o4-miniYesYesModels fetched live
GitHub CopilotOAuthGPT-4o, Claude Sonnet 4+YesYesUses your Copilot subscription — no API charges
OpenRouterAPI key400+ modelsYesYesFree models available. Reasoning output support (Qwen 3.6 Plus, etc.)
Google GeminiAPI keyGemini 2.5 Flash, 2.0, 1.5 ProYesYes1M+ context, vision, image generation
MiniMaxAPI keyM2.7, M2.5, M2.1, Text-01YesYesCompetitive pricing, auto-configured vision
z.ai GLMAPI keyGLM-4.5 through GLM-5 TurboYesYesGeneral API + Coding API endpoints
Claude CLICLI authVia claude binaryYesYesUses your Claude Code subscription
Qwen/DashScopeAPI keyqwen3.6-plus (default)YesYesDashScope API-key provider (replaced OAuth rotation). Local model tool-call extraction from text (bare JSON, Claude-style XML, Qwen formats). Prompt caching via cache_control, rate limit retry with exponential backoff
OllamaOptionalAny Ollama modelYesYesNative local provider — run any model via Ollama API
OpenCode CLINoneFree models (Mimo, etc.)YesYesFree — no API key or subscription needed
CustomOptionalAnyYesYesLM Studio, Groq, NVIDIA, any OpenAI-compatible API

How It Works

  • One provider active at a time per session — switch with /models
  • Per-session isolation — each session remembers its own provider and model. Changing provider in the TUI does not affect other active sessions (Telegram, Discord, Slack)
  • Fallback chain — configure automatic failover when the primary provider goes down
  • Models fetched live — no binary update needed when providers add new models
  • Function calling detection — OpenCrabs detects when a model doesn’t support tool use and warns you with a model switch suggestion, rather than silently failing
  • tool_choice: "auto" — sent automatically for OpenAI-compatible providers when tools are active, enabling function calling on models that require explicit opt-in

OpenRouter Reasoning

For models that support extended reasoning (e.g. Qwen 3.6 Plus), OpenCrabs sends include_reasoning: true automatically when using OpenRouter. Thinking/reasoning output is displayed in collapsible sections:

▶ Thinking... (click to expand)
  The user wants to refactor...

Reasoning text wraps to screen width instead of truncating.

See Provider Setup for configuration details and API key setup.