Supported AI Providers
OpenCrabs supports 11+ AI providers out of the box. Switch between them at any time via /models in the TUI or any channel.
| Provider | Auth | Models | Streaming | Tools | Notes |
|---|---|---|---|---|---|
| Anthropic Claude | API key | Claude Opus 4.6, Sonnet 4.5, Haiku 4.5 | Yes | Yes | Extended thinking, 200K context |
| OpenAI | API key | GPT-5 Turbo, GPT-5, o3/o4-mini | Yes | Yes | Models fetched live |
| GitHub Copilot | OAuth | GPT-4o, Claude Sonnet 4+ | Yes | Yes | Uses your Copilot subscription — no API charges |
| OpenRouter | API key | 400+ models | Yes | Yes | Free models available. Reasoning output support (Qwen 3.6 Plus, etc.) |
| Google Gemini | API key | Gemini 2.5 Flash, 2.0, 1.5 Pro | Yes | Yes | 1M+ context, vision, image generation |
| MiniMax | API key | M2.7, M2.5, M2.1, Text-01 | Yes | Yes | Competitive pricing, auto-configured vision |
| z.ai GLM | API key | GLM-4.5 through GLM-5 Turbo | Yes | Yes | General API + Coding API endpoints |
| Claude CLI | CLI auth | Via claude binary | Yes | Yes | Uses your Claude Code subscription |
| Qwen/DashScope | API key | qwen3.6-plus (default) | Yes | Yes | DashScope API-key provider (replaced OAuth rotation). Local model tool-call extraction from text (bare JSON, Claude-style XML, Qwen formats). Prompt caching via cache_control, rate limit retry with exponential backoff |
| Ollama | Optional | Any Ollama model | Yes | Yes | Native local provider — run any model via Ollama API |
| OpenCode CLI | None | Free models (Mimo, etc.) | Yes | Yes | Free — no API key or subscription needed |
| Custom | Optional | Any | Yes | Yes | LM Studio, Groq, NVIDIA, any OpenAI-compatible API |
How It Works
- One provider active at a time per session — switch with
/models - Per-session isolation — each session remembers its own provider and model. Changing provider in the TUI does not affect other active sessions (Telegram, Discord, Slack)
- Fallback chain — configure automatic failover when the primary provider goes down
- Models fetched live — no binary update needed when providers add new models
- Function calling detection — OpenCrabs detects when a model doesn’t support tool use and warns you with a model switch suggestion, rather than silently failing
tool_choice: "auto"— sent automatically for OpenAI-compatible providers when tools are active, enabling function calling on models that require explicit opt-in
OpenRouter Reasoning
For models that support extended reasoning (e.g. Qwen 3.6 Plus), OpenCrabs sends include_reasoning: true automatically when using OpenRouter. Thinking/reasoning output is displayed in collapsible sections:
▶ Thinking... (click to expand)
The user wants to refactor...
Reasoning text wraps to screen width instead of truncating.
See Provider Setup for configuration details and API key setup.