Supported AI Providers
OpenCrabs supports 10+ AI providers out of the box. Switch between them at any time via /models in the TUI or any channel.
| Provider | Auth | Models | Streaming | Tools | Notes |
|---|---|---|---|---|---|
| Anthropic Claude | API key | Claude Opus 4.6, Sonnet 4.5, Haiku 4.5 | Yes | Yes | Extended thinking, 200K context |
| OpenAI | API key | GPT-5 Turbo, GPT-5, o3/o4-mini | Yes | Yes | Models fetched live |
| GitHub Copilot | OAuth | GPT-4o, Claude Sonnet 4+ | Yes | Yes | Uses your Copilot subscription — no API charges |
| OpenRouter | API key | 400+ models | Yes | Yes | Free models available (DeepSeek-R1, Llama 3.3, etc.) |
| Google Gemini | API key | Gemini 2.5 Flash, 2.0, 1.5 Pro | Yes | Yes | 1M+ context, vision, image generation |
| MiniMax | API key | M2.7, M2.5, M2.1, Text-01 | Yes | Yes | Competitive pricing, auto-configured vision |
| z.ai GLM | API key | GLM-4.5 through GLM-5 Turbo | Yes | Yes | General API + Coding API endpoints |
| Claude CLI | CLI auth | Via claude binary | Yes | Yes | Uses your Claude Code subscription |
| OpenCode CLI | None | Free models (Mimo, etc.) | Yes | Yes | Free — no API key or subscription needed |
| Custom | Optional | Any | Yes | Yes | Ollama, LM Studio, Groq, NVIDIA, any OpenAI-compatible API |
How It Works
- One provider active at a time per session — switch with
/models - Per-session memory — each session remembers its provider and model
- Fallback chain — configure automatic failover when the primary provider goes down
- Models fetched live — no binary update needed when providers add new models
See Provider Setup for configuration details and API key setup.