Back to articles
OpenClaw Custom Model Configuration: Connect Any AI Model You Want
How-ToDevOps

OpenClaw Custom Model Configuration: Connect Any AI Model You Want

via Dev.to DevOpsxujfcn

One of OpenClaw's biggest advantages as an open-source AI coding assistant: it doesn't lock you into any single model vendor. Through its flexible Provider system, you can freely connect to OpenAI, Anthropic, Google, DeepSeek, and other cloud models, run local models via Ollama or llama.cpp, or use an API gateway to access 627+ models with one key. This is the complete guide to configuring custom models in OpenClaw. Understanding the Provider System Every model in OpenClaw is referenced in provider/model format — like openai/gpt-5.1-codex or anthropic/claude-opus-4-20250918 . The Provider determines the API endpoint, auth method, and request format. Built-in Providers Provider Description Typical Models openai OpenAI official API gpt-5.1, gpt-5.1-mini anthropic Anthropic official API claude-opus-4, claude-sonnet-4 google Google AI Studio gemini-2.5-pro, gemini-2.5-flash google-vertex Google Cloud Vertex AI gemini-2.5-pro (enterprise) openai-codex OpenAI Codex codex-mini-latest opencode

Continue reading on Dev.to DevOps

Opens in a new tab

Read Full Article
4 views

Related Articles