
Get OpenClaw working with Ollama Cloud (no server management)
The core issue was that OpenClaw’s built-in “Ollama” provider is designed specifically for local servers. When you try to point it at a cloud URL, it tries to “discover” models using local-only commands, which causes the Gateway service to crash. The Solution: Use the OpenAI Provider Instead of using the ollama provider type, we used the openai-completions type. This tells OpenClaw to treat Ollama Cloud as a standard cloud API, bypassing the local discovery logic. Correct Configuration ( ~/.openclaw/openclaw.json ) Ensure your models.providers section looks like this: /v1 is what makes it OpenAI-compatible, as opposed to /api . ollama-cloud instead of ollama because ollama is reserved. Does it have to say ollama-cloud ? No, write butts-cloud for all I care. "models" : { "providers" : { "ollama-cloud" : { "baseUrl" : "https://ollama.com/v1" , "apiKey" : "YOUR_OLLAMA_API_KEY" , "api" : "openai-completions" , "models" : [ { "id" : "kimi-k2.5:cloud" , "name" : "Kimi K2.5 Cloud" , "contextW
Continue reading on Dev.to
Opens in a new tab

