
Running Claude Code with Ollama models (Local / Cloud)
Run Claude Code with Ollama (Local, Cloud, or Any Model) This guide shows how to run Claude Code using Ollama , allowing you to use local models, cloud models, or any Ollama-supported model directly from your terminal. Prerequisites Make sure the following tools are installed: Ollama Claude Code Install Ollama If Ollama is not installed, you can install it using the commands below. You can also follow this guide: https://dev.to/sushan/how-to-connect-a-local-ai-model-to-vs-code-1g8d Windows (PowerShell) irm https://ollama.com/install.ps1 | iex macOS / Linux curl -fsSL https://ollama.com/install.sh | sh Verify installation: ollama --version Install Claude Code Windows (PowerShell) irm https://claude.ai/install.ps1 | iex macOS / Linux curl -fsSL https://claude.ai/install.sh | bash Verify installation: claude --version Running Claude Code with Ollama Once both tools are installed, you can start Claude Code through Ollama. The commands work the same on Windows, macOS, and Linux . Option 1:
Continue reading on Dev.to
Opens in a new tab



