Back to articles
Local Free Claude & Codex with Ollama
How-ToSystems

Local Free Claude & Codex with Ollama

via Dev.to BeginnersTaki

Prerequisites A machine with at least 16GB RAM (32GB+ recommended for better performance with larger models). Operating system: macOS, Linux, or Windows (Ollama supports all). Node.js installed (for installing CLI tools like Claude Code and Codex). You can download it from the official Node.js website if not already installed. Sufficient storage for models (models can be several GB in size). For optimal performance, a GPU (NVIDIA or Apple Silicon) is helpful but not required; CPU-only works. Step 1: Install Ollama Ollama is the core tool for running local AI models. Download and install it using the following command in your terminal: curl -fsSL https://ollama.com/install.sh | sh Verify the installation by running: ollama -v This should display the Ollama version (ensure it's v0.15 or later for ollama launch support). Step 2: Pull a Suitable Coding Model Ollama needs a local model to power Claude Code and Codex. Choose a model based on your hardware (RAM/VRAM). Here are recommendations

Continue reading on Dev.to Beginners

Opens in a new tab

Read Full Article
6 views

Related Articles