
Run Ollama Models on Google Colab (Free, No Local GPU)
If you don’t have a local GPU but still want to experiment with LLMs, this project might help. I built a minimal setup to run Ollama models directly on Google Colab with almost zero friction. What this repo does • Installs Ollama inside Colab • Runs models like Llama, Qwen, DeepSeek, CodeLlama • Exposes the API so you can connect external tools • Keeps the setup simple and reproducible Why this exists Most tutorials for running Ollama in Colab are either: • Overcomplicated • Broken or outdated • Missing key steps (like tunneling or API access) This repo removes that friction and gives you a working setup in minutes. Use cases • Testing coding models • Building quick AI tools • Running agents • Prompt engineering experiments • Connecting Ollama to external apps via tunnel How to use Open the notebook and run the cells step by step. That’s it. Repo https://github.com/0x1881/collama ⸻ If you have suggestions or improvements, feel free to contribute.
Continue reading on Dev.to Tutorial
Opens in a new tab

![[Learning notes and hw] getting started with R-cnn: Manually implementing Intersection over Union (IoU)](/_next/image?url=https%3A%2F%2Fmedia2.dev.to%2Fdynamic%2Fimage%2Fwidth%3D800%252Cheight%3D%252Cfit%3Dscale-down%252Cgravity%3Dauto%252Cformat%3Dauto%2Fhttps%253A%252F%252Fdev-to-uploads.s3.amazonaws.com%252Fuploads%252Farticles%252Favit2emoxc0g68e5ltqj.jpg&w=1200&q=75)

