
I built a desktop app that orchestrates Claude, GPT, Gemini and local Ollama in a 3-phase pipeline
I've been building desktop AI tools for a while, and one frustration kept coming up: every AI model has different strengths , but using them together was always manual work — copy-paste between apps, switch tabs, lose context. So I built Helix AI Studio — an open-source desktop app that lets Claude, GPT, Gemini, and local Ollama models work together in a coordinated pipeline. GitHub: https://github.com/tsunamayo7/helix-ai-studio The Core Idea: Multi-Phase AI Pipelines Instead of sending one prompt to one model, Helix routes your request through multiple AI models in sequence. Each model handles what it's best at: Your prompt ↓ Phase 1: Claude (analysis & reasoning) ↓ Phase 2: GPT / Gemini (alternative perspective) ↓ Phase 3: Local Ollama model (offline processing / privacy) ↓ Final synthesized response You configure which models run in which phases, and the output of each phase feeds into the next. What's Inside Desktop GUI (PyQt6) Three chat tabs: cloudAI (Claude/GPT/Gemini), localAI
Continue reading on Dev.to Python
Opens in a new tab


