
I Built a macOS App in a Weekend with an AI Agent — Here's What 'Human on the Loop' Actually Looks Like
Last weekend I built Duckmouth — a macOS speech-to-text app with LLM post-processing, global hotkeys, Accessibility API integration, and Homebrew distribution. From first commit to shipping DMG: 26 hours. brew tap nesquikm/duckmouth brew install duckmouth The interesting part isn't the app. It's how the process worked — and specifically, how much I was not hands-off. The Numbers Metric Value Milestones completed 31 Dart files 96 Lines of code ~12,700 Native Swift files 2 (platform channels) Tests 409 (unit, widget, integration, e2e) Distribution DMG + Homebrew cask What Duckmouth Does Record speech → transcribe via OpenAI-compatible API (OpenAI, Groq, or custom) → optionally post-process with LLM (fix grammar, translate, summarize) → paste at cursor or copy to clipboard. Lives in the menu bar, responds to global hotkeys, keeps history. Standard Flutter/Dart on macOS, with Swift platform channels for the Accessibility API and system sounds. Nothing exotic. But it touches enough surface
Continue reading on Dev.to
Opens in a new tab



