
Build Your Own AI Ops Assistant — Part 3: The AI Brain
This is Part 3 of a 6-part series. Part 2 covers the foundation setup. Claude Orchestrator & Multi-Source Search This is where Harper Eye becomes genuinely powerful. By the end of this part, you'll have an AI assistant that searches all your internal data sources in parallel, synthesizes the results with Claude, and returns structured, cited responses like this: About 400 lines of code, and every one of them earns its keep. The Architecture of a Single Query When someone asks Harper Eye a question, here's exactly what happens: The key insight: everything that can run in parallel, does. The embedding generation happens once at the start, then gets reused for KB search, negative feedback search, expertise lookup, and code knowledge search. The six data sources all fire simultaneously. The total wall-clock time is dominated by the slowest source (usually 2-4 seconds) plus Claude's synthesis time, not the sum of all sources. Here's the actual pipeline timing from a production query, you ca
Continue reading on Dev.to
Opens in a new tab

