
I tested 4 codebase-to-AI tools on FastAPI (108k lines). Here are the token costs.
Stacklit on GitHub -- the tool I built after running these tests. I have been using AI agents on real projects for the past year. Claude Code, Cursor, Aider. The one problem that never goes away: every session starts by the agent reading files to understand the codebase. Same files. Same tokens. Every time. So I tested four tools that claim to solve this. I ran them on FastAPI (108,075 lines of Python, 1,131 files) and measured what actually came out. The four tools Repomix (23k stars) -- packs your entire repo into one XML or Markdown file. Every line of source code in a single output. Aider repo-map (part of Aider, 43k stars) -- generates an ephemeral text map of functions and classes ranked by relevance. Built into Aider, not available separately. Codebase Memory MCP (1.4k stars) -- builds a SQLite knowledge graph from tree-sitter ASTs. 66 languages. Queryable through 14 MCP tools. Stacklit (new, my project) -- generates a committed JSON index with module graph, exports, types, and
Continue reading on Dev.to
Opens in a new tab


