
AI Coding Tools Produce 281% More Code in Month 1. By Month 3, the Advantage Is Gone.
A Carnegie Mellon study tracked 807 GitHub projects that adopted Cursor (an AI-native code editor) and compared them against 1,380 control repos over 20 months. The result is the most detailed picture we have of what AI coding tools actually do to a codebase over time. The headline: developers wrote 281% more lines of code in the first month after adopting Cursor. By month two, the boost dropped to 48%. By month three, it was effectively zero. But code complexity increased 41% and static analysis warnings rose 30%, and those numbers never came back down. This is the first large-scale longitudinal study of AI coding tool adoption, and the pattern it reveals matters for every team evaluating these tools. The velocity spike is real but temporary The CMU researchers (He, Miller, Agarwal, Kastner, Vasilescu) measured lines added per month as a proxy for velocity. The data is unambiguous for the first month: AI-assisted developers produce significantly more code. Time after adoption Velocity
Continue reading on Dev.to
Opens in a new tab

