Five Easy Pieces of Linear Regression
If you care about statistics, machine learning, or what everyone now calls "AI" — learning linear regression well is the single most important investment you can make. Not transformers. Not diffusion models. Linear regression. You can go as shallow as calling model.fit(X, y) and declaring you know regression. Or you can go deep — the linear algebra behind the closed-form solution, the calculus behind gradient descent, the numerical analysis behind SVD, the Gauss-Markov assumptions that tell you when least squares is optimal, Cook's distance that reveals when one rogue data point is hijacking your entire model, and the Bayesian interpretation that turns regularisation into prior beliefs. People still write PhD theses on this subject. It runs that deep. This post takes you through five fundamentally different algorithms for solving the same regression problem — each from a different branch of mathematics — and they all produce the exact same answer . Five roads, one destination. Along th
Continue reading on Dev.to
Opens in a new tab




