
Why 0.1 + 0.2 Doesn't Equal 0.3 in Your Code
Type this into your Python console right now: print ( 0.1 + 0.2 ) You expect to see 0.3. Instead, you get 0.30000000000000004. Your calculator says 0.3. Excel says 0.3. Your brain says 0.3. But your code disagrees. And this isn't a Python bug—every programming language does this. JavaScript, Java, C++, Ruby, Go—they all betray basic arithmetic in exactly the same way. This seemingly tiny quirk has caused multi-million dollar disasters. In 1991, a Patriot missile defense system failed to intercept an Iraqi Scud missile because of accumulated floating-point errors, killing 28 soldiers. Financial systems lose thousands daily to rounding errors in currency calculations. Scientific simulations produce garbage results. Medical dosing software miscalculates. Yet most developers never learn why this happens or how to fix it. They just add random rounding functions until things look right. Let's end that today. The Real Problem Computers don't actually store decimal numbers the way you think. W
Continue reading on Dev.to JavaScript
Opens in a new tab



