
Slopsquatting: AI Hallucinations as Supply Chain Attacks
One in five AI-generated code samples recommends a package that does not exist. Attackers are registering those phantom names on npm and PyPI with malware inside. The term for this is slopsquatting, and it is already happening. What Slopsquatting Actually Is Typosquatting bets on human misspellings. Slopsquatting bets on AI hallucinations. The term was coined by Seth Larson , Security Developer-in-Residence at the Python Software Foundation, to describe a specific attack: register the package names that LLMs consistently fabricate, then wait for developers to install them on an AI's recommendation. A USENIX Security 2025 study analyzed 576,000 code samples across 16 language models and found that roughly 20% recommended at least one non-existent package. The hallucinations fall into three categories. 51% are pure fabrications with no basis in reality. 38% are conflations of real packages mashed together (like express-mongoose ). 13% are typo variants of legitimate names. The part that
Continue reading on Dev.to
Opens in a new tab



