
Why Your AI Cites Real Sources That Never Said That (And the 3-Layer Fix)
100+ hallucinated citations passed peer review at NeurIPS 2025. Expert reviewers. The world's most competitive AI conference. Three or more sign-offs per paper. Still missed. Because they weren't fake sources. The papers were real. The authors were real. The claims they were being used to support? Never appeared in them. That's citation misattribution — and it's the hardest hallucination type to catch in production RAG pipelines. What Is Citation Misattribution? Most devs know about ghost citations — the model invents a paper, generates a plausible DOI, and a quick search returns nothing. Caught. Done. Citation misattribution is different. The model cites a real source but attributes a claim or finding to it that the source never actually made. The paper exists. The DOI resolves. The author is real. What the AI says the paper proves? Not in there. GPTZero coined a term for it: vibe citing . Like vibe coding — generating code that feels correct without being correct — vibe citing produc
Continue reading on Dev.to
Opens in a new tab




