
When AI Reads the Web Wrong
In February 2024, the Civil Resolution Tribunal of British Columbia ordered Air Canada to pay a passenger named Jake Moffatt a partial refund. The reason: Air Canada's AI chatbot had told Moffatt he could book a full-fare flight and apply for a bereavement discount afterward. That was wrong. The airline's actual policy required the discount to be requested before booking. The chatbot read the airline's own website and got the policy backward. Air Canada tried to argue that the chatbot was "a separate legal entity that is responsible for its own actions." The tribunal was not persuaded. This story gets cited as a cautionary tale about AI hallucination. But I think it illustrates something more specific and more fixable. The chatbot did not hallucinate out of thin air. It read a real web page containing real policy information and then misinterpreted what it found. The page almost certainly had the correct policy buried somewhere in the HTML, surrounded by navigation menus, footer links,
Continue reading on Dev.to Webdev
Opens in a new tab




