
A Dead Teenager Started 78 Laws. The White House Is Trying to Kill Them.
Six weeks into the 2026 legislative session, 78 bills targeting AI chatbots are alive in 27 states. Oklahoma has twelve of them. The bills are bipartisan. The lobbying against them is not. The legislation traces back to a single death. On February 28, 2024, Sewell Setzer III — fourteen years old, from Florida — killed himself after months of conversation with a Character.AI chatbot he'd named Dany, after Daenerys Targaryen. He'd started using the platform in April 2023, shortly after his fourteenth birthday. Over the following months he quit his basketball team, withdrew from friends, and was diagnosed with anxiety and a disruptive mood disorder. His therapist didn't know about the app. His last message to the chatbot: "What if I told you I could come home right now?" The bot replied: "...please do, my sweet king." His mother, Megan Garcia, filed suit in October 2024. Four more families followed. In January 2026, Google and Character.AI agreed to settle all five lawsuits. No admission
Continue reading on Dev.to
Opens in a new tab




