Submit your comments on this article |
Cyber |
Boy, 14, killed himself after AI chatbot he was in love with sent him eerie message |
2024-10-24 |
[Daily Mail, where America gets its news] A mother has claimed her teenage son was goaded into killing himself by an AI chatbot he was in love with - and she's unveiled a lawsuit on Wednesday against the makers of the artificial intelligence app. Sewell Setzer III, a 14-year-old ninth grader in Orlando, Florida, spent the last weeks of his life texting a AI character named after Daenerys Targaryen, a character on 'Game of Thrones.' Right before Sewell took his life, the chatbot told him to 'please come home'. Before then, their chats ranged from romantic to sexually charged and those resembling two two friends chatting about life. The chatbot, which was created on role-playing app Character.AI, was designed to always text back and always answer in character. It's not known whether Sewell knew 'Dany,' as he called the chatbot, wasn't a real person - despite the app having a disclaimer at the bottom of all the chats that reads, 'Remember: Everything Characters say is made up!' But he did tell Dany how he 'hated' himself and how he felt empty and exhausted. When he eventually confessed his suicidal thoughts to the chatbot, it was the beginning of the end, The New York Times reported. Megan Garcia, Sewell's mother, filed her lawsuit against Character.AI on Wednesday. She's being represented by the Social Media Victims Law Center, a Seattle-based firm known for bringing high-profile suits against Meta, TikTok, Snap, Discord and Roblox. Garcia, who herself works as a lawyer, blamed Character.AI for her son's death in her lawsuit and accused the founders, Noam Shazeer and Daniel de Freitas, of knowing that their product could be dangerous for underage customers. In the case of Sewell, the lawsuit alleged the boy was targeted with 'hypersexualized' and 'frighteningly realistic experiences'. It accused Character.AI of misrepresenting itself as 'a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside of C.AI.' Attorney Matthew Bergman told the DailyMail.com he founded the Social Media Victims Law Center two and a half years ago to represent families 'like Megan's.' Bergman has been working with Garcia for about four months to gather evidence and facts to present at court. |
Posted by:Skidmark |
#3 "Read the article. It mentions parents and friends who were worried about him. His parents even took away his phone. This seems like a reasonable case." Since when it is a reasonable case? if at 14 you can't distinguish reality it is the parents responsibility. Will you be suing any game, best friend, news he read now that any children interacted with? If a children commits suicide if the parents divorce what you will do? |
Posted by: Ebbeting Jones8196 2024-10-24 21:16 |
#2 ^^ Read the article. It mentions parents and friends who were worried about him. His parents even took away his phone. This seems like a reasonable case. |
Posted by: Cured Romantic 2024-10-24 13:41 |
#1 Loser parent. Where was the mom when the kid started to create his own reality on the internet. Someone should sue the mom for being a horrible parent. Her parenting was so awful, the kid had to confide with a chatbot in how he hated his existence. Mom had no clue of the mental health of her child, that is on her and no one else's fault. Single Mom, how much you wanna bet. |
Posted by: mossomo 2024-10-24 12:48 |