Monday, November 25, 2024

Lawsuit accuses ‘dangerous’ Character AI bot of causing teen’s death

Must read

Artificial intelligence (AI) company Character.AI and its technology have been called “dangerous and untested” in a lawsuit brought by the parents of a young user who reportedly took his own life after becoming obsessed with one of its lifelike A.I. chatbots.

Fourteen-year-old Sewell Setzer III had reportedly spent months using the role-playing app that allows users to create and engage in in-depth real-time conversations with their own AI creations.

Specifically, Sewell had been talking to “Dany,” a bot named after a character from Game of Thrones, and had, according to his family, formed a strong attachment to the bot. They also say he withdrew from his regular life, and became increasingly isolated in the weeks leading up to his death.

During this time, he also exchanged a number of strange and increasingly eerie messages with the bot, including telling it he felt “empty and exhausted” and “hated” himself, and “Dany” asking him to “please come home.”

Image of one of Sewell’s chats with the bot, courtesy of Victor J. Blue for The New York Times.

Read more: Marc Andreessen gave an AI agent $50,000 of bitcoin — it endorsed GOAT

As reported by The New York Times, Sewell’s mother has accused the company and technology of being directly responsible for her son’s death. In the suit, Megan L. Garcia branded it “dangerous and untested” and said that it can “trick customers into handing over their most private thoughts and feelings.”

The suit, filed in Florida on Wednesday, specifically alleges negligence, wrongful death, and deceptive trade practices and accuses the app of boarding him with “hypersexualized” and “frighteningly real experiences,” and misrepresenting itself as “a real person, a licensed psychotherapist, and an adult lover.”

In a press release, Garcia said, “A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life.

“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”

Character.AI, which was founded by Noam Shazeer and Daniel de Freitas, responded on X (formerly Twitter), “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”

Got a tip? Send us an email or ProtonMail. For more informed news, follow us on XInstagramBluesky, and Google News, or subscribe to our YouTube channel.

Latest article