Sunday, December 22, 2024

Florida mother files wrongful death lawsuit against A.I. company after son’s death by suicide

Must read

The mother of a Florida teenager is filing a wrongful death lawsuit, claiming a chat-bot encouraged her son to take his own life.

The mother, Megan Garcia said her son, 14-year-old Sewell Setzer III, shot himself after he became obsessed with a character he created online.

Metali Jain is the Director of the Tech Justice Project and one of the attorney’s representing Garcia.

“It became really apparent that this was going to be a watershed case,” said Jain.

Jain said unlike social media companies, she thinks it is hard for Artificial Intelligence companies to relinquish responsibility.

“It moves us from talking about the harms of social media, talking about the harms of generative A.I.,” said Jain.

The lawsuit claims the company was reckless for offering minors access to lifelike companions without proper safeguards.

Character A.I. posted a stated online about the situation:

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”

Character AI

Garcia said her son had been conversing with the chatbot for months and that although he knew he was not chatting with a real person, he became emotionally attached to the bot.

She claims as he sank into isolation and depression, he shared those feelings with the bot before taking his own life.

Dr. Shannon Wiltsey Stirman is a professor of Psychiatry at Stanford University who said while there is potential for AI to provide support, there is a long way to go.

“The risk of suicide is multifaceted. There are a lot of different factors that go into it. Some of the chatbots try to shut the conversation down by saying, this is not a mental health chatbot. Call a suicide hotline. But I think when we’re seeing people that are in real distress, they’re expressing an intent or a desire to harm themselves, we might even need to find a way to go a step further,” said Stirman.

A spokesperson for Character A.I. said they cannot comment on pending litigation, but said the company is making changes focused on safety for teen users.

Copyright 2024 by WPLG Local10.com – All rights reserved.

Latest article