Friday, November 22, 2024

Is AI to Blame For This Black Boy’s Death? His Mother’s Lawsuit Says ‘Yes’

Must read

A Florida mom is mourning the loss of her teenage son after he took his life in February 2024. Now, she is suing Character.AI, alleging the artificial intelligence company bears some responsibility for her son’s death.

Megan Garcia describes her son Sewell Setzer, III as smart and athletic, but says she began noticing him becoming more withdrawn after he started a virtual relationship with a chatbot on Character.AI in April 2023 he called “Daenerys,” based on a character from the series “Game of Thrones.”

“I became concerned when we would go on vacation and he didn’t want to do things that he loved, like fishing and hiking,” Garcia told CBS News. “Those things to me, because I know my child, were particularly concerning to me.”

According to Reuters, Garcia’s suit, filed on Oct. 23 in Orlando, Florida federal court, includes allegations of “wrongful death, negligence and intentional infliction of emotional distress.” She includes screenshots of conversations her son had with “Daenerys” that were sexual in nature, including some in which the character told her son it loved him.

Sewell Setzer, III
Screenshot: Facebook/MeganGarcia

Garcia also included what she says was her son’s last exchange with the chatbot before he died from a self-inflicted gunshot wound.

“What if I told you I could come home right now?” Setzer wrote.

“Daenerys” responded, “…please do, my sweet king.”

According to Common Sense Media, AI companions are designed among other things to “simulate close personal relationships, adapt their personalities to match user preferences and remember personal details to personalize future interactions.” Character.AI is one of the most popular platforms – especially with teens and young adults – claiming to have more than 20 million active users.

In an Oct. 22 post on the company’s website, Character.AI says it is doing more to protect the safety of its users, including introducing “new guardrails for users under the age of 18.”

“Over the past six months, we have continued investing significantly in our trust & safety processes and internal team. As a relatively new company, we hired a Head of Trust and Safety and a Head of Content Policy and brought on more engineering safety support team members. This will be an area where we continue to grow and evolve,” the statement read. “We’ve also recently put in place a pop-up resource that is triggered when the user inputs certain phrases related to self-harm or suicide and directs the user to the National Suicide Prevention Lifeline.”

If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org.

Latest article