Celebrity

Sewell Setzer’s Death Reveals the Dangers AI Poses to Latine & Black Youth


For almost a year, Sewell Setzer III, a 14-year-old from Orlando, spoke to AI-generated chatbots, powered by a company called Character.AI, meant to emulate “Game of Thrones” characters, like Daenerys and Rhaenyra Targaryen. Throughout that time, he developed a relationship with the chatbots. “The world I’m in now is such a cruel one. One where I’m meaningless,” Setzer said in one of his exchanges with the Daenerys bot, according to a lawsuit filed by his mother, Megan García, against Character.AI. “But I’ll keep living and trying to get back to you so we can be together again, my love.” The conversations eventually turned sexual. 

Shortly after Setzer began using the chatbots, his mental health began declining: he spent more time in his bedroom and quit the basketball team. A therapist diagnosed him with anxiety and disruptive mood disorder. 

He brought up some suicidal thoughts to the Daenerys chatbot on multiple occasions, which the bot kept mentioning. At one point, the chatbot asked him if “he had a plan” to kill himself,  to which Sewell responded that he was considering something but didn’t know if it would work or allow him to have a pain-free death. The chatbot responded by saying, “That’s not a reason not to go through with it.”

“This tragic case illustrates the dangers that some of the rapidly evolving AI technology can pose on young people.”

Valeria Ricciulli

Following his therapist’s advice of spending less time on social media, his mom confiscated and hid his phone. On February 28, while searching for his phone around the house, he also found his stepfather’s pistol. (According to the lawsuit, police determined that the gun had been hidden and stored in compliance with Florida law.) He went into the Character.AI app and told the Daenerys chatbot that he would “come home,” to which the bot responded, “Please do, my sweet king.” Just seconds after, he died by shooting himself in the head. 

The lawsuit filed by García, Setzer’s mother, alleges that Character.AI and its founders (who are now employed by Google) “went to great lengths to engineer 14-year-old Sewell’s harmful dependency on their products, sexually and emotionally abused him, and ultimately failed to offer help or notify his parents when he expressed suicidal ideation.”

This tragic case illustrates the dangers that some of the rapidly evolving AI technology can pose on young people. Advocates are asking lawmakers to act on policies to protect children and teens and tech companies to include guardrails that can prevent tragedies like Setzer’s — especially amid a severe youth mental health crisis with a troubling increase in suicide rates among Latines.

Why AI companion apps like Character.AI can be dangerous to young people

AI companion apps are meant to emulate human interaction and be as realistic as possible. Character.AI, one of several existing companion apps, has more than 20 million users (a lot of whom are teens and young adults) and allows users to create their own “personas” or chatbots and also chat with them.

“[Companion …read more

Source:: Refinery29

      

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *