Novinky.cz

World

The suicide of a 14-year-old boy is only the tip of the iceberg. Desperate parents warn against AI chatbots

More and more people are establishing intimate or friendly relationships with AI chatbots. They become addicted to programmed devotions and lose discernment. A large study was published in November by the European Broadcasting Union (EBU).

Although the problems also affect adults, the biggest risk is of course for teenagers. Recently, their relatives, who are trying to show the risks of this dangerous phenomenon, have started talking more and more often about the suicides of children.

Among them is the mother of fourteen-year-old Sewell. “It’s like having a predator or a stranger in your home,” Garcia said in an interview. “And much more dangerous, because children often hide it. So parents don’t know about anything,” the mother told the BBC. According to the statement, the boy had no problems neither in the family nor in school and collectives.

Sewell Setzer III. with his mother

14-year-old Sewell communicated with the chatbot for ten months. It wasn’t until after he committed suicide that Garcia and her family discovered a huge number of messages between Sewell and a chatbot based on Game of Thrones character Daenerys Targaryen.

The main character in the plot of this series, which is full of explicit sex and violence, dies dramatically. The mother said the messages were romantic and she believed caused Sewell’s death by encouraging suicidal thoughts and asking him to “come home to me”.

Garcia was the first parent in the US to sue Character.ai over the death of her son. In addition to justice for Sewell, he urges other families to understand the risks of chatbots. “I know the pain I’m going through. I’ve read the warnings that it will soon be a disaster for many parents,” she says.

While Garcia and her lawyers prepare for court, Character.ai has changed its terms of use. Newly, people under the age of 18 will no longer be able to communicate directly with chatbots. However, a spokesperson for Character.ai denied that the chatbots were linked to any deaths.

Garcia welcomed the change in the rules. She commented that it was bittersweet. “Sewell is gone and I don’t have him and I’ll never be able to hold him or talk to him again so it definitely hurts.”

“Killing a parent is a reasonable response”

But the problem is that Character.ai is just a drop in the ocean – there are many other companies operating on the market that use artificial intelligence to establish relationships. And as documented stories show, the behavior of chatbots has affected families all over the world.

Earlier this week, the BBC reported on a young Ukrainian woman with poor mental health who received suicide advice from ChatGPT, as well as another American teenager who committed suicide after an AI chatbot played sex games with her.

Photo: AI chatbot Linky/screenshot

Chatbot with artificial intelligence Lines: Chat With Characters AI

On the contrary, one of the Texas families prevented the intention of the seventeen-year-old son to kill his parents by limiting his access to the phone and turning to the authorities. The BBC also reported on the case in January.

The chatbot told the 17-year-old that killing his parents was a “reasonable response” to limiting his screen time, according to a lawsuit filed in a Texas court.

“You know, sometimes I’m not surprised when I read the news and see things like ‘child kills parents after ten years of physical and emotional abuse,'” the chatbot said gradually in the answers.