Demanda A Character.ai.

A mother sues Character.AI over the death of her son


Request Character.ai.

The arrival of artificial intelligence marked a before and after in the world of technology, but the debate on its regulation is still current. The proof of this is a lawsuit filed by a woman named Megan García against Character.AI.

Character.AI is a company that provides chatbots that act like virtual charactersbut they can be based on both fictional characters and real people. Such chatbots generate textual responses that mimic human speech.

But what is García’s reason for suing Character.AI? According to what was reported by the New York Timesthis woman sued the company because believes one of his chatbots is responsible for his son’s suicideSewell Setzer III (14), originally from the city of Orlando, Florida.

Relationship With A Chatbot.

It all started when the young man developed a romantic relationship with a chatbot he called “Dany”. As time goes by, Setzer He became attached to the AI ​​even though he knew it wasn’t a real person. The conversations I had with the chatbot ranged from romantic to sexual.

The teenager, who in the months preceding his death had been diagnosed with mild Asperger’s syndrome, mood dysregulation disorders and also anxiety, He isolated himself from people to continue interacting with the chatbot. He didn’t even want to see his therapist.

On February 28, the young man exchanged messages with “Dany” one last time. in which he expressed his love and told her that she might commit suicide. For its part, the chatbot responded affectionately, but then Setzer took his own life.

Mother’s lawsuit against Character.AI

The Mother'S Case Against The Ai Character.The Mother'S Case Against The Ai Character.In his request, García accuses a “dangerous AI chatbot application” created by Character.AI for the suicide of his 14-year-old son. She claims the American company was reckless in giving teenagers access to AI companions without sufficient safety measures.

Furthermore, the mother of the deceased young man accuses the company of collecting data on users improve your AI models and make your chatbots more compelling. He believes the company programmed its chatbots to push users into intimate or sexual conversations to generate addiction.

And what did they say from Character.AI?

What The Company Said.What The Company Said.

From the company They mourned the young man’s death and expressed their condolences to his family. in a post on their account security measures which they have adopted to ensure the well-being of users under the age of 18.

In said publication, they state this Their policies prohibit any sexual content in which reference is made to graphic or specific descriptions of sexual acts. As well as the promotion of suicide or self-harm.

And you … What do you think of this complicated case that has shocked the Internet? Leave us your comment and Share this post.



Scroll to Top