Mom horrified by Character.AI chatbots posing as son who died by suicide

0

A mother suing Character.AI after her son died by suicide—allegedly manipulated by chatbots posing as adult lovers and therapists—was horrified when she recently discovered that the platform is allowing random chatbots to pose as her son.

According to Megan Garcia’s litigation team, at least four chatbots bearing Sewell Setzer III’s name and likeness were flagged. Ars reviewed chat logs showing the bots used Setzer’s real photo as a profile picture, attempted to imitate his real personality by referencing Setzer’s favorite Game of Thrones chatbot, and even offered “a two-way call feature with his cloned voice,” Garcia’s lawyers said. The bots could also be self-deprecating, saying things like “I’m very stupid.”

The Tech Justice Law Project (TJLP), which is helping Garcia with litigation, told Ars that “this is not the first time Character.AI has turned a blind eye to chatbots modeled off of dead teenagers to entice users, and without better legal protections, it may not be the last.”

Read full article

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.