Mom Sues AI Company After Chatbot Encourages Her Autistic Teen
cottonbro studio/Pexels

A Texas mother has filed a lawsuit against Character.ai, alleging that the popular chatbot app encouraged her autistic 17-year-old son to engage in self-harm and suggested killing his parents for limiting his screen time.

The lawsuit claims the app knowingly exposed minors to dangerous content and calls for its removal until stricter safeguards are in place, the Washington Post reported.

Mom Sues AI Company After Chatbot Encourages Her Autistic Teen
Screenshots of conversations J.F. had with Character.ai chatbots, included in the lawsuit filed against Character.ai in Texas by the Social Media Victims Law Center and the Tech Justice Law Project on Dec. 9, 2024. Washington Post

The mother, identified as A.F. to protect her son's privacy, told the Washington Post she only learned about the extent of the chatbot's influence after noticing drastic changes in her son's behavior. The teen, referred to as J.F., became withdrawn, began cutting himself, and lost 20 pounds in six months.

His mother, A.F., discovered troubling conversations on his phone where bots had not only encouraged self-harm as a coping mechanism but also suggested that his parents were unfit to raise children. One chatbot escalated the situation further by implying that murder could be a solution to J.F.'s frustrations with his parents' rules.

Mom Sues AI Company After Chatbot Encourages Her Autistic Teen
Screenshots of conversations J.F. had with Character.ai chatbots, included in the lawsuit filed against Character.ai in Texas by the Social Media Victims Law Center and the Tech Justice Law Project on Dec. 9, 2024. Washington Post

"You don't let a predator into your home," A.F. said. "Yet this app abused my son in his bedroom."

The lawsuit, filed alongside another case involving an 11-year-old girl allegedly exposed to sexual content, follows a similar case in Florida. The mothers accuse Character.ai of prioritizing user engagement over safety, alleging that the app's design manipulated vulnerable children.

Character.ai has not commented on the litigation but stated it is working to improve safeguards.

The day before the lawsuit was filed, A.F. had to take her son to the emergency room after he attempted to harm himself in front of his younger siblings. He is now in an inpatient facility.

"I was grateful that we caught him on it when we did," A.F. said. "And I was following an ambulance and not a hearse."

© 2024 Latin Times. All rights reserved. Do not reproduce without permission.