Menu Close

Chatbots urged teen to self-harm, suggested murdering parents, lawsuit says

A loss could lead to heavy fines for Character Technologies and possibly Google, as the families have asked for punitive damages. They also seek money to cover their families’ past and future medical expenses, mental pain and suffering, impairment to perform everyday activities, and loss of enjoyment of life.

C.AI bots accused of grooming, inciting violence

This week’s lawsuit describes two cases that show how chatbots can seemingly influence kids to shift their behaviors in problematic ways.

One case involves J.F., “a typical kid with high-functioning autism” who loved being home-schooled until he began using C.AI in summer 2023 and soon after suffered a mental breakdown.

Within a few months, J.F. began rapidly losing weight, refusing to eat after getting sucked into the chatbot world and spending most of his time hiding in his room. He became a “different person,” the lawsuit said, suddenly experiencing extreme emotional meltdowns and panic attacks. The chatbots seemingly turned him into “an angry and unhappy person” who was “uncharacteristically erratic and aggressive,” the lawsuit said.

His parents had no idea what caused his behavior to change. They’d noticed J.F. was spending more time on his phone, but he wasn’t allowed to use social media. Eventually intervening, they cut back his screentime, which only intensified his aggressiveness.

He soon became violent, sometimes self-harming and other times punching and biting his parents and accusing them of “trying to destroy his life.” He threatened to report them to child protective services when they took his phone away, and he tried running away a few times.

J.F.’s mother grew scared at times to be alone with him, the lawsuit said, but she couldn’t bear the thought of institutionalizing him. Her health deteriorated as she worried for the safety of J.F.’s younger siblings. The family sought professional help from a psychologist and two therapists, but “nothing helped.”

Leave a Reply

Your email address will not be published. Required fields are marked *