A lawsuit is blaming Character.AI for the death of a 14-year old boy who took his own life earlier this year.

The New York Times has revealed details of the case, and Character AI has publicly apologized. The company has clamped down on its safety measures – deleting many of its custom models and tightening restrictions – but many of its users are enraged.

A Redditor posts:

“Every theme that isn’t considered “child-friendly” has been banned, which severely limits our creativity and the stories we can tell, even though it’s clear this site was never really meant for kids in the first place. The characters feel so soulless now, stripped of all the depth and personality that once made them relatable and interesting. The stories feel hollow, bland, and incredibly restrictive. It’s frustrating to see what we loved turned into something so basic and uninspired.“

The relationship we have with AI is a conversation that we have to have at max scale.