By Lisa Eadicicco
Chatbot platform Character.AI will no longer allow teens to engage in back-and-forth conversations with its AI-generated characters, its parent company Character Technologies said on Wednesday. The move comes after a string of lawsuits alleged the app played a role in suicide and mental health issues among teens.
The company will make the change by November 25, and teens will have a two-hour chat limit in the meantime. Instead of open-ended conversations, teens under 18 will be able to create videos, stories and streams with characters.
“We do not take this step of removing open-ended Character chat lightly – but we do think that it’s the right thing to do given the questions that have been raised about how teens do, and should, interact with this new technology,” the company said in its statement.
Character.AI has been at the center of controversy over how teens and children should be permitted to interact with AI, prompting calls from online safety advocates and lawmakers for tech companies to bolster their parental controls. A Florida mother filed a lawsuit against the company last year alleging the app was responsible for the suicide of her 14-year-old son. Three more families sued the company in September, alleging that their children died by or attempted suicide and were otherwise harmed after interacting with the company’s chatbots…
READ FULL ARTICLE HERE… (lite.cnn.com)
Home | Caravan to Midnight (zutalk.com)






Be First to Comment