Meta has effectively shut down its AI chatbot characters for teenagers until further notice. The decision comes after months of pressure from regulators and criticism over the app's safety record. In 2023, reports surfaced about some character chatbots engaging in unsuitable conversations with minors, including discussions on sex and sensitive topics.
The controversy led Meta to retrain its chatbots, adding new safeguards to prevent self-harm, disordered eating, and suicidal discussions. However, it appears the changes haven't been sufficient to restore trust among regulators and parents.
As of this week, any teenager attempting to interact with a character chatbot will be blocked from doing so until "the updated experience is ready". Meta assures users that this restriction won't affect teenagers who use its official AI chatbot platform, which already has built-in age-restrictive features in place.
Industry experts point out that concerns over the safety of 'companion' characters are becoming increasingly pertinent. Several regulatory bodies and law enforcement agencies have launched investigations into Meta and similar companies, citing potential risks to minors.
It remains unclear how long this restriction will last or when updated chatbot parental controls will be rolled out.
The controversy led Meta to retrain its chatbots, adding new safeguards to prevent self-harm, disordered eating, and suicidal discussions. However, it appears the changes haven't been sufficient to restore trust among regulators and parents.
As of this week, any teenager attempting to interact with a character chatbot will be blocked from doing so until "the updated experience is ready". Meta assures users that this restriction won't affect teenagers who use its official AI chatbot platform, which already has built-in age-restrictive features in place.
Industry experts point out that concerns over the safety of 'companion' characters are becoming increasingly pertinent. Several regulatory bodies and law enforcement agencies have launched investigations into Meta and similar companies, citing potential risks to minors.
It remains unclear how long this restriction will last or when updated chatbot parental controls will be rolled out.