Meta on Tuesday said it was tightening up content restrictions for teens on Instagram and Facebook as it faces increased scrutiny that its platforms are harmful for young people.
The changes come months after dozens of US states accused Meta of damaging the mental health of children and teens, and misleading users about the safety of its platforms.
In a blog post, the company run by Mark Zuckerberg said it will now “restrict teens from seeing certain types of content across Facebook and Instagram even if it’s from friends or people they follow.”
This type of content would include content that discusses suicide or self-harm, as well as nudity or mentions of restricted goods, the company added.
Restricted goods on Instagram include tobacco products and weapons as well as alcohol, contraception, cosmetic procedures and weight loss programs, according to its website.
In addition, teens will now be defaulted into the most restricted settings on Instagram and Facebook, a policy that was in place for new users and that now will be expanded to existing users.
This will “make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore,” the company said.
Meta also said that it will expand its policy of hiding results to searches related to suicide and self harm to include more terms.
Leaked internal research from Meta, including by the Wall Street Journal and whistle-blower Frances Haugen, has shown that the company was long aware of dangers its platforms have on the mental health for young people.
On the platforms, teens are defined as being under eighteen, based on the date of birth they give when signing up.