Meta, Snap, TikTok to remove self-harm content

Meta, Snap, TikTok to remove self-harm content
UPI

Sept. 12 (UPI) — Three of the biggest social media platforms are teaming up to address online content that features suicide and self harm, Meta announced Thursday.

Meta, the owner Facebook, Instagram and WhatsApp, has teamed up with Snap and TikTok to form Thrive, an initiative designed to destigmatize mental health issues and work to slow the viral spread of online content featuring suicide or self-harm, Meta said in a blog post.

“Suicide and self-harm are complex mental health issues that can have devastating consequences,” Meta said in its release.

“We’re prioritizing this content because of its propensity to spread across different platforms quickly,” Antigone Davis, Meta’s global head of safety, wrote in the post. “These initial signals represent content only, and will not include identifiable information about any accounts or individuals.”

The initiative was formed in conjunction with The Mental Health Coalition, a group of mental health organizations working to destigmatize these issues.

Meta, Snap and TikTok will share tips with each other, or “signals,” allowing them to compare notes and investigate and take steps if similar content appears on other apps. Thrive will serve as a database that all the participating social media companies will have access to.

Meta is using technology developed by Lantern, a company designed to make technology safe for minors. Amazon, Apple, Google, Discord, OpenAI and others are part of the coalition. Meta made clear in its release that it is targeting content, not users.

“We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals,” Davis wrote in the blog post.

The social media companies will be responsible for reviewing and taking any necessary action through Thrive, and for writing a yearly report to measure the program’s impact.

Meta said when content featuring self harm or suicide is identified, it will be given a number, or a “hash,” which can then be crossed checked by the other social media companies, look for the content and remove it.

Increased social media use by minors has caused a spike in depression and suicidal behavior, the Mental Health Coalition said. Research also suggests that young people who self harm are more active on social media.

Earlier this year, Meta announced it would begin removing and limiting sensitive content deemed to be “age-inappropriate” from teenagers’ feeds on its apps. The company said it had plans to hide search results and terms relating to suicide, self harm and eating disorders for all users.

Meta, TikTok, Snapchat and other social media platforms have long been criticized for failing to remove content deemed harmful to teens, including videos and images of self-harm.

Authored by Upi via Breitbart September 12th 2024