Research shared with the Wall Street Journal shows that Mark Zuckerberg’s Instagram floods the accounts of children as young as 13 with sexually suggestive videos within minutes of their creation, contradicting its claims of prioritizing age-appropriate content.
The Wall Street Journal reports that recent tests conducted by the publication and computer-science professor Laura Edelson from Northeastern University reveal that Instagram continues to push adult content to underage users despite assertions from its parent company, Mark Zuckerberg’s Meta, that it aims to provide a safer experience for teenagers. Over a period of seven months ending in June, the researchers created new accounts with ages listed as 13 and observed the video recommendations provided by Instagram’s Reels feed.
From the outset, Reels served a mixture of videos that included moderately racy content, such as women dancing seductively or posing suggestively. When these test accounts skipped other types of clips but fully watched the suggestive ones, the algorithm began recommending increasingly explicit content. The study found that within as little as three minutes, adult-oriented creators appeared in the feeds, and within 20 minutes, the test accounts were dominated by promotions from these creators.
Meta has dismissed these findings, calling the tests “an artificial experiment that doesn’t match the reality of how teens use Instagram,” according to Meta spokesman Andy Stone. Stone said that the company’s ongoing efforts have significantly reduced the exposure of sensitive content to teenagers. “As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months,” he said.
However, internal and external analyses suggest otherwise. A 2022 internal document reviewed by the Journal found that Instagram shows more pornography, gore, and hate speech to young users compared to adults. The document revealed that teenagers reported exposure to bullying, violence, and unwanted nudity at significantly higher rates than older users. Teens encountered three times as many prohibited posts containing nudity, 1.7 times as much violent content, and 4.1 times as much bullying content compared to users over 30, the analysis showed.
These patterns were echoed in the manual tests run by the Journal and Edelson, which took care not to follow or search for specific content to avoid influencing the algorithm. The new accounts started with a mix of conventional videos such as comedy, cars, and stunts. However, after a few short sessions, the initial wholesome content gave way to a frequent stream of videos with sexual themes, showing women provocatively performing or graphically describing their anatomy.
In contrast, both teen and adult test accounts on Instagram received sexually suggestive content at similar rates, highlighting a lack of differentiation in content recommendations. In some cases, Instagram even recommended videos marked as “disturbing” to teen accounts, which Meta attributed to an error.
Meta officials have been debating how to tackle the problem of age-inappropriate content. In a meeting last year, top safety staffers, including Instagram head Adam Mosseri, discussed reducing the frequency of such content being shown to minors. While a proposal to build an entirely separate recommendation system for teens was suggested, Meta has not pursued it.
Read more at the Wall Street Journal here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.