Instagram Reels served ‘risqué footage of children’ next to ads for major companies: report

Instagram Reels served ‘risqué footage of children’ next to ads for major companies: report

Instagram’s Reels video feed reportedly recommends “risqué footage of children as well as overtly sexual adult videos” to adult users to follow children – with some of the disturbing content placed next to ads from major companies.

In one instance, an ad promoting the dating app Bumble was sandwiched between a video of a person caressing a “life-size latex doll” and another clip of an underage girl exposing her midriff, according to the Wall Street Journal, which set up test accounts to probe Instagram’s algorithm.

In other cases, Mark Zuckerberg’s Meta-owned app showed a Pizza Hut commercial next to a video of a man laying in bed with a purported 10-year-old girl, while a Walmart ad was displayed next to a video of a woman exposing her crotch.

The shocking results were revealed as Meta faces a sweeping legal challenge from dozens of states alleging the company has failed to prevent underage users from joining Instagram or to shield them from harmful content.

It also comes on the heels of dozens of blue-chirp firms pulling their advertising from Elon Musk’s X platform after their promos appeared next to posts touting Adolf Hitler and the Nazi party. The exodus is expected to reportedly cost the site formerly known as Twitter as much as $75 million in revenue this year.

Meta claimed that just a fraction of its video views contain content that violates its policies. REUTERS

Meta now faces its own advertiser revolt after some companies cited in the study suspended ads on all its platforms, which include Facebook, following Monday’s report by the Journal.

The Journal’s test accounts followed “only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.”

“Thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults,” the outlet found.

The Reels feed presented to test accounts became even more disturbing after the Journal’s reporters followed adult users who were already following children-related content.

The algorithm purportedly displayed “a mix of adult pornography and child-sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act.”

When reached for comment, a Meta spokesperson argued the tests were “a manufactured experience” that does not reflect the experience of most users.

“We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it,” a Meta spokesperson said in a statement. “We continue to invest aggressively to stop it – and report every quarter on the prevalence of such content, which remains very low.”

“Our systems are effective at reducing harmful content and we’ve invested billions in safety, security and brand suitability solutions,” the spokesperson added. “We tested Reels for nearly a year before releasing it widely – with a robust set of safety controls and measures.”

Meta noted that it has approximately 40,000 employees globally dedicated to ensuring the safety and integrity of its platforms.

Instagram is already under fire for allegedly failing to protect teen users from harmful content. fizkes – stock.adobe.com

The company asserted that the spread of such content is relatively small, with just three to four views of posts that violate its policies per every 10,000 views on Instagram.

However, current and former Meta employees reportedly told the Journal that the tendency of the company’s algorithms to present child sex content users was “known internally to be a problem” even before Reels was released in 2020 to compete with popular video app TikTok.

The Journal’s findings followed a June report by the publication that revealed Instagram’s recommendation algorithms fueled what it described as a “vast pedophile network” that advertised the sale of “child-sex material” on the platform.

That report prompted Meta to block access to thousands of additional search terms on Instagram and to set up an internal task force to crack down on the illegal content.

Nonetheless, several major companies expressed outrage or disappointment over the company’s handling of their ads – including Match Group, the parent company of Tinder, which has reportedly pulled all of its ads for its major companies from Meta-owned apps.

Most companies sign deals stipulating that their ads should not appear next to sexually-charged or explicit content.

“We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” Match spokeswoman Justine Sacco said in a statement.

The Wall Street Journal set up test accounts to examine the Instagram Reels algorithm. AP

Bumble spokesman Robbie McKay said the dating app “would never intentionally advertise adjacent to inappropriate content” and has since suspended advertising on Meta platforms.

A Disney representative said the company had brought the problem to the “highest levels at Meta” to be addressed, while Hinge said it will push Meta to take more action.

The Canadian Center for Child Protection, a nonprofit dedicated to child safety, purportedly got similar results after conducting its own tests. The Post has reached out to the group for comment.

“Time and time again, we’ve seen recommendation algorithms drive users to discover and then spiral inside of these online child exploitation communities,” the center’s executive director Lianna McDonald told the Journal.

Check Also

Restaurant gives free bottle of wine if you give up phone

Restaurant gives free bottle of wine if you give up phone

Wine not? Parting ways with your cellphone actually pays off at this restaurant in Italy. …

Leave a Reply

Your email address will not be published. Required fields are marked *