Big players opt out of advertising on Instagram after a child predator test fails
Two of the major dating app companies – Bumble and Match – have paused their Instagram advertising. The reason is that tests that mimic the behavior of child predators utterly failed and led to ads being served alongside sexually explicit material (via 9to5Mac).
Other affected include Disney, Pizza Hut, and Walmart. As you know, these giants demand from social media platforms that their ads must not appear next to inappropriate content (for example, hate speech and sexually explicit material).
The Wall Street Journal has conducted a substantial experiment which they summarize like this: “Instagram’s Algorithm Delivers Toxic Video Mix to Adults Who Follow Children”.
WSJ “sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform. Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands”, the report says.
The Canadian Centre for Child Protection, a child-protection group, separately ran similar tests on its own, with similar results.
Meta said these tests produced “a manufactured experience that doesn’t represent what billions of users see” and declined to comment on why the algorithms compiled streams of separate videos showing children, sex and advertisements.
Match began canceling Meta advertising for some of its apps, such as Tinder, as early as October 2023. Also, Match has halted all Reels advertising and stopped promoting its major brands on any of Meta’s platforms. “We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” said Match spokeswoman Justine Sacco.
Robbie McKay, a spokesman for Bumble, said it “would never intentionally advertise adjacent to inappropriate content,” and that the company is suspending its ads across Meta’s platforms.
Other affected include Disney, Pizza Hut, and Walmart. As you know, these giants demand from social media platforms that their ads must not appear next to inappropriate content (for example, hate speech and sexually explicit material).
WSJ “sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform. Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands”, the report says.
The test goes on:
In a stream of videos recommended by Instagram, an ad for the dating app Bumble appeared between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff. In another, a Pizza Hut commercial followed a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl.
The Canadian Centre for Child Protection, a child-protection group, separately ran similar tests on its own, with similar results.
Meta said these tests produced “a manufactured experience that doesn’t represent what billions of users see” and declined to comment on why the algorithms compiled streams of separate videos showing children, sex and advertisements.
Match began canceling Meta advertising for some of its apps, such as Tinder, as early as October 2023. Also, Match has halted all Reels advertising and stopped promoting its major brands on any of Meta’s platforms. “We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” said Match spokeswoman Justine Sacco.
Things that are NOT allowed: