Instagram Serves Up ‘Risque Footage of Kids’ to Adults Who Follow Children

Instagram on a phone

An investigation has found that the Instagram algorithm pushes suggestive footage of children and overtly sexual videos to adults on the platform who only follow children.

The report by The Wall Street Journal suggests that Instagram is pushing perverted content to people who may well be perverts amid advertisements for brands, some of which, have paused ad spending with Meta amid the fallout.

Methodology

The Journal set up Instagram accounts on newly purchased smartphones and only followed teen and preteen influencers as well as young gymnasts and cheerleaders.

The paper claims that the Meta-owned platform served up “jarring doses of salacious content” to those accounts and it got even worse when they began following followers of the young influencers’ accounts.

The inquiry was apparently sparked after The Journal noticed that thousands of followers of these young people’s accounts included “large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults.”

Results

The Journal highlights their findings and how they relate to big advertisers; in one case, a video of a man lying on a bed with his arms around a “10-year-old girl” preceded a Pizza Hut commercial.

The Canadian Centre for Child Protection ran their own tests independently and got similar results.

An expert on algorithms tells The Journal that Meta’s behavioral tracking has learned that some Instagram users following young girls want to see videos that sexualize children and therefore serves them that content.

“Niche content provides a much stronger signal than general interest content,” says Jonathan Stray, senior scientist for the Center for Human-Compatible Artificial Intelligence at the University of California, Berkeley.

Fallout

Match, the company behind dating apps like Tinder, has paused all of its Reels advertising saying “We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content.”

Disney ads also appeared among the dubious content and the company says they are addressing the issue at the “highest levels at Meta.”

Meta says that the test does not represent what billions of its users see but declined to comment on why the algorithms compiled videos showing children, sex, and advertisements.


Image credits: Header photo licensed via Depositphotos.

Discussion