Report: Meta Struggling to Stop Child Sex Trafficking on Facebook and Instagram

meta

A new report by The Guardian reveals Facebook and Instagram have become marketplaces for child sex trafficking and parent company Meta is struggling to take sufficient action to prevent criminals from using its platforms.

In a recently-published investigation that took place over two years, The Guardian interviewed more than 70 sources, including survivors and prosecutors across the U.S., to understand how sex traffickers are using Facebook and Instagram.

The publication attempts to determine why Meta is able to deny legal responsibility for the trafficking that takes place on its platforms.

Facebook is the Most-Used Social Media Platform by Child Sex Traffickers

Under international law, children cannot legally consent to any kind of sex act. Therefore anyone who profits from or pays for a sex act from a child — including profiting from or paying for photographs depicting sexual exploitation — is considered a human trafficker.

In the two decades since the birth of social media, child sexual exploitation has become one of the biggest challenges facing tech companies. Human traffickers are using these platforms to access customers and find child victims.

According to a report by U.S.-based not-for-profit the Human Trafficking Institute, Facebook was the platform most used to groom and recruit children by sex traffickers (65%), based on an analysis of 105 federal child sex trafficking cases that year. The HTI analysis ranked Instagram as the second most used platform with Snapchat coming in third.

Traffickers can use social media platforms like Facebook and Instagram to post publicly about the girls they have available, and then change to private direct messages to discuss prices and locations with buyers.

Meta has numerous policies in place to try to prevent sex trafficking on its platforms. However while Meta says it is doing all it can, The Guardian claims it has seen evidence that suggests it is failing to report or even detect the full extent of what is happening.

Tina Frundt is the founder of Courtney’s House — one of Washington DC’s most prominent specialists in countering child trafficking and has worked with hundreds of young people who have suffered terrible exploitation at the hands of adults.

Frundt tells the publication that she is constantly asking Instagram to close down accounts and remove exploitative content of the children in her care.

A Meta spokesperson tells The Guardian that it has diligently responded to all requests from Courtney’s House. However, the company’s ability “to remove content or delete accounts requires sufficient information to determine that the content or user violates our policies.”

Frundt also had discussions with Instagram in 2020 and 2021 about conducting staff training to help prevent child trafficking on its platforms. She says the training did not proceed after Instagram executives refused to pay Frundt her standard fee of $3,000, allegedly offering $300 instead.

According to The Guardian, Meta did not deny Frundt’s claims.

No Legal Requirements

Another problematic issue is that, unlike child sexual abuse imagery, there is no legal requirement in the U.S. to report child sex trafficking to the National Center for Missing & Exploited Children (NCMEC).

This legal inconsistency is a major problem and the NCMEC must rely on all social media companies to be proactive in searching for and reporting sex traffic.

An NCMEC spokesperson told The Guardian that if social media companies are not reporting child sex trafficking, it allows the crime to thrive online.

According to records disclosed in a subpoena request seen by The Guardian, Meta reported only three cases of suspected child sex trafficking in the U.S. to the NCMEC between 2009 and 2019,

Not Enough Action

As well as software, Meta uses teams of human moderators to identify cases of child grooming and sex trafficking.

However, an anonymous Meta subcontractor tells The Guardian that months will often pass before any action can be taken against a child groomer or sex trafficker after a moderator passes a case on to the company. And other times, no action is taken at all.

The Guardian spoke to six moderators who worked for companies that Meta subcontracted between 2016 and 2022 who all made similar claims.

“On one post I reviewed, there was a picture of this girl that looked about 12, wearing the smallest lingerie you could imagine,” one former moderator recalls.

“It listed prices for different things explicitly, like, a blowjob is this much. It was obvious that it was trafficking.”

However, she claims that her supervisor later told her no further action had been taken in this case.

When The Guardian put forth these claims to Meta, a spokesperson replied that moderators do not typically get feedback on whether their flagged content has been escalated. Meta emphasized that if a moderator does not hear back about a flagged case, that does not mean no action has been taken.

Five of the moderators interviewed claimed that it was harder to get cases escalated or content taken down if it was posted on closed Facebook groups or Facebook Messenger.

While the law requires Meta to report any child exploitation imagery detected on its platforms, the company is not legally responsible for crimes that occur on its platform due to a law created almost thirty years ago.

Meta’s Response

“The exploitation of children is a horrific crime — we don’t allow it and we work aggressively to fight it on and off our platforms,” a Meta spokesperson tells The Guardian in response to the allegations in the extensive report.

“We proactively aid law enforcement in arresting and prosecuting the criminals who perpetrate these grotesque offenses. When we are made aware that a victim is in harm’s way, and we have data that could help save a life, we process an emergency request immediately.”

The full report is available to read on The Guardian.


Image credits: Header photo licensed via Depositphotos.

Discussion