Meta’s Trusted Partner Program is Failing Miserably: Report

Meta's Trusted Partner program is failing

A new study by Internews shows that Meta’s Trusted Partner program is coming up woefully short in its mission to protect people.

As seen on The Verge, the new report claims that according to some members of Meta’s Trusted Partners program, it is understaffed, lacks the necessary resources to live up to its promises, and is potentially even costing people their lives.

What is Meta’s Trusted Partner Program?

That is a lofty claim, so it is prudent to provide some basic background information about what Meta’s Trusted Partner program is, or at least what it is designed to be.

Per Internews, Meta’s network of Trusted Partners includes more than 460 non-governmental organizations (NGOs), humanitarian agencies, human rights defenders, and researchers from 122 countries. Partners include local and global organizations, including Internews.

“We partner with expert organizations that represent the voices and experiences of marginalized users around the globe and are equipped to raise questions and concerns about content on Facebook and Instagram to: Address problematic content trends and prevent harm, foster online safety and security, and inform the development of effective and transparent policies,” Meta promises.

Meta's Trusted Partner program is failing

Meta Fails to Deliver

According to Internews and many sources in the Trusted Partners program, Meta is not achieving its goals, costing marginalized people dearly.

It does not appear that the program’s shortcomings are solely due to a lack of effort on the part of Meta employees but rather a lack of employees themselves.

“The new research found that many of the most severe operational failures of the Trusted Partner program appear to relate directly to a lack of resourcing and are further exacerbated by recent layoffs at Meta,” Internews explains.

These issues may get worse as Meta works to expand its sphere of influence with its new Twitter (or X) competitor, Threads.

“Trusted flagger programs are vital to user safety, but Meta’s partners are deeply frustrated with how the program has been run. As Meta launches Threads to be a ‘public square for communities,’ can we trust the company to operate it responsibly? This research suggests more investment is needed to ensure Meta’s platforms are safe for users,” says Rafiq Copeland, Platform Accountability Advisor at Internews and author of the new report.

Meta's Trusted Partner program is failing

Key Findings in the Report

In the study Safety at Stake: How to Save Meta’s Trusted Partner Program, Copeland outlines that Meta advertises the Trusted Partner program as an emergency channel but fails to provide the necessary resources for the network to respond appropriately in emergencies.

According to many program members, in some cases, Meta takes weeks or months to respond to reports, even in situations where partners outline the risk of significant and imminent harm.

Copeland says that there are roughly 33 Trusted Partner reports submitted per day, but Meta has not been able to explain how many employees work full-time on the program and the submitted reports.

“We and other Trusted Partners stand ready to work productively with Meta to address the issues raised in this review,” Copeland says. “It is our hope that this program and others like it can be reinvigorated. People’s lives depend on it.”

Internews was one of the first members of Meta’s program when it joined in 2018. Since then, the Trusted Partner program has experienced “consistent issues with responsiveness, accessibility, transparency, and accountability to its partners.”

Disproportionate Responses Based on Region

The full report is lengthy and worth reading, but some highlights are essential to consider.

“The Trusted Partner Channel saves lives. It is literally a lifeline. And right now it is broken,” explains one Meta Trusted Partner.

Another adds, “I wish we’d never joined to be honest. It is worse than before. Often, we never receive any response, and when we do it can be five months later. It is worse than nothing.”

“Trust? What Trust? They don’t trust us, and so we don’t trust them. There is no trust,” says another partner.

These partners must not have been sending reports about the war in Ukraine because Internews explains that reports concerning that conflict seem to receive relatively prompt replies.

“Whilst Ukrainian partners can expect a response within 72 hours, in Ethiopia equivalent reports relating to the Tigray War can go unanswered for several months,” Copeland says.

Of the 24 Trusted Partners who contributed to Internews’ report, the only one who says they have never failed to receive a response to a report from Meta is a Ukrainian partner. It would be nice if everyone could expect this response rate, no matter where they are located.

Meta's Trusted Partner program is failing

Lives on the Line

“Given the potentially life-and-death nature of the issues reported through the Trusted Partner program, it is notable that Meta has failed to meaningfully address these issues or consult widely and formally with the Trusted Partner organizations participating in the program. Internews has raised these issues with Meta regularly since 2019. In 2021 Internews proposed to conduct a joint review of the program in active collaboration with Meta, a proposal which Meta initially agreed to and then eventually (in 2022) declined,” Copeland explains.

Members of the Trusted Partner program put themselves at great risk to participate. They assume an organizational and operational burden to work with Meta and its large influence to save people’s lives in emergencies.

The time that members spend gathering information, writing reports, and chasing down assistance from Meta is time that they could spend on other ways to protect people. Further, working with Meta, or at least trying to, can put a target on people’s backs. When someone is reporting on extremist activity in war-ravaged regions, they are themselves at risk.

Consider partners in Ethiopia, where the Tigray War directly resulted in the deaths of around 600,000 civilians in 2021 and 2022. It is the deadliest war of the 21st century.

Partners in Ethiopia say it takes weeks or months to receive any sort of response to the reports they submit. The reports concern topics such as direct threats, disinformation, incitement, poor policy enforcement on social media, compromised accounts that disseminate important information, and much more.

These are vital issues, and it is unclear why they are not receiving the same treatment that equivalent reports from Ukraine receive.

Lack of Transparency

The poor response times to nearly all reports are problematic, but perhaps as concerning is that Meta is not being transparent about its program or why it is allegedly failing.

“We acknowledge the variety of Partner experiences documented in the Report, and we are committed to continue improving training resources and ingestion systems to address these outliers and strengthen the program. However, the reporting issues of the small sample of Trusted Partners who contributed to the Report do not, in our view, represent a full or accurate picture of the program,” Meta tells Internews.

Meta has failed to provide data that supports its claim that Internews’ report represents outliers.

To its credit, Meta has also said that the company is working to “develop new methods of sharing information about the overall impact and performance of the Trusted Partner program.” However, Meta adds the caveat that data protection laws may affect the information it can share, which Internews believes may be deliberate misdirection.

Meta blamed poor response times within its Trusted Partner program on the COVID pandemic. The fact that response times remain poor now lends little credence to that defense.

Meta's Trusted Partner program is failing
Meta has taken a closed-door approach with its Trusted Partner program.

‘Most Partners Feel That Meta’s Communication is Often Perfunctory and Dismissive’

When Meta does respond, some partners interviewed by Internews suggest that replies are dismissive.

“They treat it like a privilege to have this communication with them. They explicitly said that it is a ‘privilege’ that we have this connection with them. They said that to us. It is because they view it as a privilege they feel they don’t have to respond,” one partner tells Internews.

Some people receive correspondence that indicate the person who is responding to the report did not understand the report in the first place. Some Meta replies, even ones that take months to receive, come across as impersonal and formulaic.

“Waiting for someone to respond a month or so, and then they ask for more details on the report that we sent a month earlier. Which means we have to go through every link that we sent a month ago. We have to face every kind of emotional feeling again. Meanwhile people have been killed or harmed in multiple ways while the links are there and being spread,” says a Meta Trusted Partner.

The Harm of Misinformation

The COVID-19 pandemic demonstrated the harm of misinformation. The first pandemic of such magnitude to occur during the social media age, misinformation on platforms such as Facebook and Instagram affected people’s mental health, emotional well-being, and the actions people were willing to take in the face of uncertainty. In some cases, people were scared off vaccines and other medical treatments. It cost people their lives.

In other situations, people use Meta’s social media platforms to spread hate speech, encourage violence, and reveal critical information about people that puts individuals at risk of immediate harm. Part of the Trusted Partner program’s idea is for partners to have a fast, direct line to Meta staff to remove dangerous content.

If it takes weeks or even months for someone to read a report that a partner files, let alone meaningfully act on the information, it may be far too late to prevent harm. Something must change, and it must change quickly.

Meta's Trusted Partner program is failing

A Damning Failure of the Trusted Partner Program

Before discussing Copeland and Internews’ recommendations for Meta’s Trusted Partner program, a brief digression is in order, as it directly relates to Meta’s woeful response times to partners in Ethiopia. The tragedy demonstrates the harm of misinformation and the potential benefit that the Trusted Partner program could offer, if it worked well.

During the bloody Tigray War in Ethiopia, 60-year-old Professor Meareg Amare was targeted by users on Facebook. On October 9, 2021, a post about Amare appeared, alleging that the professor, who had written chemistry textbooks for students and advocated for young people to learn more about science, had carried out abuses while hiding at Bahir Dar University in Bahir Dar, Ethiopia.

There is no evidence that Amare committed any such abuses. However, he was Tigrayan, and that meant that he was a potential victim of the ethnic cleansing that occurred during the war.

The day after that first short post appeared on Facebook, a longer one was published that claimed that Professor Amare was involved in massacring civilians in the Amhara region of Ethiopia, where Bahir Dar is the capital, and claimed that Amare had embezzled money from the university for the benefit of Tigrayan rebel forces.

This inflammatory post also included information about where Amare and his family lived.

Responses urged action to “clean” Amare and his family.

Amare’s son, Abrham, reported the posts to no avail, and they remained up.

A Trusted Partner had also seen the posts and flagged them and submitted reports.

Per reporting by Insider, the partner claims that Facebook told them that the company is not an “arbiter of truth.” Insider was unable to corroborate this troubling claim.

Professor Amare remained steadfast in his innocence and the belief that this, along with his reputation of service within the community, would protect him from harm.

On November 3, 2021, Professor Amare was murdered outside of his home by men wearing Amhara Special Forces uniforms. He was shot in the shoulder and leg. The uniformed men surrounded Amare, reportedly chanting the slander that had been posted on Facebook, and prevented anyone from providing aid or taking Amare to the hospital.

He bled to death over seven hours, according to a lawsuit that Abrham Amare has brought against Meta.

“Eight days later, Facebook replied at last to Abrham Amare’s complaint. Meta declined to discuss the case, saying it does not discuss pending litigation. But according to Abrham Amare’s lawsuit, Facebook informed him that post about his father was in violation of its community standards and had been removed,” explains Insider.

If Copeland’s report is accurate, what happened with Professor Amare, as awful as it is, is not an outlier.

This failure of the Trusted Partner program prompted many human rights and journalism institutions to sign an open letter to Meta.

Meta's Trusted Partner program is failing

The Report’s Recommendations are Reasonable but Serious

Copeland offers Meta many recommendations to improve the Trusted Partner program and ensure it can live up to its goals. Among the suggestions are a complete program overhaul in concordance with requests from Trusted Partners, improved response times, more resources and staff, targets and transparency improvements, better emergency response, and more.

Significantly more information from Trusted Partners and Meta’s various responses to Internews’ requests are available in the full report.

Perhaps Meta is correct, and some of the report’s claims are misleading or outright wrong. However, The Wall Street Journal’s excellent investigation, “The Facebook Files,” suggests otherwise.

The reporting details many ways that people inside Meta raised concerns about how its platforms were being used to commit and encourage all kinds of atrocities, including human trafficking, inciting violence against ethnic minorities, selling organs, and encouraging action against political dissidents.

However, employees found that, in most cases, their bosses did little to nothing in response.

The Wall Street Journal also reports that internal Facebook documents show that the company has prioritized growing and maintaining its user base over preventing harmful and inflammatory content from spreading.

Despite what Meta might say, there are countless other examples of ways that Facebook has been used to harm specific groups of people. Further, Meta could have worked alongside Internews for its report on the Trusted Partner program. After all, that was the plan in 2021 before Meta pulled out late last year.

Image credits: Header photo licensed via Depositphotos.