Meta’s independent Oversight Board says the company can’t start rolling back its efforts to remove misinformation, especially when it comes to Covid misinformation.
The recommendation comes after Meta asked the Oversight Board, which operates separately from the social media company that owns Facebook, Instagram, and WhatsApp, to review “whether it should continue to remove certain categories of COVID-19 misinformation, or whether a less restrictive approach would better align with its values and human rights responsibilities,” according to the Board’s release.
The organization was clear in its response: “…as long as the World Health Organization (WHO) continues to declare COVID-19 an international public health emergency, Meta should maintain its current policy.”
This means continuing to remove any Covid misinformation “likely to directly contribute to the risk of imminent and significant physical harm,” while also preparing for what to do if and when the WHO lifts the international public health emergency designation. But that doesn’t mean the Oversight Board isn’t recommending any changes. It did say that Meta should reassess the 80 Covid-related claims that the company removes. In its response, the Board noted that Meta has not consulted with public health professionals to reevaluate the neither the overall policy nor the claims it removes.
“It should therefore begin a transparent process to regularly review the 80 claims subject to removal, consulting with a broad range of stakeholders,” the recommendation clarified. “Only when stakeholders provide clear evidence of the potential of a claim to cause imminent physical harm is it justifiable to include it on the list of claims subject to removal. Meta should share with the public the outcomes of these periodic reviews.”
Within this, the Oversight Board stated that Meta should provide transparency on removals. As the reality of the Covid pandemic changed since early 2020, the Board noted that Meta should continue considering whether the 80 claims it removes still align with its reasoning for removal, and whether it poses a risk of imminent and significant physical harm. Claims that do not meet such a threshold should not be included on the list of claims to remove. Furthermore, the results of periodic reviews of this list should be shared with the public, according to the Board.
“To meet its human rights responsibilities, Meta must also make sure its rules are clear to users. To this end, the company should explain how each category of the COVID-19 claims it removes directly contributes to the risk of imminent physical harm. It should also explain the basis of its assessment that a claim is false and create a record of any changes made to the list of claims it removes,” the Oversight Board said in its statement.
This doesn’t mean letting misinformation that does not meet such a specification run rampant, either. It noted that such content can be sent to third-party fact-checkers, reviewed, and labeled with context. These posts can also be “demoted,” appearing lower in people’s feeds.
In the same vein, the independent organization advocated for another, less direct way of combating Covid misinformation. It stated that Meta should support independent research and understanding of Covid misinformation, namely its data on the subject more accessible.
Image credits: Header photo licensed via Depositphotos.