The nipple may soon be freed, at least on Facebook and Instagram. The oversight board for Meta, the parent company of both social media networks, advised the company to change its policy regarding bare chests.
Currently, only men (or those who the algorithm assumes are men) are allowed to show their nipples in photos, but women (again, those the algorithm assumes are women) are not.
The oversight board, which is made up of individuals from a variety of backgrounds including, law, technology, human rights, politics, and others, from around the world, announced the recommendation Tuesday after revealing that it overturned Meta’s decision to remove two posts in accordance with the policy. The posts, from 2021 and 2022, came from the same account and featured the same couple, who are transgender and nonbinary.
“Both posts feature images of the couple bare-chested with the nipples covered,” a release from the oversight committee read. “The image captions discuss transgender healthcare and say that one member of the couple will soon undergo top surgery (gender-affirming surgery to create a flatter chest), which the couple are fundraising to pay for.”
The posts were removed for featuring breasts and a fundraising link, but after the couple appealed the decision, the oversight board looked at the case and said the removal was made in error.
The case led the oversight board to the same conclusion that “Free the Nipple” advocates have been saying for years: the policy doesn’t make sense and doesn’t work in practice.
“The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests, to scenes of childbirth, and medical and health contexts, including top surgery and breast cancer awareness,” the release reads, adding that the “convoluted and poorly defined” exceptions make the policy “unworkable in practice.”
“Here, the Board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show,” the release states.
The full recommendation asks Meta to perform a “comprehensive human rights impact assessment” and create a plan to address harms identified before defining a “clear, objective, rights-respecting criteria to govern its Adult Nudity and Sexual Activity Community Standard, so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender.”
Further, the criteria for removal under the community standard should offer more detail, and moderator guidance reflecting new standards should be updated.
It’s Meta’s move now.