Meta’s New Open-Source Content Mod Scans for Terrorism and Photo-Based Abuse
Meta has announced a new open-source content moderation tool that allows platforms to identify and remove a range of “violating content” at scale, including terrorism, child exploitation, and other types of abuse.
The new open-source content moderation software tool is being made available for free to help platforms identify copies of images or videos and take action against them en masse, Meta announced on its website today.
“We hope the tool — called Hasher-Matcher-Actioner (HMA) — will be adopted by a range of companies to help them stop the spread of terrorist content on their platforms, and will be especially useful for smaller companies who don’t have the same resources as bigger ones,” Meta says. “HMA builds on Meta’s previous open source image and video matching software, and can be used for any type of violating content.”
As explained by Engadget, the HMA will create a “hash,” otherwise known as a unique identifier, of each piece of content that can be saved into a shared database.
“Instead of storing harmful or exploitative content like videos from violent attacks or terrorist propaganda, GIFCT stores a hash, or unique digital fingerprint for each image and video,” Meta explains. The more companies that participate in this database the better it becomes.
“The more companies participate in the hash sharing database the better and more comprehensive it is — and the better we all are at keeping terrorist content off the internet, especially since people will often move from one platform to another to share this content,” Meta continues. “But many companies do not have the in-house technology capabilities to find and moderate violating content in high volumes, which is why HMA is a potentially valuable tool.”
Meta says that it spent about $5 billion globally on safety and security measures last year and over 40,000 people are working on it, hundreds of whom are dedicated to counter terrorisim specifically.
“Of course, we’re not perfect, and fair-minded people will disagree as to whether the rules and processes we have are the right ones,” Meta says. “But we take these issues seriously, try to act responsibly and transparently, and invest huge amounts in keeping our platform safe. Many of these issues go way beyond any one company or institution. No one can solve them on their own.”
Image credits: Header photo licensed via Depositphotos.