PhotoDNA Lets Google, FB and Others Hunt Down Child Pornography Without Looking at Your Photos

PhotoDNA

Earlier this week it came out that Google turned over a man whose emails had contained an unstated amount of child pornography. And while the world as a whole seemed glad to have the perpetrator caught, there was some concern as to how whether Google dug through his emails to find these images, effectively killing the privacy of email.

However, it’s through a dedicated software that uses unique hashtags of sorts that drew Google to outing this individual. It’s called PhotoDNA and is developed by none other than Microsoft.

Gmail_logo

Working closely with the National Center for Missing and Exploited Children’s Cybertipline Child Victim Identification Program, Google, Facebook, Twitter, Bing, OneDrive and a number of other high profile sites use PhotoDNA to track down illicit photos. Using a database of known images, PhotoDNA runs only the metadata of images through for comparison, without ever actually touching someone’s inbox.

As described in a blogpost from Google,

Since 2008, we’ve used “hashing” technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique ID that our computers can recognize without humans having to view them again. Recently, we’ve started working to incorporate encrypted “fingerprints” of child sexual abuse images into a cross-industry database. This will enable companies, law enforcement and charities to better collaborate on detecting and removing these images, and to take action against the criminals.

As mentioned above, Google isn’t the only service utilizing PhotoDNA. After reports of Google using the service came out, Facebook also confirmed that it keeps a lookout for sexually exploitive photos of children. In speaking with SlashGear, a Facebook spokesperson said, “There is no place for child exploitative content on Facebook. We use PhotoDNA to check that each image which is uploaded to our site is not a known child abuse image.”

This sort of technology is used far beyond just the scope of child exploitation, as it’s almost identical to the method Dropbox uses to detect when there’s copyrighted content being shared across its servers. However, Google has stated that they only use this technology to track cases of child sexual abuse.

So, while we were most definitely relieved to see the offender tracked down and turned over, we can now also rest assured that Google isn’t systematically going through our inboxes, searching through our private images.

Discussion