PetaPixel

Google to Build a Photo Database In Effort to Rid the Web of Child Porn

google

Search giant Google has revealed that it’s working on a global database it expects to help rid the Internet of child porn.

The system will allow Google to swap information with other search providers and Web services on images that have been flagged by child abuse watchdogs such as the Internet Watch Foundation. That way, once an offending image is identified, it’s scrubbed from the entire Web, instead of the piecemeal approach used now.

3835133235_68974bd371_z

David Drummond, chief legal officer at Google, said the company has been flagging and blocking such images since 2008. But with an industry standard to mark such content, images will no longer be able to hide in nooks and crannies of the Web.

“This will enable companies, law enforcement, and charities to better collaborate on detecting and removing child abuse images,” Drummond told the Telegraph.

While it’s hard to imagine anyone short of NAMBLA objecting to this implementation of the technology, it causes one to wonder: what could happen if a similar system were to be used by watchdog agencies with less universally respected goals?

Could China use the system to wipe “the Web” of images of Tiananmen Square and Falun Gong? Beyonce to scrub the Internet of images showing the diva’s mysterious pelvic abilities?

(via The Telegraph)


Image credits: Google Logo in Building43 by Robert Scoble, Stop ignoring child abuse by quinn.anya


 
 
  • Josh Zytkiewicz

    I’d be interested in how they deal with false positives, how they classify the images, and how they treat images where the age of the subject is indeterminate.

    I’ve photographed a number of women who because of lucky genetics and lifestyle choices look younger than they actually are. Are those types of images going to be scrubbed from the internet by an overzealous watchdog?

  • Fuzztographer

    So, on other words, Google is building a database *of* child porn. What could possibly go wrong?

  • dave

    We are beginning to look and sound like Nazi Germany before the war. All I can say is be afraid, be very afraid.

  • geeves

    Google already does this with Youtube and I imagine you could build it via “fingerprints” of images and / or videos much like you do music identification apps like Shazam. Shazam doesn’t host all of the music of which they are capable of identifying, they are provided fingerprints and probably other metadata. That way they avoid legal copyright issues with the labels.

  • Andrew Iverson

    I’ve read too many stories of overzealous photo development places that have ruined lives that i can’t see this ending well. I also don’t see them stopping here, as “we are doing it for the children” always gets a pass, and then you find out it’s really much deeper than that. With the whole NSA thing, how do you keep it from doing the same with anyone that speaks ill of war or government?

  • SiriusPhotog

    Google is already doing this with YouTube. If you post any video with even the slightest of a nipple showing it gets deleted within hours. I don’t really see any difference.

  • gochugogi

    I don’t think they post their smut in public galleries for web crawlers to catalog. Don’t these pervs use closed networks, email, SMS, IM and FedEx to do their dirty deeds?

  • rankin

    except for red band movie trailers…

  • Burnin Biomass

    Well, it has not much to do with the photo development companies, as they do not prosecute anyone. When it comes to child pornography I’d rather have a lab report things on the edge, then let them pass. The police and DA decide if anything comes of it, not the lab.

  • Spongebob Nopants

    I found 2 pedos who favorited images of mine from a skateboarding event.

    The images weren’t remotely erotic but both the creeps who favorited it had gazillions of borderline favorites of young men and kids. No R or X rated nudity but a lot were very close. Gave me the creepy crawly hibbity jibbities.

    Called the flickr popo on them toot sweet.
    The images aren’t remotely erotic. Completely dressed teens in in various skateboarding modes. No close ups no stuck out underpants that teens seem to think is cool for some unfathomable reason. So instead of taking them down, so I decided to leave them up as a pedo trap. Heh heh.

  • Spongebob Nopants

    It’s a horrible horrible horrible idea. And not ust for the political reasons mentioned in the article.

    Governments will use it to prevent the spread of images of protests as soon as they are put on social media.
    In the hands of all those random activists it will create havok. Terrorists or criminals can use it to remove images of themselves.
    You don’t like X? Then you will be able to remove all images relating to X from the internet.
    Holocaust deniers would love to have that ability.

    Terrorists use stegonogrphy to hide messages in child porn. That’s why it’s often found on their computers. They use pedo sites because they think cops aren’t aware of them and the operators won’t look too closely at them. This method can be used to remove evidence.

  • Bart

    “That way, once an offending image is identified, it’s scrubbed from the entire Web*”

    *except from the Tor network, amongst others

  • Genkakuzai

    Yeah this isn’t going to end well… the whole “won’t someone think of the children?!” argument can’t be used to give everything a free pass in the long run…

  • Froggy

    Deep web.
    .onion

  • Mantis

    Um, maybe they just liked the skateboarding action in the photos?

  • Ken

    big supporter of child porn huh?

  • Mantis

    Look up the phrase “Mission Creep”.

  • Mark Dub

    Holy crap!! I guess Coppertone is gonna have to come up with a new logo. :P