We Don’t Understand Privacy

L1080787-650x450

Over 1.5 billion people worldwide use the Facebook app on a monthly basis, and all of those people have opted in to Facebook’s privacy policy by the act of creating an account.

Facebook, like many large companies, has expended their fair share of effort to clarify and simplify the privacy settings for the massive amounts of data that falls under their custody. But let’s be honest, the majority of users don’t read the privacy policy, nor do they even understand what it is. We simply assume a “reasonable expectation of privacy.”

When I launched the Facebook app a few months ago, I was perturbed. At the top of the screen, the app displayed photos from my camera roll for posting.

I had previously granted the Facebook app access to my camera roll for what I thought was a singular use case: Namely, when I chose to share photos on Facebook, I could initiate that transaction. But now the app was making it more “convenient” for me by monitoring new photos that had been added to my camera roll, and suggesting that I upload them to Facebook.

fb-app

This change in app behavior is no doubt spurred in part by a material decrease in personal sharing on the service. As the service has embraced more professionally created/shared content, users have been sharing fewer personal moments—making Facebook less of a social network and more of a curated search engine.

Unclear to me is whether the photos that were appearing in the Facebook app had already been uploaded and analyzed by Facebook servers. And there’s no way to determine this from the Privacy Policy which ambiguously states “We collect the content and other information you provide when you use our Services … This can include in or about the content you provide, such as the location of a photo or the data a file was created.”

Does merely launching the app give Facebook the ability to scan and analyze my camera roll?

We don’t understand privacy.

This isn’t a theoretical issue. As we have recently seen, significant strides in machine learning have made it possible to identify things, locations and faces—and surprisingly it doesn’t take a full resolution photo to do so.

One service, Clarifai, only needs a 256px image to identify images with human-level accuracy (But all machine learning algorithms need a ton of data, so why not mine the billions of photos taken every day?) Now imagine what happens when photos you took that have legitimate reasons to not be seen by anyone else (e.g. embargoed news, behind-the-scenes, otherwise personal and private photos) are uploaded and analyzed by companies claiming that your privacy is preserved because they aren’t being shared.

Facebook’s virtual reality platform, Oculus Rift, has also raised a number of eyebrows with their terms of service including your IP address, “precise location,” and “physical movements and dimensions.” The potential privacy issues are significant enough for Senator Al Franken to send a letter asking for clarifications on data privacy.

We don’t understand privacy.

  • We don’t understand, in part, because the pace of technological change continues to outstrip our ability to comprehend its implications.
  • We don’t understand because companies are desperate to improve engagement more than they are concerned about tackling subtlety in privacy issues.
  • We don’t understand because the short term convenience muddies our ability to consider potential abuse.
  • We don’t understand because machine learning and networking algorithms can transform a seemingly innocuous piece of data (e.g. a photo) into something potentially powerful or destructive.
  • We don’t understand because companies don’t tell us the myriad of ways they might be using our data in supposedly “unidentifiable” ways.

In 2012, billionaire entrepreneur Michael Dell forced his daughter to shut down her Twitter and Tumblr accounts because she was manually posting her location and future schedule, which circumvented Dell’s $2.7m per year security detail. But nowadays, we don’t need a user to reveal their location to know where they are. We don’t even need a face to recognize them.

In some ways the problem seems intractable. Even if you’ve never participated in social media, your photos might be stored in the cloud with Apple or Google, who have more than enough metadata to use your data in ways that you would probably consider a violation. I suspect the solution is a combination of vigilance on the part of consumers to demand clarifications of privacy policies, and for the larger companies to install ombudsman to augment their own privacy advocates.

Of course, the solution will remain murky when we can’t even understand the scope of the problem in the first place.


About the author: Allen Murabayashi is the Chairman and co-founder of PhotoShelter, which regularly publishes resources for photographers. Allen is a graduate of Yale University, and flosses daily. This article was also published here.

Discussion