Remember the hoopla last year after artist/programmer Kyle McDonald installed an app on Apple store computers to secretly snap portraits of customers? Outcries of “invasion of privacy” sprang up everywhere, and Apple got the Secret Service involved in putting an end to it. Well, photographer Irby Pace has done something similar, but instead of secretly capturing images, Pace simply visits Apple Stores and harvests self-portraits “abandoned” on the devices. Pace collected over 1,000 images in 2010 by emailing and texting them to himself, and is currently displaying them in a gallery exhibition titled “Unintended Consequences”.
A recently discovered flaw in Facebook’s abuse reporting tool allowed anyone to access private photographs of other users, including Facebook founder Mark Zuckerberg. Until it was fixed today, the reporting tool allowed anyone who reported a public photograph’s owner to also peruse that user’s images, both public and private. After members of a bodybuilding forum discovered the security hole, they proceeded to target Zuckerberg’s account and publish a number of his private photographs online. This comes a week after the FTC slapped Facebook’s wrist over deceptive privacy practices.
There was a minor hoopla yesterday after Boing Boing shared that mugshot photos of arrested Occupy Portland protesters were being uploaded by the Portland Police Department to Facebook. The police department quickly explained that it’s their standard practice to publish mugshots that are of media interest. However, many people are still uncomfortable with the idea of Facebook being used as a way to share mugshots. Stan Horaczek at PopPhoto writes,
While it doesn’t seem that there’s anything legally wrong with the photos ending up where they are, it is a little…creepy. Facial recognition software is getting scary accurate and with something as simple and straight forward as a mugshot, any program looking for a person on the web would almost certainly be able to find them without any trouble.
Flickr introduced a novel privacy feature yesterday called “geofences”, which lets you hide the location data of photos taken in certain locations from the general public. It seems like a great idea, but blogger Thomas Hawk points out that there’s a pretty big loophole in the system:
Although the geotag information is indeed pulled from the flickr photo page, ANYONE can potentially still get your geolocational data simply by downloading the original sized file and looking into the EXIF data.
This means the geofence feature doesn’t actually wipe the geotag information from the photos you upload, but simply prevents the data from being displayed in an easy-to-view format on the Flickr site. If you make the original versions of your photos available for download, the general public can still access the location data found in those. To close the loophole, simply make it so people can’t download your originals.
Flickr introduced an innovative location-based privacy feature today called “geofences“. It’s a way of assigning default privacy settings to certain locations for geotagged photographs. For example, you can assign a geofence with a certain radius around your home, and automatically set those photos’ location data to only be visible to your friends and family. Each user can have up to 10 geofences, and existing photographs are automatically updated to new geofence privacy settings.
Facial recognition features are appearing in everything from cameras to photo-sharing sites, but have you thought about the different security and privacy concerns it introduces? Fast Company has published a piece on how mobile apps in the future may be able to quickly look up your identity, your personal information, and perhaps even your social security number!
[CMU researchers] used three relatively simple technologies to create their face recognition system: An off-the-shelf face recognizer, cloud computing processing, and personal data available through the public feed at social networking sites such as Facebook [...] Combining the data gathered from the face recognizer hardware with clever search algorithms that were processed on a cloud-computing platform, the team has performed three powerful experiments: They were able to “unmask” people on a popular dating site where it’s common to protect real identities using pseudonyms, and they ID’d students walking in public on campus by grabbing their profile photos from Facebook.
Most impressively the research algorithm tried to predict personal interests and even to deduce the social security number of CMU students based solely on an image of their face–by interrogating deeper into information that’s freely available online.
If ordinary citizens have the right to photograph police in public places, what about the other way around? That’s a question that’s sure to be asked often in the coming days, as 40 law enforcement agencies across the US are planning to use iPhones to photograph civilians for the purpose of identifying wanted perps. The system, called Mobile Offender Recognition and Information System (MORIS), costs $3,000 apiece and will be able to do facial recognition searches on a database of known criminals. Photographers’ rights will apply to cops too — police won’t be required to ask permission before snapping a photograph of your face!
In the United States, anyone can be photographed in most public places without their consent… as long as they don’t have a reasonable expectation of privacy. A female traffic magistrate named Rhonda Hollander was arrested last week after following a man into a courthouse bathroom and photographing him as he used the urinal.
When Broward Sheriff’s Deputy Darlene Harden confronted Hollander a short time later, the magistrate admitted taking a picture but refused to turn over her phone, arguing that it was a public restroom and she was not violating any laws, according to the report. [#]
Other places where people have an expectation of privacy include homes, dressing rooms, medical facilities, and phone booths. Basically, it helps to have some common sense.
On August 4, 2006, AOL published a text file containing 20 million searches done by 650,000 users over a 3-month period for research purposes. Although the company anonymized the data by showing the users as numerical IDs, people soon realized that many people searched for personally identifiable information (e.g. their names), allowing real names to be put to unique IDs, thus revealing the search history of that individual. After the media caught wind of this, the whole thing was known as the AOL search data scandal. Read more…