Fernanda Viégas and Martin Wattenberg are scientists by trade and artists at heart. They work as the leads of a Google research group in Cambridge, Massachusetts, and are constantly on the lookout for interesting (and artistic) ways to visualize data.
Back in 2011, they came up with an interesting project titled “The Art Of Reproduction,” which shows how digital reproductions of photographs (and paintings) found on the Internet are far from “truthful.” Read more…
Since NASA’s first mission to the International Space Station back in 2000, astronauts on board the artificial satellite have snapped over 1.1 million photographs. What’s neat is that every one of those photographs is available to the general public through a giant online database.
Open source rocket scientist Nathan Bergey decided to use his coding skills to do a little digging through the image archive, and ended up creating some beautiful visualizations showing where the images were shot in relation to our planet. Read more…
Want to see what it looks like for a photo to go viral on Facebook? Check out these visualizations by San Francisco-based studio Stamen Design, which took three of the most shared images on the social networking service — Marvin the Martian (visualized above), Famous Failures, and Ab Fab London, all shared by George Takei — and created a visualization using the data from the hundreds of thousands of shares. Read more…
The video installation “The Wizard of Oz Experiment” shows the movie “The Wizard of Oz“ 5829 times side by side. The movies are arranged in rows from left to right and time shifted by exactly one second each. The video starts at the top left with the first second of the film and finishes bottom right with the last second of the film. The projection is in a continuous loop that repeats every 98 minutes.
A computer voice speaks the whole subtitles of the film “The Wizard of Oz” in a 68-minute loop.
The 2-minute video above gives an idea of what the installation is like. It’s interesting seeing how the colors change throughout the film.
Not too long ago I finally got around to picking up a decent manual flash for exploring lighting and speedlight techniques. I picked up a Yongnuo YN-560 Speedlight Flash for Canon and Nikon, and my friend Sean was kind enough to send me his old radio triggers to play with. I was mostly all set to start exploring the world of off-camera lighting… Read more…
The folks over at Triposo wanted to know when people around the world take pictures, so they harvested the timestamps and geolocation data from photos shared on the Internet and created this beautiful visualization showing one year of photos taken around the world (be sure to watch it full screen and in HD). It’s neat seeing certain parts of the world light up with photo activity on special days. Read more…
Photo Stats is a new iPhone app that can help you visualize your iPhoneography habits by automatically generating interesting infographics showing things such as where you snapped photos and the time of day you shoot the most. You can buy it for $1 in the App Store.
Does anyone know of any programs that does the same thing for the photos on your computer? That would certainly be neat, and much more applicable to photo-enthusiasts.
Two weeks ago we posted on the Geotaggers’ World Atlas, a project by Eric Fischer that shows heat maps of where photographs are taken in big cities, created using geolocation data from Flickr and Picasa photos.
Fischer now has a new set of maps called Locals and Tourists that distinguish between photos taken by inhabitants of the city and others who are simply passing through.
Some people interpreted the Geotaggers’ World Atlas maps to be maps of tourism. This set is an attempt to figure out if that is really true. Some cities (for example Las Vegas and Venice) do seem to be photographed almost entirely by tourists. Others seem to have many pictures taken in piaces that tourists don’t visit.
Blue points are locals (determined by whether the person has a history of photographing in that city), red points are tourists, and yellow points indicate photos for which it cannot be determined.
In the future, we might be able to roam around a 3D virtual representation of our world, where everything you see was automatically generated from photographs taken at the real locations.
Vision researchers at the University of Washington and Cornell University have been working on turning photographs of things in the real world into 3-dimensional representations. This research could eventually turn snapshots into virtual buildings, neighborhoods, and possibly cities.
PhotoCity is a new online game created by researchers that aims to harness the power of crowdsourcing in order to obtain the photographs needed for reconstructing these locations. Here’s a short 1 minute introduction of how the game works:
In addition to playing with a 2 or 3 megapixel camera, they’ve also released an iPhone app:
We’re pretty darn excited to see how photography will play a role in the technologies we’ll be using in the future.