Google I/O brought with it a lot of exciting updates for Google+, not the least of which were a slew of automatic improvements to Google+ Photos including Auto Highlight, Auto Enhance and Auto Awesome. But the updates didn’t stop when I/O ended last Friday.
Today, Google’s Search blog announced that the company has started implementing some impressive technology that will allow you to search for your photos based on what they contain visually, even if there’s not a tag in sight.
This new ability is apparently based on two types of tech: “computer vision and machine learning.” Together, these will recognize not people, but things in your and your friends’ photos, allowing you to search for those photos visually.
For example, you could type in “my photos,” “my photos of cars,” “sunset photos” (above) or even “Aaron Feinberg photos” and get tailored results (assuming you’re friends with photographer Aaron Feinberg on Google+):
Basically, Google can now recognize concepts like “sunsets,” “flowers,” and “the beach,” and embed that info into image metadata automatically. That way, you can find the right photo easily, even if it’s buried somewhere deep within your archives and hasn’t been tagged.
With the amount of photo uploads increasing daily, and some Google Glass-toting photographers claiming that they’re taking many more photos each day just because of how convenient the headgear is, the ability to dig through photos based on content is here not a moment too soon.
To learn more, head over to Google’s Search blog or visit Google and/or Google+ to give the new search functionality a try yourself.