Google Lens Turns Your Phone Camera Into an AI-Powered Visual Search Tool

In addition to stepping on Photoshop’s toes, Google also spent some time at yesterday’s keynote showing off something called “Google Lens”: a technology that basically turns your phone’s camera into an artificially intelligent visual search tool.

Google Lens is going to work hand-in-hand with Google Assistant, using AI capabilities to identify objects in the world around you and project actionable information and about those objects onto your screen.

From identifying what kind of flower (or exotic bug that just bit you…) you’re looking at, to copying relevant text automatically, to scanning your environment to pull up restaurants reviews in real time, you can see how Google Lens will work in this demo posted to YouTube by Phandroid. The demo starts with that object removal tech, but quickly moves on to Google Lens from there:

This technology essentially turns your smartphone’s camera into a tool to analyze and act on the world around you in an instant. Buy concert tickets by pointing your phone at a poster for the show, or find out how much that $5 thrift store camera is REALLY worth by simply pointing your smartphone camera at it.

This tech is reportedly being integrated into both Google Assistant and Google Photos, but just like the object recognition feature, we have no idea when exactly it’s going to be added or how it’ll work when it does. We’re excited to find out though.

Discussion