Google Live Search Uses AI to Answer Questions About What Your Camera Sees

A person’s hand points toward city buildings as seen through a smartphone screen interface, with video and upload icons overlayed and yellow taxis visible in the background.

As part of its myriad announcements at I/O 2025, Google unveiled significant changes to Search, including the ability to use AI to search for what a smartphone camera sees in real-time.

This new “Live Search” technology builds upon Google’s Project Astra, which it began rolling out a couple of months ago after initially unveiling it last year. Essentially, Google has developed an AI that can “see” the world through a user’s camera.

Project Astra also builds upon another of Google’s popular tools, Google Lens, which more than 1.5 billion people use to search for what they see every month. While Google Lens has traditionally worked with an image, Project Astra can work live, continuously, via a camera feed. By combining this with Google’s broader Search enhancements, including conversational search, users can learn about what their camera is pointing at in real-time.

Google offers an example in a blog post of a person building a Popsicle stick bridge, a common school-age physics experiment. When pointing their camera at the bridge, the user asks, “What else should I do to make it stronger?” Google’s real-time AI replies with advice, including utilizing triangular structures to improve the bridge’s strength.

“If you’re feeling stumped on a project and need some help, simply tap the ‘Live’ icon in AI mode or in Lens, point your camera, and ask your question,” Google explains. “Just like that, Search becomes a learning partner that can see what you see — explaining tricky concepts and offering suggestions along the way, as well as links to different resources that you can explore — like websites, videos, forums and more.

As PetaPixel wrote in March, something like Project Astra could be useful for photographers while out in the field. The AI can evaluate a scene and offer advice on which colors would go best with certain backgrounds, provide lens suggestions, and even provide in-the-field advice on operating a specific camera and dial in the proper settings.

The Gemini app on Android and iOS will also soon allow users to share their screen with Google, so it can analyze that content in real-time, in addition to what the user’s camera is pointed at.

These new features fall under the umbrella of Google “Live Search,” which will arrive “later this summer” after a beta test, which will be available to Google Labs members.


Image credits: Google

Discussion