Google Lens is a visual search feature that has been designed to bring up relevant information using visual analysis.
Google’s AI assistant is smart, but till now she could only answer to questions which she could hear, not see. Well, that’s about to change. Remember Google Lens, the visual search feature that uses your phone’s camera to look up for things on the internet? That’s coming to Google Assistant on Pixel 2 and Pixel 2 XL.
Google has reportedly started rolling out the visual search feature Google Lens in Google Assistant for the first batch of Pixel and Pixel 2 smartphones.
According to 9to5Google, „The first users have spotted the visual search feature app and running on their Pixel and Pixel 2 phones.“
To recall, Google Lens was announced during Google I/O 2017 and was set for trial when the Pixel 2 and Pixel 2 XL (Review) were launched last month. It has been designed to bring up relevant information using visual analysis.
The feature is built into the Google Photos app and can visually recognise things like addresses and books among others.
The feature can be activated when viewing an image or screenshot in the Photos app. Apart from that, even if a user has the camera on, the AI helper can create searches around objects and text within the frame. For example, pointing the camera at a book cover or restaurant will bring up reviews and ratings for the book or restaurant.
However, in Google Assistant, the feature is integrated right into the sheet that pops up after holding down the home button.
„Lens was always intended for both Pixel 1 and 2 phones,“ Google had earlier said in a statement.
Currently, Google Lens is available only on Google Pixel phones (both generations), but it may not be long until we see this innovative search feature make it to other Android phones as well.