A few weeks ago, Google revealed it had nearly completed work on the Google Lens app. With the integration in Google Assistant, Lens can go even further and provide quick help based on its analysis. The feature was previously only available in Google Photos, which required taking a photo, going to the Photos app and then getting additional information about the photo using Google Lens.
In recent days, we have seen the Google Lens icon pop-up at random in Google Assistant on a handful of Pixel phones, leading us to assume that/wonder if the rollout of its full feature set had begun. Lens can also identify notable landmarks and can pull up information websites and media for art, books, and movies by pointing the camera at film posters, book covers, and museum installations.
This is all according to the Google Community Manager Orrin Hancock: "Hey all, We're rolling out a software update in the coming weeks which eliminates a faint buzzing sound on some Pixel 2 devices when the phone is placed to your ear during a phone call". You can also do things like scan a painting to learn more about the artist.
When you bring up Google Assistant, you'll now see a Lens icon near the bottom right of your screen.
Toyota's T-HR3 robot mimics your hand, arm and foot movements
The user wears data gloves and an HTC Vive VR headset that's linked to cameras to show the robot's perspective. Since the 1980s, Toyota has been developing industrial robots to enhance its manufacturing processes.
Landmarks: Recognize landmarks and learn about their history.
Use Google Lens to scan a barcode and it will instantly give you additional information about the product.
This means Pixel owners will be able to use Google Lens with their smartphone camera in real-time, rather than simply using the feature to analyze photos they've previously taken.
If you have "Web & App Activity" enabled, all your Google Lens activity, together with the photo will be saved in your Google Account.