Google just rolled out a revolutionary feature to its Pixel devices — Google Lens. Previously, Lens had only been available from the Photos app, however currently, whenever you want to learn about something from the real world, you can just bring up Google Assistant, turn on the camera, then let Google’s famous AI analyze the scene.
For the time being, Google Lens will be only available on the Pixel, Pixel XL, Pixel 2, in addition to also Pixel 2 XL. So if you own one of Google’s recent flagship phones, just make sure to update the Google app, then check out This specific guide to see how Lens definitely works. There’s a handful of quirks in addition to also a few things you’ll need to know about, however we’ve got you covered below.
Step 1: Find the Google Lens Button
To try Google Lens, start by triggering your Google Assistant — which’s as easy as long-pressing your home button by any screen. by there, you’ll see the Google Lens icon from the bottom-right corner. Familiarize yourself with This specific button, as which’s how you’ll access Lens from the future.
the 1st time you access Google Lens, you’ll get a quick explainer. Beneath which, you’ll see a button which says “Tap to continue” — go ahead in addition to also press This specific button. At This specific point, you’ll be asked to give Google Lens permission to access your camera, which which needs in order to “see” the earth around you. So tap “Allow” on the popup, then you’ll be ready to get commenced.
The tutorial which appears on first-run mentions which the camera will continuously scan for items which Google Lens can identify, however I haven’t had success with This specific method yet.
At least for currently, the only way to get Lens to give you information about something which sees will be to line your camera up with the item, then tap the item on your screen.
Depending on the item, you’ll receive varying results. Some objects are natively recognized, in which case you’ll see little info cards as shown from the second screenshot above. additional results represent Lens’ best guess, in which case you’ll see buttons to perform a Google search for which term.
If Lens has an info card for an item which recognized, you can tap which to learn more about the item without even leaving the Google Assistant app.
Google Lens will be far by perfect. which struggles with random items by around the house, in addition to also which’s not very not bad at recognizing subtle differences between certain types of objects. however which definitely has its strengths.
Where Lens definitely shines will be its OCR (Optical Character Recognition). This specific has several great uses — which will translate text by another language (additional than your system default), let you easily Google a word without typing which, or even offer to import business cards as a contact.
Google Lens will be also great at scanning barcodes. Just line the barcode up on the screen, tap which, then you’ll get information about the product. You can also tap the Google button to do a search because of This specific item, which makes which incredibly easy to compare prices.
Lens can also recognize logos in addition to also book covers with precision. which’s not bad at recognizing plant in addition to also animal species, however not perfect. which should recognize any famous landmarks in addition to also artwork, which will be a great way to learn more about a place you’re visiting.
which said, you should still experiment with Google Lens on your own to get a feel for what which can in addition to also cannot do.
This specific next step will be very important — Google Lens will be using machine learning to recognize the earth around you, in addition to also the keyword there will be “learning.” which will get better as more people use the service in addition to also Google’s AI continues to train itself. however there’s a way you can help speed This specific process up.
When Google Lens recognizes something properly, make sure to give the result a thumbs up. When which’s wrong, do the opposite in addition to also give which a thumbs down. This specific can be done by tapping the thumb up/down icon after receiving a result, in addition to also which will greatly improve the service if everyone does the same.
Finally, there’s a pair of features which can help Google Lens see the earth a bit better. When an object will be too little or too far away, you can pinch your screen to zoom in, which helps Lens focus on the specific item you want which to recognize. in addition to also if which’s too dark for Lens to see anything, you can tap the flash icon from the top-right corner of the screen to light things up with your camera flash.
Lens will be still in its early days, however I can already tell which’s a lot more sophisticated than the old Google Goggles app. How has your Google Lens experience been so far? Let us know from the comment section below.