A few months before, Google has announced a new app called Google Lens at its I/O developer conference. With the Google Lens, you can also able to point-out your phone at actual things around you and instantly view useful information about them.
Google established a trickle of extraordinary examples at the event:
- In one word, Google Lens recognized the name of a flower, which could be nearby with summer everywhere at the corner.
- In another word, jaw-dropping example, a user pointed Google Lens at a W-Fi router, and it will automatically grab the username and password using optical character recognition.
- It can also reflect extra info, such as GPS location data, to work out where you are and bring up info for detailed branches of cafés and shops nearby you, as well as reviews and opening hours.
Now, On October 4th Google pixel 2 and pixel 2 XL launch event, Google has reveal more specifics about Google Lens as well as how it would primarily launch in Google Photos for Pixel devices. After launching with the Pixel 2 and Pixel 2 XL, Google Lens is starting to perform for those on last year’s devices as part of a “Pixel promo.”
After that, some Pixel users (via Reddit) were received with a new swift when introductory Google Photos. The display for the “Google Lens Pixel preview” well-known features like:
- Copy text like phone numbers, dates, and addresses
- Learn more about landmarks
- Look up books, music albums, movies, and artwork
After their users can “Get started” or stop and system at one more time. After allowing, the “Information” sign in an image is moved to the overspill list of options and swapped by one for Google Lens.
Tapping will study the image by Google develop related results as cards to whichever open a parallel app, copy text, or access the system Share list of options.
At the instant, three persons on the most recent version 3.7 of Photos must chime in saying that they have expected the Preview notice. The screenshot does indicate the Google Lens is just a preview for Pixel owners and only active in the Photos app at the moment. Google Lens can only recognize the objects in your existing pictures. In Google Assistant the Google Lens doesn’t seem to be lively yet.
A short look on the 2k16 smartphone does not yield Google Lens, with this possible being a server-side update that will rolling out over the upcoming days and weeks. For the moment, Lens in Assistant is attached for the fresh few weeks on the Pixel 2 and should also be accessible on the 1st gen Pixel. The broader rollout for Google Lens to all Android smartphone is coming soon.