In March of this year, Google Lens logged into the iOS platform through Google Photos. Today, after logging in to the Google app on the iOS side, the user no longer needs to take the photo first and then use it in the album, and then use it directly through the camera's viewfinder. There is a Lens icon near the top of the Google APP search bar near the voice search.
On first launch, the user needs to authorize the app to gain camera permissions and agree to other related terms, in addition to "scan text" and "smart store" relatedTutorial. The former allows Google Lens to retrieve text from images for analysis, including finding words, saving email addresses, or making calls. At the same time, Lens can help find similar products such as shoes or other clothing.
Google Lens has a very clean interface where users can click on the object or text in the viewfinder to start the analysis. At the point of interest, the result will slide up from the bottom. In addition, other spaces on the page include turning on the flash to provide a brighter lighting environment, and opening the album for Lens to analyze existing images. The menu in the upper right corner allows the user to send feedback.