Multisearch, a Google Lens feature that can search images and text simultaneously, will soon be more widely available after launching in beta in the US earlier this year. According to Google, Multisearch will be expanded to more than 70 languages in the coming months. The company made the announcement at an event focused on search.
Additionally, the Near Me feature, which Google unveiled at I/O back in May, will land in English in the US sometime this fall. This ties into multi-search, with the idea of making it easier for people to find out more details about local businesses.
Multi-search is all about allowing people to point their camera at something and ask about it while using the Google app. For example, you could point your phone at a store and request details about it, or ask for a screenshot of an unfamiliar item like a piece of clothing. You can also look up what a particular food is called, like soup dumplings or laksa, and see which restaurants near you have it.
There will also be some changes on the lens front when it comes to augmented reality translations. Google is now using the same artificial intelligence technology it uses for the Pixel 6’s Magic Eraser feature to make it appear as if it’s replacing the original text rather than overlaying the translation. The idea is to make translations look more seamless and natural.
Google also adds shortcuts at the bottom of the search bar in its iOS app, making it easier to find features like translate text with your camera, hum to search, and translate text in screenshots.
All products recommended by Engadget are selected by our editorial team, independently of our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may receive an affiliate commission. All prices are correct at time of publication.
https://www.engadget.com/google-lens-multisearch-languages-near-me-ar-translation-171420309.html?src=rss Google Lens image and text multisearch will soon be available in more languages