Among other AI-focused announcements, Google today announced that its new “multi-search” feature will now be available to global users on mobile devices wherever Google Lens is already available. The search function, which allows users to search using both text and images, was first introduced last April as a way to modernize Google Search to better take advantage of smartphone capabilities. A variation of this, “multisearch near me”, which targets local business searches, will also be available globally over the next few months, as will multisearch for the web and a new Lens feature for Apple users. Android.
As Google previously explained, multiple search is powered by an AI technology called Multitask Unified Model, or MUM, which can understand information in a variety of formats, including text, photos, and video, and then draw information and connections between topics, concepts and ideas. . Google put MUM to work in its Google Lens visual search features, where it would allow users to add text to a visual search query.
“We redefined what we mean by search by introducing Lens. We’ve since integrated Lens directly into the search bar and continue to bring new features like shopping and step-by-step homework help,” said Prabhakar Raghavan, Google SVP in charge of Search, Assistant, Geo, Ads, Commerce and Payments products, said at a press conference in Paris.
For example, a user can view a photo of a shirt they like in Google search, then ask Lens where they might find the same design, but on a different type of clothing, such as a skirt or socks. Or they could point their phone at a broken part on their bike and type into Google search a query like “how to fix it.” This combination of words and images could help Google process and understand search queries that it previously couldn’t handle, or that would have been more difficult to enter using text alone.
The technique is especially useful for shopping searches, where you can find clothes you like, but in different colors or styles. Or you can take a picture of a piece of furniture, like a dining set, to find matching pieces, like a coffee table. In multiple search, users could also refine and refine their results by brand, color and visual attributes, Google said.
The feature was made available to US users last October, then expanded to India in December. Starting today, Google says multi-search is available to all users worldwide on mobile, in all languages and countries where Lens is available.
The “multisearch near me” variant will also soon expand, Google announced today.
Google announced last May that it might be able to direct multi-search queries to local businesses (aka “multi-search near me”), which would return search results for items users were looking for. for that matching inventory at local retailers or other businesses. For example, in the case of the bike with the broken part, you could add the text “near me” to a search query with a photo to find a local bike shop or hardware store that had the spare part you needed.
This feature will be available in all languages and countries where Lens is available over the next few months, Google said. It will also expand beyond mobile devices with support for multiple web searches in the coming months.
In terms of new search products, the search giant teased an upcoming Google Lens feature, noting that Android users could soon search what they see in photos and videos on apps and websites. on their phone, while staying in the app or on the website. Google calls it “find your screen” and said it will also be available wherever Lens is offered.
Google also shared a new milestone for Google Lens, noting that people now use the technology more than 10 billion times per month.