You won't want to miss out on the world-class speakers at TNW Conference this year 🎟 Book your 2 for 1 tickets now! This offer ends on April 22 →

This article was published on September 30, 2021

Google’s new AI wants to supercharge contextualized search results

MUM's the word


Google’s new AI wants to supercharge contextualized search results

Google announced a new AI model for multimodal search called MUM (Multitask Unified Model) at its developer conference Google IO in May. Last night, the firm announced a bunch of consumer-facing features, including visual search, that’ll be coming to your screen in the coming months. 

The Big G currently serves contextual information such as Wikipedia snippets, lyrics, or recipe videos to you based on your search phrase. Now, for the next step of the search, it aims to get you results by understanding context beyond just the phrase you’ve used.

The first feature announced last night is visual search. When you’re looking at a picture of a shirt on your phone, you can search the lens icon and ask Google to search for the pattern, but on a pair of socks or a bag. If this works well, I’m adding a bunch of new pop culture t-shirts to my collection.

You also use the lens to take a picture of a broken bike part or a piece of plumbing in your house, and simply search ‘how to fix’ to get video guides on those topics. This is helpful when you don’t know the name of the part, which could make the textual search very difficult.

 

Google is also trying to show more context for the phrases you search for with the “Things to know” section. For example, when you search for “acrylic painting,” the result will show you stuff like “How to start with acrylic painting” and “What household items to use” for the activity. This is Google’s bet of reading beyond just the search phrase.

This section won’t show up if you’re searching for certain unspecified sensitive terms, but the company is not putting limits on any topics.

Google’s new AI will also help you zoom in and out of a search result with refining and broaden result features in the coming months. Plus, if you’re looking for inspiration through the “pour painting ideas” phrase, Google will show a visual results page for you to scroll through. 

Apart from contextually super powering the search results, Google is using MUM to pluck out related topics to a video even if the title of the clip doesn’t specify certain topics.

There are a couple of themes that are standing out in these new features: extended usage of Google Lens, using machine learning to find co-relation between words, and helping you find more contextual information to a topic.

However, as The Verge’s Deiter Bohn noted, Google has to answer questions about bias in machine learning data, and its treatment of researchers who might not share the company’s vision in all areas.

While Google might tactfully avoid some of those questions, there’s little doubt we’ll see more of AI being integrated into search and other products in the near future.

You can read more about the company’s announcement here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top