homesmart tech NewsGoogle's next level AI looks to improve search with focus on images and videos

Google's next-level AI looks to improve search with focus on images and videos

MUM's the word for Google as this new tech is multimodal and understands not just text but also images and ultimately video and audio.

Profile image

By CNBCTV18.com Sept 30, 2021 8:11:48 PM IST (Published)

Listen to the Article(6 Minutes)
Google's next-level AI looks to improve search with focus on images and videos
Google recently said it will use artificial intelligence (AI) to improve its search functionalities for users. The company’s Google Search engine facilitates nearly 100,000 searches every second for users across the world, but the company will now be improving its search functionality with a focus on visuals.

The tech company demonstrated several of its new search functionalities during the Search On event’s livestream. One of the key changes is the use of a new technology called Multitask Unified Model (MUM) to improve search. The technology was first announced in May and built with the aim to improve search results for complex questions and tasks.
“MUM has the potential to transform how Google helps you with complex tasks. MUM uses the T5 text-to-text framework and is 1,000 times more powerful than BERT. MUM not only understands language, but also generates it. It’s trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models,” Pandu Nayak Google Fellow and Vice President, Search, explained in a blog post.
Bidirectional Encoder Representations from Transformers (BERT) is a tool that allows Google to better understand the context of words used in a search. “And MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio,” Nayak added.
It is this capability of understanding images and videos that make MUM invaluable for the next leap in Google’s search capabilities. For example, Google highlighted how users will now be able to just take a picture of an object using Google Lens and find the relevant search results for it.
“Your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help,” the company explained in a new blog post.
MUM is also going to be instrumental in the new design for Google Search. A new section that users will get to see is the ‘Things to Know’ section, where Google will give out more results that give you a chance to learn more about a new topic.
Google will also be able to now give suggestions for related topics to a video that you are watching. MUM will help Google to find related topics to your current videos, even if those topics aren’t explicitly mentioned in the video itself.
With Google, and particularly YouTube being a trove of knowledge for the millennials and Gen Z, the advanced AI will only improve the search functionality of the ubiquitous search engine.
A ‘Think with Google’ study found that teenagers were regularly using YouTube to teach themselves new subjects, and while penetration of the same behaviour is lower in older age demographics, the use of YouTube as a platform for both entertainment and knowledge is well documented.

Most Read

Share Market Live

View All
Top GainersTop Losers
CurrencyCommodities
CurrencyPriceChange%Change