At Google I/O today, Google CEO Sundar Pichai detailed a future that featured a whole lot of talking to your devices.
20 percent of queries on Google are performed via voice search. Because voice assistants, such as Google Now, are making up progressively larger percentages of our searches, Google wants to ensure its “industry leading” voice processing natural language processing platform is up to the task.
Google Assistant, Google’s own natural language processing platform has high aspirations that exceed mobile devices. It’s designed to work with Google Home, an Alexa-like always-on listening device that helps you manage everything from search, to playing music or even having a conversation with Google — or so it says.
The digital assistant is designed to provide an on-going two-way dialogue with Google and the company wants to handle your most mundane tasks and searches without any real input from you, outside of your voice.
“We want to do [your tasks] for you, understanding your context. We think of this as building each user, their own individual Google,” CEO Sundar Pichai said.
The demo was rather impressive, featuring Google Assistant using contextual clues when performing a search about ‘The Revenant,’ that led to information, trailers, reviews and ultimately a QR code to buy the tickets based on conversational language.
Google hopes its digital assistant can become an integrated part of your life through mobile, Google Home, Nest, Google Cast and other planned projects but didn’t drop any clue as to when we might see more of it in these devices. If the timeline is the same as other products announced today, expect fall to be a busy season at Google.
Get the TNW newsletter
Get the most important tech news in your inbox each week.