This article was published on July 25, 2017

Microsoft’s AI chip teaches HoloLens how to understand you


Microsoft’s AI chip teaches HoloLens how to understand you

Microsoft presented HoloLens 2.0 on Sunday at an event in Hawaii. The company, known mostly for its software, revealed the newest iteration of its AR headset would contain a Microsoft-made AI co-processor. This marked the company’s entry into the design and manufacture AI chip-sets. Microsoft designed its new AI chip to handle complex speech and gesture interactions specifically for HoloLens 2.0.

Google has its own AI chips, designed to train and administrate neural networks. The AI race is that serious: companies that make billions on software are cranking out silicon chips in-house now. You can bet that Apple, Amazon, Facebook, and hundreds of lesser-known startups have serious intentions of making hardware designed for machine-learning implementation. The bonanza is in full-swing.

AI has almost exclusively been hyped as a money-saving tool for business, or a herald for the end of humankind. The truth is that AI is already here, there, and everywhere. It’s mostly back-end stuff, which isn’t very interesting. The average consumer doesn’t have a deeper understanding of how predictive analytics can save money, and we’ve had self-driving car hyperbole shoved down our throats for years now. AI is already old news.

What Microsoft is doing could change the way consumers think about it. It’ll have to actually release a great peripheral that becomes a killer-app for AI and AR – which is a loftier endeavor than designing and making an advanced AI chip – but HoloLens 2.0 shows that kind of promise. At the intersection of AI and AR is where consumers will find value beyond only the battlefield or the workplace.

The combination of AI and AR creates fertile ground for developers — even if we’re not ready to be a society of people wearing techy headgear at the club or a PTA meeting just yet. Imagine Hololens 2.0 for police, though, in conjunction with new body-camera technology that can identify suspects or missing persons. US Army soldiers are taking advantage of AR headsets right now. It’s easy to visualize a paramedic getting real-time vital signs information fed to them through a visor: anything that saves emergency response personnel time will save lives. We start to lose our nerve when we think of a big attachment on our own head in public spaces.

It’s gotta look good

You might end up wearing something made by the company responsible for Kinect 1.0 on your face, but at least it’ll be smart. Theoretically the AI could also be taught to conduct object identification and make predictions. The potential applications are hard to quantify; we’re not sure exactly how powerful the chip or the algorithm will be. The headset is self-contained and battery-powered, which is nice, but until it’s unobtrusive and fashionable it won’t leverage popular appeal.

The idea of walking around with a big ole’ headset on has always seemed off-base. We aren’t all wearing smartwatches and those look normal by compairson. For a nascent technology to obtain massive consumer appeal it has to make us feel included, not isolated. It’s a tough sell trying to tell people to walk around with a giant distraction on their face.

Then again there were plenty of people who thought a touch-screen phone was a stupid idea. Apple got it right and the rest is history.

No, I won’t tell you that HoloLens 2.0 is the heads-up-display (HUD) everyone on the planet will be wearing when it arrives – it almost certainly won’t be. Instead, I’ll say it’s going to be part of the inspiration for that device.

Tomorrow we might think sunglasses that refuse to provide real-time information about everything in our field-of-view are old-fashioned and useless.

Actually – put me on the “sick-and-tired of glasses without a HUD” list now.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with