This article was published on October 17, 2017

Why should you care about the AI chips in Apple and Huawei’s new phones?


Why should you care about the AI chips in Apple and Huawei’s new phones?

Earlier this year, both Apple and Huawei made a lot of noise about the new processors in their latest phones, touting the inclusion of a game-changing neural engine or neural processing unit specifically designed to handle artificial intelligence computation. But why should you care, and do you really need an AI-capable phone?

What neither manufacturer managed to do was include game-changing apps to go with these powerful new chips. Apple’s A11 Bionic enables its Face ID tech and Animoji, while Huawei’s NPU-equipped Kirin 970 allows for automatic camera settings for good pictures, and offline support for Bing’s Translator on the new Mate 10 series. But these features are just nice-to-haves, and hardly what you’d call revolutionary.

Huawei's new Mate 10 series packs a Kirin 970 chip with a Neural Processing Unit
Huawei’s new Mate 10 series packs a Kirin 970 chip with a Neural Processing Unit

Our own Tristan Greene noted that all flagships will likely come with an AI chip next year, and that the challenge for phone makers is in proving that they’re worth the additional expense and hype.

Ideally, developers will take advantage of the new hardware and tool their apps to do more with the additional processing capability. But what can we expect from these powerful handsets in the future?

According to Baofeng Zhang, Huawei’s vice president of software engineering, it’s important to note that most apps that currently use cloud-based AI can be retooled fairly easily to make use of on-board neural chips, by working with the Kirin API or open-source frameworks like TensorFlow and Caffe2.

Doing so can result in quicker performance, less dependence on strong connectivity, and increased privacy for users since data isn’t leaving their devices for these tasks, he adds.

Zhang explained that the first lot of third-party apps will likely work on voice and image recognition, followed by services that make use of natural language processing to parse requests like searching for content stored on your phone.

That could lead to better tools for things like video editing. Imagine being able to shoot an hour’s worth of footage at your child’s soccer match and then have an app bring up highlights when you issue a voice command like, ‘Show me segments of this video with Hannah playing soccer.’

That could replace the present generation of apps which offer ‘Smart Cuts’-style feature that actually use cues from the footage’s audio and gyrometer data to find ‘interesting’ scenes. An AI chip could help make more intelligent decisions about which portions of the video to surface.

So what are major tech companies doing in the field of AI-powered apps? Microsoft is presently working with Huawei to build on its Translator app’s offline capabilities with the NPU chip; it’s already available on the device but doesn’t yet support voice translation without an internet connection.

Meanwhile, Google is slated to make some services and APIs available by the end of the year. Hopefully, those will enable developers to do more with Huawei’s NPU and similar processors once they’re out.

And what of the next generation of AI chips? Zhang noted that it’ll likely have more processing power, but the evolution of frameworks like TensorFlow Lite, which developers will use to build their apps, may affect the path of development of these processors.

So, do you need an AI-powered phone right now? Honestly, there isn’t yet a killer app that takes advantage of on-device AI, so it’s mostly an early-adopter novelty at present. If Huawei is right, more phones will soon come with neural processors on board, and it’ll make sense for developers to make use of the additional firepower. What we need right now are clever ideas for game-changing apps, and how to build them.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with