This article was published on November 15, 2017

Google brings on-device machine learning to mobile with TensorFlow Lite


Google brings on-device machine learning to mobile with TensorFlow Lite

Google’s recent preview release of its open-source TensorFlow Lite software for machine learning developers signifies an exciting shift in the field of AI. The company’s dedication to developing AI capable of running algorithms on a mobile device — without connecting to the cloud — is laying the groundwork for the artificial intelligence of things (AioT) of the future.

As far as consumer products go, Google Assistant, Alexa, and Siri are among the most popular uses of AI in the mainstream. For as little as $30 or $40 a person can get their own interactive artificial intelligence – as long as they also have WiFi and somewhere to plug in a charger.

TensorFlow Lite represents the nascent steps on the path toward making AI-powered devices not just accessible, but disposable. It’s the death of buttons.

Developers now have preview access to TensorFlow Lite for Android or iOS. Instead of providing new functionality for AI applications, it’s designed to leverage existing hardware – like the Snapdragon processors inside many smartphones – to run algorithms typically impossible for mobile devices without connecting to the cloud.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

With Google’s new Lite AI platform you can run AI models on a smartphone and, upon adding new data, run those algorithms to determine new outcomes. It’s machine-learning on the go, without the need for connectivity.

If you’re one of the people terrified at the prospect of hundreds of devices in your home spying on you through your internet connection, you’ll be happy to know that the researchers at Google are specifically designing TensorFlow Lite to, eventually, address those kinds of concerns.

According to the TensorFlow Lite website the software is designed with the following criteria in mind:

  • Widely-available smart appliances create new possibilities for on-device intelligence.
  • Interest in stronger user data privacy paradigms where user data does not need to leave the mobile device.
  • Ability to serve ‘offline’ use cases, where the device does not need to be connected to a network.

It’ll be interesting to see where Google’s ‘AI platform miniaturization’ project goes next. It’s paving the way for voice-controlled disposables built on cheap chips and AI-powered appliances that don’t expose your entire network to hackers.

If Google can continue to squeeze more usefulness into less powerful devices we’ll eventually live in a world where AI can be injected cheaply into any gadget, even disposable ones.

Google engineer and TensorFlow technical lead Pete Warden told MIT “What I want is a 50-cent chip that can do simple voice recognition and run for a year on a coin battery.”

TensorFlow Lite brings the company one step closer to realizing Warden’s vision.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top