Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on May 19, 2016

Google’s new Awareness API will make your Android phone more like an iPhone


Google’s new Awareness API will make your Android phone more like an iPhone

Google has created a single API that will let Android apps be more contextual and location aware.

The Google Awareness API uses seven location and context signals (time, location, places, beacons, headphones, activity and weather) from your phone to help apps create custom behaviors. It also offloads much of the behind-the-scenes resources your phone would use for those signals to help with battery life, and lets developers create custom geofenced locations.

It’s a bit like Apple’s predictive app feature, which does things like suggest you open the Starbucks app when you’re near a Starbucks — but a bit more robust.

Screen Shot 2016-05-19 at 11.21.33 AM

One API is really two

Within the Awareness API are two separate APIs that can be accessed independently; Fence and Snapshot.

The Fence API uses simple geofencing so developers can create geofencing to make app suggestions contextual. For instance, a developer who has created a dedicate workout music app could parse JSON location data to suggest you open your favorite cardio playlist when you’re near a gym.

Separately, the Snapshot API does the heavy lifting described above to help gather signal information from your device.

The Awareness API isn’t quite ready for the masses, but Google is taking applications for early access.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with