TNW Conference 2022 will be bigger, bolder, and better! Get your tickets now >>

The heart of tech

This article was published on May 19, 2016

Google’s new Awareness API will make your Android phone more like an iPhone

Google’s new Awareness API will make your Android phone more like an iPhone
Nate Swanner
Story by

Nate Swanner

Former Reporter, TNW

TNW's former West Coast writer in the PNW (Portland, Oregon). Nate loves amplifying developers, and codes in Swift when he's not writing. If TNW's former West Coast writer in the PNW (Portland, Oregon). Nate loves amplifying developers, and codes in Swift when he's not writing. If you need to get in touch, Twitter is your best bet.

Google has created a single API that will let Android apps be more contextual and location aware.

The Google Awareness API uses seven location and context signals (time, location, places, beacons, headphones, activity and weather) from your phone to help apps create custom behaviors. It also offloads much of the behind-the-scenes resources your phone would use for those signals to help with battery life, and lets developers create custom geofenced locations.

It’s a bit like Apple’s predictive app feature, which does things like suggest you open the Starbucks app when you’re near a Starbucks — but a bit more robust.

Screen Shot 2016-05-19 at 11.21.33 AM

One API is really two

Within the Awareness API are two separate APIs that can be accessed independently; Fence and Snapshot.

The Fence API uses simple geofencing so developers can create geofencing to make app suggestions contextual. For instance, a developer who has created a dedicate workout music app could parse JSON location data to suggest you open your favorite cardio playlist when you’re near a gym.

Separately, the Snapshot API does the heavy lifting described above to help gather signal information from your device.

The Awareness API isn’t quite ready for the masses, but Google is taking applications for early access.

Get the Google newsletter

Also tagged with