This article was published on June 14, 2016

We’re completely overlooking the most important announcement Apple made at WWDC


We’re completely overlooking the most important announcement Apple made at WWDC

Lost in the glitz and glamour of bigger emoji yesterday, Apple’s biggest announcement went almost entirely overlooked: differential privacy.

Differential privacy isn’t an Apple creation, in fact it’s been around for years and is a well-known method for masking data in order to keep it from being extracted. What you need to know, however, is that Apple is integrating the cryptography method as an effort to keep all of us safer and to avoid having usable data to turn over to law enforcement or government agencies should they request it.

Differential data

It’s a fairly complex process, but think of it like this: it’s like opening a phonebook (remember those?) and trying to find your friend in New York City whose name you can’t remember, but starts with an “A.” You’ll quickly find there are far too many “A’s” in the phonebook to pinpoint a single name. This is how differential privacy works. In essence, it provides a lot of noise to hide the signal. The more noise added, the harder it is to find what you’re actually looking for.

What Apple is doing is a departure from the typical data collection mindset in that it’s not creating user profiles to provide things like: locations on Apple Maps, popular tracks on Apple Music, or new words to add to auto-correct. Instead, and stick with me on this one, Apple is assigning a deep link with a unique hash for these events and then adding noise. From there, it can determine what’s popular by extracting a fragment and analyzing how the pattern changed without it.

For example, If you wanted to find Joe’s income and you know that he lived in San Jose before moving to Oakland, you’d take San Jose’s total income — during and immediately after Joe lived there — and then find the difference. The difference is Joe’s salary.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

It’s simplified, but this is how Apple is providing recommendations without creating a user profile based on your data. In aggregate, these fragments can be combined to figure out trends and preferences without actually revealing any particular user.

The problem with differential data

With enough fragments, you can start extrapolating individual data (like the Joe example from above).

To prevent this, Apple is assigning what’s known as a “privacy budget” which will effectively limit the number of fragment submissions made from a single user over a set period of time. Those that get submitted during this timeframe are anonymized and Apple deletes each fragment donation after a period of time to collect new ones — from new users.

Hopefully you’re not thoroughly confused at this point, but if you are, just think of differential data as a cryptography method (encryption) that keeps each of us safe by not tracking or monitoring us.

If Apple can’t pinpoint an individual user based on this data, it can’t be used against us — which is more important than ever after the recent FBI debacle.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with