Apple today published a paper on its Machine Learning Journal which addressed the topic of differential privacy, and how it can be used to protect user privacy in a time when every business needs to gather increasing amounts of data. This method addressed the fundamental quandary Apple and companies like it face: how to improve user experience, which involves collecting data, without sacrificing privacy.
The company proposes the use of local differential privacy, instead of central — in other words, the individual user’s device uses noise to mix up any data before it’s received by a central server. According to the paper, when enough people sending in their data, the noise averages out and leaves usable information behind.
Have you visited TNW's hype-free blockchain and cryptocurrency news site yet?
It's called Hard Fork.
Some of the use cases for the algorithm include identifying new words, figuring out which emoji people are using the most, and finding out what websites put the most strain on Safari.
Differential privacy isn’t without its critics, however. According to Wired, studies suggest even users who opt into differential privacy are still not protected enough, and Apple is obfuscating just how much it mines from individual users.
You can read Apple’s full paper, with all the nitty-gritty details, here.