This article was published on August 23, 2019

Google follows Apple with its own anti-tracking policy for Chromium-based browsers


Google follows Apple with its own anti-tracking policy for Chromium-based browsers

Google has announced a new initiative that aims to “fundamentally enhance privacy on the web.”

The proposal — dubbed “Privacy Sandbox” — is a stab at preventing extensive tracking of users on the web through cookies and other covert techniques like tracking pixels, link decoration, and device fingerprinting.

In creating a new standard that puts users in control over their data, Google hopes it would strike a balance between personalization and privacy. The changes, if implemented, are bound to have major implications for the whole ad tech ecosystem.

“Technology that publishers and advertisers use to make advertising even more relevant to people is now being used far beyond its original design intent — to a point where some data practices don’t match up to user expectations for privacy,” Chrome’s engineering director Justin Schuh said.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Schuh stressed the lack of a common anti-tracking standard among browser makers (aka Safari and Firefox) is having unintended consequences, hurting publisher revenues and prompting advertisers to circumvent tracking protections through sneaky workarounds.

But if all of it sounds like you’ve heard before, it’s because the development follows exactly a week after Apple outlined a similar anti-tracking policy that strikes the heart of how digital advertising functions today.

Google wants a middle ground

The search giant doesn’t want to unilaterally block all cookies that are used to keep tabs on your every move as you hop from one site to the other.

Google notes that advertising is still the way for a more open web, citing a study that shows publishers lose an average of 52 percent of their advertising revenues when readers block tracking cookies.

But it doesn’t want advertisers and marketers to embrace practices like fingerprinting either. Fingerprinting is what happens when information such as a device hardware configuration and browser settings are used to identify and track a user.

While you can opt out of third-party cookie tracking through features built in browsers like Chrome, Safari, and Firefox, you can’t prevent companies from fingerprinting you — unless you keep changing the configuration of your device on a regular basis, which isn’t feasible.

Instead, it’s proposing a new solution called Privacy Sandbox that protects your privacy while also offering advertisers a way to show you targeted ads without resorting to privacy-violating practices like fingerprinting.

Differential privacy to the rescue — again

To minimize data leakage associated with device fingerprinting, Google is taking a page out of differential privacy (DP) — a statistical technique that strategically adds random noise to personal user information stored in databases so that businesses can still analyze it without being able to single people out.

The result achieved by adding “random noise” to the dataset isn’t quite exact, but is accurate enough to glean insights. Even so, there’s a catch: although this noise prevents information leakage, it doesn’t do so entirely.

This is because the total leakage increases every time data is queried from a database — the more you ask, the more you know — necessitating that additional noise be injected in order to minimize the exposure.

The tradeoff between accuracy and privacy — which manifests as a “privacy budget” — underpins the very idea of differential privacy. In the words of Johns Hopkins University professor Matthew Green:

The total allowed leakage is often referred to as a ‘privacy budget,’ and it determines how many queries will be allowed (and how accurate the results will be). The basic lesson of DP is that the devil is in the budget. Set it too high, and you leak your sensitive data. Set it too low, and the answers you get might not be particularly useful.

In order to prevent fingerprinting, Google intends to leverage the privacy budget to limit API calls from websites to reveal “enough information to narrow a user down to a group sufficiently large enough to maintain anonymity.”

Once a website exceeds the hard cap, the browser will block any further attempts by websites to obtain any further information, or return deliberately inaccurate or generic information.

Eventually, this will be available as an open-source browser extension which will allow you to see three things: the kinds of data being collected about you (and by whom and why), the advertiser responsible for the ad you’re seeing, and what caused it to appear.

That’s not all. Google also appears to be following Apple’s footsteps for a privacy-preserving ad tracking method called Conversion Measurement with Aggregation. It seeks to limit ad tech vendors’ cross-site tracking, at the same time, measure the effectiveness of their ad campaigns on the web without compromising your privacy.

Putting users in control of data

What Google, ultimately, is outlining is a privacy-focused initiative that puts users front and center — a tool that gives them the ability to see what data is collected and control how it is used.

For a company whose business model is built on the foundation of tracking people’s activities online and then share that (anonymized) data with advertisers — which then use that information for targeted advertising — the move is bound attract a fair bit of skepticism.

At this stage, Privacy Sandbox remains just a concept. But Google is seeking extensive feedback from browser developers, privacy advocates, publishers, and advertisers to take it forward.

“While Chrome can take action quickly in some areas (for instance, restrictions on fingerprinting) developing web standards is a complex process, and we know from experience that ecosystem changes of this scope take time,” Schuh said. “They require significant thought, debate, and input from many stakeholders, and generally take multiple years.”

Why now?

Google already has a long list of initiatives — federated learning, private join and compute, private set intersection, and confidential computing — all geared around improving privacy and security at different levels of the internet machinery.

But Google, at its heart, is still an advertising company. If anything, the rash of proposals are emblematic of a wider public and regulatory scrutiny faced by big tech to be more transparent in their data practices.

What’s more, Apple’s WebKit anti-tracking policy — which treats online tracking as a security vulnerability — has raised the stakes, forcing Google to respond with similar privacy-first solutions or risk losing customer trust.

While one cannot deny it’s a ploy on part of the search giant to retain users on Google Chrome (and its larger ecosystem), the fact that it’s joining the tracking debate can only be a good thing.

However, just as Facebook is struggling to convince its users that its pivot to privacy after a string of data scandals is real, Google will have to do everything it can to bridge that trust gap and encode privacy into their design in a manner that instills transparency, choice, and control.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with