This article was published on October 29, 2019

Mozilla and Element AI want to build ‘data trusts’ in the artificial intelligence age


Mozilla and Element AI want to build ‘data trusts’ in the artificial intelligence age

Mozilla, the nonprofit behind the free and open-source Firefox web browser, is partnering with Montreal-based artificial intelligence startup Element AI to push for ethical use of AI.

To that effect, the two companies are exploring the idea of data trusts, a proposed data collection approach that aims to provide individuals with greater control over their personal information.

The aim, the companies said, is to offer an alternative model to the current broken consent-based system of data collection such as the EU GDPR regulations.

It’s easy to see why. As artificial intelligence and machine learning (ML) continues to infiltrate different aspects of our day-to-day lives, the technology is now doing more than ever — for both good and bad.

This necessitates an ethical use of such solutions to prevent misuse, and ensure there’re adequate controls over the massive amounts of data accessed by these algorithms.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The data trust, therefore, acts as a “steward” that gets to approve and control the collection of, and manage access to, data with an eye on privacy while not sacrificing the benefits of AI and ML.

In other words, a data trust — say, an independent watchdog agency — sets the terms of data collection, usage, and sharing, in addition to deciding who gets to access said information in a way that balances privacy and responsible use of technology.

The idea of a data trust is not new. Google’s sister company Sidewalk Labs — which released blueprints for its controversial Quayside smart city in Toronto back in June — has set up a data governance model that places “urban data” under the control of an independent Civic Data Trust.

Despite the project’s promises to take a privacy by design means to minimizing data collection and its assurances that the gathered data will not be sold, used for advertising, or shared without people’s permission, the proposals have courted data monetization and surveillance concerns.

Privacy in the time of surveillance capitalism

With tech giants like Google, Facebook, Amazon, Apple, and Microsoft becoming the de facto data monarchs of personal information, the idea that they can potentially use your data without actually having it is sure an appealing one.

It’s a widely accepted fact that most smart technologies today — be it data-driven, internet-connected, or automated — are rife with privacy issues.

Even as the battle to keep personal data private rages on, it’s expected of netizens to give up some level of privacy as the cost of admission for all the conveniences of the digital world, so much so that the existing frameworks begin to feel like mere band-aid solutions.

The tacit agreement between individuals and the powerful digital institutions that profit from the data gathered by profiling its users has led to a privacy paradox, leaving users with no choice but to hit “accept” and move on.

The GDPR was meant to usher an era of better data protection, not to mention an opportunity for companies to inform people that their data is being collected — but not necessarily to prevent the collection — and provide an option to find out what data is held and to request its deletion.

Instead, users are having to navigate a complicated menu of privacy settings to fine-tune their cookie consent, with privacy turning out to be this extra thing that needs to be opted in — and never the default.

Just as consumers are coerced into participating in their own privacy violation in the name of “consent,” it’s essential that new cybernated governance methods are engineered to tackle the disconnect and decouple personal data from the companies who need them in order to offer their services, ad-supported or otherwise.

Whether it’s by making individuals “data shareholders,” offering privacy as a paid service, or entrusting data in the hands of an independent “data trust,” it’s time to address the need for owning personal data so that privacy can be what it is — a fundamental human right that cannot be taken or given away.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top