We’ve had several decades of platforms and apps collecting data about us – and in a heavyweight panel on Sunday at SXSW, the debate turned to how that data is being used to both make assumptions about us and alter the products and services we’re offered.
In the introduction, Ashkan Soldani referenced IBM’s development of software that was designed to ascertain whether individuals arriving into Europe from Syria were terrorists or refugees. Using multiple data sources the software creates a ‘terrorist score’ which determines the likelihood that someone is involved in terrorism activity.
Ever been to a tech festival?
TNW Conference won best European Event 2016 for our festival vibe. See what's in store for 2017.
And while the motivation behind this kind of tool is understandable, the potential for misuse or mistakes is clear.
Nicole Wong, former U.S. Deputy Chief Technology Officer for the Obama administration, introduced the concept of negative selection algorithms, citing the case of a major university in the U.S. that changed its procedures for recruitment to its highly regarded computer science degree when it became clear the initial screening process had inadvertently discriminated against women.
Other examples of how data is (deliberately and unknowingly) changing both the content we see online and the products were offered includes Facebook’s patent on judging financial worthiness based on your social graph and a study that showed that women were less likely to be shown Google ads for the highest paying jobs.
All the panelists acknowledged the huge challenges of addressing these kinds of issues and the lack of a definitive answer to solving them. Legislation would likely prove to be ineffectual, since the landscape is changing so rapidly, we certainly don’t even know the full picture of what we need to legislate against.
Companies could work harder to examine their systems and processes but sometimes seemingly benign processes can be causing problems without an organization even being aware of it. The panelists agreed that including data ethics as a mandatory subject within computer science courses was vital.
Even these measures are unlikely to completely solve the problems, but we have to start making a more concerted effort to push harder to try and address them. Oftentimes, the impact of data-driven decisions on our lives is hidden, and the resulting danger is that we don’t see it as a pressing problem.
As journalist and author Julia Angwin said:
“You might not know why you didn’t get that job, you may never know that it was data that discriminated against you. “
It is clear that we have reached a pivotal point in society – one where we have to collectively consider what we want the future of data collection and use to look like. We must take the time to understand the enormous potential impact of data on our lives.