Did you know Mariarosaria Taddeo, the Deputy Director of the Oxford Internet Institute’s Digital Ethics Lab, is speaking at TNW2020 this year? Check out her session on ‘Shaping the future of AI: International policy outlook’ here.
Artificial intelligence is increasingly affecting our everyday lives. The field has the potential to make the world a healthier, wealthier, and more efficient place. But it also poses vast safety and security risks.
Mariarosaria Taddeo, the Deputy Director of the Oxford Internet Institute’s Digital Ethics Lab, believes we can mitigate these risks through regulation built on strong ethical principles.
“These technologies are transformative,” Taddeo told TNW. “They are reshaping our societies and the reality in which we live. So we need to make sure that this transformation is leading to the societies we want: a post-AI society, which is democratic, open, and diverse. To achieve these ends it’s essential that ethical considerations are leading us down the right route. We cannot leave it too late.”
[Read: Are EVs too expensive? Here are 5 common myths, debunked]
In her work at the Oxford Internet Institute, Taddeo develops guidance on the ethical design, development, and deployment of digital technologies. She believes that ethics can not only help make AI a force for good, but also benefit businesses and innovation:
When we think of digital technologies, we cannot disregard their social impact, with respect to the ethical values and principles that underpin our societies. If there is friction between these values and principles and technological innovation, the latter will not be adopted and it’s also likely that this friction will lead to strict policies and regulation.
In turn, this can hinder innovation. Ethics, when embraced at the beginning of any design process, can help us to avoid this path, limit risk, and to make sure that we foster the ‘right’ innovation.
The impact of ethical guidelines
In recent years, a growing number of organizations have published ethical guidelines on the use of AI. But these need to be constantly reassessed as technology and society evolve.
“This is because both technologies and societies change, and these changes may pose new ethical risks or new ethical questions that need to be addressed,” said Taddeo. “Ethics — especially digital ethics — should be seen more as a practice than as a thing.”
One of the principles most frequently included in ethical guidelines for artificial intelligence is “trustworthy AI.” But Taddeo says the term is a misnomer:
Trust is a way of delegating a task and no longer controlling or supervising the way the task is being performed. That’s not what we want from AI. Technically speaking AI is not a rubes technology, it can be attacked and manipulated — without the user ever knowing about it.
Trust and forget about AI is a dangerous approach. The use of AI should be coupled with forms of monitoring and control of the system. More than trustworthy AI, we should develop processes to make AI reliable.
Taddeo believes the OECD Principles on Artificial Intelligence are very good in this respect, as they refer to the need for continuous monitoring of the systems. This can ensure that AI systems continue to behave as expected throughout their deployment.
But ethically good uses of AI will not only be determined by written guidelines. They will also depend on our vision for the society we wish to build.
“So when considering how to use AI as a force of good, we should also ask the question as to what kind of societies we want to develop using this technology,” explained Taddeo.
Taddeo draws a parallel to a famous Winston Churchill quote: “We shape our buildings; thereafter they shape us.”
“This is even more true when considering digital technologies,” she said. “We shape AI and then AI returns to give shape to us. The question is what kind of shape we want to take and how we can use AI to take us there.”
So you’re interested in AI? Then join our online event, TNW2020, where you’ll hear how artificial intelligence is transforming industries and businesses.