Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on April 5, 2019

We’re losing control of tech’s future – here’s how we can reclaim it


We’re losing control of tech’s future – here’s how we can reclaim it


We seem to be at a crossroads when it comes to the future of tech. Advancements in fields like artificial intelligence raise dangers we need to collectively address as a society, before it’s too late. Will we embrace tech for better or for worse?

James Bridle, a multimedia artist and best-selling author, will discuss the tech hurdles we face at TNW Conference in Amsterdam on May 9 and 10. His new book, New Dark Age, is a collection of gripping examples of how we misidentify the dangers technology presents.

Bridle calls upon real world illustrations, ones we’re all familiar with, and presents it to us in a new, shocking light. Take, for example, how Google Translate has massively improved over the past few years. Unknowingly, the advanced AI behind its success has taken on new forms of computational power, ones which even its creators don’t fully understand. Bridle enlightens readers to the sharp realities of our digital world, and forces us to reassess the methods of future-proofing technology.

We sat down with him to discuss his striking views.

You have a very clear passion for AI, machine learning, and technology in general. When and how did it all start?

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

I studied computer science and artificial intelligence at the very end of the last AI hype cycle, almost 20 years ago. This was when people were starting to realize that models of AI based on modelling the human brain weren’t going to work.

The latest wave we’re in right now is so interesting because it’s completely different to that idea: while things like neural networks are loosely based on simplified models of the brain, machine learning as we’re currently developing it – and deploying in every area of life, from stock markets to insurance assessment, medical diagnoses to energy management – is something very different, something alien to human intelligence and largely opaque to it. As someone who’s driven to understand such technologies and their impact on society, this presents an interesting challenge.

Why did you decide to work across multiple mediums? How has the experience of writing books differed from your art installations?

I wouldn’t even say it was a clear decision. Rather, work that doesn’t fit within established modes of technological production and critique – either within the tech industry or academia – finds a place within art, journalism, and public writing.

Art has always been where modes of inquiry that don’t fit logics of production and consumption end up, and for me it’s about finding other uses and opportunities for our technologies – how can we bend and rewrite them, how can we tell different stories about them, and thus produce different outcomes?

My writing really isn’t that different: it’s about finding alternative ways to tell these stories, and thus to come to very different understandings of them. Sometimes this takes the form of artworks, sometimes of journalism or lectures, and sometimes, as in the case of New Dark Age, a longer form book. Each are processes of enquiry, debate, comprehension, and critique.

You use a lot of examples, from politics to chess, in the book. Did these events lead you to write New Dark Age, or were they researched?

They’re all things that I’ve found interesting, over many years! I can’t really put it better than that. When I find a subject interesting, whether it’s a political or social event, or machine chess, or drones, or self-driving cars, or machine learning, I dig in to see what can be said, what can be done about it.


I find these things fascinating – and I think by connecting them, by seeing how the language we use around them, the patterns that we make of them, all shape and produce society in different ways, we can understand something about ourselves, and the world we find ourselves living in.

 

Has the experience of writing New Dark Age impacted how you interact with apps, the internet, and technology in general?

Yes. I was already fairly critical in my use of technology: I’ve never used Facebook, for example, because I simply don’t understand the utility of a social operating system based on the behavior of a small number of awful people at Harvard, and why the whole world should have to conform to it.

But I’ve also learned a huge amount more about how centralized and surveilled networks fundamentally shape power relationships and thus our everyday lives, and the ways in which really horrific physical and psychological – even neurological – hacks are built into many of the things we interact with all the time, from the manipulations of recommendation engines to the dopamine rewards of pop-up alerts and endlessly scrolling timelines.

As a result, I’m much more interested in decentralized and distributed networks, open source applications and operating systems, and technology that’s designed with thought and care to educate and support users, rather than enchant, disempower, and even radicalize them.

At the end of each chapter, you round off by saying that we have the agency to ensure that we work positively with machines. How can we go about this, and are there examples of how we are already doing so?

There are various ways of thinking about this, whether it’s the cooperative model of Kasparov’s Advanced Chess – where humans and machines play together rather than against each other – or the restructuring of the underlying networks of ownership made possible by distributed web technologies and peer-to-peer applications like DAT, IPFS, Mastodon, Secure Scuttlebutt, Appear.in, Jit.si, and so on.

There’s also flat-out opposition: refusing to participate in systems of dominance and control, and deliberately obfuscating oneself through encryption and withdrawal – use PGP, use VPNs, and so on. But the only long-term strategy that’s ever worked is mutual support and education: enabling everyone to participate meaningfully in the design and construction of these systems, and turning machines for dominance and extraction into tools for understanding and questioning.

This can take the form of simply better tech education – learning to code, critically, is one very direct way of gaining agency within these systems – but it also requires commitment from those building our social and life-support systems to guide users through these processes, so they are not disempowered but elevated by the technologies they use.

Want to learn more about James Bridle’s views on technology? Don’t miss him speak about the future of tech at TNW2019. If you haven’t already, make sure to order his book, New Dark Age.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with