I’m not sure he meant it this way, but when tech entrepreneur and venture capitalist Marc Andreessen declared that software is going to eat the world, he may have been projecting both the future of technology, as well as the impending risk of zombie apocalypse. Hate to be a potential spoiler to your Sunday night The Walking Dead fix, but the “virus” that causes the zombie outbreak is likely going to be software related.
Looking into viruses, there are a lot of similarities between technology and biology when we refer to either as a pathway for something “going viral.” This is because both do so at an exponential rate of change that biologically outpaces our own ability to keep up. As that rate continues to escalate, let’s look at the technology impact over the coming years:
Recently, a single commodity servers intellectual capability has surpassed that of a field mouse.
That may not seem like much, but remember the famous hundred million dollar IBM “Deep Blue” supercomputer that beat chessmaster Garry Kasparov in the late 1990’s ? This single server is over 30x more powerful and 100,000x lower cost.
- By 2025, your laptop’s emulated human intelligence capabilities will surpass your own.
- By 2045, your smartphone will far exceed the collective intellect of all people on the planet.
- By that same time, there’s expected to be over 50 billion machines working together.
Currently software-driven technology infrastructure is what powers the world. Not only is every societal aspect of modern life critically dependent on it, but within the timeframes above our own core intellectual capacities will be supplemented by it.
Software is indeed eating the world, but what happens when it gets major indigestion in the process? Given the right catastrophe, the answer could likely be the zombie apocalypse. That said, let’s explore the predictive root cause analysis of two potential apocalyptic scenarios, one that involves people turning into zombies and one that involves machines turning into zombies:
Scenario One: The human zombie outbreak
In scenario one, extended technology infrastructure failure caused by either a natural (EMF weaponry) or unnatural (cyber terrorism) catastrophe, will serve as virus of the apocalypse, and human zombies will be the result:
Within weeks of broad loss of core infrastructure, widespread human pandemonium ensues.
Our bank accounts are gone. Informational and educational content is lost. Supply chain capabilities (food, water, fuel) evaporate. Manufacturing and agricultural production capabilities disappear (no more machines, etc) We are left in the dark, both from an energy and a communications perspective. Nuclear reactor backup power fuel supplies dwindle, causing impending environmental catastrophe.
Modern societal structure and governmental capabilities quickly fail. Without the machines we have continuously learned to depend on, chaos ensues.
Hungry people in chaotic environments quickly become desperate “zombies.” If people are capable of turning into deranged lunatics over $99 Black Friday electronics specials at Walmart, imagine what can happen during times of zero order, zero food and zero water.
Two weeks of broad societal technology infrastructure failure is enough to drive scenario one of the zombie apocalypse. The biggest current risk that could drive a taste of this scenario is likely via a “cyber 9/11” type event that would exploit current security gaps that currently exist in our infrastructure, systems and software.
In scenario one, data encryption and security & privacy diligence are the two biggest weapons to avoid and defeat this zombie apocalypse.
Scenario Two: The machine zombie outbreak
In scenario two, the risks of scenario one will eventually be addressed, through effective machine learning and AI. In essence, the machines will learn to protect themselves from bad guys and other machines.
In this scenario, however, what helps avoid the zombie apocalypse of scenario one is what causes the zombie apocalypse of scenario two:
As machine capabilities continue to rapidly outpace human capabilities, humans begin to relinquish more control of the highly capable AI driven machines that will be able to maintain and improve themselves with less and less human intervention.
With human error being the primary risk to machine survivability, AI logic will deem human intervention as a potential “virus” in itself, and begin to take preventative measures against that risk.
If you teach a machine to protect itself, and it deems you as the risk, what can it do when you try to turn it off? By the time this challenge presents itself, our military will have already expanded beyond drone weaponry to more advanced robotic weaponry.
Highly powerful logic based machines, with no emotion, become zombies that targets human risk targets. Having control over all aspects of critical societal infrastructure (those named in scenario one,) gives machines great leverage against us.
In scenario two, diligence in maintaining strict controls over distributed machine infrastructure is the key weapon to avoid and defeat this zombie apocalypse.
This write-up may be half kidding, but perhaps more than half not-kidding.
Exponential technology advancement is a great power. Greater than even nuclear power. That said, “with great power comes great responsibility.” If we don’t properly manage this great responsibility, then none of us will need a TV to realize The Walking Dead. We’ll be living it. Not kidding.
Litbit is fully committed to this great responsibility, and effective, secure human-machine relationship management is a key part of it. Not kidding here either.
This post is part of our contributor series. It is written and published independently of TNW.
Celebrate Pride 2020 with us this month!
Why is queer representation so important? What's it like being trans in tech? How do I participate virtually? You can find all our Pride 2020 coverage here.