Stephen Hawking once suggested Albert Einstein’s assertion that “God does not play dice” with the universe was wrong. In Hawking’s view, the discovery of black hole physics confirmed that not only did God play dice, “but that he sometimes confuses us by throwing them where they can’t be seen.”
Are we here by chance or design?
A more pragmatic approach to the question, considering the subject matter, would be to assume that all answers are correct. In fact, that’s the basis of quantum physics.
Here’s the simplest explanation of how it all works that you’ll ever read: imagine flipping a coin and then walking away secure in the knowledge that it landed on heads or tails.
If we look at the entire universe and start zooming in until you get down to the tiniest particles, you’ll see the exact same effect in their interactions. They’re either going to do one thing or another. And, until you observe them, that potential remains.
With all that potential out there in the universe just waiting to be observed, we’re able to build quantum computers.
However, like all things quantum, there’s a duality involved in harnessing God’s dice for our own human needs. For every mind-blowing feat of quantum engineering we come up with — just wait until you read about laser tweezers and time crystals — we need some grounded technology to control it.
In reality, there’s no such thing as a “purely-quantum computer” and there probably never will be. They’re all hybrid quantum-classical systems in one way or another.
Let’s start off with why we need quantum computers. Classical (or binary, as they’re often called) computers — the kind you’re reading this on — complete goals by solving tasks.
We program computers to do what we want by giving them a series of commands. If I press the “A” key on my keyboard, then the computer displays the letter “A” on my screen.
Somewhere inside the machine, there’s code telling it how to interpret the key press and how to display the results.
It took our species approximately 200,000 years to get that far.
In the past century or so, we’ve come to understand that Newtonian physics doesn’t apply to things at very small scales, such as particles, or objects at particularly massive scales such as black holes.
The most useful lesson we’ve learned in our relatively recent study of quantum physics is that particles can become entangled.
Quantum computers allow us to harness the power of entanglement. Instead of waiting for one command to execute, as binary computers do, quantum computers can come to all of their conclusions at once. In essence, they’re able to come up with (nearly) all the possible answers at the same time.
The main benefit to this is time. A simulation or optimization task that might take a supercomputer a month to process could be completed in mere seconds on a quantum computer.
The most commonly cited example of this is drug discovery. In order to create new drugs, scientists have to study their chemical interactions. It’s a lot like looking for a needle in a never-ending field of haystacks.
There are near-infinite possible chemical combinations in the universe, sorting out their individual combined chemical reactions is a task no supercomputer can do within a useful amount of time.
Quantum computing promises to accelerate these kinds of tasks and make previously impossible computations commonplace.
But it takes more than just expensive, cutting-edge hardware to produce these ultra-fast outputs.
Hybrid quantum computing has entered the chat
Hybrid quantum computing systems integrate classical computing platforms and software with quantum algorithms and simulations.
And, because they’re ridiculously expensive and mostly experimental, they’re almost exclusively accessed via cloud connectivity.
In fact, there’s a whole suite of quantum technologies out there aside from hybrid quantum computers, though they’re the technology that gets the most attention.
In a recent interview with Neural, the CEO of SandboxAQ (a Google sibling company under the Alphabet umbrella), Jack Hidary, lamented:
For whatever reason, the mainstream media seems to only focus on quantum computing.
There are also quantum sensing, quantum communications, quantum imaging, and quantum simulations — although, some of those overlap with quantum hybrid computing as well.
The point is, as Hidary also told Neural, “we’re at an inflection point.” Quantum tech is no longer a far-future technology. It’s here in many forms today.
But the scope of this article is limited to hybrid quantum computing technologies. And, for that, we’re focused on two things:
- Quantum annealing systems
- Gate-based quantum computers
Is this for here or to go?
There are two kinds of problems in the quantum computing world: optimization problems and… the kind that aren’t optimization problems.
For the former, you need a quantum annealing system. And, for everything else, you need a gate-based quantum computer… probably. Those are still very much in the early stages of development.
But companies such as D-Wave have been building quantum annealing systems for decades.
Here’s how D-Wave describes the annealing process:
The systems starts with a set of qubits, each in a superposition state of 0 and 1. They are not yet coupled. When they undergo quantum annealing, the couplers and biases are introduced and the qubits become entangled. At this point, the system is in an entangled state of many possible answers. By the end of the anneal, each qubit is in a classical state that represents the minimum energy state of the problem, or one very close to it.
Here’s how we describe it here at Neural: have you ever seen one of those 3-D pin art sculpture things?
That’s pretty much what the annealing process is. The pin art sculpture thing is the computer and your hand is the annealing process. What’s left behind is the “minimum energy state of the problem.”
Gate-based quantum computers, on the other hand, function entirely differently. They’re incredibly complex and there are a number of different ways to implement them but, essentially, they run algorithms.
These include Microsoft’s new cutting-edge experimental system which, according to a recent blog post, is almost ready for prime time:
Microsoft’s approach has been to pursue a topological qubit that has built-in protection from environmental noise, which means it should take far fewer qubits to perform useful computation and correct errors. Topological qubits should also be able to process information quickly, and one can fit more than a million on a wafer that’s smaller than the security chip on a credit card.
And even the most casual of science readers have probably heard about Google’s amazing time crystal breakthrough.
Last year, here on Neural, I wrote:
Google’s ‘time crystals’ could be the greatest scientific achievement of our lifetimes.
A time crystal is a new phase of matter that, simplified, would be like having a snowflake that constantly cycled back and forth between two different configurations. It’s a seven-pointed lattice one moment and a ten-pointed lattice the next, or whatever.
What’s amazing about time crystals is that when they cycle back and forth between two different configurations, they don’t lose or use any energy.
Heck, even D-Wave, the company that put quantum annealing on the map, has plans to introduce cross-platform hybrid quantum computing to the masses with an upcoming gate-based model of its own.
What’s next for the quantum computing industry
The quantum computing industry is already thriving. As far as we’re concerned here at Neural, the mainstream is just now starting to catch a whiff of what the 2030s are going to look like.
As Bob Wisnieff, CTO of IBM Quantum, told Neural back in 2019 when IBM unveiled its first commercial quantum system:
We get to be in the right place at the right time for quantum computing, this is a joy project… This design represents a pivotal moment in tech.
According to Wisnieff and others building the hybrid quantum computer systems of tomorrow, the timeline from experimental to fully-implemented is very short.
Where annealing and similar quantum optimization systems have been around for years, we’re now seeing the first generation of gate-based models of quantum advantage come to market.
You might remember reading about “quantum supremacy” a few years back. Quantum advantage is the same thing but, semantically speaking, it’s a bit more accurate. Both terms represent the point at which a quantum computer can perform a given function in a reasonable amount of time that would take a classical computer too long to do.
The reason “supremacy” quickly went out of favor is because quantum computers rely on classical computers to perform these functions, so it makes more sense to say they give an advantage when used in tandem. That’s the very definition of hybrid quantum computing.
As for what’s next? It’s unlikely you’ll see a ticker-tape parade for quantum computing any time soon. There won’t be an iPhone of quantum computers, or a cultural zeitgeist surrounding the launch of a particular processor.
Instead, like all great things in science, over the course of the next five, 10, 100, and 1,000 years, scientists and engineers will continue to pass the baton from one generation to the next as they stand upon the shoulders of giants to see into the future.
Thanks to their continuing work, in our lifetimes we’re likely to see vast improvements to power grids, a resolution to mass scheduling conflicts, dynamic shipping optimizations, pitch-perfect quantum chemistry simulations, and even the first inklings of far-future tech such as warp engines.
These technological advances will improve our quality of life, extend our lives, and help us to reverse human-caused climate change.
Hybrid quantum computing is, in our humble opinion here at Neural, the single most important technology humankind has ever endeavored to develop. We hope you’ll stick with us as we continue to blaze a trail of coverage at the frontier of this new and exciting realm of engineering.
Get the Neural newsletter
Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.Follow @neural