The promise of quantum computing supremacy is bunk

The promise of quantum computing supremacy is bunk

Google recently announced that the first demonstration of quantum supremacy could be just a few months away. I think they are entirely wrong. It is not just more than a few months away, it is forever away.

I explain why below, but first I would like to take exception to their use of the word “supremacy” because what they are promising should not be called “quantum supremacy,” but one very narrow and specially constructed case in which quantum computing is better than traditional computing. 

To call this supremacy makes as much sense as claiming that strips of magnesium enjoy supremacy over electric bulbs for the generation of light. Yes, there is a specific use case for which they are superior, but to say that they enjoy “supremacy” is a grotesque misuse of language; or, clever marketing if you prefer.

A more accurate term than “supremacy” would be “not entirely useless”. In case you were wondering, the magnesium use case is the temporary illumination of Austrian ice caves for the edification of tourists.

Now, back to quantum computing. 

One reason for doubting the latest cheery announcements is simply the field’s long and unbroken series of failed promises. Researchers have been predicting such a breakthrough “within the next decade” for almost forty years now and despite the billions of dollars poured into development, the target date keeps retreating.

Further, doubts are now being expressed by mathematicians, computer scientists, and physicists. Doubts grounded in technical analyses which have gone a long way towards proving that the task is not just extremely difficult, but theoretically impossible.  

For an overview, here is an excellent article from Quanta Magazine. Among other insights, it points out that the computational power of quantum systems is limited if they are noisy. While the quantum boosters seem to believe that noise is just an engineering problem, I believe that it is absolutely fundamental. In part because computers are entropy engines.

For example, when a large number is decomposed into its prime factors, the entropy of these factors is lower than that of the original number. Since the entropy of the universe tends to increase, the net result of the computation must always be an increase in overall entropy, i.e. noise. 

The promise of quantum computers is that as the number of qubits rises, their computational power to perform such calculations rises exponentially – if only the quantum state could be kept from decaying. But as the computational power of the system rises exponentially with the number of qubits, the amount of energy that must be converted into noise also rises exponentially.  

Noise is not a crippling problem for traditional computers – their circuits can be made robust enough to stand some noise and whisk it away in the form of heat.

But the quantum state is intrinsically fragile and destroyed by noise. There are attempts to address this challenge with error correcting quantum circuits, but these mechanisms would also have to scale exponentially with the qubit count. They can’t. And so quantum computers will never be usable. 

This is not to say that quantum techniques are without value. For example, they can improve cryptography with a defense against man in the middle attacks that is unbreakable. But for general computational tasks, such as factoring large numbers, they are, and will, remain worthless.

To be specific, I do not believe that a quantum computer will ever do a better job of factoring a large number than the 8-year-old Intel i7 950 running the PC on which I typed this article. And at just $300, it is quite a bargain in comparison to the $15M that a quantum computer can cost.

Read next: EU commissioner Margrethe Vestager: ‘Now is the time for citizens to take control’