This article was published on November 14, 2017

IBM claims ā€˜quantum supremacyā€™ over Google with 50-qubit processor


IBM claims ā€˜quantum supremacyā€™ over Google with 50-qubit processor Image by: IBM Research

IBM researcher Edwin Pednault was doing the dishes one evening when he came to the realization that qubits are a lot like the bristles of a scrubbing brush. What he dubbed as a ā€œseemingly inconsequential momentā€ became the basis of a fault-tolerance theory which makes the 50-qubit quantum computer possible.

Early last month Googleā€™s quantum computer research team announced it had made strides towards what it dubbed ā€œquantum supremacy.ā€ The big idea was that a 50-qubit quantum computer would surpass the computational capabilities of our most advanced supercomputers, making it superior.

IBM, early this month, successfully built and measured an operational prototype 50-qubit processor.

The jury is still out on whether 50 qubits actually represents ā€˜quantum supremacy,ā€™ thanks to some new ideas ā€“ from IBM of course ā€“ on how we can use classical computers to simulate quantum processes.

Pednaultā€™s insight however, at least in part, was responsible for a new fault-tolerance capability that helped scale simulations of quantum processors as high as 56 qubits.

IBMā€™s 50-qubit processor is a phenomenal feat that happened far quicker than any expert predicted.  It also unveiled a 20-qubit quantum processor accessible to developers and programmers via the IBM Q cloud-based platform.

Prior to Pednaultā€™s ā€˜eurekaā€™ moment, 50 qubits was considered beyond our immediate grasp due to a problem with ā€˜noisyā€™ data.

Basically, the more qubits you have in play, the more their computations become susceptible to errors. This problem is compounded by the fact that qubits scale exponentially.

In a company blog post Pednault described it:

Two qubits can represent four values simultaneously: 00, 01, 10, and 11, again in weighted combinations. Similarly, three qubits can represent 2^3, or eight values simultaneously: 000, 001, 010, 011, 100, 101, 110, 111. Fifty qubits can represent over one quadrillion values simultaneously, and 100 qubits over one quadrillion squared.

Quantum computing is meaningless if a bunch of noisy qubits throw every calculation out of whack. Adding more processing power, to date, has been accompanied by an increase in errors.

The previous state of quantum computing could have been described as a ā€˜moā€™ qubits, moā€™ problemsā€™ situation. IBMā€™s fancy new method for fault-tolerance, inspired by Pednaultā€™s scrubbing brush, solves that problem by silencing the noise surrounding the science.

Realistically, the quantum computer might not prove truly useful until we reach processors with thousand-qubit capabilities, and the real exciting science-fiction stuff probably wonā€™t come until weā€™ve developed quantum computers with million-qubit processors ā€“ assuming we overcome the fragility of the hardware.

But we wonā€™t get there until someone makes a 100-qubit processor, and then a 200-qubit one, and so forth.

Once weā€™ve surpassed the capabilities of classical computers weā€™ll be able to simulate and understand molecular compounds in a much more detailed way. With this new technology comes the potential to eradicate diseases, eliminate hunger, and repair our environment.

In the race for ā€œquantum supremacy,ā€ there arenā€™t any losers. But IBM might just be winning.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top