This article was published on March 14, 2019

IBM’s latest trick: Turning noisy quantum bits into machine learning magic


IBM’s latest trick: Turning noisy quantum bits into machine learning magic

IBM’s figured out how to ignore noisy qubits and run machine learning algorithms in quantum feature spaces. Eureka-cadabra! The age of quantum algorithms is upon us.

A team of IBM researchers, alongside scientists from MIT and Oxford, created a pair of quantum classification algorithms and then experimentally implemented them on a hybrid system utilizing a 2-qubit quantum computer and a classical superconductor. Basically, they demonstrated that quantum computers can provide advantages in machine learning that classical computers, alone, cannot.

According to the researchers’ paper:

Here we propose and experimentally implement two quantum algorithms on a superconducting processor. A key component in both methods is the use of the quantum state space as feature space. The use of a quantum-enhanced feature space that is only efficiently accessible on a quantum computer provides a possible path to quantum advantage. The algorithms solve a problem of supervised learning: the construction of a classifier.

Another way of putting it: we now have a road map for quantum advantage in machine learning. This is the point where a quantum system’s ability to run/optimize algorithms surpasses that of a classical computer. We’re not quite there yet, as IBM’s research blog points out:

Our research doesn’t yet demonstrate Quantum Advantage because we minimized the scope of the problem based on our current hardware capabilities, using only two qubits of quantum computing capacity, which can be simulated on a classical computer … What we’ve shown is a promising path forward.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

TNW spoke to Dr. Kristan Temme, IBM Research physicist and co-author on the team’s white paper, and Dr. Bob Sutor, VP for IBM Q Strategy, about this Reese’s-peanut-butter-cup-esque mash-up of machine learning and quantum. Temme told us the team designed the experiment to work with today’s noisy systems, “these are basically algorithms that should run on a device that doesn’t have fault tolerance,” he says.

This is important because, as it stands, one of the major hurdles to quantum computers becoming useful outside of laboratories is dealing with the problem of decoherence, which is basically a manifestation of quantum noise (more on that here).

The idea here is to not wait until quantum hardware is perfect in a decade or two before we start figuring out how to develop and program for these systems. IBM’s work showcases how classical and quantum computers will work together to solve problems.

And speaking of working together: IBM open-sourced the algorithms. If you’re wondering why on Earth a big tech company would do that, you’re not alone. We asked Sutor, who told us:

We’re doing everything that we can to get quantum into the hands of the people … We learned a lot about open source over the years. Open source is an essential way for people to develop software.

Temme added, “we’re hoping that many people will engage with the algorithms.”

To that end, you can try a cool demo of the algorithms for yourself here – no quantum physics or computer skills required. For those who want to go a little deeper: IBM’s released them to Qiskit Aqua, an open-source library of quantum algorithms for developers and researchers to use with IBM’s cloud-accessed quantum computers.

We’re still in the early days of quantum computers, as IBM’s CTO of Quantum Computing, Bob Wisnieff, recently told TNW:

Imagine if everyone in the 60s had five to ten years to explore the mainframe’s hardware and programming when it was essentially still a prototype. That’s where we are with quantum computing.

IBM’s latest research blazes the trail forward for both quantum computers and machine learning. We can’t wait to see what’s on the other side of quantum machine learning advantage.


Want to learn more about AI than just the headlines? Check out our Machine Learners track at TNW2019.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with