close
Computer Sciences

Classical Computers Shiver Under New Error-Prone Quantum Computing Method

Despite constant advancements, quantum computers are still noisy and prone to errors, which produce ambiguous or incorrect responses. They won’t actually surpass today’s “classical” supercomputers for at least five or ten years, according to scientists, because of the mistakes that plague entangled quantum bits, or qubits.

However, a recent study demonstrates that, even in the absence of effective error correction, there are strategies to reduce faults that might make quantum computers usable right now.

Researchers at IBM Quantum in New York and their collaborators at the University of California, Berkeley, and Lawrence Berkeley National Laboratory report today (June 14) in the journal Nature that they pitted a 127-qubit quantum computer against a state-of-the-art supercomputer and, for at least one type of calculation, bested the supercomputer.

The researchers claim that the computation was chosen because it is akin to calculations that physicists frequently undertake rather than because it was challenging for traditional computers. Importantly, the calculation might be made it more difficult to evaluate the accuracy of certain sorts of common calculations on the noisy, error-prone quantum computers of today.

It gives hope that quantum computing algorithms with error mitigation, rather than the more challenging error correction, could address cutting-edge physics problems, like understanding the quantum properties of superconductors and novel electronic materials. This is because the quantum computer produced the verifiably correct solution as the calculation became more complex, whereas the supercomputer algorithm produced an incorrect result.

“We’re entering the regime where the quantum computer might be able to do things that current algorithms on classical computers cannot do,” said UC Berkeley graduate student and study co-author Sajant Anand.

“We can start to think of quantum computers as a tool for studying problems that we wouldn’t be able to study otherwise,” added Sarah Sheldon, senior manager for Quantum Theory and Capabilities at IBM Quantum.

Conversely, the quantum computer’s trouncing of the classical computer could also spark new ideas to improve the quantum algorithms now used on classical computers, according to co-author Michael Zaletel, UC Berkeley associate professor of physics and holder of the Thomas and Alison Schneider Chair in Physics.

“Going into it, I was pretty sure that the classical method would do better than the quantum one,” he said. “So, I had mixed emotions when IBM’s zero-noise extrapolated version did better than the classical method. But thinking about how the quantum system is working might actually help us figure out the right classical way to approach the problem. While the quantum computer did something that the standard classical algorithm couldn’t, we think it’s an inspiration for making the classical algorithm better so that the classical computer performs just as well as the quantum computer in the future.”

Boost the noise to suppress the noise

Quantum error mitigation, a cutting-edge method for coping with the noise that comes along with a quantum processor, is one of the keys to the apparent advantage of IBM’s quantum computer.

In a paradoxical move, IBM researchers controlled the noise level in their quantum circuit to produce louder, less accurate results and then projected backward to determine the result the computer would have produced if there had been no noise. This requires a thorough comprehension of the noise that affects quantum circuits and the ability to forecast how it will affect the output.

Going into it, I was pretty sure that the classical method would do better than the quantum one. So, I had mixed emotions when IBM’s zero-noise extrapolated version did better than the classical method. But thinking about how the quantum system is working might actually help us figure out the right classical way to approach the problem. While the quantum computer did something that the standard classical algorithm couldn’t, we think it’s an inspiration for making the classical algorithm better so that the classical computer performs just as well as the quantum computer in the future.

Michael Zaletel

Because IBM’s qubits are delicate, superconducting circuits that encode the zeros and ones of a binary operation, noise is an issue. Unavoidable annoyances like heat and vibration can change the entanglement of the qubits during a calculation, resulting in mistakes. The greater the entanglement, the worse the effects of noise.

Additionally, computations that affect one set of qubits have the potential to introduce random mistakes in unaffected qubits. Then, these inaccuracies are compounded by other computations.

In order to repair these faults using so-called fault-tolerant error correction, scientists want to monitor such problems using additional qubits. But achieving scalable fault-tolerance is a huge engineering challenge, and whether it will work in practice for ever greater numbers of qubits remains to be shown, Zaletel said.

Instead, IBM engineers developed a zero noise extrapolation (ZNE) error mitigation technique that employs probabilistic approaches to gradually increase the noise on the quantum device. Based on a recommendation from a former intern, IBM researchers approached Anand, postdoctoral researcher Yantao Wu, and Zaletel to ask their help in assessing the accuracy of the results obtained using this error mitigation strategy.

Zaletel develops supercomputer algorithms to solve difficult calculations involving quantum systems, such as the electronic interactions in new materials. These algorithms, which employ tensor network simulations, can be directly applied to simulate interacting qubits in a quantum computer.

Over a period of several weeks, Youngseok Kim and Andrew Eddins at IBM Quantum ran increasingly complex quantum calculations on the advanced IBM Quantum Eagle processor, and then Anand attempted the same calculations using state-of-the-art classical methods on the Cori supercomputer and Lawrencium cluster at Berkeley Lab and the Anvil supercomputer at Purdue University.

When Quantum Eagle was rolled out in 2021, it had the highest number of high-quality qubits of any quantum computer, seemingly beyond the ability of classical computers to simulate.

In fact, it would take an enormous amount of memory for a conventional computer to accurately simulate all 127 entangled qubits. It would be necessary to represent the quantum state using 2 to the power of 127 different numbers. That’s 1 followed by 38 zeros; typical computers can store around 100 billion numbers, 27 orders of magnitude too small.

To simplify the problem, Anand, Wu, and Zaletel used approximation techniques that allowed them to solve the problem on a classical computer in a reasonable amount of time, and at a reasonable cost. These techniques are similar to jpeg picture compression in that they eliminate unnecessary data and retain only what is necessary to produce correct results within the constraints of the memory at hand.

For the simpler computations, Anand verified the precision of the quantum computer’s findings, but as the complexity of the calculations increased, the quantum computer’s findings differed from those of the classical computer.

For certain specific parameters, Anand was able to simplify the problem and calculate exact solutions that verified the quantum calculations over the classical computer calculations. At the largest depths considered, exact solutions were not available, yet the quantum and classical results disagreed.

Eagle’s results on the prior runs gave the researchers confidence that the final answers provided by the quantum computer for the most difficult computations were accurate, but they caution that they cannot guarantee it.

“The success of the quantum computer wasn’t like a fine-tuned accident. It actually worked for a whole family of circuits it was being applied to,” Zaletel said.

Friendly competition

While Zaletel is cautious about predicting whether this error mitigation technique will work for more qubits or calculations of greater depth, the results were nonetheless inspiring, he said.

“It sort of spurred a feeling of friendly competition,” he said. “I have a sense that we should be able to simulate on a classical computer what they’re doing. But we need to think about it in a clever and better way the quantum device is in a regime where it suggests we need a different approach.”

One approach is to simulate the ZNE technique developed by IBM.

“Now, we’re asking if we can take the same error mitigation concept and apply it to classical tensor network simulations to see if we can get better classical results,” Anand said. “This work gives us the ability to maybe use a quantum computer as a verification tool for the classical computer, which is flipping the script on what’s usually done.”

The U.S. Department of Energy under an Early Career Award (DE-SC0022716) supported Anand and Zaletel’s work. Wu’s work was supported by a RIKEN iTHEMS fellowship. Cori is part of the National Energy Research Scientific Computing Center (NERSC), the primary scientific computing facility for the Office of Science in the U.S. Department of Energy.

Topic : Article