A cooperation between the Applied Mathematics and Computational Research Division (AMCRD) and the Physics Division at Lawrence Berkeley National Laboratory (Berkeley Lab) has resulted in a new approach to error mitigation that could help make quantum computing’s theoretical potential a reality.
Recent research has revealed a new technique to quantum error mitigation – ‘noise estimation circuits’ – that could aid in realizing quantum computing’s theoretical potential. This study is described in the publication “Mitigating Depolarizing Noise on Quantum Computers using Noise-Estimation Circuits,” which was published in Physical Review Letters.
“Quantum computers have the ability to answer more difficult problems far faster than traditional computers,” said Bert de Jong, one of the study’s principal authors and director of the AIDE-QC and QAT4Chem quantum computing projects. De Jong also serves as the director of the AMCRD’s Applied Computing for Scientific Discovery Group. “The actual issue, however, is that quantum computers are still in their infancy. And there is still much work to be done to make them reliable.”
One of the issues for the time being is that quantum computers are still too error-prone to be consistently helpful. This is largely due to a phenomenon known as “noise” (errors).
Readout noise and gate noise are two examples of noise. The former is concerned with reading out the results of a run on a quantum computer; the more noise, the more likely a qubit (the quantum counterpart of a bit on a classical computer) will be measured in the incorrect state. The latter is concerned with the actual operations carried out; noise refers to the possibility of performing the incorrect operation. Furthermore, the prevalence of noise increases substantially as more operations are attempted with a quantum computer, making it more difficult to extract the correct result and severely limiting quantum computers’ usability as they scale up.
Quantum computers have the ability to answer more difficult problems far faster than traditional computers. The actual issue, however, is that quantum computers are still in their infancy. And there is still much work to be done to make them reliable.
Bert de Jong
“So noise here simply implies stuff you don’t want, and it obscures the outcome you do want,” explained Ben Nachman, a Berkeley Lab physicist and co-author on the work who also leads the cross-cutting Machine Learning for Fundamental Physics group.
While error correction would be ideal on classical computers, it is not currently practicable on existing quantum computers due to the number of qubits required. The next best thing: error mitigation – methods and tools for reducing noise and minimizing errors in quantum simulation science outputs. “On average,” Nachman added, “we want to be able to say what the appropriate answer should be.”
To get there, the Berkeley Lab researchers devised a new method known as noise estimating circuits. A circuit is a set of operations or a program that is executed on a quantum computer to compute the solution to a scientific problem. The scientists tweaked the circuit to provide a predictable answer — 0 or 1 — then used the difference between the measured and projected answers to correct the real circuit’s output.
Some inaccuracies are corrected by the noise estimation circuit approach, but not all. The Berkeley Lab scientists combined their novel strategy with three other error mitigation techniques: readout error correction using “iterative Bayesian unfolding,” a technique often employed in high-energy physics; randomized compilation; and error extrapolation. They were able to obtain trustworthy findings from an IBM quantum computer by bringing all of these elements together.
Making bigger simulations possible
This research has the potential to have far-reaching repercussions for the field of quantum computing. The new error mitigation method enables researchers to elicit the correct answer from simulations that require a significant number of processes, “far more than what individuals have traditionally been able to perform,” according to de Jong.
Instead of performing tens of entanglement or controlled NOT operations, the new technique allows researchers to perform hundreds of such operations while still obtaining trustworthy results, he noted. “As a result, we can now perform larger simulations that were before impossible.”
Furthermore, according to de Jong, the Berkeley Lab group was able to employ these strategies efficiently on a quantum computer that was not necessarily perfectly configured to eliminate gate noise. This broadens the appeal of the unique mistake mitigation strategy.
“It’s a positive thing because if you can do it on those platforms, we can definitely do it even better on less noisy ones,” he says. “As a result, it’s a pretty universal method that we can utilize on a variety of platforms.”
For researchers, the new error mitigation strategy means that quantum computers may be able to tackle larger, more difficult issues. For example, scientists will be able to do chemistry simulations with far more operations than previously possible, according to de Jong, a computational chemist by trade.
“My focus is in trying to tackle challenges related to carbon capture, battery research, and catalysis research,” he explained. “As a result, my portfolio has always been: I perform science, but I also design the instruments that allow me to do science.”
Quantum computing advances have the potential to lead to breakthroughs in a variety of fields, including energy generation, decarbonization, and greener industrial processes, as well as medication development and artificial intelligence. Quantum computing could aid in the discovery of hidden patterns in LHC data at CERN’s Large Hadron Collider, where researchers send particles crashing into each other at extraordinarily high speeds to understand how the cosmos works and what it’s composed of.
Error mitigation will be critical in moving quantum computing ahead in the near term.
“The better the error mitigation, the more operations we can apply to our quantum computers, which means that someday, hopefully soon, we’ll be able to make calculations on a quantum computer that we couldn’t make now,” said Nachman, who is particularly interested in the potential for quantum computing in high-energy physics, such as further investigating the strong force that holds nuclei together.
A cross-division team effort
The work, which began in late 2020, is the most recent in a long line of collaborations between Berkeley Lab’s Physics and Computational Research divisions. According to Nachman, cross-divisional work is extremely vital in the study and development of quantum computing. Nachman and his colleague Christian Bauer, a Berkeley Lab theoretical physicist, approached de Jong after receiving a funding call from the US Department of Energy (DOE) a few years ago as part of a pilot initiative to explore whether researchers might find ways to use quantum computing for high-energy physics.
“‘We have this notion,’ we said. We’re carrying out these computations. ‘What are your thoughts?'” Nachman explained. “We drafted a proposition. It received funding. And it’s now a significant portion of what we do.” According to Nachman, there is a lot of interest in this technology across the board. “We have significantly benefited from collaborating with (de Jong’s) group, and I believe it works both ways,” he said.
De Jong concurred. “It’s been interesting understanding each other’s physics languages and discovering that, at the core, we have similar requirements and algorithmic demands when it comes to quantum computing,” he said.