Despite the current limitations of noisy and error-prone quantum computers, a recent study suggests that there are ways to mitigate errors and make them useful even without robust error correction. Scientists from IBM Quantum, the University of California, Berkeley, and Lawrence Berkeley National Laboratory published their findings in the journal Nature. In the study, they compared a 127-qubit quantum computer to a state-of-the-art supercomputer and found that the quantum computer outperformed the supercomputer in at least one type of calculation.
The chosen calculation was not specifically difficult for classical computers but rather resembled the calculations commonly performed by physicists. The researchers designed the experiment to test whether today’s quantum computers, despite their noise and errors, could produce accurate results for certain types of calculations that can be progressively complex.
The significant finding was that as the calculation became more intricate, the quantum computer consistently produced the correct solution, whereas the supercomputer algorithm yielded an incorrect answer. This discovery provides hope that quantum computing algorithms incorporating error mitigation, rather than error correction, could tackle cutting-edge physics problems. For instance, it could aid in understanding the quantum properties of superconductors and novel electronic materials.
The researchers believe that we are approaching a phase where quantum computers might accomplish tasks that current classical algorithms cannot. Sajant Anand, a co-author of the study and graduate student at UC Berkeley, stated, “We’re entering the regime where the quantum computer might be able to do things that current algorithms on classical computers cannot do.”
Sarah Sheldon, Senior Manager for Quantum Theory and Capabilities at IBM Quantum, expressed that quantum computers can now be seen as a tool for studying problems that were previously infeasible to explore. On the other hand, the quantum computer’s superior performance over the classical computer could also serve as a catalyst for improving quantum algorithms used on classical computers. Michael Zaletel, co-author of the study and Associate Professor of Physics at UC Berkeley, suggested that the unexpected success of the quantum approach could inspire advancements in classical algorithms. While initially expecting the classical method to outperform the quantum one, Zaletel sees the quantum system’s functionality as an opportunity to develop more effective classical approaches. By studying how the quantum computer achieves its results, researchers aim to enhance classical algorithms to match the future performance of quantum computers.
Boost the noise to suppress the noise
One of the key factors contributing to the apparent advantage of IBM’s quantum computer is the implementation of quantum error mitigation, a novel technique aimed at addressing the inherent noise in quantum computations. Interestingly, researchers at IBM deliberately increased the noise in their quantum circuit, resulting in even noisier and less accurate answers. They then extrapolated backward to estimate the answer the computer would have produced if no noise had been present. This approach relies on a thorough understanding of the noise affecting quantum circuits and the ability to predict its impact on the output.
The issue of noise arises due to the sensitivity of IBM’s qubits, which are superconducting circuits representing binary computations’ zeroes and ones. When these qubits become entangled for a calculation, external disturbances such as heat and vibration can disrupt the entanglement and introduce errors. The level of noise becomes more detrimental as the entanglement increases.
Moreover, computations that affect a specific set of qubits can introduce random errors in other unrelated qubits. These errors can then accumulate and amplify as additional computations are performed. Scientists are aiming to implement fault-tolerant error correction, using extra qubits to monitor and rectify these errors. However, achieving scalable fault-tolerance presents significant engineering challenges, and it remains to be seen whether it can effectively handle larger numbers of qubits, as noted by Zaletel.
Instead of relying on fault-tolerant error correction, IBM engineers devised a strategy called zero noise extrapolation (ZNE). This approach employs probabilistic methods to deliberately increase the noise within the quantum device. Based on a recommendation from a former intern, IBM researchers sought assistance from Anand, Wu, and Zaletel to assess the accuracy of results obtained using this error mitigation strategy. Zaletel specializes in developing supercomputer algorithms for complex calculations involving quantum systems, such as electronic interactions in novel materials. These algorithms, which utilize tensor network simulations, can be directly applied to simulate interacting qubits within a quantum computer.
Over several weeks, IBM Quantum’s Youngseok Kim and Andrew Eddins conducted increasingly complex quantum calculations using the advanced IBM Quantum Eagle processor. Anand then attempted to perform the same calculations using state-of-the-art classical methods on supercomputers such as Cori and Lawrencium at Berkeley Lab, as well as the Anvil supercomputer at Purdue University. When Quantum Eagle was introduced in 2021, it boasted the highest number of high-quality qubits among all quantum computers, seemingly surpassing the capabilities of classical computers to simulate.
Simulating all 127 entangled qubits on a classical computer would require an astronomical amount of memory. Representing the quantum state would necessitate 2 to the power of 127 separate numbers, which is an enormous figure. Conventional computers typically have the capacity to store around 100 billion numbers, which falls short by 27 orders of magnitude. To address this, Anand, Wu, and Zaletel employed approximation techniques that allowed them to solve the problem within reasonable time and cost constraints. These techniques resemble image compression methods, discarding less crucial information and retaining only what is necessary to achieve accurate answers within the memory limitations.
For less complex calculations, Anand confirmed the accuracy of the quantum computer’s results, but as the complexity increased, discrepancies emerged between the quantum and classical computer outcomes. However, by simplifying certain parameters and performing exact calculations, Anand was able to validate the quantum calculations against the classical ones. At the largest depths of calculations considered, where exact solutions were not available, the quantum and classical results diverged.
The researchers acknowledge that they cannot definitively prove the correctness of the quantum computer’s final answers for the most challenging calculations. Nonetheless, the consistent successes of the Quantum Eagle across previous runs instilled confidence in the researchers regarding the accuracy of its results.
“The success of the quantum computer wasn’t like a fine-tuned accident. It actually worked for a whole family of circuits it was being applied to,” Zaletel emphasized.
While Zaletel remains cautious about the applicability of the error mitigation technique to a larger number of qubits or more complex calculations, he finds the results highly encouraging and inspiring. He explains that the findings sparked a sense of friendly competition and a belief that it should be possible to simulate on a classical computer what the quantum computer is accomplishing. However, he acknowledges the need for a clever and improved approach, as the quantum device operates in a regime that suggests a different methodology.
One potential avenue is to simulate IBM’s zero noise extrapolation (ZNE) technique using classical tensor network simulations. This approach aims to explore whether applying the error mitigation concept to classical computations can yield improved classical results. Anand highlights that this work opens up the possibility of using a quantum computer as a verification tool for classical computations, which reverses the traditional script of using classical computers to verify quantum computations.