Inching toward usefulness
IBM boosts the amount of computation you can get done on quantum hardware
Incremental improvements across the hardware and software stacks add up.
There’s a general consensus that we won’t be able to consistently perform sophisticated quantum calculations without the development of error-corrected quantum computing, which is unlikely to arrive until the end of the decade. It’s still an open question, however, whether we could perform limited but useful calculations at an earlier point. IBM is one of the companies that’s betting the answer is yes, and on Wednesday, it announced a series of developments aimed at making that possible.
On their own, none of the changes being announced are revolutionary. But collectively, changes across the hardware and software stacks have produced much more efficient and less error-prone operations. The net result is a system that supports the most complicated calculations yet on IBM’s hardware, leaving the company optimistic that its users will find some calculations where quantum hardware provides an advantage.
Better hardware and software
IBM’s early efforts in the quantum computing space saw it ramp up the qubit count rapidly, being one of the first companies to reach the 1,000 qubit count. However, each of those qubits had an error rate that ensured that any algorithms that tried to use all of these qubits in a single calculation would inevitably trigger one. Since then, the company’s focus has been on improving the performance of smaller processors. Wednesday’s announcement was based on the introduction of the second version of its Heron processor, which has 133 qubits. That’s still beyond the capability of simulations on classical computers, should it be able to operate with sufficiently low errors.
IBM VP Jay Gambetta told Ars that Revision 2 of Heron focused on getting rid of what are called TLS (two-level system) errors. “If you see this sort of defect, which can be a dipole or just some electronic structure that is caught on the surface, that is what we believe is limiting the coherence of our devices,” Gambetta said. This happens because the defects can resonate at a frequency that interacts with a nearby qubit, causing the qubit to drop out of the quantum state needed to participate in calculations (called a loss of coherence).
By making small adjustments to the frequency that the qubits are operating at, it’s possible to avoid these problems. This can be done when the Heron chip is being calibrated before it’s opened for general use.
Separately, the company has done a rewrite of the software that controls the system during operations. “After learning from the community, seeing how to run larger circuits, [we were able to] almost better define what it should be and rewrite the whole stack towards that,” Gambetta said. The result is a dramatic speed-up. “Something that took 122 hours now is down to a couple of hours,” he told Ars.
Since people are paying for time on this hardware, that’s good for customers now. However, it could also pay off in the longer run, as some errors can occur randomly, so less time spent on a calculation can mean fewer errors.
Deeper computations
Despite all those improvements, errors are still likely during any significant calculations. While it continues to work toward developing error-corrected qubits, IBM is focusing on what it calls error mitigation, which it first detailed last year. As we described it then:
“The researchers turned to a method where they intentionally amplified and then measured the processor’s noise at different levels. These measurements are used to estimate a function that produces similar output to the actual measurements. That function can then have its noise set to zero to produce an estimate of what the processor would do without any noise at all.”
The problem here is that using the function is computationally difficult, and the difficulty increases with the qubit count. So, while it’s still easier to do error mitigation calculations than simulate the quantum computer’s behavior on the same hardware, there’s still the risk of it becoming computationally intractable. But IBM has also taken the time to optimize that, too. “They’ve got algorithmic improvements, and the method that uses tensor methods [now] uses the GPU,” Gambetta told Ars. “So I think it’s a combination of both.”
That doesn’t mean the computational challenge of error mitigation goes away, but it does allow the method to be used with somewhat larger quantum circuits before things become unworkable.
Combining all these techniques, IBM has used this setup to model a simple quantum system called an Ising model. And it produced reasonable results after performing 5,000 individual quantum operations called gates. “I think the official metric is something like if you want to estimate an observable with 10 percent accuracy, we’ve shown that we can get all the techniques working to 5,000 gates now,” Gambetta told Ars.
That’s good enough that researchers are starting to use the hardware to simulate the electronic structure of some simple chemicals, such as iron-sulfur compounds. And Gambetta viewed that as an indication that quantum computing is becoming a viable scientific tool.
But he was quick to say that this doesn’t mean we’re at the point where quantum computers can clearly and consistently outperform classical hardware. “The question of advantage—which is when is the method of running it with quantum circuits is the best method, over all possible classical methods—is a very hard scientific question that we need to get algorithmic researchers and domain experts to answer,” Gambetta said. “When quantum’s going to replace classical, you’ve got to beat the best possible classical method with the quantum method, and that [needs] an iteration in science. You try a different quantum method, [then] you advance the classical method. And we’re not there yet. I think that will happen in the next couple of years, but that’s an iterative process.”
John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.