Getty Images
Experts predict quantum utility by 2033
Quantum computing is no longer science fiction. Experts at Nvidia GTC 2026 think quantum utility is possible by 2033, but not without challenges.
SAN JOSE, Calif. -- Savvy enterprise leaders have had quantum computing in their sights, if not actively preparing for quantum technology and post-quantum cryptography. Soon, they might have to put their strategies into action.
Experts at Nvidia GTC 2026 agreed that quantum utility could be possible as soon as 2033. This technology could disrupt entire industries and be catastrophic for organizations that aren't prepared to defend against quantum algorithms capable of breaking traditional encryption. But quantum computing isn't all doom and gloom; it also stands to be one of the most breakthrough innovations of the century, enabling businesses to further optimize AI, bolster cybersecurity, and optimize drug and chemical research.
Experts are excited about this latest development in computing and the progress of scientists and engineers worldwide. But progress doesn't come without challenges, and there are still major hurdles before we can realize the full potential of quantum utility.
Achieving quantum utility
The current goal for quantum computing is to reach quantum utility. The Defense Advanced Research Projects Agency (DARPA), an agency within the U.S. Department of War, launched its Quantum Benchmarking Initiative (QBI) in 2024. According to the project's documentation, the goal of this initiative is to "validate whether any quantum computing approach can achieve utility-scale operation."
"The aim is to work with companies to determine if it's possible to build industrially useful quantum computers much faster than had previously been predicted," said Andrew Dzurak, CEO and founder of Diraq, which is building quantum computers based on modified silicon transistors. "They define [utility-scale operation] as the point when the computational value exceeds the cost" of developing the computers, he added.
Dzurak, speaking at his Nvidia GTC session, "Scaling toward quantum-GPU supercomputing," said some recent technologies have demonstrated practical applications of quantum computing. These technologies are already generating revenue today, albeit on a small scale, he explained.
"They're a long way from the tipping point where the actual revenue generated clearly outpaces the cost of building those systems," Dzurak said.
Dr. William Oliver, the Henry Ellis Warren (1894) Professor of Electrical Engineering and Computer Science and professor of physics at MIT, also said we aren't quite at the quantum endgame. People often assume that meeting the 2033 benchmark means quantum technology will fulfill its biggest promises, and that might not be the case, Oliver said. In fact, it could take another 20 years before quantum can do something as extraordinary as simulate new drugs for the pharmaceutical industry, he said.
"I think that when people are asked the question 'When are we going to have a quantum computer?' they're thinking of the endgame quantum computer, which can do heavy scientific computing," Oliver said in an interview with TechTarget Editorial. "Whether that's 10 years or 30 years, nobody knows. But it's not next year."
While we won't achieve the quantum endgame in the next few years, Oliver and Dzurak agreed that the utility of quantum technology is still achievable by the 2033 benchmark.
"I'm certainly hopeful," Oliver said. "You could say that we have commercialization already because, for example, companies will pay IBM for education, to use their quantum computers and have a white-glove service that helps them with their algorithms. There is an educational aspect to it, which, I would say, is generating revenue."
So, what's preventing us from achieving the quantum endgame?
Quantum error correction and other challenges
According to several experts at Nvidia GTC, one of the major challenges that quantum development faces is quantum error correction (QEC).
During his session on "Realizing the promise of quantum computation," Oliver described QEC as "fixing the errors that happen due to the delicate nature of quantum mechanics and fixing them faster than they occur. If you can do that, you can kind of keep all the balls up in the air and do a computation for a long time," he said.
Quantum bits, or qubits, are the basic unit of information that quantum computers use to complete their tasks. Think of qubits as the quantum counterpart to bits that classical computers use. Oliver explained that the qubits we use today, physical qubits, have error rates in the range of one in 100 to one in 10,000. However, to achieve practical applications of quantum computers, we need to reduce these error rates to within the range of one in a billion to one in a trillion. This is because commercial quantum algorithms will require multiple steps, Oliver said. This is where QEC comes in.
"Basically, the point of error correction is that, rather than encoding information in a single physical qubit, we're going to encode one qubit's worth of information in a team of qubits," he said.
This team of qubits is "refereed" by other qubits, which then flag when an error occurs with the team and what it is. Interpreting this data during QEC is known as quantum decoding, and it's a subject that Dr. Joschka Roffe, a senior researcher at the University of Edinburgh School of Informatics, is focused on cracking.
"We've been really looking at pushing the limits of how fast we can make decoding," he said during his session on "Vibe decoding quantum error correction with CUDA-Q." "We were able to make our existing implementations 10,000 times faster."
While this is substantial progress, Roffe said in an interview with TechTarget Editorial that they still need to drive down the physical errors of the hardware before quantum commercial utility is fully realized. "There have to be some pretty significant developments on [the error correction] front," he said. "There are also a lot of theoretical challenges to be overcome in terms of the engineering of the wider consumer software stack."
Despite these challenges, Roffe said that it's not inconceivable that the 2033 QBI goal will be met.
Dzurak said another challenge for some companies creating quantum technology is the physical size and energy consumption of their hardware. "If they're going to get up to [quantum] duties, they're going to be very large and incredibly power hungry," he said during his session.
What's next for quantum?
While experts continue to strive toward commercial quantum utility, enterprise leaders need to prepare for the emergence of quantum computing now.
"Once quantum computers are powerful enough to do commercially interesting things, like pharma design, they will also be cryptographically relevant," Dzurak said in an interview with TechTarget Editorial. "They'll be able to crack public key encryption codes. At that point, effectively all data is insecure."
The finance sector could potentially be disrupted by commercial quantum utility, Dzurak said. "Financial trading at the moment is already using AI and very fast turnaround cycles, but they need increasingly accurate data in the models to drive those trading algorithms," he explained. "That's an area where quantum computing can really change the game and provide much more accurate models."
Dzurak also mentioned how the pharma industry could reduce drug trials and bring pharmaceuticals to market faster by simulating multiple drug trials at once that would traditionally take years. "If you can find the right drug first off, you can then do that trial much more quickly and reduce the overall cost to develop pharma," he said.
These developments are underway, and experts agreed that business leaders must take this new technology seriously. However, Oliver said in an interview with TechTarget Editorial that it's still "too early to bet the farm" on quantum's commercial utility.
"It's a very exciting time as a researcher in the field," he said. "Things are accelerating, and when technology starts to accelerate, it's very hard to predict exactly when it's going to happen."
Everett Bishop is the assistant site editor for AI & Emerging Tech and the previous editor for searchCloudComputing. He graduated from the University of New Haven in 2019.