The future of quantum data centers: Resilience and risk
The quantum outlook calls for close ties with classical computing, even as it tops standard IT in some use cases. Businesses can benefit but must address post-quantum security.
Data centers have progressed from the glasshouse computing spaces of yesteryear to today's gigantic hyperscaler facilities. Quantum data centers are another evolutionary step that builds upon that history while branching out in new directions.
A quantum data center is a facility built to house quantum computers and meet their power, cooling and workflow management requirements. Some centers are purpose-built for quantum systems. In other cases, quantum processing units (QPUs) are colocated with classical computers in high-performance computing (HPC) centers.
Quantum technology vendors and research institutions are the typical operators of quantum data centers. A few large enterprises also run quantum facilities. Most CIOs and CTOs, however, will likely access quantum computing through a cloud interface that provides access to the back-end quantum processor.
Whatever the access method, quantum computing promises to take on problems beyond the reach of conventional IT. Quantum processes, particularly superposition and entanglement, provide the computational power and parallelism necessary to address sticky problems in areas such as simulation and optimization. Specifically, quantum bits -- qubits -- serve as the basic unit of information that lets quantum computers process a multitude of calculations.
Quantum computing's perceived business value varies by industry. Financial services firms see opportunities in areas such as portfolio management. Life sciences companies look for breakthroughs in molecular modeling and drug discovery. Other businesses believe quantum data centers will address logistics and supply chain challenges.
Key capabilities and design features of future quantum data centers
Much of this perceived value will derive from the capabilities and features quantum data centers promise to provide. Those centers require some exotic fixtures, such as specialized refrigeration units. However, in many cases, quantum machines take advantage of classical computing components. Here are a few characteristics of still-evolving quantum data centers.
Hybrid computing
Hybridization is a core feature of many quantum data centers that's likely to persist over time. The coexistence and integration of classical computers and quantum hardware are prevalent expectations among industry executives. That's because quantum computing is geared toward specialized tasks, leaving ample space for classical machines to handle workloads unsuited for QPUs.
"We see quantum computing as a hybrid technology that works hand in hand with currently existing compute technologies, such as CPUs and GPUs," said Jan Goetz, CEO and co-founder at IQM Quantum Computers, which builds superconducting full-stack quantum computers.
While IQM operates quantum gear in its own data centers, offering compute time as cloud services, the company gives significant weight to its hybrid deployments. "A big part of our business model," Goetz explained, "is actually selling quantum computers into high-performance computing centers, where they are physically located right next to the racks full of CPUs and GPUs."
IBM also operates its own quantum data centers while collaborating with partners on quantum-classical facilities. As for the latter, the company's IBM Quantum System One machine is colocated on the Rensselaer Polytechnic Institute (RPI) campus with the university's Artificial Intelligence Multiprocessing Optimized System, a classical supercomputer commonly referred to as AiMOS. The two systems connect so workloads can move from one machine to the other.
Similarly, Riken, a national scientific research institute in Japan, plans to integrate an IBM quantum computer with its Fugaku supercomputer.
"There are scopes of problems that classical computers are better at and scopes of problems that quantum computers are better at," said Oliver Dial, CTO at IBM Quantum, pointing to the simple premise behind such linkups. "If you can find a problem you can actually segregate into those scopes, you end up with something that is really a lot better than either of them [alone] would be today."
Quantum data centers have some unusual requirements but also take advantage of traditional infrastructure.
Hybridized algorithms
Several quantum computing algorithms are designed to break up problems and assign processing tasks to the appropriate platform, whether quantum or classical. Pairing quantum machines and supercomputers, Dial said, "enables the exploration of joint quantum-classical algorithms."
He cited the sample-based quantum diagonalization (SQD) algorithm as one example. Using SQD, quantum computers identify the electronic states of molecules, determining the scope for simulating chemistry problems on classical machines. This approach lets researchers tinker with the balance of quantum vs. classical computing power for tackling specific problems. In this context, the conventional machines build a model from the noisy results from the quantum computer, Dial noted. Using more runtime on the quantum computer, he added, can provide enough data to use only the answers that are most confidently correct, easing the burden on the conventional machine.
"You can kind of trade off the accuracy of your quantum computer for the amount of classical computing you do afterwards," Dial explained. "It's a really great problem to play with this idea of colocating classical and quantum computation and seeing how to get these to work together the best."
Goetz, meanwhile, cited variational quantum eigensolver (VQE) as another prominent hybrid algorithm used in chemistry. VQE solves the quantum part of a particular molecule -- its electronic structure -- and the classical computer runs an optimization algorithm to refine the result.
Another hybrid example is the quantum approximate optimization algorithm -- or QAOA -- which focuses on optimization problems, such as financial portfolio management. Goetz said such algorithms are typical of the current era of quantum computing, which industry executives have termed noisy intermediate-scale quantum, or NISQ. "In the NISQ era," he said, "there are quite a few [algorithms] that require a quantum computer and a normal computer."
Power, cooling and space accommodations
Power, cooling and space rank highly among the design considerations for quantum data centers. How organizations meet those requirements differs depending on the type of quantum computing hardware and whether it's an expanded HPC center or a pure-play quantum facility. IBM's quantum data center in Poughkeepsie, N.Y., falls into the latter category.
"The Poughkeepsie site is basically pure quantum," Dial said. That facility includes several cryogenic refrigeration units and quantum machines equipped with IBM's Heron and Eagle quantum processors. IBM's quantum computers are built on superconducting qubits, which must be cooled to near absolute zero to minimize error-causing environmental interference. Superconducting qubits technology is one of a handful of quantum computing types.
"The refrigeration does bring in some facilities requirements that wouldn't normally be there," Dial said, noting that refrigeration units need occasional routine maintenance. The teams managing quantum systems, for example, change a couple of parts that pump Helium-3, a helium isotope used in dilution refrigerators. "In some sense, that part of it looks a little bit more like a semiconductor fab than a data center," Dial added.
Quantum data centers also require cooling water to remove heat. Goetz said HPC centers already use water-based cooling, so that requirement doesn't require new infrastructure. As for power, IQM machines need tens of kilowatts, which Goetz said is considerably less than the power demands of large supercomputers. Quantum vendors also aim to minimize space requirements in hybrid data centers. IQM's quantum machines, Goetz said, come in a 19-inch rack form factor, which is common in data centers.
Challenges for quantum data centers
Error rates, integration and data storage issues are among quantum data center challenges. The industry's ability to address those issues will play a critical role in making quantum computing useful for enterprise customers. Here's what's on the agenda.
Dealing with quantum error rates
The "noisy" aspect of NISQ refers to the tendency of quantum calculations to generate errors as fragile quantum states break down. Quantum machines, however, are making strides in error suppression and mitigation. The industry's longer-term objective is error correction and fault tolerance. That more advanced stage, however, will also call for quantum-classical cooperation.
Detecting and correcting errors in quantum computations, especially over a large number of qubits, Goetz said, will require "a lot of compute power, independent of which algorithm you run." Classical computers will handle that demand. "Whenever you have error correction in the game," Goetz added, "you will need conventional compute power in the back end."
IBM is developing an error-correcting decoder that draws upon classical computing. The decoding technique, dubbed Relay-BP, can be implemented on a field-programmable gate array or an application-specific integrated circuit, which IBM described as "classical components that are ubiquitous today." The upshot: Huge amounts of HPC power won't be necessary for fault-tolerant quantum computing, according to the company.
Integrating computing platforms
Getting the quantum and classical building blocks to work together requires integration. Currently, integration of the respective technology stacks focuses on software, Goetz said, noting, "Right now, it is mainly through APIs."
Scheduling software also plays a role in harmonizing hybrid data centers. Schedulers, such as simple Linux utility for resource management (Slurm), manage workloads moving between quantum and conventional platforms. Slurm is the standard in HPC centers, Goetz said.
Dial cited using a classical supercomputer's scheduling and workflow management tools as a key takeaway from IBM's work at sites such as RPI and Riken. "It's important to be able to tie into the existing workflow management as opposed to trying to come in with your own special quantum version off to the side," he said. "One of the things we are doing is pivoting in how we handle our workflow orchestration, tying into industry-standard tools like Slurm."
Pursuing higher-level integration
Goetz predicted much tighter quantum-classical integration going forward, first on the software side.
As a step in this direction, IQM in July adopted Qrisp, an open source software development kit, as the default SDK for its Resonance quantum computing platform. This move contributes to software-level integration since Qrisp eases the task of moving data between environments. Qrisp lets IQM build an open and modular platform that developers can use to work on integration, Goetz added.
Hardware integration is also in the offing. Goetz pointed to the possibility of a direct, low-latency connection between the graphics cards in a GPU cluster and a quantum processor. He said R&D projects are already pursuing that objective.
Providing data storage and management resources
Quantum computers lack their own storage subsystems. A quantum machine functions as a real-time processing unit, with data flowing through it and then to a conventional computer, Goetz explained. But at that point, the HPC center's data-handling resources can take over, providing primary storage, backup and data management. "On the data management and storage side," Goetz said, "everything just needs to be connected to the quantum computer."
How quantum data centers could disrupt backup and recovery
The number of quantum data centers is fairly small and largely the province of technology vendors, universities, research institutions and government agencies. In the near term, most businesses will encounter quantum data centers primarily through on-demand services for specialized use cases. While such enterprises stand to benefit from an emerging source of computing power, they also face a cybersecurity threat that could target their backup data and impede recovery.
Threat actors, especially those with nation-state backing, could use quantum computing to crack classical encryption algorithms that have been protecting data for decades. Quantum computers capable of doing so have yet to materialize, but their arrival seems likely within the next few years. In the meantime, businesses are subject to so-called harvest-now-decrypt-later attacks in which data is stolen and stored until Q-Day -- the time when quantum computers become capable of defeating encryption.
"Stored backups encrypted with degraded legacy algorithms would become immediate targets for attackers employing a harvest-now-decrypt-later strategy," said John Young, COO of Quantum eMotion America, the U.S. subsidiary of Montreal-based Quantum eMotion Corp. The company specializes in quantum-secure hardware and software.
Young noted that systems incorporating multifactor authentication and identity and access management processes could also be compromised if they depend on cryptography vulnerable to quantum threats. An attacker impersonating a backup administrator could disrupt recovery operations, access sensitive data or manipulate records, he added.
Automated backup will play a key role in expediting quantum data center demands.
How to build a post-quantum backup strategy today
Businesses can take several steps to prepare for post-quantum security from a data backup and recovery perspective.
We see quantum computing as a hybrid technology that works hand in hand with currently existing compute technologies, such as CPUs and GPUs.
Jan GoetzCEO and co-founder, IQM Quantum Computers
1. Focus more attention on data backup strategies
Young said many enterprises aren't paying enough attention to conventional backups, much less a post-quantum backup plan. So, the first step should be to create a strong backup and recovery strategy, he noted. Businesses should also ensure backup policies are well defined and employees responsible for backups know those policies, Young said. Recovery of backed-up data must be regularly tested, he added.
2. Evaluate the IT environment and assess risk
This phase involves discovering data, taking inventory and ensuring it has the proper security classification. Confidential data, such as intellectual property, customer records and employee information, should be assigned a security classification that "acknowledges its importance, relative to an old sales training demo which is already available on the internet," Young said.
Taking inventory also documents how data is currently encrypted and assesses whether those methods are post-quantum ready -- or vulnerable to attack. The inventory of the cryptographic systems is called a cryptographic bill of materials. Overall, the assessment workflow can demand time, technical knowledge and transparency, Young said.
3. Prioritize the top vulnerabilities for remediation
No one can say with certainty when Q-Day will arrive. The current estimates hover around the 2029-2030 time frame, although a technical breakthrough could accelerate the onset. Cybersecurity experts expect the migration to post-quantum cryptography will span several years, citing the SHA-1 to SHA-2 algorithm transition that took seven or more years to complete.
Businesses that have yet to embark on a post-quantum backup strategy will be hard-pressed to finish their work before the quantum threat strikes. That makes prioritizing the most sensitive data and top vulnerabilities a critical step.
A Mosca score, based on the theorem developed by cryptography expert Michele Mosca, can help with prioritization. The score considers how long a customer will take to remediate vulnerable algorithms relative to the likely arrival date of sufficiently powerful quantum computers. Young said organizations should also consider air gapping their most sensitive data. Keeping that data offline, he added, protects against ransomware and harvest-now-decrypt-later tactics.
4. Identify and assess backup vendors
Organizations should identify the vendors in their backup and recovery ecosystem, including hardware, software and off-site storage, Young advised. He said the major considerations include whether backups are handled in-house or automated and handled remotely by an MSP. "Peeling back the onion" of backup providers' post-quantum readiness can prove difficult, he added.
5. Update business continuity and disaster recovery plans
BCDR plans should be revisited to reflect post-quantum business threats. "This is another area where many companies are flying by the seat of their pants," Young said, noting such entities might only review their BCDR plans every few years. "It can't be just a checkbox on a manager's task list," he said. "This takes some serious effort not only to pull off the planning correctly, but to follow up with testing, maintenance and updating when needed."
6. Consider the long haul
IT managers should view quantum security as an immediate and enduring challenge. "Much of our sensitive data must be kept for years, while some is held onto for decades," Young said. "This is a tremendous organizational responsibility that forces architects to plan for long-term post-quantum threats."
John Moore is a writer for Informa TechTarget covering the CIO role, economic trends and the IT services industry.