quantum computing
What is quantum computing?
Quantum computing is an area of computer science focused on the development of technologies based on the principles of quantum theory. Quantum computing uses the unique behaviors of quantum physics to solve problems that are too complex for classical computing.
Development of quantum computers marks a leap forward in computing capability, with the potential for massive performance gains in specific use cases. For example, quantum computing is expected to excel at tasks such as integer factorization and simulations and shows potential for use in industries such as pharmaceuticals, healthcare, manufacturing, cybersecurity and finance.
According to industry trade publication The Quantum Insider, there are more than 600 companies and more than 30 national labs and government agencies worldwide that are developing quantum computing technology. This includes U.S.-based tech giants such as Amazon, Google, Hewlett Packard Enterprise, Hitachi, IBM, Intel and Microsoft as well as Massachusetts Institute of Technology, Oxford University and the Los Alamos National Laboratory. Other countries, including the U.K., Australia, Canada, China, Germany, Israel, Japan and Russia, have made significant investments in quantum computing technologies. The U.K. recently launched a government-funded quantum computing program. In 2020, the Indian government introduced its National Mission on Quantum Technologies & Applications.
The global quantum computing market in 2021 was valued at $395 million USD, according to the report "Quantum Computing Market" from Markets N Research. The report predicts that the market will grow to approximately $532 million USD by 2028.
Although quantum computing is a rapidly emerging technology, it has the potential to be a disruptive technology once it reaches maturity. Quantum computing companies are popping up all over the world, but experts estimate that it could take years before quantum computing delivers practical benefits.
The first commercially available quantum computer was released in 2011 by D-Wave Systems. In 2019, IBM released the Quantum System One, and in November 2022, it unveiled the largest quantum computer yet, Osprey.
Although the idea of using a quantum computer can be exciting, it's unlikely that most organizations will build or buy one. Instead, they might opt to use cloud-based services that enable remote access. For example, Amazon Braket, Microsoft Azure Quantum and Rigetti Quantum Cloud Services all provide quantum computing as a service.
Commercial quantum computers are available anywhere from $5,000 to $15 million, depending on the processing power. For example, a quantum computer with 50 qbits can cost up to $10 million.
How does quantum computing work?
Quantum theory explains the nature and behavior of energy and matter on the quantum, or atomic and subatomic levels. Quantum computing takes advantage of how quantum matter works: Where classical computing uses binary bits -- 1s and 0s -- quantum computing uses 1s, 0s and both a 1 and 0 simultaneously. The quantum computer gains much of its processing power because bits can be in multiple states at the same time.
Quantum computers are composed of an area that houses qubits, the method that transfers signals to qubits, and a classical computer that runs a program and sends instructions.
A qubit, or quantum bit, is equivalent to a bit in classical computing. Just as a bit is the basic unit of information in a classical computer, a qubit is the basic unit of information in a quantum computer. Quantum computers use particles such as electrons or photons that are given either a charge or polarization to act as a 0, 1 or both a 0 and 1. The two most relevant aspects of quantum physics are the principles of superposition and entanglement.
Superposition refers to placing the quantum information a qubit holds into a state of all possible configurations, while entanglement refers to one qubit directly changing another.
Quantum computers tend to be resource-intensive and require a significant amount of energy and cooling to run properly. Quantum computing hardware is mostly composed of cooling systems that keep a superconducting processor at a specific super-cooled temperature. A dilution refrigerator, for example, can be used as a coolant that keeps the temperature in a milli-kelvin (mK) range. As an example, IBM has used this coolant fluid to keep its quantum-ready system to about 25 mK, which is comparable to -459 degrees Fahrenheit. At this super-low temperature, electrons can flow through superconductors, which create electron pairs.
Features of quantum computing
Quantum computers are designed to perform complex calculations with huge amounts of data using the following features:
Superposition. Superposition refers to qubits that are in all configurations at once. Think of a qubit as an electron in a magnetic field. The electron's spin might be either in alignment with the field, known as a spin-up state, or opposite to the field, known as a spin-down state. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from a laser. If only half a unit of laser energy is used, and the particle is isolated from all external influences, it enters a superposition of states. The particle behaves as if it were in both states simultaneously.
Since qubits take a superposition of 0 and 1, this means the number of computations a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits has the potential to do 2^500 calculations in a single step.
Entanglement. Entanglement particles are entangled pairs of qubits that exist in a state where changing one qubit directly changes the other. Knowing the spin state of one entangled particle -- up or down -- gives away the spin of the other in the opposite direction. In addition, because of the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being measured is determined at the time of measurement and communicated to the connected particle, which simultaneously assumes the opposite spin direction.
Quantum entanglement enables qubits separated by large distances to interact with each other instantaneously. No matter how great the distance between the correlated particles, they remain entangled as long as they're isolated.
Quantum superposition and entanglement together create enormously enhanced computing power. If more qubits are added, the increased capacity is expanded exponentially.
What is quantum theory?
Development of quantum theory began in 1900 with a presentation by German physicist Max Planck to the German Physical Society. Planck introduced the idea that energy and matter exist in individual units. Further developments by a number of scientists over the following 30 years has led to the modern understanding of quantum theory.
The elements of quantum theory include the following:
- Energy, like matter, consists of discrete units -- as opposed to a continuous wave.
- Elementary particles of energy and matter, depending on the conditions, might behave like particles or waves.
- The movement of elementary particles is inherently random and, thus, unpredictable.
- The simultaneous measurement of two complementary values -- such as the position and momentum of a particle -- is flawed. The more precisely one value is measured, the more flawed the measurement of the other value will be.
Uses and benefits of quantum computing
Quantum computing has the potential to offer the following benefits:
- Speed. Quantum computers are incredibly fast compared to classical computers. For example, quantum computing has the potential to speed up financial portfolio management models, such as the Monte Carlo model for gauging the probability of outcomes and their associated risks.
- Ability to solve complex processes. Quantum computers are designed to perform multiple complex calculations simultaneously. This can be particularly useful for factorizations, which could help develop decryption technologies.
- Simulations. Quantum computers can run complex simulations. They're fast enough to be used to simulate more intricate systems than classical computers. For example, this could be helpful for molecular simulations, which are important in prescription drug development.
- Optimization. With quantum computing's ability to process huge amounts of complex data, it has the potential to transform artificial intelligence and machine learning.
Limitations of quantum computing
Although the benefits of quantum computing are promising, there are still huge obstacles to overcome:
- Interference. The slightest disturbance in a quantum system can cause a quantum computation to collapse -- a process known as decoherence. A quantum computer must be totally isolated from all external interference during the computation phase. Some success has been achieved with the use of qubits in intense magnetic fields.
- Error correction. Qubits aren't digital bits of data and can't use conventional error correction. Error correction is critical in quantum computing, where even a single error in a calculation can cause the validity of the entire computation to collapse. There has been considerable progress in this area, however, with an error correction algorithm developed that uses 9 qubits -- 1 computational and 8 correctional. A system from IBM can make do with a total of 5 qubits -- 1 computational and 4 correctional.
- Output observance. Retrieving output data after a quantum calculation is complete risks corrupting the data. Developments such as database search algorithms that rely on the special wave shape of the probability curve in quantum computers can avoid this issue. This ensures that once all calculations are performed, the act of measurement sees the quantum state decohere into the correct answer.
There are other problems to overcome as well, such as how to handle security and quantum cryptography. Long-time quantum information storage also has been a problem in the past. But recent breakthroughs have made some form of quantum computing practical.
A comparison of classical and quantum computing
Classical computing relies on principles expressed by Boolean algebra, usually operating on a logic gate principle. Data must be processed in an exclusive binary state at any point in time -- either 0 for off or 1 for on. These values are bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. There's also still a limit as to how quickly these devices can be made to switch states.
By comparison, quantum computers operate with a two-mode logic gate -- XOR and a mode called QO1-- which lets them change 0 into a superposition of 0 and 1. In a quantum computer, particles such as electrons or photons can be used. Each particle is given a charge, or polarization, acting as a representation of 0 and 1. Each particle is referred to as a quantum bit, or qubit. The nature and behavior of these particles form the basis of quantum computing and quantum supremacy.
Like any emerging technology, quantum computing offers opportunities and risks. Learn how quantum computing compares to classical computing.