sakkmesterke - stock.adobe.com
Quantum computing continues to garner interest as the next big computing innovation after artificial intelligence, despite the prohibitive cost and lack of accessibility.
Researchers, academia and companies are working on resources -- and processing units -- that use quantum physics to perform calculations much faster than traditional computing hardware. Quantum computers can support data units that can exist in more than one state, which is a limitation of binary systems.
Potential uses for quantum computing in the enterprise include conducting statistical analysis, solving optimization processes, breaking ciphers and factoring large number sets. In 2019, IBM and D-Wave Systems are the top vendors actively pursuing quantum computer hardware and chip development.
Getting started with quantum concepts
These quantum computing terms can help IT administrators build a foundational understanding.
This field is the study of the behavior of matter and energy at the atomic and subatomic level. There are two major analyses in quantum theory: the Copenhagen interpretation and the many-worlds theory.
In the Copenhagen interpretation, Niels Bohr asserted that a particle can be measured as a wave or a particle, but scientists can't assume the particle has specific properties until they observe it.
Another way to explain the Copenhagen interpretation is the Schrodinger's cat example. This experiment looks to explain the gap between quantum theory and what actually happens at the molecular level. A living cat is placed into a box with radioactive material and a container of hydrocyanic acid. If any of the radioactive material starts to decay, a lever breaks open the vial of acid, which kills the cat. But researchers can't know if the cat is dead or alive because they can't see if the bottle breaks while inside the box.
The many-worlds, or multiverse, theory's answer to Schrodinger's cat is that, as soon as an object exists in a certain state, a series of parallel universes that contain every possible state of that object is created. The idea is that the cat is both dead and alive.
Quantum theory provides the foundation for quantum computing. IT professionals will see more development for quantum systems over the next 10 years, as well as enterprise use cases in finance, healthcare, defense and technology sectors.
This is quantum computing's version of a bit; it serves as a basic unit of information. Within a quantum system, the number of particles in a qubit can be a representation of 0 and/or 1.
Although an ordinary computer's two-bit register can only store one of four binary configurations -- 00, 01, 10 or 11 -- at a time, a two-qubit register can simultaneously store all four values because a qubit represents two values instead of one.
Since quantum computers are not quite mainstream yet, the qubit's benefits are largely speculative, but the idea is that it enables end users to calculate more possible outcomes in a shorter amount of time.
This quantum computing term is one of the most vexing quantum theory principles. It specifies that elementary particles can be in multiple places at once, and any single particle can cross its own trajectory and obstruct particle path direction.
There are two concepts to help understand quantum interference: superposition and the double-slit experiment. Superposition is a quantum system's ability to be in multiple states at the same time before researchers take any measurements.
Thomas Young conducted the double-slit experiment in 1801. Young set up a photographic plate and aimed a beam of light at a barrier with two vertical slits. When one slit was covered, he saw a single block of light. But when Young aimed the light at both open slits, he didn't just see two blocks of light, but multiple light blocks of different light levels instead.
This experiment's results display interference and establish the idea that photons of light, when projected, cross every possible trajectory before hitting a target -- instead of just going through one slit.
The other component of quantum interference is wave-particle duality. Traditional mechanics separates light movement into waves and particles. Waves continually move and are spatially extended, while particles are discrete and take up little space. In quantum mechanics, this duality means that waves can behave as particles and vice versa.
Quantum interference, combined with wave-particle duality, makes it difficult for users to predict experiment or processing job outcomes. These concepts help researchers understand the inner workings of quantum systems and could give insight into how to develop processing technology for quantum computers. Research on quantum interference has discovered use cases for quantum cryptography, quantum computing and the superconducting quantum interference device.
Quantum use cases beyond a computer
This is a method that cryptologists can use to build a cryptosystem and encode messages.
The main difference from traditional cryptography is that quantum cryptography uses physics -- as opposed to mathematics -- as part of its security model. It uses individual photon particle properties to develop an impenetrable cryptosystem.
Potential use cases for this cryptosystem are communications between the White House and the Pentagon, as well as other key military sites and defense contractors.
This hypothetical system of interconnected quantum computers uses quantum waves to communicate with one another, instead of radio waves. To make the quantum internet possible, the infrastructure must keep quantum computers in environments as low as absolute zero to ensure functionality.
There is speculation that this technology will be a specialized sector of traditional internet, and scientists will use it to conduct research between labs.
Read about IBM's latest quantum computing developments