TechTarget.com/searchcio

https://www.techtarget.com/searchcio/definition/NISQ-computing

What is NISQ computing? Pros and cons

By Kinza Yasar

Noisy intermediate-scale quantum (NISQ) computing defines the present stage of quantum computing, in which quantum devices have a moderate qubit count, ranging from 50 to 1,000. The term was coined by physicist John Preskill in 2018 to characterize this prefault-tolerant era of quantum computation.

NISQ computing systems are advanced enough to begin real-world applications across diverse fields. However, these systems are still susceptible to errors stemming from environmental noise and scalability challenges, and they possess limited quantum error correction capabilities.

Characteristics of NISQ computers

NISQ computing is characterized by the following features:

Benefits of NISQ computers

NISQ computers face issues with noise and their moderate number of qubits. However, they still offer benefits over classical computers, including the following:

Limitations and drawbacks of NISQ computers

While NISQ computers represent a step forward in quantum computing, limitations in technologies and investments are keeping quantum computers from reaching their full potential. For example, quantum hardware requires highly specialized infrastructure, including cryogenic systems and precise control mechanisms, which are both complex and costly to develop and maintain.

Also, the absence of standardized quantum software frameworks slows the creation of applications that can operate across different quantum platforms, resulting in a fragmented market. There's also a shortage of skilled professionals with quantum computing expertise, which poses a challenge for advancing the field.

The following are some common challenges of NISQ computers:

Use cases of NISQ computers

NISQ computers are being explored for practical applications across various industries. The following are some notable use cases and applications of NISQ computers:

Promising algorithms for NISQ computing

NISQ devices are defined by their limited number of qubits and their vulnerability to noise. As a result, they need specialized algorithms that can function effectively within these constraints.

The following are some of the most prominent examples:

History of NISQ computing technology

The NISQ era of quantum computing emerged when it was realized that initial quantum machines were constrained by their limited qubit count and significant noise due to the absence of full error correction.

The NISQ phase followed the theoretical groundwork laid in the 1980s and early experimental quantum algorithm demonstrations on small systems in the late 1990s and early 2000s. A significant milestone occurred in 2019 when Google demonstrated quantum supremacy with its 53-qubit Sycamore processor. This achievement highlighted the potential of NISQ hardware, even as researchers were exploring practical applications for the computers.

By 2024, the largest quantum processors exceeded 1,000 qubits, with Atom Computing's 1,180-qubit processor and IBM's Condor leading the way. Despite these advancements, today's NISQ systems struggle with high error rates and short coherence, hindering their ability to execute complex tasks reliably.

Current research focuses on developing hybrid quantum-classical algorithms and error mitigation techniques to extract useful computations from these noisy devices for applications in various fields.

Future of quantum computing technologies

According to a McKinsey & Company study, four sectors are expected to be the earliest beneficiaries of quantum computing: chemicals, life sciences, finance and mobility. Those sectors could gain up to $2 trillion by 2035.

The future of quantum computing extends beyond the NISQ era, with the ultimate goal being the realization of fault-tolerant quantum computers. This transition will be marked by the ability to perform arbitrarily long and complex computations with low error rates.

The evolution beyond NISQ will involve the following stages and advancements:

The timeline for achieving fully fault-tolerant quantum computing remains uncertain, with estimates ranging from the late 2020s to the 2030s or beyond. Some roadmaps anticipate demonstrating scientific quantum advantage with early FTQCs within the next few years, potentially leading to a commercial tipping point by 2030.

Quantum computing uses the principles of quantum mechanics to solve problems more efficiently than classical computers. Discover potential future applications of quantum computing, including advancements in supply chains, financial modeling, AI optimization and data centers.

14 May 2025

All Rights Reserved, Copyright 2007 - 2026, TechTarget | Read our Privacy Statement