NISQ Is Dead, a Dying Dead End, With No Prospects for a Brighter Future or Practical Quantum Computing

Jack Krupansky
19 min readFeb 15, 2023

--

This is just a final note on my decision to abandon NISQ approaches to quantum computing as well as full quantum error correction (QEC), in favor of advancing to low-noise or what I call near-perfect qubits and post-NISQ quantum computing, as well as full any to any qubit connectivity and quantum processors capable fine granularity of phase and probability amplitude sufficient to support non-trivial quantum Fourier transforms and quantum phase estimation, such as for quantum computational chemistry. This is the only viable path to practical quantum computing. This informal paper justifies my decision and its implications and consequences.

Full quantum error correction was the great hope for escaping from noisy qubits, but it is just too complex and impractical, and fails to address all of the errors and noisiness. Near-perfect qubits won’t be perfect, but far better than noisy NISQ qubits, and likely good enough for many applications.

Whether full quantum error correction can ever be fixed and made practical is debatable, but not a slam-dunk surefire bet worthy of consideration at this juncture. Doubling down on steady advances in qubit fidelity, as was done with classical transistors, is a much more practical and surefire bet. Besides, higher fidelity qubits reduce the overhead needed for full quantum error correction anyway.

This informal paper is actually an abbreviated version of my rationale for abandoning NISQ — the meat and details of my rationale can be found in my immediately preceding informal paper on post-NISQ quantum computing:

That informal paper includes coverage of the limitations of NISQ, the rationale for moving beyond NISQ, and what life will be like with post-NISQ computing.

The bottom line is that there is no significant and interesting future for NISQ and noisy qubits, but there is indeed a bright future for non-noisy, near-perfect qubits in a post-NISQ world which finally enables practical quantum computing.

Caveat: My comments here are all focused only on general-purpose quantum computers. Some may also apply to special-purpose quantum computing devices, but that would be beyond the scope of this informal paper. For more on this caveat, see my informal paper:

Topics discussed in this informal paper:

  1. In a nutshell
  2. Quick summary of what’s wrong with current (NISQ) quantum computers
  3. Is there hope for NISQ? Nope
  4. NISQ is plagued by huge, insurmountable problems
  5. Wasn’t quantum error correction supposed to come to the rescue of NISQ and enable fault-tolerant quantum computing? Yes, but…
  6. Tell the truth, is quantum error correction really even feasible, at all? Nope
  7. In short, without quantum error correction, NISQ is useless
  8. Technically, NISQ plus quantum error correction would no longer be NISQ since it would then be fault-tolerant quantum computing
  9. Error mitigation and error suppression can help a little, but not save NISQ
  10. Qubit count is not the issue
  11. Is there a bright future for NISQ? Nope
  12. Is NISQ a dead end, a dying dead end? Yes
  13. Is NISQ dead? Yes
  14. But aren’t there applications running on NISQ hardware today? Not really, just prototypes and stripped-down functionality, not production scale
  15. Any significant quantum advantage is not possible using NISQ
  16. Is there life after the death of NISQ? Yes, post-NISQ quantum computing
  17. Is there something beyond NISQ? Yes, post-NISQ quantum computing
  18. Target 48 fully-connected near-perfect qubits as the sweet spot goal for a basic practical quantum computer
  19. Can we achieve The ENIAC Moment of quantum computing using NISQ? Nope
  20. But post-NISQ quantum computers might not be here for a few more years
  21. Use simulation for 32 to 50 qubits until post-NISQ quantum computers arrive
  22. And focus on scalable algorithms if you expect to need more qubits than can be simulated
  23. So place your bets! Just not on NISQ! Plan for post-NISQ and simulate now
  24. The NISQ era is dead
  25. So, sayonara NISQ and sayonara NISQ era
  26. A lot of talk about NISQ and the NISQ era is really about post-NISQ since the presumption is less noisy qubits, which by definition are no longer NISQ (noisy)
  27. Will the 1,121-qubit IBM Condor still be a NISQ quantum computer?
  28. Conclusion

In a nutshell

  1. NISQ qubits and gates are too noisy. Limits size of quantum algorithms and circuits before errors accumulate to render the results unusable.
  2. Below three nines of qubit fidelity is just too noisy to support practical quantum applications. Wait for three, 3.25, or even 3.5 nines of qubit fidelity. Waiting for four to five nines might be too long a wait, but 3.75 or even 3.5 nines is likely to be sufficient for many applications.
  3. NISQ coherence time is too short. Limiting the size of quantum algorithms and quantum circuits.
  4. The size of quantum algorithms and quantum circuits under NISQ is too limited.
  5. Very limited qubit connectivity under NISQ also severely limits the size of quantum algorithms and quantum circuits.
  6. Lack of fine granularity of phase and probability amplitude under NISQ limits the complexity of quantum algorithms and circuits. Especially quantum Fourier transforms and quantum phase estimation, such as needed for quantum computational chemistry.
  7. The effort required to cope with and work around these technical limitations dramatically limits productivity and ability to produce working quantum applications.
  8. The limitations of the technology limit the size, complexity, and usefulness of quantum algorithms and quantum applications.
  9. Any significant quantum advantage is not possible using NISQ. Can’t achieve the circuit size to achieve the quantum parallelism needed to gain significant quantum advantage. There’s no point to using NISQ if no significant quantum advantage can be achieved.
  10. Production-scale practical quantum algorithms and quantum applications are not possible using NISQ.
  11. But aren’t there applications running on NISQ hardware today? Not really, just prototypes and stripped-down functionality, not production scale.
  12. The future of NISQ is not bright. No path to a real future with real value.
  13. NISQ is a dead end, a dying dead end.
  14. NISQ is dying.
  15. NISQ is dead.
  16. NISQ’s problems are insurmountable. Can’t be fixed.
  17. By definition, non-noisy qubits are no longer NISQ (noisy).
  18. Quantum error correction is too impractical to be a solution.
  19. NISQ is unfixable.
  20. Time to move on! To… post-NISQ quantum computing.
  21. Focus on near-perfect qubits — four to five nines of qubit fidelity, maybe 3.5 nines.
  22. Focus on full any-to-any qubit connectivity.
  23. Focus on greater coherence time, shorter gate execution time, and greater circuit size. Support non-trivial quantum algorithms and quantum circuits.
  24. Focus on fine granularity of phase and probability amplitude. Sufficient to support non-trivial quantum Fourier transforms and quantum phase estimation, such as needed for quantum computational chemistry.
  25. But even post-NISQ quantum computing may not be here for another couple of years. Nominally two to seven years, depending on what your quantum algorithm and quantum application need.
  26. Best to focus on simulators for now. For 32 to 50 qubits.
  27. Best to focus on scalable algorithms. That can scale to more qubits than a simulator can handle when more qubits become available in real quantum hardware. And we need analysis tools to detect gate usage that is not scalable.
  28. So, sayonara NISQ, sayonara NISQ era. Move on to post-NISQ quantum computing, based on near-perfect qubits!
  29. A lot of talk about NISQ and the NISQ era is really about post-NISQ since the presumption is less noisy qubits, which by definition are no longer NISQ (noisy).
  30. The 1,121-qubit IBM Condor won’t be a NISQ quantum computer. Since NISQ means intermediate scale, which is 50 to a few hundred.

Quick summary of what’s wrong with current (NISQ) quantum computers

Despite the tremendous advances in quantum computing over the past decade, current quantum computers, so-called NISQ devicesNISQ quantum computers, suffer a number of fatal limitations. In short, these include:

  1. Qubits and gates are too noisy. Limits size of quantum algorithms and quantum circuits before errors accumulate to render the results unusable.
  2. Coherence time is too short. Limiting the size of quantum algorithms and quantum circuits.
  3. The size of quantum algorithms and quantum circuits is too limited.
  4. Very limited qubit connectivity also severely limits the size of quantum algorithms and quantum circuits.
  5. Lack of fine granularity of phase and probability amplitude limits the complexity of quantum algorithms and quantum circuits. Especially quantum Fourier transforms and quantum phase estimation, such as needed for quantum computational chemistry.

The net effect of these technical limitations is that:

  1. The effort required to cope with and work around these technical limitations dramatically limits productivity and ability to produce working production-scale quantum applications.
  2. The limitations of the technology limit the size, complexity, and usefulness of quantum algorithms and quantum applications.
  3. Any significant quantum advantage is not possible using NISQ.
  4. Production-scale practical quantum algorithms and quantum applications are not possible using NISQ.

For more detail, both of the issues and proposed solutions, see my immediately preceding informal paper on post-NISQ quantum computing:

Is there hope for NISQ? Nope

The notion that noisy qubits could actually be useful, for anything other than relatively trivial quantum algorithms and quantum circuits using more than a trivial number of qubits was never a viable proposition. Why did anyone ever believe it?!

NISQ is plagued by huge, insurmountable problems

There’s no way to sugar coat it, NISQ is a broken technology and can’t be fixed — by definition.

Sure, you can (and I did) propose changes to NISQ to fix it, but then it is no longer NISQ — it no longer fits the technical definition of NISQ if the qubits are no longer noisy.

Wasn’t quantum error correction supposed to come to the rescue of NISQ and enable fault-tolerant quantum computing? Yes, but…

Yeah, that was the theory, or at least the narrative, that all of the downsides of noisy qubits could magically get corrected by quantum error correction (QEC). And that quantum error correction was coming soon, like within a few years. That was the story I heard five years ago. But here we are, five years later and even optimists are not so willing to commit that we will have this magical quantum error correction within another five years.

So, no, full quantum error correction will not come riding to the rescue of NISQ and NISQ-based quantum algorithms and quantum applications.

Tell the truth, is quantum error correction really even feasible, at all? Nope

I was a true believer in quantum error correction and expected it within a few years, but as I dug deeper into it, my enthusiasm waned and… completely evaporated.

For more detail on the evolution of my thinking about quantum error correction — and near-perfect qubits, see my informal paper:

In short, without quantum error correction, NISQ is useless

There is no other way of putting it, the bottom line is that without full quantum error correction (QEC), NISQ is useless for non-trivial quantum algorithms and applications.

Sure, you can play around, even develop some stripped-down prototype algorithms, and experiment with the technology, but nothing that would in any way aid you in the development of practical production-scale quantum algorithms and quantum applications.

Technically, NISQ plus quantum error correction would no longer be NISQ since it would then be fault-tolerant quantum computing

And even if quantum error correction (QEC) were available on NISQ devices, that combination would by definition no longer be considered NISQ since by definition it would constitute fault-tolerant quantum computing and we would no longer talk about it as NISQ.

Error mitigation and error suppression can help a little, but not save NISQ

Various techniques can be used to accomplish quantum error mitigation and even quantum error suppression, which are well short of full quantum error correction (QEC), and these can help to some modest to moderate degree but not enough to eliminate noise and errors enough to enable larger and more complex quantum algorithms and quantum circuits to execute properly. They can help with smaller or trivial quantum circuits, but aren’t scalable to much larger quantum algorithms and quantum circuits, as would be needed for production-scale practical quantum computing.

Besides, limited qubit connectivity and limited overall quantum circuit size will eventually become the critical limiting technical factors no matter how effective error mitigation and error suppression techniques manage to be.

To be sure, error mitigation and error suppression will likely still have a useful and even valuable role in post-NISQ quantum computing, but alone they cannot save NISQ.

Qubit count is not the issue

Adding more qubits — hundreds, thousands, even millions — won’t save NISQ. We aren’t even able to fully exploit 27 qubits today due to low qubit fidelity and lack of full any to any qubit connectivity, and unable to exploit larger quantum Fourier transforms and quantum phase estimation.

Is there a bright future for NISQ? Nope

NISQ is what it is. We got what we got. Sure, there will be incremental improvements in the months and at least the next few years ahead, but nothing which will enable NISQ to achieve anything resembling practical quantum computing.

NISQ has no future. NISQ has no path to a real future with real value.

Time to move on! To… post-NISQ quantum computing, based on near-perfect qubits, full any-to-any qubit connectivity, and larger circuit size.

Is NISQ a dead end, a dying dead end? Yes

Sure, you can try to milk every drop of capability you can from NISQ quantum computers, but it will basically be all to no avail.

There is nothing ahead of real value for NISQ, just the death throes of dying and a dead end.

Time to move on! To… post-NISQ quantum computing, based on near-perfect qubits, full any-to-any qubit connectivity, and larger circuit size.

Is NISQ dead? Yes

Yes, NISQ really is dead, with no ability to provide production-scale practical quantum computing and no path to ever get there.

Yes, you can go through the motions and pretend that you are doing real, practical work on current and near-term quantum computers, but going through the motions, such as with prototypes and stripped-down functionality, will fall far short of delivering real and substantial value to your organization.

Sure, spend some time grieving, as much time as you need, but then…

Time to move on! To… post-NISQ quantum computing, based on near-perfect qubits, full any-to-any qubit connectivity, and larger circuit size.

But aren’t there applications running on NISQ hardware today? Not really, just prototypes and stripped-down functionality, not production scale

There have been quite a flurry of academic papers and press releases about implementing a wide variety of practical application features on NISQ hardware, but if you dive deep and look at the details, the fine print, these are always inevitably simply prototypes rather than full, production-scale applications, and even when features are implemented, they are inevitably stripped down with limited functionality rather than being the full functionality that production-scale applications would require.

In short, a lot of people are presenting the illusion that quantum applications are a reality, when actually they are not, and not even close at that.

Most critically, all that has been implemented to date is far from being scalable or readily evolvable to production-scale quantum algorithms and quantum applications.

Any significant quantum advantage is not possible using NISQ

Any significant quantum advantage cannot be achieved using NISQ. you can’t achieve the circuit size to achieve the quantum parallelism to gain the quantum advantage.

There’s no point to using NISQ if no significant quantum advantage can be achieved.

Is there life after the death of NISQ? Yes, post-NISQ quantum computing

Even if NISQ isn’t absolutely dead, or is dead but the body has not yet been properly buried, there is life after NISQ, namely post-NISQ quantum computing

Is there something beyond NISQ? Yes, post-NISQ quantum computing

For my proposal for quantum computing after NISQ, see my informal paper on post-NISQ quantum computing:

Target 48 fully-connected near-perfect qubits as the sweet spot goal for a basic practical quantum computer

Every application and application category and user will have their own hardware requirements, but I surmise that 48 fully-connected near-perfect qubits should be enough for many applications and users to at least get started with something that shows preliminary signs of a significant quantum advantage.

For more details, see my informal paper:

Can we achieve The ENIAC Moment of quantum computing using NISQ? Nope

Achieving The ENIAC Moment of quantum computing, the first moment when a quantum computer is able to demonstrate a production-scale quantum application achieving some significant level of quantum advantage, won’t require the perfect logical qubits of full quantum error correction, but won’t be possible with qubits as noisy as NISQ covers — and will require greater qubit connectivity and larger maximum circuit size than are generally available using NISQ.

But The ENIAC Moment will require near-perfect qubits, which, by definition, since they are no longer noisy, will never be available using NISQ.

Post-NISQ quantum computers will almost definitely enable The ENIAC Moment to be reached, but NISQ alone has no chance of achieving that milestone.

For more on The ENIAC Moment of quantum computing, see my paper:

But post-NISQ quantum computers might not be here for a few more years

It could be two to seven years before post-NISQ quantum computers are available.

Five to seven years is a long time to wait, but that would be for the more extreme applications, those requiring well above four nines or even five nines of qubit fidelity, and maybe requiring 75 to 100 fully-connected near-perfect qubits.

It might be three to four years before we have 50 to 75 fully-connected qubits with four nines of qubit fidelity.

But it may only be two to three years before we have 32 to 48 fully-connected qubits with 3.5 to four nines of qubit fidelity.

And we could get qubits with 3.25 nines of qubit fidelity in one to two years, maybe even with full any-to-any connectivity later in that timeframe.

Use simulation for 32 to 50 qubits until post-NISQ quantum computers arrive

The best option is to use a simulator until the first versions of post-NISQ quantum computers with 32 to 48 fully-connected qubits with 3.5 to four nines of qubit fidelity arrive in two to three years.

The simulator should be configured with a noise model to match the expected real quantum computer qubit fidelity.

And focus on scalable algorithms if you expect to need more qubits than can be simulated

Some quantum algorithms and quantum circuits just won’t fit on smaller quantum computers or simulators, typically under 50 qubits, or maybe only 32 qubits. You must also watch for total circuit size, which might not fit within the coherence time of a particular quantum computer, or a circuit depth which exceeds the error rate of that quantum computer. The best approach to this issue is to design a scalable quantum algorithm with these essential qualities:

  1. Gates are generated dynamically based on input data size and parameters.
  2. Test the algorithm with smaller data sizes on a simulator.
  3. Test the algorithm with incrementally larger data sizes until the limit of the simulator (or your patience!) is reached.
  4. Analyze the algorithm for gate patterns and parameter values which might not function for larger data sizes. For example, requiring a very fine granularity of phase or probability amplitude. Also, calculate qubit count and total circuit size for the larger data sizes to assure that they will run on real quantum computers that are expected in the coming years or that are the target for your application. Ditto for circuit depth and error rate.

For more on designing scalable quantum algorithms, see my informal paper:

So place your bets! Just not on NISQ! Plan for post-NISQ and simulate now

With so many reasonable simulators available, there is no credible reason for anyone to be using current NISQ quantum computers.

Other than maybe as an expensive method to generate true random numbers.

Granted, post-NISQ quantum computers are not here yet, but they’ll be here within a couple of years, so now is a great time to plan for them.

Until they get here, a simulator is your best bet.

And now is a great time to be providing hardware vendors and quantum service providers with your input about what you want and need and expect in post-NISQ quantum computers. And when you expect it. They need some incentive. You can provide that.

The NISQ era is dead

It should go without saying that if NISQ is dead, then the NISQ era is dead as well, a dead end, and dying, with no bright future ahead of it.

The post-NISQ era will eliminate any need for NISQ.

So, sayonara NISQ and sayonara NISQ era

With that, I intend to personally put NISQ quantum computing in my rearview mirror. It will no longer be a focus of mine. Sure, I may occasionally check up on it to see if progress is being made towards post-NISQ quantum computing, but nothing to focus any significant fraction of my energy on.

Anything less than three nines of qubit fidelity definitely won’t get my attention.

Anything with three nines of qubit fidelity might get a passing glance from me.

It will only be when something has 3.25 nines of qubit fidelity that I will start paying attention. And, it will have to have full any-to-any qubit connectivity.

And, people will have to be talking about non-trivial quantum Fourier transforms and using quantum phase estimation.

A lot of talk about NISQ and the NISQ era is really about post-NISQ since the presumption is less noisy qubits, which by definition are no longer NISQ (noisy)

Nobody in their right mind imagines that large, complex quantum algorithms and quantum circuits will execute properly within very short coherence times, so if they are talking about larger and more complex circuits, they are implicitly talking about far-less noisy qubits and significantly longer coherence time, implying that they are no longer talking about noisy qubits, which means, by definition, their qubits will no longer be NISQ since NISQ implies noisy qubits.

To me, it seems silly, pointless, even outright mindless for people to talk about NISQ and the NISQ era when the qubits are no longer NISQ — noisy.

Will the 1,121-qubit IBM Condor still be a NISQ quantum computer?

As proposed by Prof. John Preskill, a NISQ device is a noisy intermediate-scale quantum device (quantum computer.) It has noisy qubits and 50 to a few hundred qubits:

So, by definition, since the 1,121-qubit IBM Condor quantum computer has more than “a few hundred” qubits, it would, by definition, no longer be intermediate scale, and hence no longer a NISQ quantum computer.

Assuming that the qubits are still at least somewhat noisy, in my own proposed terminology Condor would be a noisy large-scale quantum device, or a NLSQ device.

Whether the qubit fidelity of Condor would be sufficient to qualify for what I call near-perfect qubit fidelity, remains to be seen. It is possible that the qubit fidelity of Condor might hit three nines or even a little better, in which case it could be considered a near-perfect large-scale quantum device, or a NPLSQ device. But, personally, I’d prefer to hold off calling Condor or some successor device NPLSQ until it hits a qubit fidelity of at least 3.5 nines. And my nominal definition of near-perfect qubit fidelity is really four to five nines.

For more details on my proposed terminology, see my informal paper:

All of that said, some people might still consider Condor to be NISQ since it will still be somewhat noisy and is still short of thousands of qubits.

Conclusion

  1. NISQ qubits and gates are too noisy. Limits size of quantum algorithms and circuits before errors accumulate to render the results unusable.
  2. Below three nines of qubit fidelity is just too noisy to support practical quantum applications. Wait for three, 3.25, or even 3.5 nines of qubit fidelity. Waiting for four to five nines might be too long a wait, but 3.75 or even 3.5 nines is likely to be sufficient for many applications.
  3. NISQ coherence time is too short. Limiting the size of quantum algorithms and quantum circuits.
  4. The size of quantum algorithms and quantum circuits under NISQ is too limited.
  5. Very limited qubit connectivity under NISQ also severely limits the size of quantum algorithms and quantum circuits.
  6. Lack of fine granularity of phase and probability amplitude under NISQ limits the complexity of quantum algorithms and circuits. Especially quantum Fourier transforms and quantum phase estimation, such as needed for quantum computational chemistry.
  7. The effort required to cope with and work around these technical limitations dramatically limits productivity and ability to produce working quantum applications.
  8. The limitations of the technology limit the size, complexity, and usefulness of quantum algorithms and quantum applications.
  9. Any significant quantum advantage is not possible using NISQ. Can’t achieve the circuit size to achieve the quantum parallelism needed to gain significant quantum advantage. There’s no point to using NISQ if no significant quantum advantage can be achieved.
  10. Production-scale practical quantum algorithms and quantum applications are not possible using NISQ.
  11. But aren’t there applications running on NISQ hardware today? Not really, just prototypes and stripped-down functionality, not production scale.
  12. The future of NISQ is not bright. No path to a real future with real value.
  13. NISQ is a dead end, a dying dead end.
  14. NISQ is dying.
  15. NISQ is dead.
  16. NISQ’s problems are insurmountable. Can’t be fixed.
  17. By definition, non-noisy qubits are no longer NISQ (noisy).
  18. Quantum error correction is too impractical to be a solution.
  19. NISQ is unfixable.
  20. Time to move on! To… post-NISQ quantum computing.
  21. Focus on near-perfect qubits — four to five nines of qubit fidelity, maybe 3.5 nines.
  22. Focus on full any-to-any qubit connectivity.
  23. Focus on greater coherence time, shorter gate execution time, and greater circuit size. Support non-trivial quantum algorithms and quantum circuits.
  24. Focus on fine granularity of phase and probability amplitude. Sufficient to support non-trivial quantum Fourier transforms and quantum phase estimation, such as needed for quantum computational chemistry.
  25. But even post-NISQ quantum computing may not be here for another couple of years. Nominally two to seven years, depending on what your quantum algorithm and quantum application need.
  26. Best to focus on simulators for now. For 32 to 50 qubits.
  27. Best to focus on scalable algorithms. That can scale to more qubits than a simulator can handle when more qubits become available in real quantum hardware. And we need analysis tools to detect gate usage that is not scalable.
  28. So, sayonara NISQ, sayonara NISQ era. Move on to post-NISQ quantum computing, based on near-perfect qubits!
  29. A lot of talk about NISQ and the NISQ era is really about post-NISQ since the presumption is less noisy qubits, which by definition are no longer NISQ (noisy).
  30. The 1,121-qubit IBM Condor won’t be a NISQ quantum computer. Since NISQ means intermediate scale, which is 50 to a few hundred.

--

--