Why I Continue to Lack Confidence That Quantum Computing Will Ever Be Able to Fulfill Most of the Grand Promises Made for It

Jack Krupansky
58 min readMay 17, 2023

There are still a number of aspects of quantum computing which seem unlikely to ever come to fruition.

Many of the technical capabilities of the quantum computers themselves and the quantum algorithms and quantum applications will be technically difficult or take a very long time to successfully implement — but eventually come to fruition, but some capabilities, algorithms, or applications are outright infeasible to ever implement in both theory and practice — the deal killers for the grandiose promises.

This informal paper enumerates and briefly describes both the deal killers — those features or capabilities or applications which can likely never be implemented — as well as mere impediments which can likely eventually be successfully implemented, even if with some degree of difficulty or some excessive elapse of time and expense of talent, money, and other valuable resources.

Whether this paper might constitute my swan song for my quantum computing efforts, my quantum journey, remains to be seen, although I do already have one more informal paper in the works.

Caveat: My comments here are all focused only on general-purpose quantum computers. Some may also apply to special-purpose quantum computing devices, but that would be beyond the scope of this informal paper. For more on this caveat, see my informal paper:

Caveat #2: In fact, since the focus of this informal paper is attempts to fulfill grand promises made for quantum computing, my comments are also generally restricted to practical quantum computers — those with sufficient capabilities to support production-scale practical real-world quantum applications and achieve significant quantum advantage and deliver significant business value which is well beyond the reach of classical computing. For more on this caveat, see my informal paper:

Topics discussed in this paper:

  1. In a nutshell
  2. Background
  3. Practical quantum computing is the ultimate goal
  4. Expectations for significant quantum advantage
  5. Deal killers vs. mere impediments
  6. Categories of obstacles
  7. Profound reservations and likely or potential deal killers
  8. Never say never = 1 in 1,000 chance
  9. Analog nature of probability amplitude and phase, dubious granularity
  10. The analog nature of probability amplitude and phase, with dubious granularity is likely to be the fatal achilles heel which dooms quantum computing to severely limited utility
  11. Shor’s factoring algorithm will never be practical for large encryption key sizes, but people act as if it will be
  12. Expectation and reliance on quantum error correction (QEC) is likely misplaced
  13. Near-perfect qubits are our only true hope
  14. Scalability of circuit repetitions (shot count or shots)
  15. No prospect for accessing or processing Big Data from a quantum algorithm
  16. Exploiting quantum parallelism for little data with a big solution space
  17. Especially tough obstacles
  18. Need to achieve The ENIAC Moment, Configurable Packaged Quantum Solutions, and The FORTRAN Moment
  19. Reliance on NISQ devices is counterproductive, post-NISQ quantum computing is needed
  20. Where are all of the 40-qubit quantum algorithms? Or even 32, 28, 24, or 20-qubit algorithms?!
  21. Need a focus on automatically scalable quantum algorithms
  22. Simulating more than 40 qubits for non-trivial quantum circuits
  23. Major impediments and obstacles
  24. Low qubit fidelity is an ongoing, disturbing, and problematic issue
  25. Lack of a practical quantum computer is a major problem
  26. Difficult to achieve even a quantum computer with 48 fully-connected near-perfect qubits
  27. No real attention being given to a universal quantum computer which fully merges and fully integrates quantum computing and classical computing
  28. Moderate impediments
  29. No coherent model for quantum information
  30. Minor impediments and annoyances
  31. Special-purpose quantum computing devices are a significant distraction
  32. Odd terminology, inconsistent with classical computing for no good reason
  33. Too much hype with unwarranted excitement and enthusiasm
  34. Support software and development tools are needed, but are not a significant obstacle to progress at present
  35. Plenty of additional work is needed, but the focus here is on the deal killer issues that are an absolute barrier to the success of quantum computing
  36. Much research is needed, but it ultimately won’t eliminate the deal killers
  37. Sure, a Quantum Winter is possible, but unlikely, but the risk is rising
  38. My quantum journey
  39. My writing, in general
  40. My introductory writing about quantum computing
  41. Lingering obstacles to my full and deep understanding of quantum computing
  42. Might this be my swan song for quantum computing?
  43. Conclusions

In a nutshell

  1. Practical quantum computing is the ultimate goal. But we are not even close.
  2. Expectations for significant quantum advantage. And deliver dramatic business value. Again, we are not even close.
  3. Distinguish issues which are deal killers from mere impediments. The latter merely slow down progress and reduce productivity, while the former can never be overcome, forever limiting quantum computing, either outright precluding some applications or at least severely curtailing some applications — or their quantum advantage or their business value.
  4. Categories of impediments. Separate from the deal killers. Especially tough obstacles, major impediments, moderate impediments, minor impediments, and mere annoyances. These issues do not absolutely preclude any applications or even severely curtail them, but simply delay them or make them much more difficult.
  5. Deal killer: Analog nature of probability amplitude and phase, dubious granularity.
  6. The analog nature of probability amplitude and phase, with dubious granularity is likely to be the fatal achilles heel which dooms quantum computing to severely limited utility.
  7. Deal killer: Reliance on fine granularity of the analog probability amplitude and phase.
  8. Deal killer: No sense of the phenomenological basis for product states of entangled qubits, and how it could support a huge number of product states.
  9. Deal killer: Shor’s factoring algorithm will never be practical for large encryption key sizes, but people act as if it will be. This is a double-edged sword. It’s a deal killer for actually factoring very large encryption keys. Definitely a deal killer for hackers and government intelligence folks seeking to spy on people. But it’s great news for quantum computer vendors and organizations and individuals with tons of encrypted data files and network protocols. All encrypted data will be safe. No need to migrate to quantum-safe cryptography.
  10. Deal killer: Grover’s search algorithm will never be practical for data of any significant size, but people talk as if it will be.
  11. Deal killer: Expectation of quantum circuits with millions or billions of gates.
  12. Deal killer: Reliance on variational methods won’t achieve any significant quantum advantage. If the only viable quantum solution is variational, the application is not a viable candidate for a quantum solution.
  13. Deal killer: Even if execution of a quantum circuit is feasible, is the required shot count (circuit repetitions or shots) practical based on the probabilistic nature of quantum computation? Is it scalable?
  14. Deal killer: Expectation and reliance on quantum error correction (QEC) is likely misplaced. Focus on near-perfect qubits with four to five nines of qubit fidelity.
  15. Deal killer: No prospect for accessing or processing Big Data from a quantum algorithm. Quantum alternative is little data with a big solution space, exploiting quantum parallelism.
  16. Especially tough obstacle: Great difficulty of translating application problems into solution approaches and into successful quantum implementations of those solution approaches.
  17. Especially tough obstacle: We need to achieve The ENIAC Moment, Configurable Packaged Quantum Solutions, and The FORTRAN Moment. Until we do, everything is in question. Little meaningful progress has been made towards these levels of capability.
  18. Especially tough obstacle: Achieving four to five nines of qubit fidelity with near-perfect qubits. Many quantum algorithms and quantum applications should be feasible with near-perfect qubits with 3.5 to five nines of qubit fidelity, but achieving that qubit fidelity is no easy task. Achieving even a mere three nines of qubit fidelity has so far proven to be out of reach.
  19. Especially tough obstacle: Achieving five to six or more nines of qubit fidelity. Larger and more complex quantum algorithms may not be feasible with only near-perfect qubits with 3.5 to five nines of qubit fidelity. Five to six or more nines of qubit fidelity might be required.
  20. Especially tough obstacle: Getting to quantum algorithms with tens of thousands of gates operating on even just a few thousand qubits will be a very daunting challenge. Getting to even ten thousand gates operating on a few hundred gates will still be a fairly daunting challenge.
  21. Especially tough obstacle: Discovery or invention of alternative qubit technologies. I’m not convinced that any of the existing qubit technologies will be capable of fulfilling even an interesting fraction of the grand promises made for quantum computing. Much more research is needed, to enable the discovery or invention of better qubit technologies, such as for much higher qubit fidelity, longer coherence time, finer granularity of probability amplitude and phase, and easier and higher fidelity for connectivity between non-adjacent qubits.
  22. Especially tough obstacle: Quantum Fourier transform and quantum phase estimation are way too heavy and cumbersome an approach to quantum computing, especially simply to read out the probability amplitude or phase of qubits.
  23. Especially tough obstacle: Reliance on variational methods won’t achieve any significant quantum advantage. The main challenge here is to shift to alternative approaches such as using quantum phase estimation.
  24. Especially tough obstacle: Expectation and reliance on quantum error correction (QEC) is likely misplaced. Lack of full quantum error correction, even if near-perfect qubits are available, will still likely fully preclude some applications or at least substantially curtail some applications. Some applications may indeed require more than six nines of qubit fidelity, which near-perfect qubits are unlikely to ever be able to provide.
  25. Especially tough obstacle: Reliance on NISQ devices is counterproductive. Really need to move on from NISQ, to post-NISQ quantum computing.
  26. Especially tough obstacle: Where are all of the 40-qubit quantum algorithms? Even though there isn’t any hardware yet, we do have the capability of simulating 40-qubit quantum circuits, but quantum algorithm designers still aren’t producing any 40-qubit algorithms. Something is seriously wrong here. We have an unexplained gap that needs to be resolved.
  27. Especially tough obstacle: Where are all of the 32-qubit quantum algorithms? Ditto as per 40-qubit quantum algorithms and should be even easier, but still no 32-qubit quantum algorithms to be found.
  28. Especially tough obstacle: Where are all of the 28-qubit quantum algorithms? Ditto as per 32 and 40-qubit quantum algorithms and should be even easier, but still no 28-qubit quantum algorithms to be found.
  29. Especially tough obstacle: Where are all of the 24-qubit quantum algorithms? Ditto as per 28, 32, and 40-qubit quantum algorithms and should be even easier, but still no 24-qubit quantum algorithms to be found.
  30. Especially tough obstacle: And why so few 20-qubit algorithms? This should be a no-brainer, but as with 24, 28, 32, and 40-qubit algorithms, there seems to be some sort of invisible barrier, so that there are very few 20-qubit quantum algorithms to be found. Why is it so especially tough to achieve 20, 24, 28, 32, and 40-qubit quantum algorithms, even when capable classical quantum simulators are readily available?
  31. Especially tough obstacle: Need a focus on automatically scalable quantum algorithms. People are presuming that quantum algorithms can be easily, even trivially, scaled, but that is not always or even usually the case. Virtually no attention is being given to this issue.
  32. Especially tough obstacle: Simulating more than 40 qubits for non-trivial quantum circuits. Like up to 45, 50, or even 55 qubits, and the capacity for an exponential increase in qubit quantum states, as well as greater performance to handle deeper and more complex quantum circuits, such as hundreds or even a thousand gates. Every qubit you add doubles the resource requirements, both time and memory — the 2^n exponential advantage of quantum computing. But we need to get as high as possible since classical quantum simulators are needed to facilitate debugging of quantum algorithms, and quantum computers with more than 40 fully functional and connected high fidelity qubits are not on the near-term horizon.
  33. Major impediment: Qubit fidelity. Including measurement fidelity. Still too low and not increasing at a palpable pace. But seems doable. Really need near-perfect qubits with four to five nines of qubit fidelity, or at least 3.5 nines or maybe three nines for some applications. At present, even three nines is currently beyond reach.
  34. Major impediment: Lack of full any-to-any qubit connectivity. Trapped-ion qubits have it, but superconducting transmon qubits don’t. No word or signs that silicon spin qubits will have it. But, it does seem technically feasible.
  35. Major impediment: Lack of commitment to some reasonable level of granularity for phase and probability amplitude. Not likely enough for the full grandiose promises, but at least enough for some palpable level of quantum advantage and delivery of at least some interesting level of business value.
  36. Major impediment: Lack of sufficient coherence time. To support non-trivial quantum algorithms.
  37. Major impediment: Lack of sufficient maximum circuit size. To support non-trivial quantum algorithms. Driven by coherence time, and also gate execution time.
  38. Major impediment: Inability to debug larger quantum algorithms. Classical quantum simulators can support debugging capabilities for smaller quantum algorithms, up to 32 or maybe 40 qubits, but not for quantum algorithms much larger than about 50 qubits. Ironically, it’s the larger circuits which are in the greatest need for debugging.
  39. Major impediment: Slow progress on quantum error correction (QEC). Many people are pinning all of their hopes on quantum error correction to compensate for qubit errors, but I am not. I assess that it is likely to never happen, but maybe this is more of an impediment than an absolute deal killer, provided that near-perfect qubits become available on a timely basis. But even if near-perfect qubits are the ultimate solution, all of the time, effort, resources, and talent consumed by ongoing work on quantum error correction will have impeded other areas where those resources could have been more productively applied.
  40. Major impediment: Lack of a practical quantum computer. No machines available that address all of these issues, impediments, obstacles, shortfalls, and problems.
  41. Major impediment: Lack of a coherent high level programming model. Compared to the Turing machine model and algebraic expressions and support for very large and very small and very granular real numbers and arithmetic. Current programming model is too low level, comparable to classical assembly or machine language, too difficult to use, and results in very low productivity, and even prevents significant progress on the quantum algorithm and quantum application front.
  42. Major impediment: How to exploit quantum-inspired classical computing. Not a simple mechanical or automatic — or guaranteed — process. Requires significant cleverness and deft analytical skill. First requires a great quantum algorithm to be designed. Then requires the analysis and cleverness to discern how the quantum logic can be translated into a classical algorithm that is extremely efficient. Identifying opportunities and techniques for classical efficiency is a significant challenge. Sometimes it may be as simple as applying a large number of classical servers to a Monte Carlo simulation using heuristics to boost efficiency compared to a simple brute force Monte Carlo simulation. Much more research is needed.
  43. Major impediment: Nobody seems very interested or willing to prioritize much of the major impediments and obstacles that I see for achieving practical quantum computing.
  44. Major impediment: Difficult to achieve even a quantum computer with 48 fully-connected near-perfect qubits. This might be the lower end of what could be considered a practical quantum computer. Many quantum applications will need substantially more than this, but even this low end is well beyond our reach, now and for the near and maybe even medium term.
  45. Major impediment: No real attention being given to a universal quantum computer which fully merges and fully integrates quantum computing and classical computing. Current focus on hybrid operation of classical and quantum computers, including dynamic circuits, is a weak and poor substitute for a full universal quantum computer. That’s a longer-term effort, but without any ongoing and sustained effort, it will never happen.
  46. Plenty of moderate impediments. Generally won’t limit quantum computing in the long run, but delay getting to practical quantum computing.
  47. Plenty of minor impediments. Generally won’t limit quantum computing in the long run, but may delay getting to practical quantum computing, or at least reduce productivity.
  48. Plenty of annoyances. Generally won’t limit quantum computing in the long run, but interfere with learning, understanding, adopting and using quantum computing.
  49. Support software and development tools are needed, but are not a significant obstacle to progress at present. My only worry about this area at present is that there is too much attention and resources being given to it, with too little attention and resources being given to much more troublesome hardware and algorithm issues. The priorities are backwards.
  50. Plenty of additional work is needed, but the focus here is on the deal killer issues that are an absolute barrier to the success of quantum computing. Some of this additional work has been mentioned or even summarized here in this informal paper, but mostly to highlight important issues simply to point out that although they are important, they are not deal killers.
  51. Much research is needed, but it ultimately won’t eliminate the deal killers.
  52. Sure, a Quantum Winter is possible, but unlikely, but the risk is rising.
  53. Whether this paper might constitute my swan song for my quantum computing efforts, my quantum journey, remains to be seen, although I do already have one more informal paper in the works. I may simply take a summer break and then resume in the fall, or not. Stay tuned.

Background

Quantum computing has always seemed rather odd to me, as an outsider.

I thought my view might change as I dug deeper into both theory and practice, but… no such luck.

Okay, sure, much of the superficial oddness has been addressed, but there are a number of deeper, more profound reservations which I have not been able to dispel no matter how deeply I have dug.

Even after more than five years of fairly intensive exposure, there are still a number of aspects of quantum computing which just don’t feel right given all of the grand promises made for quantum computing.

Actually, I feel reasonably confident that a number of the grandiose promises for quantum computing are simply not practical, not even in theory, not today, not in five or ten or twenty years, not ever.

I can confidently say that many of these grandiose promises are certain, likely, or potential deal killers — they simply aren’t going to happen, no matter what.

Not to mention quite a few significant or minor impediments or obstacles that impede progress for what is more likely to be practical.

And not to mention quite a few more minor annoyances that further stymy progress.

This informal paper enumerates and briefly describes these issues, both mere impediments as well as outright deal killers.

Maybe (likely or at least hopefully) some of them can be addressed, but maybe (likely) at least some of them may not, which will be a big problem for the future of quantum computing.

That’s not to say that quantum computing will never happen, but simply that it won’t be able to fulfill all of the many grand promises made on its behalf.

Exactly what fraction of the grand promises can be fulfilled remains to be seen. We’ll just have to wait and watch for the technology advances to unfold.

Practical quantum computing is the ultimate goal

The focus of this informal paper is not simply doing anything with a quantum computer, but to enable practical quantum computing, defined as:

  • The fruition of quantum computing, when practical quantum computers themselves come to fruition, coupled with all of the software, algorithms, applications, and other components of quantum computing.

And practical quantum computer is defined as:

  • A practical quantum computer supports production-scale practical real-world quantum applications and achieves significant quantum advantage over classical computing and delivers significant business value which is well beyond the reach of classical computing.
  • Preferably dramatic quantum advantage and extraordinary business value. But significant quantum advantage and significant business value are a good start.

For more on practical quantum computers, see my informal paper:

The point here is not to define the ultimate end state of quantum computing, but to establish the criteria for the starting point for practical quantum computing, when the grand promises can begin to be fulfilled.

Similarly, the point of this informal paper is not to define the criteria to achieve all of the grand promises made for quantum computing, but to focus on what is needed to begin fulfilling at least some of the grand promises for quantum computing.

Expectations for significant quantum advantage

Quantum computing is all for naught if we can’t achieve a dramatic quantum advantage or at least a significant quantum advantage relative to classical solutions. So, the expectation for post-NISQ quantum computing — practical quantum computing — is to achieve at least significant quantum advantage.

I personally define three levels of quantum advantage:

  1. Minimal quantum advantage. 1,000X a classical solution.
  2. Substantial or significant quantum advantage. 1,000,000X a classical solution.
  3. Dramatic quantum advantage. One quadrillion X a classical solution. Presumes 50 entangled qubits operating in parallel, evaluating 2⁵⁰ = one quadrillion alternatives in a solution space of 2⁵⁰ values (quantum states.)

Minimal quantum advantage is not terribly interesting or exciting, and although it is better than nothing, it’s not the goal of the vision of this informal paper. Still, it could be acceptable as an early, preliminary milestone or stepping stone to greater quantum advantage.

Dramatic quantum advantage is the ultimate goal for quantum computing, but is less likely for the limited qubit fidelity envisioned by this informal paper. It may take a few more nines of qubit fidelity and 10X circuit size and coherence time to get there, so don’t expect it, at least not in the early versions of post-NISQ quantum computers.

Substantial or significant quantum advantage is a more reachable and more satisfying goal over the coming years, even if not the ultimate goal for practical quantum computing. Achieving a millionfold computational advantage over classical solutions is quite impressive, even if not as impressive as dramatic quantum advantage.

In any case, a core goal for the proposal of this informal paper is to achieve quantum advantage, and it needs to be an advantage that delivers real value to the organization.

For more on dramatic quantum advantage, see my informal paper:

For more on minimal quantum advantage and substantial or significant quantum advantage — what I call fractional quantum advantage, see my informal paper:

For general discussion of quantum advantage — and quantum supremacy, see my informal paper:

Deal killers vs. mere impediments

Just to be clear, I carefully distinguish issues and problems which are deal killers from mere impediments:

  1. Deal killers are absolute obstacles which can never be overcome. Not ever. A given deal killer might not kill all aspects of quantum computing for all applications, but may kill some applications or application categories entirely, or may simply severely limit the benefits for some applications or application categories without having the same impact on other applications or application categories.
  2. Impediments may slow or stymy progress, but eventually can be overcome. Some are easier to overcome than others. Some could deter practical computing for years, while some are really more of an inconvenience, while others seriously impact productivity, but somehow determined quantum algorithm designers and quantum application developers can manage to cope with and move past these mere impediments.

Although both categories of issues and problems are covered by this paper, the primary focus is on the deal killers, those issues which have the potential of preventing quantum computers from ever achieving their full potential and never delivering on many of the grand promises that have been made for quantum computers.

The mere impediments are covered primarily to highlight that they are not deal killers. But also to highlight areas where improvements can accelerate the development and adoption of quantum computing.

Categories of obstacles

The range of obstacles considered by this informal paper include:

  1. Deal killers.
  2. Especially tough obstacles.
  3. Major impediments.
  4. Moderate impediments.
  5. Minor impediments.
  6. Annoyances.

Profound reservations and likely or potential deal killers

Subsequent sections will enumerate major and minor impediments, obstacles, and even annoyances that impede progress but are not potential deal killers. Here in this section I list my profound reservations which do have the very real potential if not certainty of being deal killers which could prevent quantum computing from achieving some or all of the grandiose promises for quantum computing.

These are capabilities which are simply not practical, even in theory, not today, not in five or ten or twenty years, not ever.

The potential (likely) deal killers are:

  1. Analog nature of probability amplitude and phase, dubious granularity. The continuous-value (analog) aspects of probability amplitude and phase are insufficient to have the precision or range to support critical capabilities such as quantum Fourier transform, quantum phase estimation, amplitude amplification, and quantum amplitude estimation. Reliance on classical analog signals and digital to analog (DAC) conversion, which likely won’t have the precision, magnitude, and stability for the more extreme promises made for quantum computing. People are expecting way too much. Ultimately, this is likely to be the fatal Achilles heel which dooms quantum computing to severely limited utility.
  2. Reliance on fine granularity of the analog probability amplitude and phase. A number of preferred techniques for quantum algorithms rely on fine granularity of the analog nature of probability amplitude and phase, even finer granularity than is likely to be physically feasible. A relatively coarse granularity is feasible, but expecting an analog signal to have billions, trillions, or more of gradations is simply not realistic. These techniques include quantum Fourier transform, quantum phase estimation, amplitude amplification, and quantum amplitude estimation. They may work fine for coarser granularity, but are likely to break down at finer granularities.
  3. No sense of the phenomenological basis for product states of entangled qubits, and how it could support a huge number of product states. Are product states — 2^n of them for n entangled qubits — energy, energy levels, or what? Is there any theoretical upper or practical limit to the number of product states for entangled qubits? And how small can that energy level be — what is its quantum, and is it limited at the Planck level? Essentially, is there a limit to the number of qubits which can be used to achieve quantum parallelism, the whole point of quantum computing.
  4. Shor’s factoring algorithm will never be practical for large encryption key sizes, but people act as if it will be. The big casualty of reliance on fine granularity of the analog probability amplitude and phase.
  5. Grover’s search algorithm will never be practical for data of any significant size, but people talk as if it will be. At best, it can achieve only a quadratic speedup. For larger data sizes it will run into the same analog granularity problem as Shor’s factoring algorithm due to its reliance on amplitude amplification, or the number of iterations required for a quadratic complexity solution may simply be prohibitively high. If your application solution requires Grover search, then it’s likely not a good fit for quantum computing.
  6. Expectation of quantum circuits with millions or billions of gates. Simply not credible, either for feasibility or utility. Thousands of gates, yes. Tens of thousands of gates, maybe. But not more than that.
  7. Reliance on variational methods won’t achieve any significant quantum advantage. Variational methods break quantum parallelism and fail to exploit the full exponential speedup that quantum computers promise. Sure, these methods can work, at least for smaller amounts of data, but they have no potential for achieving an exponential speedup for production-scale applications. They may achieve some advantage, but not enough to achieve the grand promises made for quantum computing. In short, if the solution to an application problem requires a variational method, then it’s likely not a good fit for quantum computing. Other than that, the main challenge here is to shift to alternative approaches such as using quantum phase estimation.
  8. Even if execution of a quantum circuit is feasible, is the required shot count (circuit repetitions or shots) practical based on the probabilistic nature of quantum computation? How scalable is shot count as the input data and input parameters increase in size? Especially for circuits with hundreds or thousands of qubits and tens of thousands of gates or more. Very likely to be prohibitive, particularly for larger data sizes.
  9. Expectation and reliance on quantum error correction (QEC) is likely misplaced. Many people are pinning all of their hopes on quantum error correction to compensate for qubit errors, but I am not. I assess that it is likely to never happen. Most notably, no proposed quantum error correction scheme can handle continuous analog values or subtle gate execution errors. Gross and discrete errors can be handled, but subtle analog and gate execution errors are beyond the scope of digital quantum error correction schemes. But this may be more of an impediment than an absolute deal killer if near-perfect qubits become available on a timely basis. But even if near-perfect qubits are the ultimate solution, all of the time, effort, resources, and talent consumed by ongoing work on quantum error correction will have impeded other areas where those resources could have been more productively applied. In any case, lack of full quantum error correction, even if near-perfect qubits are available, will still likely fully preclude some applications or at least substantially curtail some applications. Some applications may indeed require more than six nines of qubit fidelity, which near-perfect qubits are unlikely to ever be able to provide.
  10. No prospect for accessing or processing Big Data from a quantum algorithm. Not now. Not ever. Quantum alternative is little data with a big solution space, exploiting quantum parallelism.

Never say never = 1 in 1,000 chance

Even when I make a fairly definitive statement, I tend to offer the caveat of my philosophy of never say never.

What I mean by never say never is that there may be very little chance, but there is at least some chance. How much of a chance? I use the model of never say never meaning no more than a 1 in 1,000 chance. Significantly less than a 1% chance. But still at least some possibility.

Even so, sometimes never really is never. It’s just that we can never know that in advance with any absolute sense of certainty.

Analog nature of probability amplitude and phase, dubious granularity

For more on my concerns related to precision and range for the continuous-value aspects of probability amplitude and phase, see my informal paper:

And for some additional perspective:

The reliance of quantum Fourier transform and quantum phase estimation on fine granularity of probability amplitude and phase is a major concern. Ditto for amplitude amplification and quantum amplitude estimation. Some uses, such as for Shor’s factoring algorithm seem to be dead-end deal killers for quantum computing.

The analog nature of probability amplitude and phase, with dubious granularity is likely to be the fatal achilles heel which dooms quantum computing to severely limited utility

Just to emphasize and reinforce the fact and point of the preceding section that the analog nature of probability amplitude and phase, with fairly limited granularity, really is likely to be the fatal achilles heel which dooms quantum computing to severely limited utility.

See the citations in the preceding section.

Shor’s factoring algorithm will never be practical for large encryption key sizes, but people act as if it will be

Shor’s factoring algorithm, with its potential to crack large encryption keys, is unfortunately (or fortunately, depending on your perspective) one of the casualties of the limited granularity of the analog nature of probability amplitude and phase.

As noted, this is a double-edged sword:

  1. This is a deal killer for actually factoring very large encryption keys. Definitely a deal killer for hackers and government intelligence folks seeking to spy on people.
  2. But it’s great news for quantum computer vendors and organizations and individuals with tons of encrypted data files and network protocols. All encrypted data will be safe. No need to migrate to quantum-safe cryptography.

For details about these issues for Shor’s factoring algorithm, see my informal paper:

And for some additional perspective:

Expectation and reliance on quantum error correction (QEC) is likely misplaced

Many people are pinning all of their hopes on quantum error correction to compensate for qubit and gate errors, but I am not. I assess that it is likely to never happen.

Most notably, no proposed quantum error correction scheme can handle continuous analog values or subtle gate execution errors. Gross and discrete errors can be handled, but subtle analog and gate execution errors are beyond the scope of digital quantum error correction schemes.

For example, small or precise rotation angles can be especially problematic — no software, firmware, or hardware can possibly know what the qubit quantum state (or product state for multiple entangled qubits) should be after executing a gate for a qubit in some arbitrary (but unreadable) quantum state, with probability amplitude and phase as fine-grained analog values rather than discrete digital values.

But maybe this is more of an impediment than an absolute deal killer if near-perfect qubits become available on a timely basis.

But even if near-perfect qubits are the ultimate solution, all of the time, effort, resources, and talent consumed by ongoing work on quantum error correction will have impeded other areas where those resources could have been more productively applied.

For more on quantum error correction, see my informal paper:

And for more on my more recent thoughts about the infeasibility of quantum error correction, see my informal paper:

Near-perfect qubits are our only true hope

Despite the fact that so many people are pinning their hopes on quantum error correction to compensate for poor qubit quality, I continue to have faith that so-called near-perfect qubits have great promise to be sufficient for many quantum applications.

Many quantum algorithms and quantum applications should be feasible with near-perfect qubits with 3.5 to five nines of qubit fidelity, but achieving that qubit fidelity is no easy task. Achieving even a mere three nines of qubit fidelity has so far proven to be out of reach.

Four to five nines of qubit fidelity should be the goal.

Although 3.75 nines of qubit fidelity should be enough for some applications.

Even 3.5 nines of qubit fidelity should be enough for some applications.

And even 3.25 nines of qubit fidelity might be enough for some applications.

It’s unlikely that only three nines of qubit fidelity will enable many applications at all, except for some smaller, niche applications.

For more detail on near-perfect qubits, see my informal paper:

My proposal for a practical quantum computer based on 48 near-perfect qubits:

Scalability of circuit repetitions (shot count or shots)

Even if execution of a quantum circuit is feasible, is the required shot count (circuit repetitions) practical (scalable) based on the probabilistic nature of quantum computation?

How scalable is shot count as the input data and input parameters increase in size?

It’s very likely to be prohibitive for any non-trivial algorithm with a non-trivial amount of data.

Especially for circuits with hundreds or thousands of qubits and tens of thousands of gates or more.

For n qubits, it might take something on the order of 2^n shots or at least a significant fraction of that, at least in some cases.

A billion-billion shots might be required for 60 qubits.

A quadrillion-quadrillion shots might be required for 100 qubits.

Even if only a fraction of that, it could still be very impractical.

Little attention — NO attention — is being paid to this issue.

For more discussion of this issue, see my informal paper:

No prospect for accessing or processing Big Data from a quantum algorithm

All input data for a quantum algorithm must be encoded in the gate structure of the quantum circuit. And all output data comes from measurement of the qubits. That’s a very serious limitation on data size, both for input data and for output data.

So, there is no prospect for accessing or processing Big Data from a quantum algorithm.

Not now. Not ever.

So, there is no possibility — ever — of accessing Big Data (terabytes?) either as input data, output data, or updates to a Big Data database.

Instead, the sweet spot for quantum computing is little data with a big solution space.

Exploiting quantum parallelism for little data with a big solution space

The alternative to Big Data for quantum computing is little data with a big solution space.

The essence of little data with a big solution space to exploit quantum parallelism:

  1. A relatively small amount of input data. All input data must be encoded in the gate structure of the quantum circuit.
  2. A relatively small quantum computation. Quantum circuits cannot be very large.
  3. Perform that relatively small quantum computation over a big solution space. Exploiting quantum parallelism.
  4. Select a relatively small portion of that big solution space. As part of the quantum parallelism of the core computation. Generally accomplished by exploiting interference and phase of the probability amplitude. Result is in the output register of qubits.
  5. Measure the output register of qubits to get the relatively small result.
  6. Perform this sequence a number of times (shot count, shots, or circuit repetitions) to develop a statistically meaningful result, or average, the expectation value of the result.

With this little data approach, only the inner loop of the classical computation can be implemented as a quantum circuit, with limited quantum advantage (if any), while any Big Data is processed classically as an outer loop which invokes the quantum circuit for each element of Big Data to be processed. Whether any significant quantum advantage can be achieved with such a Big Data outer loop is debatable and will vary widely between applications.

For more detail on little data with a big solution space, see my informal paper:

Especially tough obstacles

Although these issues are not necessarily absolute deal killers for quantum computing overall, they do reduce or at least delay the potential impact for quantum computing in a very significant manner.

They may indeed block applications in some areas, even as other areas remain unimpeded.

And even if and when any of these tough obstacles are overcome, quantum computing will be significantly limited until that time.

  1. Great difficulty of translating application problems into solution approaches and into successful quantum implementations of those solution approaches. Some applications or application areas may remain insoluble. Others will simply be very difficult. Others may not be so difficult per se from a human effort perspective, but may require hardware advances that take years or decades to accomplish.
  2. We need to achieve The ENIAC Moment, Configurable Packaged Quantum Solutions, and The FORTRAN Moment. Until we do, everything is in question. Little meaningful progress has been made towards this level of capability.
  3. Achieving four to five nines of qubit fidelity with near-perfect qubits. Many quantum algorithms and quantum applications should be feasible with near-perfect qubits with 3.5 to five nines of qubit fidelity, but achieving that qubit fidelity is no easy task. Achieving even a mere three nines of qubit fidelity has so far proven to be out of reach.
  4. Achieving five to six or more nines of qubit fidelity. Larger and more complex quantum algorithms may not be feasible with only near-perfect qubits with 3.5 to five nines of qubit fidelity. Five to six or more nines of qubit fidelity might be required.
  5. Getting to quantum algorithms with tens of thousands of gates operating on even just a few thousand qubits will be a very daunting challenge. Getting to even ten thousand gates operating on a few hundred gates will still be a fairly daunting challenge.
  6. Discovery or invention of alternative qubit technologies. I’m not convinced that any of the existing qubit technologies will be capable of fulfilling even an interesting fraction of the grand promises made for quantum computing. Much more research is needed, to enable the discovery or invention of better qubit technologies, such as for much higher qubit fidelity, longer coherence time, finer granularity of probability amplitude and phase, and easier and higher fidelity for connectivity between non-adjacent qubits.
  7. Quantum Fourier transform and quantum phase estimation are way too heavy and cumbersome an approach to quantum computing, especially simply to read out the probability amplitude or phase of qubits. And they rely on the analog nature of probability amplitude and phase, with its dubious granularity. Is it really true that there isn’t a much simpler and much more reliable method to do such a basic operation? It sure looks that way. Maybe not an absolute deal killer, but a real bummer and really saps confidence in quantum computing, and dramatically slows progress..
  8. Reliance on variational methods won’t achieve any significant quantum advantage. Variational methods break quantum parallelism and fail to exploit the full exponential speedup that quantum computers promise. Sure, these methods can work, at least for smaller amounts of data, but they have no potential for achieving an exponential speedup for production-scale applications. They may achieve some advantage, but not achieve the grand promises made for quantum computing. In short, if the solution to an application problem requires a variational method, then it’s not a good fit for quantum computing. Other than that, the main challenge here is to shift to alternative approaches such as using quantum phase estimation.
  9. Expectation and reliance on quantum error correction (QEC) is likely misplaced. Many people are pinning all of their hopes on quantum error correction to compensate for qubit errors, but I am not. I assess that it is likely to never happen. But this may be more of an impediment than an absolute deal killer if near-perfect qubits become available on a timely basis. But even if near-perfect qubits are the ultimate solution, all of the time, effort, resources, and talent consumed by ongoing work on quantum error correction will have impeded other areas where those resources could have been more productively applied. In any case, lack of full quantum error correction, even if near-perfect qubits are available, will still likely fully preclude some applications or at least substantially curtail some applications. Some applications may indeed require more than six nines of qubit fidelity, which near-perfect qubits are unlikely to ever be able to provide.
  10. Reliance on NISQ devices is counterproductive. People act as if something good can come from NISQ devices, but they’re definitely a technological dead-end. Generally people will have to wait for Post-NISQ quantum computing, which should have some real potential, but even that’s speculative and an especially tough milestone to achieve.
  11. Where are all of the 40-qubit quantum algorithms? Even though there isn’t any hardware yet, we do have the capability of simulating 40-qubit quantum circuits, but quantum algorithm designers still aren’t producing any 40-qubit algorithms. Something is seriously wrong here. We have an unexplained gap that needs to be resolved.
  12. Where are all of the 32-qubit quantum algorithms? Ditto as per 40-qubit quantum algorithms and should be even easier, but still no 32-qubit quantum algorithms to be found.
  13. Where are all of the 28-qubit quantum algorithms? Ditto as per 32 and 40-qubit quantum algorithms and should be even easier, but still no 28-qubit quantum algorithms to be found.
  14. Where are all of the 24-qubit quantum algorithms? Ditto as per 28, 32, and 40-qubit quantum algorithms and should be even easier, but still no 24-qubit quantum algorithms to be found.
  15. And why so few 20-qubit algorithms? This should be a no-brainer, but as with 24, 28, 32, and 40-qubit algorithms, there seems to be some sort of invisible barrier, so that there are very few 20-qubit quantum algorithms to be found. Why is it so especially tough to achieve 20, 24, 28, 32, and 40-qubit quantum algorithms, even when capable classical quantum simulators are readily available?
  16. Need a focus on automatically scalable quantum algorithms. People are presuming that quantum algorithms can be easily, even trivially, scaled, but that is not the case. Virtually no attention is being given to this issue.

Need to achieve The ENIAC Moment, Configurable Packaged Quantum Solutions, and The FORTRAN Moment

The ability to run toy-like or relatively trivial quantum algorithms is not a good indicator of progress towards fulfilling any of the grand promises made for quantum computing.

Three major milestones must be achieved towards the adoption of quantum computing:

  1. The ENIAC Moment. Hand-crafted applications. Very limited deployments. Relying on super-elite technical teams at only the most elite of organizations.
  2. Configurable packaged quantum solutions. Widespread deployments of a relatively few applications. Requires no quantum expertise.
  3. The FORTRAN Moment. Higher-level programming model. Widespread development of custom applications. No longer requires super-elite technical teams and is no longer limited to only the most elite of organizations.

For more detail, see my informal paper:

Reliance on NISQ devices is counterproductive, post-NISQ quantum computing is needed

People act as if something good can come from NISQ devices, but they’re definitely a technological dead-end. With only NISQ, most of the grand promises for quantum computing will be beyond reach.

Generally people will have to wait for Post-NISQ quantum computing, which should have some real potential, but even that’s speculative and an especially tough milestone to achieve.

For more detail on NISQ being a dead-end, see my informal paper:

For more on the potential of post-NISQ quantum computing, see my informal paper:

Where are all of the 40-qubit quantum algorithms? Or even 32, 28, 24, or 20-qubit algorithms?!

Even though there isn’t any hardware yet that can execute quantum algorithms with over 16 qubits with reasonably high fidelity, we do have the capability of simulating 40-qubit quantum circuits, but quantum algorithm designers still aren’t producing any 40-qubit algorithms. Something is seriously wrong here. We have an unexplained gap that needs to be resolved.

Ditto for 36, 32, 28, 24, and even 20-qubit quantum algorithms.

This serious issue is discussed in my informal paper:

Need a focus on automatically scalable quantum algorithms

People are presuming that quantum algorithms can be easily, even trivially, scaled, but that is not always or usually the case.

Virtually no attention is being given to this issue.

For more detail on this issue, see my informal paper:

Simulating more than 40 qubits for non-trivial quantum circuits

We need quantum classical simulators which can handle significantly higher qubit counts and deeper and more complex quantum circuits.

We need to get as high as possible since classical quantum simulators are needed to facilitate debugging of quantum algorithms, and quantum computers with more than 40 fully functional and connected high fidelity qubits for non-trivial quantum circuits are not on the near-term horizon.

Like up to 45, 50, or even 55 qubits and the capacity and performance to handle the exponential increase in qubit quantum states and deeper and more complex quantum circuits, such as hundreds or even a thousand or more gates.

I’d set a qubit count of 48 and a maximum circuit size of 750 gates as a primary target — and only as a stepping stone to 50 qubits and 1,000, 1,500, and 2,000 gates.

How to get to a capacity and performance beyond that is an open research question. And incremental advances in classical computing hardware as well.

Every qubit you add doubles the resource requirements, both time and memory — the 2^n exponential advantage of quantum computing.

Major impediments and obstacles

These are issues which significantly impede the progress of quantum computing, but none of them, singly or even all together would amount to deal killers for achieving any of the grand promises made about quantum computing.

  1. Qubit fidelity. Including gate fidelity and measurement fidelity. Still too low and not increasing at a palpable pace. But seems doable. Really need near-perfect qubits with four to five nines of qubit fidelity, or at least 3.5 or maybe 3.25 for some applications. At present, even three nines is currently beyond reach.
  2. Lack of full any-to-any qubit connectivity. Trapped-ion qubits have it, but superconducting transmon qubits don’t. No word or signs that silicon spin qubits will have it. But, it does seem technically feasible.
  3. SWAP error rate. In the absence of full any-to-any qubit connectivity, quantum algorithms must resort to SWAP networks to shuffle qubits around. Errors introduced by SWAP gates (typically three CNOT gates — each with a two-qubit gate error rate) can make it infeasible to implement some quantum algorithms, or at least increase their error rate, possibly to a prohibitive or at least disappointing level. SWAP networks are not a scalable approach to mitigating for the lack of full any-to-any qubit connectivity — they do work for smaller quantum algorithms, but not for large and complex quantum algorithms using a large number of qubits.
  4. Lack of commitment to some reasonable level of granularity for phase and probability amplitude. Not likely enough for the full grandiose promises, but at least enough for some palpable level of quantum advantage and delivery of at least some interesting level of business value.
  5. Lack of sufficient coherence time. To support non-trivial quantum algorithms.
  6. Lack of sufficient maximum circuit size. To support non-trivial quantum algorithms. Driven by coherence time, and also gate execution time. Getting to hundreds or even a few thousand gates operating on dozens of qubits will be a major challenge in itself. Getting to more than a few thousand gates operating on a hundred or more qubits, or even ten thousand or more gates operating on hundreds of qubits will be an even greater major challenge.
  7. Inability to debug larger quantum algorithms. Classical quantum simulators can support debugging capabilities for smaller quantum algorithms, up to 32 or maybe 40 qubits, but not for quantum algorithms much larger than about 50 qubits. It’s the larger circuits which are in the greatest need for debugging.
  8. Slow progress on quantum error correction (QEC). Many people are pinning all of their hopes on quantum error correction to compensate for qubit errors, but I am not. I assess that it is likely to never happen, but maybe this is more of an impediment than an absolute deal killer, provided that near-perfect qubits become available on a timely basis. But even if near-perfect qubits are the ultimate solution, all of the time, effort, resources, and talent consumed by ongoing work on quantum error correction will have impeded other areas where those resources could have been more productively applied.
  9. Lack of a practical quantum computer. No machines are available that address all of these issues, shortfalls, and problems.
  10. Lack of a coherent high level programming model. Compared to the Turing machine model and algebraic expressions and support for very large and very small and very granular real numbers and arithmetic. The current programming model is too low level, comparable to classical assembly or machine language, too difficult to use, and results in very low productivity, and even prevents significant progress on the quantum algorithm and quantum application front.
  11. How to exploit quantum-inspired classical computing. Not a simple mechanical or automatic — or guaranteed — process. Requires significant cleverness and deft analytical skill. First requires a great quantum algorithm to be designed. Then requires the analysis and cleverness to discern how the quantum logic can be translated into a classical algorithm that is extremely efficient. Identifying opportunities and techniques for classical efficiency is a significant challenge. Sometimes it may be as simple as applying a large number of classical servers to a Monte Carlo simulation using heuristics to boost efficiency compared to a simple brute force Monte Carlo simulation. Much more research is needed.
  12. Nobody seems very interested or willing to prioritize much of the major impediments and obstacles that need to be overcome to achieve practical quantum computing.
  13. Difficult to achieve even a quantum computer with 48 fully-connected near-perfect qubits. This might be the lower end of what could be considered a practical quantum computer. Many quantum applications will need substantially more than this, but even this low end is well beyond our reach, now and for the near and maybe even medium term.
  14. No real attention being given to a universal quantum computer which fully merges and fully integrates quantum computing and classical computing. Current focus on hybrid operation of classical and quantum computers, including dynamic circuits, is a weak and poor substitute for a full universal quantum computer. That’s a longer-term effort, but without any ongoing and sustained effort, it will never happen.

Low qubit fidelity is an ongoing, disturbing, and problematic issue

Put simply, qubit fidelity is too low.

This includes coherence time, susceptibility to noise, gate fidelity, and measurement fidelity.

It’s still too low and not increasing at a palpable pace.

Really need near-perfect qubits with four to five nines of qubit fidelity, or at least 3.5 or maybe 3.25 for some applications.

At present, even three nines is currently beyond reach.

Overall:

  1. No solution is in sight.
  2. Complicated quantum error correction (QEC) and error mitigation schemes are, well, excessive and appalling, and overall disappointing.
  3. My solution is to more quickly evolve to near-perfect qubit fidelity which eliminates the problem for most common use cases. But that isn’t happening, yet.

Lack of a practical quantum computer is a major problem

The simple fact that there are no quantum computers available that address all of the major issues, shortfalls, problems, and impediments is… a major problem.

Put simply, we need practical quantum computers.

For more detail on what constitutes a practical quantum computer, see my informal paper:

Difficult to achieve even a quantum computer with 48 fully-connected near-perfect qubits

I’ve endeavored to speculate about what the low end of practical quantum computing might be. My tentative conclusion was that 48 fully-connected near-perfect qubits might be the sweet spot.

This could enable a 20-bit quantum phase estimation (QPE) — quantum Fourier transform (QFT) on 20 qubits, with a 20-qubit output register.

This might be the lower end of what could be considered a practical quantum computer.

Many quantum applications will need substantially more than this.

But even this low end is well beyond our reach, now and for the near and maybe even medium term.

For more detail on my proposal, see my informal paper:

No real attention being given to a universal quantum computer which fully merges and fully integrates quantum computing and classical computing

Pursuing a universal quantum computer which fully merges and fully integrates quantum computing and classical computing is a longer-term effort, but without any ongoing and sustained effort, it will never happen.

This would more fully leverage the unique capabilities of both classes of computing devices.

This will likely extend the range to more applications.

And this will likely extend performance for a wider range of applications.

The current focus on hybrid operation of classical and quantum computers, including dynamic circuits, is a weak and poor substitute for a full universal quantum computer.

For more on universal quantum computers, see my informal paper:

Moderate impediments

These are issues about quantum computing which have bothered me from the beginning of my quantum journey, but none of them, singly or even all together would amount to deal killers for achieving any of the grand promises made about quantum computing, but they still impede progress to varying degrees.

In some cases, it is not a technical impediment so much as a documentation, description, and intuition impediment which interferes with a full understanding of how the science and technology works and how it can be most effectively exploited.

  1. No coherent model for quantum information. In the context of quantum computing, rather than in the context of quantum mechanics. Classical computing has a high-level model of information that is very distinct from the physics underlying classical hardware, but so far the model of information in quantum computing is intimately tied to the physicists model of information in a quantum sense. That may work for physicists, but isn’t very useful for computing.
  2. Lack of a coherent high level programming model. Compared to the Turing machine model and algebraic expressions and support for very large and very small and very granular real numbers and arithmetic. The current programming model is too low level, comparable to classical assembly or machine language, too difficult to use, and results in very low productivity, and even prevents significant progress on the quantum algorithm and quantum application front.
  3. Lack of high-level data types.
  4. Lack of high-level algorithmic building blocks.
  5. Lack of a high-level quantum-native programming language.
  6. Quantum Fourier transform and quantum phase estimation are way too heavy and cumbersome an approach to quantum computing, especially simply to read out the probability amplitude or phase of qubits. Is it really true that there isn’t a much simpler and much more reliable method to do such a basic operation? It sure looks that way. Maybe not a deal killer, but a real bummer.
  7. Will quantum computing ever get to the scale and maturity of server blades in a data center? Or will it likely peter out as a few thousand mainframe-size systems used by a relatively few thousands of quantum experts for relatively narrow niches of applications?
  8. Must all quantum computation (algorithms) be reversible? And uncomputation as well. Early works explicitly said this religiously, as a matter of quantum computing dogma, but more recently, nobody says it, at all, ever, and nobody seems to do it. This needs to be clarified.
  9. Why exactly does a Clifford gate set plus CNOT constitute a universal gate set? Universal with respect to what? Plain language and intuition, please!
  10. What is the number of binary bits or decimal digits of precision supported or required for entries of unitary transformation matrices and gate parameters, such as rotation and phase? Single-precision floating point? Double? Quad precision? What does the software API support, what does the firmware support, what does the gate execution support, what do the lasers, microwaves or other qubit manipulation technology support, and what do the actual qubits support? How many bits or digits of precision do we need? How many bits or digits of precision can we use, at most? Clear guidelines and limitations need to be carefully documented.
  11. The documentation and training for CNOT is silent with regards to superimposed basis states. Usually speaks as if the qubits were pure 0 or 1 — “if 1”. Doesn’t give any hint that superposition can lead to Bell states.
  12. No plain language and intuitive description of pure and mixed states. The raw math is clear (albeit complicated), but the intuitive meaning is… unknown.
  13. No plain language and intuitive description of density matrix or density operator and how is it used or required in quantum computation.
  14. No plain language and intuitive description of the trace of a unitary transformation matrix. The math is clear, but not the meaning or purpose of the math. How should programmers be using the concept in algorithm design? It’s simply defined as the sum of the entries on the main diagonal of a square matrix, but phenomenologically, what is actually going on and why is it significant? It has something to do with a density matrix and pure vs. mixed states. A more clear definition of pure and mixed states might clarify the matter.
  15. No plain language and intuitive description of Clifford groups and Clifford gates. Why do single-qubit gates need to belong to a Clifford group? We need plain language. It needs to be intuitive.
  16. No plain language and intuitive description for global property. Including how it relates to interference and quantum parallelism.
  17. What does “up to a factor” or “up to a constant factor” or “equivalent up to global phase” really mean? A phase shift that has no impact on the outcome of measurement? Change in amplitude that leaves probability unchanged? What exactly is the point or purpose or effect of saying it? We need a technically correct and complete description. And one that is intuitive as well. Other phrases… Up to a constant phase factor. Up to a constant phase shift. Up to a relative phase shift. Up to an unobservable global phase factor. (What is a phase factor vs. phase shift?) Up to an unimportant constant factor. Up to an unimportant global phase T. Up to an unimportant global phase. Up to an unimportant global phase factor. Up to an unimportant global phase shift. Up to a global phase. Up to a global phase shift. Up to an irrelevant overall phase. Up to an irrelevant overall phase factor. Up to an irrelevant global phase. Up to a normalization factor. Up to an overall multiplicative factor. Up to an error on a single qubit of the output. Up to a constant factor. Up to a unitary transformation. Up to a relative phase shift. “equivalent up to global phase” — “Two quantum states |ψi and |ϕi are equivalent up to global phase if |ϕi = e^iθ |ψi, where θ ∈ R. The phase e^iθ will not be observed upon measurement of either state.” — https://arxiv.org/abs/0705.0017.

No coherent model for quantum information

There is no coherent model for information In the context of quantum computing, rather than in the context of quantum mechanics.

Classical computing has a high-level model of information that is very distinct from the physics underlying classical hardware.

But so far the model of information in quantum computing is intimately tied to the physicist’s model of information in a quantum sense. That may work for physicists, but isn’t very useful for computing.

For detail on my attempts to untangle and grasp quantum information — in the context of quantum computing, see my informal paper:

Minor impediments and annoyances

These are oddities and lesser issues about quantum computing which have bothered me from the beginning of my quantum journey, but none of them, singly or even all together would amount to deal killers for achieving any of the grand promises made about quantum computing, but they still impede progress to varying but lesser degrees than moderate or major impediments.

  1. Too many capabilities which are inscrutable and not amenable to developing a sensible intuition. Intuition really matters, a lot. This is equally true for quantum mechanics and quantum computing. But all too often it is simply out of reach for most people. Folklore and hype end up being used as crutches in substitution for real intuition.
  2. Special-purpose quantum computing devices are a significant distraction. Distracts from the needed focus on general-purpose quantum computing. Addressing the needs of niches is interesting, but not helpful if it pulls attention and resources from general-purpose quantum computing.
  3. The need for circuit repetitions (shots) and statistical analysis is a nontrivial annoyance and extra level of effort compared to classical computing. Probabilistic nature of quantum computing is certainly odd and unsettling, but I understand it, accept it, and it’s not an issue for me, but still a level of extra effort. Better documentation and tools to support it are needed.
  4. Feeding input data and filtering results is tedious. All input data must be manually encoded into the gate structure of the quantum circuit. Raw qubit bit values are returned, which need to be statistically analyzed over a number or shots or circuit repetitions. Other post-processing is typically needed.
  5. Odd terminology, inconsistent with classical computing for no good reason. Qubits, quantum information, gates, circuits, etc.
  6. Obstinate refusal of people to use plain language. The root problem is that the field of quantum computing is still owned by quantum physicists. Obviously they have no incentive to give up their own cryptic vocabulary and conceptual model in favor of plain language.
  7. None of the set of introductory quantum algorithms, including Shor’s factoring and Grover’s search algorithms has any real utility. Read any significant journal paper and you might see historical references to Shor, or maybe Grover, but published algorithms never build from those introductory quantum algorithms. We desperately need a better set of introductory quantum algorithms, that are easier to understand, enlightening, help to develop intuition, and actually have utility in production-scale practical real-world quantum applications.
  8. There is no simple Hello World quantum algorithm and quantum application which demonstrates quantum parallelism and significant quantum advantage. How hard can that be? Why is that so hard? In any case, it is needed.
  9. Overall, the pace of real and meaningful technical advances in quantum hardware is much too slow for me to tolerate. Too boring, too depressing, and outright soul-crushing. Lack of meaningful breakthroughs. Becoming or has already become too much like nuclear fusion power, where grand promises are made, but fulfillment for those promises are always years away, even after decades of nominal progress.
  10. Too much hype with unwarranted excitement and enthusiasm. Much more hype than actual technical progress, and lack of meaningful breakthroughs. Hype can distract and mislead people, causing them to waste attention, time, energy, and resources on unproductive pursuits.

Special-purpose quantum computing devices are a significant distraction

Special-purpose quantum computing devices distract attention and resources from the needed focus on general-purpose quantum computing.

Addressing the needs of niches is interesting, but not helpful if it pulls attention and resources from the general-purpose quantum computing devices which will address the broader range of applications suitable for quantum computing.

For more detail on both general-purpose quantum computers and special-purpose quantum computing devices, see my informal paper:

Odd terminology, inconsistent with classical computing for no good reason

Every field should have simple, clear, complete, and consistent terminology. Quantum computing is no different.

Some examples of odd terminology in quantum computing:

  1. Qubit is a hardware device, not abstract information as a bit is in classical computing.
  2. There is no formal term for quantum information in quantum computing comparable to information (or data) in classical computing. Information is represented as quantum state in qubits, but there is no term for the abstract information, distinct from the raw quantum state of qubits, with its basis states, wave functions, and probability amplitudes and phases, as well as product states for entangled qubits.
  3. Gates are operations or instructions rather than hardware devices as in classical computing.
  4. Circuits are sequences (or a graph) of operations or instructions rather than hardware devices as in classical computing.

Too much hype with unwarranted excitement and enthusiasm

There has been much more hype than actual technical progress, and lack of meaningful breakthroughs in quantum computing.

Hype can distract and mislead people, causing them to waste attention, time, energy, and resources on unproductive pursuits.

For a catalog of hype used (misused) in quantum computing, see my informal paper:

Support software and development tools are needed, but are not a significant obstacle to progress at present

As important as support software and development tools are, they are not an area of major or even minor concern at present.

If anything, I think quantum computing is overtooled — more tools than we need, and more relative to more pressing needs.

To be clear, overtooling is detrimental, not an advantage.

My only worry about this area at present is that there is too much attention and resources being given to it, with too little attention and resources being given to much more troublesome hardware and algorithm issues. The priorities are backwards.

We need to flip the priorities and focus on research and engineering for better hardware, better programming model, better algorithmic building blocks, better algorithm design methods, better algorithm design patterns, and better algorithms and applications, and then we can pursue a more rational approach to support software and development tools which are a much better fit for both the underlying hardware and the kinds of application solutions we really need.

Plenty of additional work is needed, but the focus here is on the deal killer issues that are an absolute barrier to the success of quantum computing

Yes, there is plenty of additional work needed to achieve practical quantum computing, to achieving significant quantum advantage, and delivering significant business value for quantum computing, but the focus of this informal paper is on the deal killer issues which stand as an absolute barrier limiting quantum computing from achieving the many grand promises which have been made for it.

Some of this additional work has been mentioned or even summarized here in this informal paper, but mostly to highlight important issues simply to point out that although they are important, they are not the deal killers.

Some other areas of work related to quantum computing, but not amounting to any deal killers for quantum computing itself overall:

  1. Quantum networking.
  2. Quantum-safe cryptography. Or post-quantum cryptography (PQC).
  3. Quantum computer clusters.
  4. Quantum memory (QRAM).
  5. Modular quantum computers.

Much research is needed, but it ultimately won’t eliminate the deal killers

Yes, issues which are deal killers are indeed deal killers, whether or not additional research is pursued. Research or lack of research is not a cause or impediment for deal killers. They are what they are, independent of research.

For reference, here is my informal paper which enumerates and briefly summarizes many of the areas of research urgently needed for quantum computing:

This is work that needs to be done, but it won’t alleviate any of the issues which are deal killers.

But this research will help to address many of the issues which are the especially tough obstacles, major impediments, moderate impediments, and minor impediments.

Sure, a Quantum Winter is possible, but unlikely, but the risk is rising

Is a so-called Quantum Winter coming?

Well, sure, it’s very possible, but at present it still seems unlikely.

But the risk of a Quantum Winter is indeed rising.

For a more detailed discussion of Quantum Winter for quantum computing, see my informal paper:

Actually, since I wrote that over a year ago (March 2022), the timeframe would become one to two years from when I’m writing this (May 2023).

My quantum journey

I began my journey into quantum computing — my quantum journey — about five and a half years ago.

My main motive was not to actually use a quantum computer, but simply to discern whether the technology really worked and what it really did.

Overall, I’m interested in the capabilities, limitations, and issues related to quantum computing. For more discussion of this, see my informal paper:

For details of the first stage of my quantum journey, see my informal paper from August 2020:

And my follow up after that, from September 2022:

My writing, in general

As part of my quantum journey, I’ve done a lot of writing, strictly informal writing, all posted on Medium. My informal papers are listed here:

My introductory writing about quantum computing

I’ve endeavored to write an introduction to quantum computing, consisting of a handful of my informal papers:

If you simply want the briefest of introductions to quantum computing, check out my elevator pitch:

Here’s my suggested reading list for those attempting to get started in quantum computing:

  1. What Is Quantum Computing?
  2. What Is a Quantum Computer?
  3. What Is Quantum Information?
  4. What Is Quantum Information Science?
  5. What Are Quantum Effects and How Do They Enable Quantum Information Science?

I don’t have my own paper on getting started with hands-on quantum programming, but the IBM Qiskit Textbook is a decent tutorial after digesting at least an introductory level from my papers listed above:

Lingering obstacles to my full and deep understanding of quantum computing

I actually wrote an informal paper on the topic of this paper over four years ago, in March 2019:

I have indeed made great progress since then, but some of these lingering obstacles still remain despite my best efforts, which is the point of this current informal paper.

Might this be my swan song for quantum computing?

To be honest, I really do think that I’ve written just about everything that I have urgently felt needs to be written about quantum computing.

Of course there’s always plenty more to write about, but if I stopped here, with this particular informal paper (and one more I already have in progress), maybe this really would be enough for me personally to write about the topic.

And even if I do stop here, I can always choose to start writing again.

Maybe this will simply mark the end of a phase or stage, with some other, distinct phase or stage of my quantum journey to follow.

Or maybe I’ll simply take a summer break for three or four months before deciding what to do next. Finishing up this month (May 2023) and starting June 2023 (or Memorial Day weekend at the end of May) off with a break or pause until September or fall might make sense, especially since technical advances are likely to come slower during the summer anyway, and then I might resume when technical advances pick up again after the summer season, or not.

With that, I’ll leave it as an open question as to whether this informal paper constitutes my swan song for quantum computing.

For now, stay tuned and see how it all really plays out.

Conclusions

  1. Practical quantum computing is the ultimate goal. But we are not even close.
  2. Expectations for significant quantum advantage. And deliver dramatic business value. Again, we are not even close.
  3. Distinguish issues which are deal killers from mere impediments. The latter merely slow down progress and reduce productivity, while the former can never be overcome, forever limiting quantum computing, either outright precluding some applications or at least severely curtailing some applications — or their quantum advantage or their business value.
  4. Categories of impediments. Separate from the deal killers. Especially tough obstacles, major impediments, moderate impediments, minor impediments, and mere annoyances. These issues do not absolutely preclude any applications or even severely curtail them, but simply delay them or make them much more difficult.
  5. Deal killer: Analog nature of probability amplitude and phase, dubious granularity.
  6. The analog nature of probability amplitude and phase, with dubious granularity is likely to be the fatal achilles heel which dooms quantum computing to severely limited utility.
  7. Deal killer: Reliance on fine granularity of the analog probability amplitude and phase.
  8. Deal killer: No sense of the phenomenological basis for product states of entangled qubits, and how it could support a huge number of product states.
  9. Deal killer: Shor’s factoring algorithm will never be practical for large encryption key sizes, but people act as if it will be. This is a double-edged sword. It’s a deal killer for actually factoring very large encryption keys. Definitely a deal killer for hackers and government intelligence folks seeking to spy on people. But it’s great news for quantum computer vendors and organizations and individuals with tons of encrypted data files and network protocols. All encrypted data will be safe. No need to migrate to quantum-safe cryptography.
  10. Deal killer: Grover’s search algorithm will never be practical for data of any significant size, but people talk as if it will be.
  11. Deal killer: Expectation of quantum circuits with millions or billions of gates.
  12. Deal killer: Reliance on variational methods won’t achieve any significant quantum advantage. If the only viable quantum solution is variational, the application is not a viable candidate for a quantum solution.
  13. Deal killer: Even if execution of a quantum circuit is feasible, is the required shot count (circuit repetitions or shots) practical based on the probabilistic nature of quantum computation? Is it scalable?
  14. Deal killer: Expectation and reliance on quantum error correction (QEC) is likely misplaced. Focus on near-perfect qubits with four to five nines of qubit fidelity.
  15. Deal killer: No prospect for accessing or processing Big Data from a quantum algorithm. Quantum alternative is little data with a big solution space, exploiting quantum parallelism.
  16. Especially tough obstacle: Great difficulty of translating application problems into solution approaches and into successful quantum implementations of those solution approaches.
  17. Especially tough obstacle: We need to achieve The ENIAC Moment, Configurable Packaged Quantum Solutions, and The FORTRAN Moment. Until we do, everything is in question. Little meaningful progress has been made towards these levels of capability.
  18. Especially tough obstacle: Achieving four to five nines of qubit fidelity with near-perfect qubits. Many quantum algorithms and quantum applications should be feasible with near-perfect qubits with 3.5 to five nines of qubit fidelity, but achieving that qubit fidelity is no easy task. Achieving even a mere three nines of qubit fidelity has so far proven to be out of reach.
  19. Especially tough obstacle: Achieving five to six or more nines of qubit fidelity. Larger and more complex quantum algorithms may not be feasible with only near-perfect qubits with 3.5 to five nines of qubit fidelity. Five to six or more nines of qubit fidelity might be required.
  20. Especially tough obstacle: Getting to quantum algorithms with tens of thousands of gates operating on even just a few thousand qubits will be a very daunting challenge. Getting to even ten thousand gates operating on a few hundred gates will still be a fairly daunting challenge.
  21. Especially tough obstacle: Discovery or invention of alternative qubit technologies. I’m not convinced that any of the existing qubit technologies will be capable of fulfilling even an interesting fraction of the grand promises made for quantum computing. Much more research is needed, to enable the discovery or invention of better qubit technologies, such as for much higher qubit fidelity, longer coherence time, finer granularity of probability amplitude and phase, and easier and higher fidelity for connectivity between non-adjacent qubits.
  22. Especially tough obstacle: Quantum Fourier transform and quantum phase estimation are way too heavy and cumbersome an approach to quantum computing, especially simply to read out the probability amplitude or phase of qubits.
  23. Especially tough obstacle: Reliance on variational methods won’t achieve any significant quantum advantage. The main challenge here is to shift to alternative approaches such as using quantum phase estimation.
  24. Especially tough obstacle: Expectation and reliance on quantum error correction (QEC) is likely misplaced. Lack of full quantum error correction, even if near-perfect qubits are available, will still likely fully preclude some applications or at least substantially curtail some applications. Some applications may indeed require more than six nines of qubit fidelity, which near-perfect qubits are unlikely to ever be able to provide.
  25. Especially tough obstacle: Reliance on NISQ devices is counterproductive. Really need to move on from NISQ, to post-NISQ quantum computing.
  26. Especially tough obstacle: Where are all of the 40-qubit quantum algorithms? Even though there isn’t any hardware yet, we do have the capability of simulating 40-qubit quantum circuits, but quantum algorithm designers still aren’t producing any 40-qubit algorithms. Something is seriously wrong here. We have an unexplained gap that needs to be resolved.
  27. Especially tough obstacle: Where are all of the 32-qubit quantum algorithms? Ditto as per 40-qubit quantum algorithms and should be even easier, but still no 32-qubit quantum algorithms to be found.
  28. Especially tough obstacle: Where are all of the 28-qubit quantum algorithms? Ditto as per 32 and 40-qubit quantum algorithms and should be even easier, but still no 28-qubit quantum algorithms to be found.
  29. Especially tough obstacle: Where are all of the 24-qubit quantum algorithms? Ditto as per 28, 32, and 40-qubit quantum algorithms and should be even easier, but still no 24-qubit quantum algorithms to be found.
  30. Especially tough obstacle: And why so few 20-qubit algorithms? This should be a no-brainer, but as with 24, 28, 32, and 40-qubit algorithms, there seems to be some sort of invisible barrier, so that there are very few 20-qubit quantum algorithms to be found. Why is it so especially tough to achieve 20, 24, 28, 32, and 40-qubit quantum algorithms, even when capable classical quantum simulators are readily available?
  31. Especially tough obstacle: Need a focus on automatically scalable quantum algorithms. People are presuming that quantum algorithms can be easily, even trivially, scaled, but that is not always or even usually the case. Virtually no attention is being given to this issue.
  32. Especially tough obstacle: Simulating more than 40 qubits for non-trivial quantum circuits. Like up to 45, 50, or even 55 qubits, and the capacity for an exponential increase in qubit quantum states, as well as greater performance to handle deeper and more complex quantum circuits, such as hundreds or even a thousand gates. Every qubit you add doubles the resource requirements, both time and memory — the 2^n exponential advantage of quantum computing. But we need to get as high as possible since classical quantum simulators are needed to facilitate debugging of quantum algorithms, and quantum computers with more than 40 fully functional and connected high fidelity qubits are not on the near-term horizon.
  33. Major impediment: Qubit fidelity. Including measurement fidelity. Still too low and not increasing at a palpable pace. But seems doable. Really need near-perfect qubits with four to five nines of qubit fidelity, or at least 3.5 nines or maybe three nines for some applications. At present, even three nines is currently beyond reach.
  34. Major impediment: Lack of full any-to-any qubit connectivity. Trapped-ion qubits have it, but superconducting transmon qubits don’t. No word or signs that silicon spin qubits will have it. But, it does seem technically feasible.
  35. Major impediment: Lack of commitment to some reasonable level of granularity for phase and probability amplitude. Not likely enough for the full grandiose promises, but at least enough for some palpable level of quantum advantage and delivery of at least some interesting level of business value.
  36. Major impediment: Lack of sufficient coherence time. To support non-trivial quantum algorithms.
  37. Major impediment: Lack of sufficient maximum circuit size. To support non-trivial quantum algorithms. Driven by coherence time, and also gate execution time.
  38. Major impediment: Inability to debug larger quantum algorithms. Classical quantum simulators can support debugging capabilities for smaller quantum algorithms, up to 32 or maybe 40 qubits, but not for quantum algorithms much larger than about 50 qubits. Ironically, it’s the larger circuits which are in the greatest need for debugging.
  39. Major impediment: Slow progress on quantum error correction (QEC). Many people are pinning all of their hopes on quantum error correction to compensate for qubit errors, but I am not. I assess that it is likely to never happen, but maybe this is more of an impediment than an absolute deal killer, provided that near-perfect qubits become available on a timely basis. But even if near-perfect qubits are the ultimate solution, all of the time, effort, resources, and talent consumed by ongoing work on quantum error correction will have impeded other areas where those resources could have been more productively applied.
  40. Major impediment: Lack of a practical quantum computer. No machines available that address all of these issues, impediments, obstacles, shortfalls, and problems.
  41. Major impediment: Lack of a coherent high level programming model. Compared to the Turing machine model and algebraic expressions and support for very large and very small and very granular real numbers and arithmetic. Current programming model is too low level, comparable to classical assembly or machine language, too difficult to use, and results in very low productivity, and even prevents significant progress on the quantum algorithm and quantum application front.
  42. Major impediment: How to exploit quantum-inspired classical computing. Not a simple mechanical or automatic — or guaranteed — process. Requires significant cleverness and deft analytical skill. First requires a great quantum algorithm to be designed. Then requires the analysis and cleverness to discern how the quantum logic can be translated into a classical algorithm that is extremely efficient. Identifying opportunities and techniques for classical efficiency is a significant challenge. Sometimes it may be as simple as applying a large number of classical servers to a Monte Carlo simulation using heuristics to boost efficiency compared to a simple brute force Monte Carlo simulation. Much more research is needed.
  43. Major impediment: Nobody seems very interested or willing to prioritize much of the major impediments and obstacles that I see for achieving practical quantum computing.
  44. Major impediment: Difficult to achieve even a quantum computer with 48 fully-connected near-perfect qubits. This might be the lower end of what could be considered a practical quantum computer. Many quantum applications will need substantially more than this, but even this low end is well beyond our reach, now and for the near and maybe even medium term.
  45. Major impediment: No real attention being given to a universal quantum computer which fully merges and fully integrates quantum computing and classical computing. Current focus on hybrid operation of classical and quantum computers, including dynamic circuits, is a weak and poor substitute for a full universal quantum computer. That’s a longer-term effort, but without any ongoing and sustained effort, it will never happen.
  46. Plenty of moderate impediments. Generally won’t limit quantum computing in the long run, but delay getting to practical quantum computing.
  47. Plenty of minor impediments. Generally won’t limit quantum computing in the long run, but may delay getting to practical quantum computing, or at least reduce productivity.
  48. Plenty of annoyances. Generally won’t limit quantum computing in the long run, but interfere with learning, understanding, adopting and using quantum computing.
  49. Support software and development tools are needed, but are not a significant obstacle to progress at present. My only worry about this area at present is that there is too much attention and resources being given to it, with too little attention and resources being given to much more troublesome hardware and algorithm issues. The priorities are backwards.
  50. Plenty of additional work is needed, but the focus here is on the deal killer issues that are an absolute barrier to the success of quantum computing. Some of this additional work has been mentioned or even summarized here in this informal paper, but mostly to highlight important issues simply to point out that although they are important, they are not deal killers.
  51. Much research is needed, but it ultimately won’t eliminate the deal killers.
  52. Sure, a Quantum Winter is possible, but unlikely, but the risk is rising.
  53. Whether this paper might constitute my swan song for my quantum computing efforts, my quantum journey, remains to be seen, although I do already have one more informal paper in the works. I may simply take a summer break and then resume in the fall, or not. Stay tuned.

--

--