Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization

Jack Krupansky
36 min readNov 17, 2021

Premature commercialization of quantum computing will be counterproductive and lead to unnecessary disenchantment with the long-term potential for quantum computing. Despite the dramatic progress of quantum computing in recent years, many significant obstacles remain before commercialization can even begin. This informal paper will discuss the obstacles and propose a path forward which can dramatically accelerate the advance of quantum computing.

Much more basic research, prototyping, and experimentation is needed in hardware, support software and tools, programming models, algorithmic building blocks, algorithms, and quantum applications before quantum computing will be ready to be exploited by organizations seeking to develop and deploy production-scale quantum applications for practical real-world problems. This pre-commercialization is required before commercialization can even begin.

Premature commercialization will be an extreme distraction, counterproductive, and outright harmful. Boosting research, both for hardware and algorithms is essential. Algorithm and application research should focus on simulation configured to match the hardware expected at the time of commercialization, rather than attempting to distort and shoehorn advanced, complex algorithms into woefully inadequate near-term hardware.

This paper won’t delve too deeply into the many issues related to research and commercialization, as these were covered extensively in my preceding two papers, designed and explicitly written to provide the foundation for this paper:

Need for more extensive research:

Commercialization, pre-commercialization, and premature commercialization:

The purpose of this paper is to pull together and briefly summarize and highlight those two papers and to emphasize a relative few key points:

  1. Commercialization of current technology will NOT lead to dramatic quantum advantage. The hardware is too primitive. Much research is needed.
  2. Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out and rely on simulation until the hardware is ready.
  3. Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
  4. Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
  5. It’s premature to even begin commercialization. The technology just isn’t ready. Not even close. Both hardware and algorithms, and applications.
  6. Much pre-commercialization work remains before commercialization can begin.
  7. Boost research, prototyping, and experimentation — pre-commercialization.
  8. Much research remains to fully characterize and resolve many technical obstacles. Many engineering challenges don’t have sufficient research results to guide them. Both hardware and algorithms, and applications.
  9. Hardware may seem to be the primary limiting factor, but algorithms are an even greater limiting factor. We can simulate 32 and 40-qubit algorithms, but they’re nowhere to be found.
  10. The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses. But we can make some good, tentative guesses now.
  11. Variational methods are an unproductive distraction and technical dead end — the focus should be on quantum Fourier transform (QFT) and quantum phase estimation (QPE). It will take years for the hardware to support this, but simulation can be used in the meantime.
  12. Quantum error correction (QEC) and logical qubits will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
  13. Prototyping and experimentation for quantum algorithms and quantum applications should focus on simulation configured to match the hardware expected at the time of initial commercialization rather than focusing on current, very limited hardware.
  14. There should be no expectation of running or even testing algorithms or applications for 64 or more qubits during pre-commercialization. Not until the hardware can be confirmed to be approaching the target capabilities for the initial commercialization stage — not just raw qubit count, but quality of the qubits. Simulation-only during pre-commercialization. May be limited to 50, 48, 44, 40, or even 32 qubits based on the limits of the simulator and circuit depth.
  15. Even initial commercialization will be fairly limited and it could take ten or more subsequent commercialization stages before the full promise of quantum computing can be delivered.
  16. Any efforts at premature commercialization are doomed to be counterproductive and a distraction from research and simulation for prototyping and experimentation.
  17. Hardware and algorithm research and development should be allowed to be on their own, parallel but independent tracks. Very slow progress on hardware must not be permitted to slow algorithm progress.
  18. Double down on pre-commercialization? Double down is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
  19. Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent. Some people and organizations require excitement and rapid change while others require calm stability — individuals and organizations must decide clearly which they are.
  20. Pre-commercialization could take another 2 to 4 years — or longer.
  21. The initial commercialization stage could take another 2 to 3 years — or longer, beyond pre-commercialization.
  22. The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
  23. Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.

Topics covered in this paper:

  1. This paper merely summarizes and highlights from the issues — see my two previous papers for details
  2. What do commercialization and pre-commercialization mean?
  3. The crux of the problem, the dilemma
  4. Premature commercialization is the problem now facing us
  5. No need for premature Quantum Ready
  6. Great for Fortune 500 companies to do their own research push
  7. Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid
  8. Current dramatic push for commercialization is a counterproductive distraction
  9. Commercialization of current technology will NOT lead to dramatic quantum advantage
  10. Little if any of the current technology will be relevant in 5–10 years
  11. Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation
  12. Big risk of hitting a Quantum Winter in two years
  13. Taming the hype may be impossible, so we need to push the reality to catch up
  14. Boost research, prototyping, and experimentation — pre-commercialization
  15. We need to push research much harder to try to catch up with the hype
  16. Distinguishing pre-commercialization from commercialization
  17. Avoid premature commercialization
  18. Critical technical gating factors for initial stage of commercialization
  19. Minimum viable product (MVP)
  20. Initial commercialization stage — C1.0
  21. Subsequent commercialization stages
  22. Quantum error correction (QEC) and logical qubits — later, not in C1.0
  23. Near-perfect qubits — good enough for most applications
  24. No, noisy NISQ quantum computers are not viable for commercialization
  25. Configurable packaged quantum solutions
  26. Critical hardware research issues
  27. Critical algorithm and application research areas
  28. Other critical research areas
  29. We need to decouple hardware development and algorithm and application research, prototyping, and experimentation
  30. Focus algorithm and application research, prototyping, and experimentation on simulation
  31. Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?
  32. Hardware engineers should run their own functional tests, stress tests, and benchmark tests
  33. Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware
  34. Functional enhancements and performance and capacity improvements are needed for simulation
  35. Where are all of the 40-qubit quantum algorithms?
  36. Scalability is essential for robust quantum algorithms
  37. Configure simulation to match expected commercial hardware
  38. Configure simulation to match expected improvements — or shortfalls — of the hardware
  39. Research will continue even as commercialization commences
  40. Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization
  41. Risk of business development during pre-commercialization — beware of premature commercialization
  42. Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization
  43. When might the initial commercialization stage, C1.0, be available?
  44. IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization
  45. My apologies — There’s so much more! See my two papers
  46. Summary and conclusions

This paper merely summarizes and highlights from the issues — see my two previous papers for details

This paper won’t delve too deeply into the many issues related to research and commercialization, as these were covered extensively in my preceding two papers, designed and explicitly written to provide the foundation for this paper. The intent here is to get people to focus more on pre-commercialization and less on (premature) commercialization.

Need for more extensive research:

Commercialization, pre-commercialization, and premature commercialization:

What do commercialization and pre-commercialization mean?

Taken from my previous paper on pre-commercialization and commercialization:

  1. Pre-commercialization means research as well as prototyping and experimentation. This will continue until the research advances to the stage where sufficient technology is available to produce a viable product that solves production-scale practical real-world problems. All significant technical issues have been resolved, so that commercialization can proceed with minimal technical risk.
  2. Commercialization means productization after research as well as prototyping and experimentation are complete. Productization means a shift in focus from research to a product engineering team — commercial product-oriented engineers and software developers rather than scientists.

For more detail on commercialization and pre-commercialization see my paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

The crux of the problem, the dilemma

Quantum computing has been plodding along for several decades now, finally accelerating in the past few years, so it’s only natural that people would finally like to see the fruits of this labor put into action actually solving practical real-world problems. The desire is very real. It’s palpable. It’s irresistible.

But, despite the advances of recent years, the technology is far from ready for prime-time deployment. The hardware isn’t ready. Algorithms aren’t ready. Applications aren’t ready. Nothing’s ready. Except all of the hype!!

Much research is still required. Many technical questions and issues remain unresolved. Hardware, algorithms, and applications. And programming models, algorithmic building blocks, design patterns, frameworks, and programming languages. And support software and tools. You name it, much more research is required.

Much prototyping and experimentation is also needed, but it is generally still premature to do so until much more of the foundational research has occurred.

In short, much pre-commercialization is still needed for quantum computing. We’re not even close to being ready to begin commercialization.

Premature commercialization is the problem now facing us

Research in hardware, algorithms, and applications may be the clear technical problem in front of us, but the main problem right in front of us is the temptation of premature commercialization.

Everybody is being told to believe that they need to get Quantum Ready, even though the basic technology won’t be ready to even begin commercialization for three to five or seven years or even longer.

Sure, researchers must be laboring away right now, but not the individuals or organizations that will ultimately use the eventual results of that research.

In fact, the technology really isn’t ready for even advanced development groups at larger organizations. Much basic research remains before that can happen.

No need for premature Quantum Ready

Not all organizations or all individuals within a particular organization need to get Quantum Ready at the same time or pace, or even at all. It all depends on the needs, interests, criteria, and timing of the particular organization, department, project, team, role, or individual. It’s not a one size fits all proposition. The timing will vary, as will the pace.

Some may need to get Quantum Ready early in pre-commercialization, others later in pre-commercialization, others not until early commercialization, others not until later stages of commercialization, and others at any stage in between, and others not at all.

Of course nobody wants to get Quantum Ready too late but there is no special merit in getting there too early, and there are real risks with doing so, such as the risk that work, knowledge, and skills from early in pre-commercialization may well be obsolete by the time commercialization rolls around. Investment is supposed to be leveraged, not discarded and redone.

Organizations, teams, and individuals should get Quantum Ready when they need it — not too late and not too soon. Most organizations and individuals will not need to rush. And now is not the time for most organizations and individuals to get Quantum Ready. In fact, for many or even most organizations and individuals it is still too soon to get Quantum Aware. Timing does matter.

For more on the various aspects of Quantum Ready, see my pre-commercialization paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Great for Fortune 500 companies to do their own research push

Large organizations with deep pockets for research and advanced development should of course be focusing on quantum computing, but…

  1. Focus on the long term. These research efforts should not be billed as intended to produce production solutions over the next few years.
  2. Demonstrations of the future, not the near-term. Research prototypes should be billed as demonstrations of possible production-scale technology in 5–7 years and production deployment in 10–15 years, but not near term in the next 2–4 years.
  3. Integration of quantum into mainline applications will take years. Integration of quantum technology into larger solutions could take 3–5 (or even 7) years alone, even once the quantum technology itself is ready.
  4. Some elite teams may develop ENIAC-class solutions in less time. Maybe to production deployment in 3–5 years, but most organizations will have to wait another 5–8 years for The FORTRAN Moment, or utilize configurable packaged quantum solutions acquired from outside vendors who do have the necessary elite teams.

Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid

The tremendous hype surrounding quantum computing is quite intoxicating. It’s one thing to be taken in by the promise of a brilliant future, but it’s an entirely different matter to treat that distant promise as if it were reality today or even the relatively near future.

The hype is well beyond the reality.

Current dramatic push for commercialization is a counterproductive distraction

The quantum computing field is plagued with excessive hype, not simply promises of great benefits to come, but even to the point of claims that the benefits are actually here, now, or at least in the very near future.

As some pundits (and, unfortunately, journalists) put it — and these are actual headlines:

  1. Quantum Computing May Be Closer Than You Think
  2. Quantum Computing Might Be Here Sooner Than You Think
  3. Quantum Computing May Be A Reality Sooner Than You Think
  4. Quantum Computing Will Change Everything, and Sooner Than You Expect
  5. Quantum Computing is coming to you faster than you think
  6. Quantum computing could be useful faster than anyone expected
  7. A Quantum Future will be here Sooner than You Think
  8. Quantum Computers Could Go Mainstream Sooner than We Think
  9. You’ll Be Using Quantum Computers Sooner Than You Think
  10. And more!

Vendors of hardware, software, algorithms, and applications are doing nothing to tamp down this rampant hype.

But the hardware, software, algorithms, and applications simply aren’t ready to be placed into production deployment. Not even close.

This dramatic push for rapid commercialization is a counterproductive distraction. It’s outright harmful. It will inevitably cause eventual disillusionment and an eventual pullback in investment.

The problem is not just the hype alone, since many people will sensibly ignore it, but unfortunately many people are actually acting as if they really did believe the hype.

Commercialization of current technology will NOT lead to dramatic quantum advantage

Current quantum computing technology is actually fairly impressive compared to just a few years ago, but is still well short of being suitable for solving production-scale practical real-world problems and achieving even a tiny fraction of dramatic quantum advantage. And this includes both hardware and algorithms.

So, any attempt to commercialize current quantum computing technology is doomed to be a guaranteed market flop in terms of solving production-scale practical real-world problems.

What’s needed, again, focused on pre-commercialization, is a lot more research, as well as prototyping and experimentation for quantum algorithms and quantum applications, but based on simulation rather than attempting to run on current, very-limited hardware.

Little if any of the current technology will be relevant in 5–10 years

Instead of expending inordinate energy on distorting and shoehorning stripped-down algorithms into current hardware, we should instead rely on simulation in the near term, and focus algorithm and application research on expected hardware 2–7 years out.

Expected hardware advances which will make life much easier for algorithm designers and application developers include:

  1. Much higher qubit fidelity.
  2. Greater qubit counts.
  3. Finer phase granularity.
  4. Quantum error correction (QEC) and logical qubits.
  5. Lower gate error rates.
  6. Lower measurement error rates.
  7. Greater qubit connectivity.
  8. Capable of supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation

Variational methods are quite popular right now, particularly for such applications as quantum computational chemistry, primarily because they work, in a fashion, on current NISQ hardware, but they are far from ideal and only work for smaller problems. In fact, they are a distraction and an absolute technical dead end — variational algorithms will never achieve dramatic quantum advantage, just by the nature of how they do work.

Variational methods are a short-term crutch, a stopgap measure, designed to compensate for the inability of current hardware to support the desired algorithmic approach of quantum Fourier transform (QFT) and quantum phase estimation (QPE). Limited granularity of phase, limited circuit depth, limited qubit fidelity, and excessive gate errors are some of the hardware limitations precluding QFT and QPE at present.

The preferred alternative at this stage should be to refrain from trying to implement algorithms on current hardware in favor of implementing them on classical quantum simulators. Granted, that limits implementations to 32 to 40 or maybe 50 qubits, but this puts more emphasis on designing automatically scalable algorithms, so that the algorithms can use the best and optimal technical approach and be ready for the day when the hardware really is ready for commercialization.

For more information on variational methods, quantum Fourier transform and quantum phase estimation, and automatically scalable algorithms, see my pre-commercialization paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Big risk of hitting a Quantum Winter in two years

The biggest risk in quantum computing is an excess of investment flows into quantum computing with an expectation of payoff (presumed commercialization) within two or three years, followed by a Quantum Winter as vendors and organizations are unable to deliver on those bold promises by the time the patience of that investment money has been pushed to the limit.

Taming the hype may be impossible, so we need to push the reality to catch up

Sure, we can push back on the hype, to some degree, but that’s a Herculean task that would consume all of our energy. Instead, we can focus on how to advance the technology so that it catches up with at least a fair fraction of the hype.

We need to double-down — even triple or quadruple-down — on research, as well as prototyping and experimentation.

Boost research, prototyping, and experimentation — pre-commercialization

If there is only one message for the reader to take away from this paper it is to boost research, as well as to boost prototyping and experimentation.

But to be clear, the research has to come first.

Prototyping and experimentation with current quantum hardware is an unproductive distraction.

Quantum algorithms and quantum applications should instead be prototyped and experimented with using simulation — configured to match the typical quantum hardware configuration expected at the initial commercialization stage (C1.0.)

To be clear, prototyping and experimentation with quantum algorithms and quantum applications can proceed in parallel with a lot of the hardware research, but it must be focused on simulation.

But, research that is focused on the functional capabilities of qubits, programming models, and algorithmic building blocks, must be completed before it makes any sense to begin prototyping algorithms and applications.

Granted, additional research on the functional capabilities of qubits, programming models, and algorithmic building blocks can proceed in parallel with prototyping and experimentation with quantum algorithms and quantum applications, but only to the extent that the capabilities of the initial commercialization stage (C1.0) are already clearly known, or at least a reasonable approximation.

For more on research, see my paper: Essential and Urgent Research Areas for Quantum Computing.

We need to push research much harder to try to catch up with the hype

We may have only another two or three years to begin delivering on at least some of the bold promises of quantum computing before patient capital loses patience.

We can probably get away without full delivery on all promises, but we at least need to deliver on a substantial fraction of the promises. Such as achieving The ENIAC Moment, when a quantum computer can finally run a production-scale practical real-world application, and achieve at least a substantial fraction of quantum advantage.

But that’s going to take a lot of research. A whole lot. Much more than we are currently pursuing.

Can we catch up with the hype? Maybe. Possibly. If we try hard enough. But, it’s not a slam dunk.

Distinguishing pre-commercialization from commercialization

Commercialization implies that you have the necessary science, technology, knowledge, and skills, but now you just have to do it. You have the science. You just need to do the engineering. It’s not quite that simple, but that’s the essence.

Pre-commercialization is the process of seeking and obtaining all of the science, technology, knowledge, and skills which can then be applied during commercialization to engineer a viable commercial product.

Premature commercialization is the attempt to engage in commercialization before the necessary science, technology, knowledge, and skills have been developed. In short, don’t do it!

For more detail on commercialization, pre-commercialization, and premature commercialization see my paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Avoid premature commercialization

Just to reemphasize the point, premature commercialization is the attempt to engage in commercialization before the necessary science, technology, knowledge and skills have been developed. In short, don’t do it!

Science, technology, knowledge, and skills for quantum computing are constantly changing as research progresses during pre-commercialization.

Any reliance on science, technology, knowledge, and skills from pre-commercialization is very risky — it may all change by the time that commercialization begins.

Critical technical gating factors for initial stage of commercialization

Here’s a brief summary of the more critical technical gating factors which must be addressed before quantum computing can be considered ready for commercialization, an expectation for the initial stage of commercialization, C1.0:

  1. Near-perfect qubits. At least four nines of qubit fidelity — 99.99%. Possibly five nines — 99.999%.
  2. Circuit depth. Generally limited by coherence time. No clear threshold at this stage but definitely going to be a critical gating factor. Whether it is 50, 100, 500, or 1,000 is unclear. Significantly more than it is now. Let’s call it 250 for the sake of argument.
  3. Qubit coherence time. Sufficient to support needed circuit depth.
  4. Near-full qubit connectivity. Either full any to any qubit connectivity or qubit fidelity high enough to permit SWAP networks to simulate near-full connectivity.
  5. 64 qubits. Roughly. No precise threshold. Maybe 48 qubits would be enough, or maybe 72 or 80 qubits might be more appropriate. Granted, I think people would prefer to see 128 to 256 qubits, but 64 to 80 might be sufficient for the initial commercialization stage.
  6. Alternative architectures may be required. Especially for more than 64 qubits. Or even for 64, 60, 56, 50, and 48 qubits in order to deal with limited qubit connectivity.
  7. Fine phase granularity to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). 40 qubits = 2⁴⁰ gradations — one trillion gradations should be the preferred target for C1.0. At least 20 or 30 qubits = 2²⁰ to 2³⁰ gradations — one million to one billion gradations, at a mimumum. Even 20 qubits may be a hard goal to achieve. 50 qubits needed for dramatic quantum advantage.
  8. Quantum Fourier transform (QFT) and quantum phase estimation (QPE). Needed for quantum computational chemistry and other applications. Needed to achieve quantum advantage through quantum parallelism. Relies on fine granularity of phase.
  9. Conceptualization and methods for calculating shot count (circuit repetitions) for quantum circuits. This will involve technical estimation based on quantum computer science coupled with engineering processes based on quantum software engineering. See my paper below.
  10. Moderate improvements to the programming model. Unlikely that a full higher-level programming model will be available soon (before The FORTRAN Moment), but some improvements should be possible.
  11. Moderate library of high-level algorithmic building blocks.
  12. The ENIAC Moment. A proof that something realistic is possible. The first production-scale practical real-world application.
  13. Substantial quantum advantage. Full, dramatic quantum advantage (one quadrillion X speedup) is not so likely, but an advantage of at least a million or a billion is a reasonable expectation — much less will be seen as not really worth the trouble. This will correspond to roughly 20 to 30 qubits in a single Hadamard transform — 2²⁰ = one million, 2³⁰ = one billion. An advantage of one trillion — 2⁴⁰ may or may not be reachable by the initial stage of commercialization. Worst case, maybe minimal quantum advantage — 1,000X to 50,000X — might be acceptable for the initial stage of commercialization.
  14. 40-qubit quantum algorithms. Quantum algorithms utilizing 32 to 48 qubits should be common. Both the algorithms and hardware supporting those algorithms. 48 to 72-qubit algorithms may be possible, or not — they may require significantly greater qubit fidelity.
  15. Classical quantum simulators for 48-qubit algorithms. The more the better, but that may be the practical limit in the near term. We should push the researchers for 50 to 52 or even 54 qubits of full simulation.
  16. Overall the technology is ready for production deployment.
  17. No further significant research is needed to support the initial commercialization stage product, C1.0. Further research for subsequent commercialization stages, but not for the initial commercialization stage. The point is that research belongs in the pre-commercialization stage, not during commercialization.

For more detail consult my pre-commercialization paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Minimum viable product (MVP)

One could accept all of the critical technical gating factors for the initial stage of commercialization (C1.0) as the requirements for a minimum viable product (MVP). That would be the preference. But, it may turn out that not all customers or users need all of those capabilities or features. Or, maybe everybody wants and needs all of those capabilities and features, but they simply aren’t technically or economically feasible in a reasonable timeframe. In such situations it may make sense or at least be tempting to define a minimum viable product (MVP) which is substantially less than the more capable desired initial product.

This paper won’t attempt to predict what sort of minimum viable product (MVP) will form the ultimate initial commercialization stage, C1.0, but it is worth considering.

Some obvious compromises:

  1. Qubit count. 128 or 256 qubits may be a clear preference, but maybe 72 or 64 or even 48 qubits might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
  2. Qubit fidelity. Five nines or at least 4.5 nines of qubit fidelity might be the preference, but four or even 3.5 nines might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
  3. Connectivity. Full connectivity might not be achievable. Maybe SWAP networks are feasible if qubit fidelity is high enough.
  4. Fineness of phase granularity. Critical for quantum Fourier transform and quantum phase estimation. Sufficient for at least 20 to 30 qubits = 2²⁰ to 2³⁰ gradations in a quantum Fourier transform, rather than the desired 50 to 64.
  5. Quantum Fourier transform and quantum phase estimation resolution. Preferably at least 20 to 30 qubits, but maybe only 16 bits of precision can be achieved — or even only 12, rather than 32 to 64 bits.

Quantum error correction (QEC) would likely come in a later stage of commercialization in any case. Near-perfect qubits should be good enough for many applications.

In any case, the precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses — hopefully increasing in capabilities as more technical unknowns are resolved, favorably.

Initial commercialization stage — C1.0

The initial commercialization stage would be the very first product offering to result from the commercialization process. Call it C1.0. This initial product would meet all of the critical technical gating factors detailed in a preceding section of this paper, although it is possible that there might be a minimum viable product (MVP) which doesn’t meet all of those factors.

It is not the intention of C1.0 to fulfill all of the grand promises of quantum computing, but it’s the first real start.

With C1.0, quantum computing is no longer a mere laboratory curiosity. It’s actually ready to address at least some production-scale practical real-world problems.

Quantum error correction (QEC) would likely come in a later stage of commercialization in any case. Near-perfect qubits should be good enough for many applications.

Design of quantum algorithms and development of quantum applications should continue to rely on simulation as well as automatically scalable quantum algorithms, but with C1.0, quantum algorithms and quantum applications can finally run at full capacity, well beyond 50 qubits, at least to the extent of the capabilities of the hardware available at that time.

Aslo, with C1.0, customers, users, algorithm designers and application developers can expect some sense of stability and compatibility as further stages of commercialization unfold.

For more on what to expect in the initial commercialization stage, see my paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Subsequent commercialization stages

Research continues even as the technical details of the initial commercialization stage, C1.0, stabilize. This ongoing research will form the basis for subsequent commercialization stages.

These subsequent stages can include hardware advances, support software and tool advances, algorithm advances, algorithmic building block advances, programming model advances, etc.

The first subsequent commercialization stage might be labeled C1.1, to be followed by C1.2, and so on. Eventually there will be a dramatic enough change to warrant a C2.0.

The intention is that with each subsequent commercialization stage customers, users, algorithm designers, and application developers can expect some sense of stability and compatibility as each stage of commercialization unfolds.

For more on what some of the expected subsequent commercialization stages might be, see my paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Quantum error correction (QEC) and logical qubits — later, not in C1.0

As valuable and essential as quantum error correction (QEC) and logical qubits will be for quantum computing, it seems unlikely that it will be ready for the initial commercialization stage, C1.0. In fact it will likely take several additional stages before the technology advances far enough even in research before it is ready for commercialization. My best guess is for it to debut in C3.0, the third major commercialization stage.

But, that may not be a major issue, provided that research gets us to near-perfect qubits, which would be good enough for many if not most quantum algorithms and applications.

Near-perfect qubits — good enough for most applications

Although quantum error correction (QEC) and logical qubits are the ideal, most quantum algorithms and applications won’t actually need truly perfect logical qubits — a fairly tiny error may be acceptable. Near-perfect qubits may in fact be good enough in most situations.

Four to five nines of qubit fidelity — 99.99% to 99.999% — is likely enough for many if not most quantum algorithms and applications. More advanced or demanding algorithms may require more, but they would be the exception rather than the rule.

For more on nines of qubit fidelity, read my paper:

No, noisy NISQ quantum computers are not viable for commercialization

Absent support for quantum error correction (QEC), only near-perfect qubits would supply the qubit fidelity needed to support larger and more complex quantum algorithms and to achieve substantial quantum advantage, such as supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE) which are required for quantum computational chemistry.

By definition, a NISQ quantum computer has only noisy qubits with qubit fidelity of no more than two or three nines of reliability, 99% to 99.9%, rather than the four or five nines of reliability, 99.99% to 99.999%, required for supporting more complex quantum algorithms and quantum applications.

For more on nines of qubit fidelity, see my paper:

Configurable packaged quantum solutions

Design of production-scale quantum algorithms and development of production-scale quantum applications is likely to remain well beyond the technical ability of most organizations for quite some time. Rather, I expect that many organizations will be able to get started in quantum computing by acquiring and deploying configurable packaged quantum solutions which provide them with all of the benefits of quantum computing without the need to go anywhere near a raw quantum algorithm.

All they need to do is prepare their input data and control parameters and the package will automatically generate and execute the needed quantum algorithms and output the final results.

I suspect that a subsequent commercialization release such as C2.0 might be the first to debut a configurable packaged quantum solution, with subsequent stages to debut a growing population of such quantum solutions.

Customers employing such an approach to quantum computing would not need to invest any resources in quantum computing during pre-commercialization.

Critical hardware research issues

A separate paper lays out much of the needed research for quantum computing hardware. This paper only briefly summarizes the critical areas:

  1. Need for a more ideal qubit technology. Current qubit technology demonstrates quantum computing capabilities, but is incapable of delivering on the full promises of quantum computing.
  2. Limited qubit capacity. Need a lot more qubits.
  3. Limited qubit fidelity. Too many errors.
  4. Limited qubit coherence. Limits circuit depth.
  5. Limited circuit depth. Basically limited by qubit coherence.
  6. Limited gate fidelity. Too many errors executing quantum logic gates.
  7. Limited granularity of phase. Need fine granularity for quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  8. Limited measurement fidelity. Too many errors measuring a qubit to get results.
  9. Unable to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). Rely on qubit fidelity, fine granularity of phase, gate fidelity, circuit depth, and measurement fidelity. Needed for application categories such as quantum computational chemistry.
  10. Unable to achieve substantial quantum advantage. A dramatic performance advantage over classical computing is the only point of even pursuing quantum computing. This will require a combination of sufficient hardware capabilities with quantum algorithms which exploit those hardware capabilities.

For much more detail on hardware research areas, see my paper: Essential and Urgent Research Areas for Quantum Computing.

Critical algorithm and application research areas

A separate paper lays out much of the needed research for quantum algorithms and quantum applications. This paper only briefly summarizes the critical areas:

  1. Need for a higher-level programming model. Current programming model is too low-level, too primitive, too much like classical machine and assembly language.
  2. Need for a robust collection of high-level algorithmic building blocks.
  3. Need for a high-level programming language. Tailored to the needs of quantum algorithms and quantum applications.
  4. Need for a robust collection of example algorithms. Which demonstrate production-scale quantum parallelism and show how practical real-world problems can be easily transformed into quantum algorithms and applications.
  5. Need for algorithm debugging capabilities. Difficult enough for relatively simple quantum algorithms, virtually impossible for complex quantum algorithms.
  6. Need for configurable packaged quantum solutions. Generalized applications for each major application category which allow the developer to present input data and input parameters in an easy way which can readily be automatically transformed into adaptations of the pre-written quantum algorithms and application framework. Still requires a lot of work, but not expertise in quantum circuits.
  7. Research in specific algorithms for each application category.

As mentioned elsewhere in this paper, algorithm and application research during pre-commercialization should focus on simulation with the simulator configured to match the expected target hardware capabilities for the initial commercialization stage (C1.0) or a subsequent stage. Attempting to perform quantum algorithm and quantum application research on actual near-term hardware would be counterproductive and a gross distraction.

For much more detail on quantum algorithm and quantum application research areas see my paper: Essential and Urgent Research Areas for Quantum Computing.

Other critical research areas

Besides hardware, algorithms, and applications, there are a number of other areas of critical and urgent research needed to fully exploit the promised potential of quantum computing. From my paper, here is the summary list of the areas:

  1. Physics.
  2. Hardware.
  3. Firmware (see: Hardware).
  4. Hardware support.
  5. Debugging.
  6. Classical quantum simulators.
  7. Quantum information science in general.
  8. Software. Support software, tools.
  9. Quantum software engineering. A new field.
  10. Quantum computer science. A new field.
  11. Cybersecurity.
  12. Quantum algorithm support.
  13. Quantum algorithms.
  14. Quantum application support.
  15. Quantum applications.
  16. Quantum application solutions. Particularly configurable packaged quantum solutions.
  17. Quantum general artificial intelligence.
  18. Quantum advantage and quantum supremacy.
  19. Other areas of QSTEM research.

For details, see my paper: Essential and Urgent Research Areas for Quantum Computing.

We need to decouple hardware development and algorithm and application research, prototyping, and experimentation

Trying to prototype and experiment with algorithms and applications on woefully-inadequate hardware is an exercise in futility. There’s another, better path: simulation.

Focus algorithm and application research, prototyping, and experimentation on simulation

Simulation can be slow and is limited to 40 to 50 or so qubits, but is much more reliable and ultimately more efficient and productive than attempting to prototype and experiment with hardware that simply isn’t up to the task.

Plus, a simulator can provide support for analysis and debugging which is not physically feasible with a real quantum computer.

For more detail on simulation and needed research, see my paper: Essential and Urgent Research Areas for Quantum Computing.

Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?

There are very few real algorithms that use more than about 23 qubits on a real quantum computer at present. That is likely due to the fact that this is approximately the limit of current hardware, particularly with respect to qubit fidelity, coherence time, circuit depth, gate fidelity, and measurement errors.

Sure, it’s great to demonstrate a quantum algorithm on an actual quantum computer, but to what effect? Technically, we can run that same algorithm on a simulator.

Hardware engineers should run their own functional tests, stress tests, and benchmark tests

The quantum hardware engineers should run their own functional tests, stress tests, and benchmark tests to confirm that their hardware is performing as expected. No need to slow down algorithm and application research, prototyping, and experimentation just to test the hardware in a rather inefficient manner.

Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware

There’s no good reason to slow down algorithm and application research, prototyping, and experimentation or to gate them by hardware research.

They can proceed just fine with simulation.

In fact, they can proceed better than just fine since:

  1. Hardware limits don’t interfere with progress.
  2. Simulation and analysis software can alert them to bugs and other issues in their algorithms and applications. Debugging on actual quantum hardware is very problematic.

Functional enhancements and performance and capacity improvements are needed for simulation

Simulation runs reasonably fine today. Yes, significant functional enhancements and performance and capacity improvements would be very beneficial, but simulators are generally more usable than actual hardware at the moment — and for the indefinite future.

For more detail on needed research for simulation, including enhancements and improvements, see my paper: Essential and Urgent Research Areas for Quantum Computing.

Where are all of the 40-qubit quantum algorithms?

Indeed, where are they? There doesn’t appear to be any technically valid reason that we don’t see a plethora of 40-qubit or even 32-qubit algorithms, other than the mere fact that it’s so intoxicating to run algorithms on actual quantum hardware. An increased focus on simulation should improve the situation.

See my paper on the merits of focusing on scalable 40-qubit algorithms:

Scalability is essential for robust quantum algorithms

We want to be able to develop quantum algorithms and applications today which will run on future hardware as it becomes available. We definitely don’t want to have to redesign and reimplement quantum algorithms and quantum applications every time there is even a modest advance in hardware capabilities. Scalability is the key. And in fact automatic scalability.

Scalability is also essential as we get to larger and more complex algorithms so that a scaled-down version of the algorithm can be fully and accurately simulated and validated on a 32 to 40-qubit simulator, under the presumption that automatic analysis can confirm that the logic of the algorithm is fully, reliably, and accurately — and automatically — scalable to a larger number of qubits without introducing errors.

Current quantum algorithms generally aren’t scalable. Why? Simply because it isn’t currently considered a priority. That needs to change — scalability, automatic scalability — needs to be a top research priority.

See my paper on the staged model for scalable quantum algorithms:

Configure simulation to match expected commercial hardware

Simulation can be configured to match any hardware configuration. The default should be the target hardware configuration for the initial commercialization stage of quantum computing, C1.0, with the only significant difference being fewer qubits since simulation is limited to 40 to 50 or so qubits.

Configuration factors include:

  1. Qubit count.
  2. Coherence time.
  3. Circuit depth.
  4. Connectivity.
  5. Phase granularity.
  6. Qubit fidelity.
  7. Gate fidelity.
  8. Measurement fidelity.

Configure simulation to match expected improvements — or shortfalls — of the hardware

Simulation is flexible and can be configured to match any hardware configuration (limited only by the maximum qubits for simulation or roughly 40 to 50 or so qubits.)

This makes it easy to test how algorithms and applications might behave on hardware that isn’t quite ready for commercialization, or for hardware improvements for subsequent commercialization stages beyond initial commercialization. Algorithms and applications can be tested even before the hardware becomes available.

Research will continue even as commercialization commences

Research for any vibrant field is never-ending. There’s always something new to discover, and some new problem to be overcome.

Granted, a true mountain of research must be completed during pre-commercialization before commercialization can begin, but that will not be the end of research in quantum computing.

The initial commercialization stage will be only the beginning of a long sequence of improvements and enhancements, many of which will only be possible as the result of additional research.

Generally, on average, it could easily take five years from the onset of research in some area to produce results which will find their way into commercialization. Some research may produce results which can be commercialized in less than five years, but some research may take seven to ten or more years to produce results suitable for commercialization.

This means that research for subsequent stages of commercialization after the initial stage of commercialization will need to commence even before commercialization commences — during pre-commercialization.

Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization

Literally every aspect of quantum computing technology during pre-commercialization is subject to change. Hardware, software, tools, algorithms, applications, programming models, knowledge, methods — you name it, all of it has a high probability of changing by the time true commercialization begins. This especially includes support software and tools.

There is a significant risk of premature commercialization of support software and tools during pre-commercialization. Vendors, customers, and users alike will be very tempted to latch onto appealing technology and make an investment and commitment to stick to their choices, but a year or two or even less of change, evolution, and innovation could trivially make those choices obsolete.

I’m not saying that you can or should avoid such choices, but simply that you have to be aware that such commitments and work derived from those commitments will likely have to be reworked or started again from scratch when true commercialization does begin.

Vendors will want to sell support software, tools, algorithms, and applications, and customers will want to buy them, but just be aware that those expenditures and investments of time may have to be made again when commercialization does begin, maybe even multiple times, and especially as vendors come and go or as customers switch vendors as capabilities and features evolve.

Risk of business development during pre-commercialization — beware of premature commercialization

The same comments from the preceding section apply to business development. All factors involved in business decisions, including but not limited to the technology, are likely to change, evolve, and even radically change as pre-commercialization progresses.

Business deals arranged during pre-commercialization are unlikely to survive into commercialization. They should be considered short-term. Revision, rework, and restarts should be the expected norm during pre-commercialization — for both technology and business.

Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization

There will be plenty of temptations to blindly leap into quantum computing during pre-commercialization, but… better to look before you leap, or in fact don’t leap at all, waiting for true commercialization to actually begin or at least be imminent.

Just be aware of what you are leaping into — constant and incompatible change since quantum computing will remain a mere laboratory curiosity for the indefinite future. That’s fine for The Lunatic Fringe and more technologically sophisticated organizations, but not for most organizations.

When might the initial commercialization stage, C1.0, be available?

There’s no clarity or certainty as to the timeframe for commercialization of quantum computing, but it might be illuminating to speculate about maximum, nominal, and minimum paths to both pre-commercialization and initial commercialization — C1.0. These numbers are somewhat arbitrary, but hopefully helpful to bound expectations.

So, here they are, as elapsed times:

  • Pre-commercialization. Minimal: 2 years. Nominal: 4 years. Maximal: 10 years.
  • Commercialization. Minimal: 2 years. Nominal: 3 years. Maximal: 5 years.
  • Total. Minimal: 4 years. Nominal: 7 years. Maximal: 15 years.

Four to seven years seems to be the best and optimistic bet for the timeframe for C1.0.

Commercialization here refers to the readiness of the initial commercialization stage, C1.0. This would be the first production-quality product. Alpha, beta, and pre-releases would be available earlier.

And to be clear, all of these numbers are highly speculative and subject to change.

IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization

Late breaking news… Just as I was finishing up this paper I saw the news that IBM has finally announced the availability of their 127-qubit Eagle quantum computer system. That’s a major accomplishment and a big step forward, finally breaking the 100-qubit barrier. But… it’s not yet a commercial product and despite its achievement and progress is still woefully short of what is needed for true commercialization of quantum computing. Hence, it is solid evidence that we’re still deep in the realm of pre-commercialization.

Technical details are still sparse, but it doesn’t appear to have made any major breakthrough in qubit fidelity, qubit connectivity, or phase granularity, so other than the big jump in raw qubit count, it’s not terribly noteworthy.

Unfortunately, it has been presented in a slick and flashy manner, with all of the trappings of a commercial product even though it is still fundamentally inadequate to support production-scale practical real-world applications. It is still very much a mere laboratory curiosity. I’m sure that the lunatic fringe desperately want to get their hands on it, but that only emphasizes the conclusion that it is part of pre-commercialization, not even close to being a commercial product.

IBM explicitly admitted that they are not close to the key technical milestone of quantum advantage, without which there would be no good reason to proceed with commercialization of quantum computing:

The term quest is very appropriate, and once again emblematic of a research project, not the mundane tasks of an engineering project.

To be clear, IBM is making great progress, but the point is they still have a long way to go. Hence, they’re still deep in pre-commercialization, even though they present the technology as if it was imminent and virtually ready to go, even though it is not even close to being ready for development and deployment of production-scale practical real-world applications.

In short, it is unfortunately a prime example of premature commercialization.

I would have been happier if they showed some raw lab photos rather than pretentious flashy graphics.

I don’t want to be too negative, but I do think we need to focus more attention on this being part of the pre-commercialization of quantum computing, not commercialization itself.

The IBM press release:

The IBM blog post:

Initial press coverage by Reuters:

My apologies — There’s so much more! See my two papers

I’ve tried to keep this paper as short as possible, so it’s limited to summaries and some highlights. The full details are in my immediately two preceding papers which were designed and explicitly written to provide the foundation for this paper:

Need for more extensive research:

Commercialization, pre-commercialization, and premature commercialization:

Summary and conclusions

  1. Despite many advances, it still seems premature to attempt to commercialize quantum computing at this time — or any time soon.
  2. There has been great progress for quantum computing over the past few years.
  3. But significant deficits, shortfalls, limitations, and obstacles remain.
  4. Commercialization is not underway, not imminent, and not near.
  5. Too much talk as if commercialization was in fact underway, imminent, or near.
  6. We’re in the pre-commercialization stage for the indefinite future.
  7. Much more research is needed before the process of commercialization can even begin.
  8. Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
  9. Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
  10. Much prototyping and experimentation are needed for algorithms and applications, but focus should be on simulation since current hardware is too limited and too much of an unproductive distraction.
  11. The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses.
  12. Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation.
  13. Quantum error correction (QEC) will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
  14. Simulators should be configured for what is expected once commercialization has occurred. Or some future hardware at a subsequent stage of commercialization.
  15. We need a bold push for 40-qubit algorithms — as a start, with more complex algorithms after that. Under simulation, for now.
  16. We need a bold push for automatically and provably scalable algorithms.
  17. We need a bold push for greater quantum parallelism and quantum advantage.
  18. We need a bold push for substantial quantum advantage, even if full, dramatic quantum advantage is still a more distant future.
  19. Only much later in pre-commercialization — if not until initial commercialization, when the hardware has matured and stabilized — should prototyping and experimentation be attempted on actual hardware. Even then, more as a final test rather than as a primary design and development mode.
  20. Although hardware is a very limiting factor, algorithms and applications are a much greater limiting factor.
  21. Attempts at commercialization at this stage are unwarranted.
  22. Commercialization must wait until all of the technical uncertainties of hardware, algorithms, and applications are clearly identified and resolved — which is the purpose of pre-commercialization.
  23. Risk of changes to support software and tools during pre-commercialization. Accept that everything can and will change, more than once. Revision, rework, and restarts should be the expected norm during pre-commercialization.
  24. Risk of business development during pre-commercialization. All factors related to business deals, technology and business alike, should be expected to change, more than once. Revision, rework, and restarts should be the expected norm during pre-commercialization.
  25. Premature commercialization does more harm than good. An unproductive distraction. Algorithm research, prototyping, and experimentation should be focused on simulating the hardware expected for eventual commercialization.
  26. Double down on pre-commercialization is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
  27. Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent.
  28. Pre-commercialization could take another 2 to 4 years — or longer.
  29. The initial commercialization stage could take another 2 to 3 years — or longer — beyond pre-commercialization.
  30. The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
  31. Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.

--

--