Risks of Premature Commercialization of Quantum Computing

  • Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
  • Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.
  • Premature commercialization is a really, really, REALLY bad idea.
  1. In a nutshell
  2. What is premature commercialization?
  3. Overview
  4. The overall model
  5. Brief summary of pre-commercialization
  6. Research, more research, and even more research
  7. Premature commercialization risk
  8. Summary of risks of premature commercialization
  9. Overall the technology is NOT ready for production deployment
  10. No production deployment of quantum computing during pre-commercialization
  11. No great detail on commercialization proper here since focus here is on pre-commercialization
  12. For more on commercialization itself
  13. The crux of the problem, the dilemma
  14. Premature commercialization is the problem now facing us
  15. No need for premature Quantum Ready
  16. Great for Fortune 500 companies to do their own research push
  17. Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid
  18. Current dramatic push for commercialization is a counterproductive distraction
  19. Commercialization of current technology will NOT lead to dramatic quantum advantage
  20. Premature for any significant quantum advantage on any consistent basis across application categories
  21. Little if any of the current technology will be relevant in 5–10 years
  22. Wait a few years for the software technology to mature and evolve before getting started
  23. Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation
  24. Risk of backlash
  25. Big risk of hitting a Quantum Winter in two to three years
  26. Taming the hype may be impossible, so we need to push the reality to catch up
  27. Boost research, prototyping, and experimentation — pre-commercialization
  28. We need to push research much harder to try to catch up with the hype
  29. Distinguishing pre-commercialization from commercialization
  30. Avoid premature commercialization
  31. Critical technical gating factors for initial stage of commercialization
  32. Minimum viable product (MVP)
  33. No, noisy NISQ quantum computers are not viable for commercialization
  34. 48 fully-connected near-perfect qubits as the sweet spot goal for near-term quantum computing
  35. Critical hardware research issues
  36. Critical algorithm and application research areas
  37. Other critical research areas
  38. We need to decouple hardware development and algorithm and application research, prototyping, and experimentation
  39. Focus algorithm and application research, prototyping, and experimentation on simulation
  40. Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?
  41. Hardware engineers should run their own functional tests, stress tests, and benchmark tests
  42. Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware
  43. Functional enhancements and performance and capacity improvements are needed for simulation
  44. Where are all of the 40-qubit quantum algorithms?
  45. Scalability is essential for robust quantum algorithms
  46. Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic — much hard work is needed during pre-commercialization
  47. Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications
  48. Configure simulation to match expected commercial hardware
  49. Configure simulation to match expected improvements — or shortfalls — of the hardware
  50. Research will continue even as commercialization commences
  51. Exception: Commercial viability of capabilities which support pre-commercialization
  52. Early commercial opportunities for selling tools and services to enable and facilitate research, prototyping, and experimentation
  53. Exception: Random number-based applications are actually commercially viable today
  54. Even for exceptions for commercialization during pre-commercialization, be especially wary
  55. Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization
  56. Beware of any capabilities available during pre-commercialization which might seem as if they are likely to apply to commercialization as well
  57. Products which enable quantum computing vs. products which are enabled by quantum computing
  58. Potential for commercial viability of quantum-enabling products during pre-commercialization
  59. Preliminary quantum-enabled products during pre-commercialization
  60. Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization
  61. Risk of business development during pre-commercialization — beware of premature commercialization
  62. Quantum computing is still in the realm of the lunatic fringe
  63. Quantum Ready — It’s never too early for The Lunatic Fringe
  64. Quantum Aware is fine, but be careful about Quantum Ready
  65. Expect radical change — continually update vision of what quantum computing will look like
  66. Quantum computing is still a mere laboratory curiosity, not ready for production deployment
  67. Quantum computing is still more suited for elite technical teams than average, normal technical teams
  68. Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization
  69. Pre-commercialization is about constant change while commercialization is about stability and carefully controlled and compatible evolution
  70. Customers and users prefer carefully designed products, not cobbled prototypes
  71. Customers and users will seek the stability of methodical commercialization, not the chaos of pre-commercialization
  72. Quantum ecosystem
  73. Early, preliminary development of quantum ecosystem during pre-commercialization
  74. When might the initial commercialization stage, C1.0, be available?
  75. IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization
  76. Must assure that there are no great unanswered questions hanging over the heads of the commercialization teams
  77. My apologies — There’s so much more! See my three papers
  78. Grand finale — So what do we do now??
  79. My original proposal for this topic
  80. Summary and conclusions

In a nutshell

  1. General risks of premature commercialization of quantum computing…
  2. The technology just isn’t ready. Too much is missing. Too much is too primitive. Too much research is incomplete. Too much needs further research. Too much is incapable of delivering on the many wild promises that have been made,
  3. Risk of disenchantment and loss of project funding and commitment.
  4. Failure to complete projects.
  5. Failure of completed projects to meet expectations.
  6. Critical project failures now could make it harder to fund credible projects in the future.
  7. Risk of backlash. Disenchantment could lead to pushback on quantum computing. Denial of the potential for quantum computing.
  8. Surest path to an early quantum winter. By hitting a critical mass of disenchantment due to unmet expectations.
  9. Constant rework needed as the technology constantly and radically evolves. That’s the nature of the pre-commercialization stage. It’s a good thing at this stage, but not good for commercialization
  10. The technology is changing and evolving rapidly, so likely to be obsolete relatively soon, so it’s bad to bet on it in its current state.
  11. Insufficient research. Trying to skip too much of the needed research.
  12. Insufficient prototyping. Trying to skip too much of the needed prototyping.
  13. Insufficient experimentation. Trying to skip too much of the needed experimentation.
  14. Premature for any significant quantum advantage on any consistent basis across application categories. Very little, if any, quantum advantage available in the near term.
  15. Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications. Capabilities, tools, and methods are too primitive. Benchmarking not well developed.
  16. General comments…
  17. Commercialization of current technology will NOT lead to dramatic quantum advantage. The hardware is too primitive. Much research is needed.
  18. Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out and rely on simulation until the hardware is ready.
  19. Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
  20. Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
  21. Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic. Much hard work is needed. And it needs to be done during pre-commercialization. Attempting scalability during commercialization is a really bad idea. All of the issues need to be identified and worked out before commercialization even begins.
  22. It’s premature to even begin commercialization. The technology just isn’t ready. Not even close. Both hardware and algorithms, and applications.
  23. Much pre-commercialization work remains before commercialization can begin.
  24. Boost research, prototyping, and experimentation — pre-commercialization.
  25. Much research remains to fully characterize and resolve many technical obstacles. Many engineering challenges don’t have sufficient research results to guide them. Both hardware and algorithms, and applications.
  26. Hardware may seem to be the primary limiting factor, but algorithms are an even greater limiting factor. We can simulate 32 and 40-qubit algorithms, but they’re nowhere to be found.
  27. The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses. But we can make some good, tentative guesses now.
  28. Variational methods are an unproductive distraction and technical dead end — the focus should be on quantum Fourier transform (QFT) and quantum phase estimation (QPE). It will take years for the hardware to support this, but simulation can be used in the meantime.
  29. Quantum error correction (QEC) and logical qubits will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
  30. Prototyping and experimentation for quantum algorithms and quantum applications should focus on simulation configured to match the hardware expected at the time of initial commercialization rather than focusing on current, very limited hardware.
  31. There should be no expectation of running or even testing algorithms or applications for 64 or more qubits during pre-commercialization. Not until the hardware can be confirmed to be approaching the target capabilities for the initial commercialization stage — not just raw qubit count, but quality of the qubits. Simulation-only during pre-commercialization. May be limited to 50, 48, 44, 40, or even 32 qubits based on the limits of the simulator and circuit depth.
  32. Even initial commercialization will be fairly limited and it could take ten or more subsequent commercialization stages before the full promise of quantum computing can be delivered.
  33. Any efforts at premature commercialization are doomed to be counterproductive and a distraction from research and simulation for prototyping and experimentation.
  34. Hardware and algorithm research and development should be allowed to be on their own, parallel but independent tracks. Very slow progress on hardware must not be permitted to slow algorithm progress.
  35. Double down on pre-commercialization? Double down is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
  36. Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent. Some people and organizations require excitement and rapid change while others require calm stability — individuals and organizations must decide clearly which they are.
  37. Pre-commercialization could take another 2 to 4 years — or longer.
  38. The initial commercialization stage could take another 2 to 3 years — or longer, beyond pre-commercialization.
  39. The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
  40. Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.
  41. Quantum-enabled products. Products which are enabled by quantum computing. Such as quantum algorithms, quantum applications, and quantum computers themselves.
  42. Quantum-enabling products. Products which enable quantum computing. Such as software tools, compilers, classical quantum simulators, and support software. They run on classical computers and can be run even if quantum computing hardware is not available. Also includes classical hardware components and systems, as well as laboratory equipment.
  43. There are indeed exceptions: products or services which can actually thrive during pre-commercialization. Namely equipment, software, tools, and services which enable pre-commercialization, focused on research, prototyping, and experimentation. Anything but production deployment. Generally, quantum-enabling products.
  44. Even for exceptions for commercialization during pre-commercialization, be especially wary. Plenty of potential gotchas.
  45. Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization.
  46. The overall message is twofold…
  47. Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
  48. Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.
  49. Assure that there are no great unanswered questions hanging over the heads of professional product engineering teams that could interfere with their ability to develop commercial products by slowing their progress or putting their success at risk. Any needed research, prototyping, or experimentation must be complete and out of the way before commercialization can begin. No great questions can remain unanswered once commercialization commences.

What is premature commercialization?

Premature commercialization is an attempt to commercialize a technology before the basic research has been completed and before there has been sufficient prototyping and experimentation with the new technology to confirm that it really is ready for commercialization in terms of having enough answers to enough questions so that detailed specifications can be written for what commercial products should look like so that professional product engineering teams can proceed to develop commercial products without great unanswered questions hanging over their heads and slowing their progress or risking their success.

Overview

Premature commercialization of quantum computing will be counterproductive and lead to unnecessary disenchantment with the long-term potential for quantum computing.

The overall model

The overall model for the development of a technology has three stages:

  1. Pre-commercialization. Research, prototyping, and experimentation. Start with some ideas and see what can be done with them. The end result is not a product, but sufficient answers to all of the relevant questions so that a professional product engineering team can write specifications for commercial products and services.
  2. Premature commercialization. A false stage before pre-commercialization has been completed, where people imagine or fantasize that all of the preliminary foundational research, prototyping, and experimentation has been completed when in fact it has not been completed or in some cases not even started. The results will be rather less than impressive if not outright disastrous.
  3. Commercialization. All of the pre-commercialization work has been completed and enough answers to enough questions are readily available so that a professional product engineering team can proceed to writing specifications for products and services and proceed to implementing, delivering, and even deploying them with minimal technical risk.

Brief summary of pre-commercialization

Three main activities are occurring during the pre-commercialization stage of a new technology:

  1. Research. The basic research needed to understand the initial ideas and concepts to develop the rudiments of the technology.
  2. Prototyping. Attempting to use the rudimentary new technology to develop portions of applications which are representative of how the technology might be used when fully developed and commercialized. Provides feedback into further research.
  3. Experimentation. Using the new technology and prototypes to gather data and get a sense of how well the technology is performing. Including some preliminary degree of performance and capacity testing and benchmarking. Provides additional feedback into further research.

Research, more research, and even more research

The single greatest need at this stage for quantum computing is more research at all levels and in all areas. Theory, experimental, and applied.

Premature commercialization risk

A key motivation for this paper is to attempt to avoid the incredible technical and business risks that would come from premature commercialization of an immature technology — trying to create a commercial product before the technology is ready, feasible, or economically viable.

Summary of risks of premature commercialization

  1. The technology just isn’t ready. Too much is missing. Too much is too primitive. Too much research is incomplete. Too much needs further research. Too much is incapable of delivering on the many wild promises that have been made,
  2. Risk of disappointment.
  3. Risk of disenchantment.
  4. Risk of backlash. Disenchantment could lead to pushback on quantum computing. Denial of the potential for quantum computing.
  5. Risk of loss of enthusiasm and energy.
  6. Risk of loss of project funding.
  7. Risk of loss of commitment to the cause and promise of quantum computing.
  8. Failure to complete projects.
  9. Failure of completed projects to meet expectations.
  10. Critical project failures now could make it harder to fund credible projects in the future.
  11. Surest path to an early quantum winter. By hitting a critical mass of disenchantment due to unmet expectations.
  12. Constant rework needed as the technology constantly and radically evolves. That’s the nature of the pre-commercialization stage. It’s a good thing at this stage, but not good for commercialization
  13. The technology is changing and evolving rapidly, so likely to be obsolete relatively soon, so it’s bad to bet on it in its current state.
  14. Insufficient research. Trying to skip too much of the needed research.
  15. Insufficient prototyping. Trying to skip too much of the needed prototyping.
  16. Insufficient experimentation. Trying to skip too much of the needed experimentation.
  17. Insufficient knowledge. Too many unanswered questions. Or questions that never even got asked. Pre-commercialization surfaces and answers all relevant questions.
  18. Premature for any significant quantum advantage on any consistent basis across application categories. Very little, if any, quantum advantage available in the near term.
  19. Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications. Capabilities, tools, and methods are too primitive. Benchmarking not well developed.

Overall the technology is NOT ready for production deployment

Tremendous progress is being made in quantum computing, but we have very far yet to go. We don’t even know how far we have to go. The simple truth is that wherever we are or however far we have to go, overall quantum computing is simply not ready for production deployment. Prototyping and experimentation, yes, but production deployment, no.

No production deployment of quantum computing during pre-commercialization

Saying it again, more clearly, quantum computing is still too immature to even consider production deployment during pre-commercialization.

No great detail on commercialization proper here since focus here is on pre-commercialization

Commercialization itself is discussed in this paper to some degree, but not as a main focus since the primary intent is to highlight what work should be considered during pre-commercialization vs. product engineering for commercialization.

For more on commercialization itself

As mentioned, this paper does not focus on commercialization itself as a primary focus. For more detail on commercialization itself, see my two papers:

The crux of the problem, the dilemma

Quantum computing has been plodding along for several decades now, finally accelerating in recent years, so it’s only natural that people would finally like to see the fruits of this labor put into action actually solving practical real-world problems. The desire is very real. It’s palpable. It’s irresistible.

Premature commercialization is the problem now facing us

Research in hardware, algorithms, and applications may be the clear technical problem in front of us, but the main problem right in front of us is the temptation of premature commercialization.

No need for premature Quantum Ready

Not all organizations or all individuals within a particular organization need to get Quantum Ready at the same time or pace, or even at all. It all depends on the needs, interests, criteria, maturity, and timing of the particular organization, department, project, team, role, or individual. It’s not a one size fits all proposition. The timing will vary, as will the pace.

Great for Fortune 500 companies to do their own research push

Large organizations with deep pockets for research and advanced development should of course be focusing on quantum computing, but…

  1. Focus on the long term. These research efforts should not be billed as intended to produce production solutions over the next few years.
  2. Demonstrations of the future, not the near-term. Research prototypes should be billed as demonstrations of possible production-scale technology in 5–7 years and production deployment in 10–15 years, but not near term in the next 2–4 years.
  3. Integration of quantum into mainline applications will take years. Integration of quantum technology into larger solutions could take 3–5 (or even 7) years alone, even once the quantum technology itself is ready.
  4. Some elite teams may develop ENIAC-class solutions in less time. Maybe to production deployment in 3–5 years, but most organizations will have to wait another 5–8 years for The FORTRAN Moment, or utilize configurable packaged quantum solutions acquired from outside vendors who do have the necessary elite teams.

Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid

The tremendous hype surrounding quantum computing is quite intoxicating. It’s one thing to be taken in by the promise of a brilliant future, but it’s an entirely different matter to treat that distant promise as if it were reality today or even the relatively near future.

Current dramatic push for commercialization is a counterproductive distraction

The quantum computing field is plagued with excessive hype, not simply promises of great benefits to come, but even to the point of claims that the benefits are actually here, now, or at least in the very near future.

  1. Quantum Computing May Be Closer Than You Think
  2. Quantum Computing Might Be Here Sooner Than You Think
  3. Quantum Computing May Be A Reality Sooner Than You Think
  4. Quantum Computing Will Change Everything, and Sooner Than You Expect
  5. Quantum Computing is coming to you faster than you think
  6. Quantum computing could be useful faster than anyone expected
  7. A Quantum Future will be here Sooner than You Think
  8. Quantum Computers Could Go Mainstream Sooner than We Think
  9. You’ll Be Using Quantum Computers Sooner Than You Think
  10. And more!

Commercialization of current technology will NOT lead to dramatic quantum advantage

Current quantum computing technology is actually fairly impressive compared to just a few years ago, but is still well short of being suitable for solving production-scale practical real-world problems and achieving even a tiny fraction of dramatic quantum advantage. And this includes both hardware and algorithms.

Premature for any significant quantum advantage on any consistent basis across application categories

There may be some narrow niche cases where some minor quantum advantage can be achieved, but there is certainly no opportunity to achieve any significant quantum advantage in any broad sense across a wide swath of applications or application categories, and certainly not on any consistent basis.

  1. Minimal quantum advantage. A 1,000X performance advantage over classical solutions. 2X, 10X, and 100X (among others) are reasonable stepping stones.
  2. Substantial or significant quantum advantage. A 1,000,000X performance advantage over classical solutions. 20,000X, 100,000X, and 500,000X (among others) are reasonable stepping stones.
  3. Dramatic quantum advantage. A one quadrillion X (one million billion times) performance advantage over classical solutions. 100,000,000X, a billion X, and a trillion X (among others) are reasonable stepping stones.

Little if any of the current technology will be relevant in 5–10 years

Instead of expending inordinate energy on distorting and shoehorning stripped-down algorithms into current hardware, we should instead rely on simulation in the near term, and focus algorithm and application research on expected hardware 2–7 years out.

  1. Much higher qubit fidelity.
  2. Greater qubit counts.
  3. Finer phase granularity.
  4. Quantum error correction (QEC) and logical qubits.
  5. Lower gate error rates.
  6. Lower measurement error rates.
  7. Greater qubit connectivity.
  8. Capable of supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Wait a few years for the software technology to mature and evolve before getting started

Besides the hardware advancing, if you wait a few years the software and programming models will have evolved as well, making it much easier to get started.

  1. Higher-level programming models.
  2. Higher-level quantum-native programming languages.
  3. Higher-level algorithmic building blocks.
  4. More sophisticated design patterns.
  5. More sophisticated application frameworks.
  6. Availability of configurable packaged quantum solutions. Let somebody else do all of the really hard work. Or, do all of the hard work yourself and then get others to pay you for access.
  7. Plethora of operational quantum algorithms and applications to mimic. Ah, so that’s how you do it!

Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation

Variational methods are quite popular right now, particularly for such applications as quantum computational chemistry, primarily because they work, in a fashion, on current NISQ hardware, but they are far from ideal and only work for smaller problems. In fact, they are a distraction and an absolute technical dead end — variational algorithms will never achieve dramatic quantum advantage, just by the nature of how they do work.

Risk of backlash

If sufficient progress is not made during the next year or two, there is a very real risk of developing a backlash against quantum computing.

Big risk of hitting a Quantum Winter in two to three years

The biggest risk in quantum computing is an excess of investment flows into quantum computing with an expectation of payoff (presumed commercialization) within two or three years, followed by a Quantum Winter as vendors and organizations are unable to deliver on those bold promises by the time the patience of that investment money has been pushed to the limit.

Taming the hype may be impossible, so we need to push the reality to catch up

Sure, we can push back on the hype, to some degree, but that’s a Herculean task that would consume all of our energy. Instead, we can focus on how to advance the technology so that it catches up with at least a fair fraction of the hype.

Boost research, prototyping, and experimentation — pre-commercialization

If there is only one message for the reader to take away from this paper it is to boost research, as well as to boost prototyping and experimentation.

We need to push research much harder to try to catch up with the hype

We may have only another two or three years to begin delivering on at least some of the bold promises of quantum computing before patient capital loses patience.

Distinguishing pre-commercialization from commercialization

Commercialization implies that you have the necessary science, technology, knowledge, and skills, and now you just have to do it. You have the science. You just need to do the engineering. It’s not quite that simple, but that’s the essence.

Avoid premature commercialization

Just to reemphasize the point, premature commercialization is the attempt to engage in commercialization before the necessary science, technology, knowledge, and skills have been developed. In short, don’t do it!

Critical technical gating factors for initial stage of commercialization

Here’s a brief summary of the more critical technical gating factors which must be addressed before quantum computing can be considered ready for commercialization, an expectation for the initial stage of commercialization, C1.0:

  1. Near-perfect qubits. At least four nines of qubit fidelity — 99.99%. Possibly five nines — 99.999%. Okay, maybe 3.75 nines or even 3.5 nines might be enough at least for some applications.
  2. Circuit depth. Generally limited by coherence time. No clear threshold at this stage but definitely going to be a critical gating factor. Whether it is 50, 100, 500, or 1,000 is unclear. Significantly more than it is now. Let’s call it 250 for the sake of argument.
  3. Qubit coherence time. Sufficient to support needed circuit depth.
  4. Near-full qubit connectivity. Either full any to any qubit connectivity or qubit fidelity high enough to permit SWAP networks to simulate near-full connectivity.
  5. 64 qubits. Roughly. No precise threshold. Maybe 48 qubits would be enough, or maybe 72 or 80 qubits might be more appropriate. Granted, I think people would prefer to see 128 to 256 qubits, but 64 to 80 (or maybe 48) might be sufficient for the initial commercialization stage.
  6. Alternative architectures may be required. Especially for more than 64 qubits. Or even for 64, 60, 56, 50, and 48 qubits in order to deal with limited qubit connectivity.
  7. Fine phase granularity to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). 40 qubits = 2⁴⁰ gradations — one trillion gradations should be the preferred target for C1.0. At least 20 or 30 qubits = 2²⁰ to 2³⁰ gradations — one million to one billion gradations, at a minimum. Even 20 qubits may be a hard goal to achieve. 50 qubits needed for dramatic quantum advantage.
  8. Quantum Fourier transform (QFT) and quantum phase estimation (QPE). Needed for quantum computational chemistry and other applications. Needed to achieve quantum advantage through quantum parallelism. Relies on fine granularity of phase.
  9. Conceptualization and methods for calculating shot count (circuit repetitions) for quantum circuits. This will involve technical estimation based on quantum computer science coupled with engineering processes based on quantum software engineering. See my paper below.
  10. Moderate improvements to the programming model. Unlikely that a full higher-level programming model will be available soon (before The FORTRAN Moment), but some improvements should be possible.
  11. Moderate library of high-level algorithmic building blocks.
  12. The ENIAC Moment. A proof that something realistic is possible. The first production-scale practical real-world application.
  13. Substantial quantum advantage. Full, dramatic quantum advantage (one quadrillion X speedup) is not so likely, but an advantage of at least a million or a billion is a reasonable expectation — much less will be seen as not really worth the trouble. This will correspond to roughly 20 to 30 qubits in a single Hadamard transform — 2²⁰ = one million, 2³⁰ = one billion. An advantage of one trillion — 2⁴⁰ may or may not be reachable by the initial stage of commercialization. Worst case, maybe minimal quantum advantage — 1,000X to 50,000X — might be acceptable for the initial stage of commercialization.
  14. 40-qubit quantum algorithms. Quantum algorithms utilizing 32 to 48 qubits should be common. Both the algorithms and hardware supporting those algorithms. 48 to 72-qubit algorithms may be possible, or not — they may require significantly greater qubit fidelity.
  15. Classical quantum simulators for 48-qubit algorithms. The more the better, but that may be the practical limit in the near term. We should push the researchers for 50 to 52 or even 54 qubits of full simulation.
  16. Overall the technology is ready for production deployment. At least in some minimal sense.
  17. No further significant research is needed to support the initial commercialization stage product, C1.0. Further research for subsequent commercialization stages, but not for the initial commercialization stage. The point is that research belongs in the pre-commercialization stage, not during commercialization.

Minimum viable product (MVP)

One could accept all of the critical technical gating factors for the initial stage of commercialization (C1.0) as the requirements for a minimum viable product (MVP). That would be the preference. But, it may turn out that not all customers or users need all of those capabilities or features. Or, maybe everybody wants and needs all of those capabilities and features, but they simply aren’t technically or economically feasible in a reasonable timeframe. In such situations it may make sense or at least be tempting to define a minimum viable product (MVP) which is substantially less than the more capable desired initial product.

  1. Qubit count. 128 or 256 qubits may be a clear preference, but maybe 72 or 64 or even 48 qubits might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
  2. Qubit fidelity. Five nines or at least 4.5 nines of qubit fidelity might be the preference, but four or even 3.5 nines might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
  3. Connectivity. Full connectivity might not be achievable. Maybe SWAP networks are feasible if qubit fidelity is high enough.
  4. Fineness of phase granularity. Critical for quantum Fourier transform and quantum phase estimation. Sufficient for at least 20 to 30 qubits = 2²⁰ to 2³⁰ gradations in a quantum Fourier transform, rather than the desired 50 to 64.
  5. Quantum Fourier transform and quantum phase estimation resolution. Preferably at least 20 to 30 qubits, but maybe only 16 bits of precision can be achieved — or even only 12, rather than 32 to 64 bits.

No, noisy NISQ quantum computers are not viable for commercialization

Absent support for quantum error correction (QEC), only near-perfect qubits would supply the qubit fidelity needed to support larger and more complex quantum algorithms and to achieve substantial quantum advantage, such as supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE) which are required for quantum computational chemistry.

48 fully-connected near-perfect qubits as the sweet spot goal for near-term quantum computing

After some careful thought, I came up with a proposal for what might be the most viable quantum computer to be available in two to three years or so.

  1. 48 qubits.
  2. Full any to any connectivity for all qubits.
  3. Near-perfect qubits. 3.25 to 4 nines of qubit fidelity.
  4. Fine granularity of phase to support a 20-bit quantum Fourier transform (QFT). And enable quantum phase estimation (QPE) to support quantum computational chemistry.
  5. Up to 2,000-gate circuits.

Critical hardware research issues

A separate paper lays out much of the needed research for quantum computing hardware. This paper only briefly summarizes the critical areas:

  1. Need for a more ideal qubit technology. Current qubit technology demonstrates quantum computing capabilities, but is incapable of delivering on the full promises of quantum computing.
  2. Limited qubit capacity. Need a lot more qubits.
  3. Limited qubit fidelity. Too many errors.
  4. Limited qubit coherence. Limits circuit depth.
  5. Limited circuit depth. Basically limited by qubit coherence.
  6. Limited gate fidelity. Too many errors executing quantum logic gates.
  7. Limited granularity of phase. Need fine granularity for quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  8. Limited measurement fidelity. Too many errors measuring a qubit to get results.
  9. Unable to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). Rely on qubit fidelity, qubit connectivity, fine granularity of phase, gate fidelity, circuit depth, and measurement fidelity. Needed for application categories such as quantum computational chemistry.
  10. Unable to achieve substantial quantum advantage. A dramatic performance advantage over classical computing is the only point of even pursuing quantum computing. This will require a combination of sufficient hardware capabilities with quantum algorithms which exploit those hardware capabilities.

Critical algorithm and application research areas

A separate paper lays out much of the needed research for quantum algorithms and quantum applications. This paper only briefly summarizes the critical areas:

  1. Need for a higher-level programming model. Current programming model is too low-level, too primitive, too much like classical machine and assembly language.
  2. Need for a robust collection of high-level algorithmic building blocks.
  3. Need for a high-level programming language. Tailored to the needs of quantum algorithms and quantum applications.
  4. Need for a robust collection of example algorithms. Which demonstrate production-scale quantum parallelism and show how practical real-world problems can be easily transformed into quantum algorithms and applications.
  5. Need for algorithm debugging capabilities. Difficult enough for relatively simple quantum algorithms, virtually impossible for complex quantum algorithms.
  6. Need for configurable packaged quantum solutions. Generalized applications for each major application category which allow the developer to present input data and input parameters in an easy way which can readily be automatically transformed into adaptations of the pre-written quantum algorithms and application frameworks. Still requires a lot of work, but not expertise in quantum circuits.
  7. Research in specific algorithms for each application category.

Other critical research areas

Besides hardware, algorithms, and applications, there are a number of other areas of critical and urgent research needed to fully exploit the promised potential of quantum computing. From my paper, here is the summary list of the areas:

  1. Physics.
  2. Hardware.
  3. Firmware (see: Hardware).
  4. Hardware support.
  5. Debugging.
  6. Classical quantum simulators.
  7. Quantum information science in general.
  8. Software. Support software, tools.
  9. Quantum software engineering. A new field.
  10. Quantum computer science. A new field.
  11. Cybersecurity.
  12. Quantum algorithm support.
  13. Quantum algorithms.
  14. Quantum application support.
  15. Quantum applications.
  16. Quantum application solutions. Particularly configurable packaged quantum solutions.
  17. Quantum general artificial intelligence.
  18. Quantum advantage and quantum supremacy.
  19. Other areas of QSTEM research.

We need to decouple hardware development and algorithm and application research, prototyping, and experimentation

Trying to prototype and experiment with algorithms and applications on woefully-inadequate hardware is an exercise in futility. There’s another, better path: simulation.

Focus algorithm and application research, prototyping, and experimentation on simulation

Simulation can be slow and is limited to 40 to 50 or so qubits, but is much more reliable and ultimately more efficient and productive than attempting to prototype and experiment with hardware that simply isn’t up to the task.

Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?

There are very few real algorithms that use more than about 23 qubits on a real quantum computer at present. That is likely due to the fact that this is approximately the limit of current hardware, particularly with respect to qubit fidelity, qubit connectivity, coherence time, circuit depth, gate fidelity, and measurement errors.

Hardware engineers should run their own functional tests, stress tests, and benchmark tests

The quantum hardware engineers should run their own functional tests, stress tests, and benchmark tests to confirm that their hardware is performing as expected. No need to slow down algorithm and application research, prototyping, and experimentation just to test the hardware in a rather inefficient manner.

Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware

There’s no good reason to slow down algorithm and application research, prototyping, and experimentation or to gate them by hardware research.

  1. Hardware limits don’t interfere with progress.
  2. Simulation and analysis software can alert them to bugs and other issues in their algorithms and applications. Debugging on actual quantum hardware is very problematic.

Functional enhancements and performance and capacity improvements are needed for simulation

Simulation runs reasonably fine today. Yes, significant functional enhancements and performance and capacity improvements would be very beneficial, but simulators are generally more usable than actual hardware at the moment — and for the indefinite future.

Where are all of the 40-qubit quantum algorithms?

Indeed, where are they? There doesn’t appear to be any technically valid reason that we don’t see a plethora of 40-qubit or even 32-qubit algorithms, other than the mere fact that it’s so intoxicating to run algorithms on actual quantum hardware. An increased focus on simulation should improve the situation.

Scalability is essential for robust quantum algorithms

We want to be able to develop quantum algorithms and applications today which will run on future hardware as it becomes available. We definitely don’t want to have to redesign and reimplement quantum algorithms and quantum applications every time there is even a modest advance in hardware capabilities. Scalability is the key. And in fact automatic scalability.

Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic — much hard work is needed during pre-commercialization

Commercialization is a really, really, super-bad time to start thinking about how to scale your quantum algorithms or quantum applications.

  1. Free.
  2. Cheap.
  3. Easy.
  4. Obvious.
  5. Automatic.
  1. A lot of effort.
  2. A lot of time.
  3. A lot of careful attention to detail.
  4. A lot of patience.
  5. A lot of diligence.
  6. A lot of focus.
  • Don’t do it!

Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications

Understanding the performance and capacity of quantum computers is essential for success for commercialization. Ditto for the performance and capacity of quantum algorithms and quantum applications.

  1. Observed.
  2. Measured.
  3. Tabulated.
  4. Analyzed.
  5. Characterized.
  6. Reported.
  7. Used.
  1. Understood fully.
  2. Understood in detail.
  3. Understood accurately.

Configure simulation to match expected commercial hardware

Simulation can be configured to match any hardware configuration. The default should be the target hardware configuration for the initial commercialization stage of quantum computing, C1.0, with the only significant difference being fewer qubits since simulation is limited to 40 to 50 or so qubits.

  1. Qubit count.
  2. Coherence time.
  3. Circuit depth.
  4. Connectivity.
  5. Phase granularity.
  6. Qubit fidelity.
  7. Gate fidelity.
  8. Measurement fidelity.

Configure simulation to match expected improvements — or shortfalls — of the hardware

Simulation is flexible and can be configured to match any hardware configuration (limited only by the maximum qubits for simulation or roughly 40 to 50 or so qubits.)

Research will continue even as commercialization commences

Research for any vibrant field is never-ending. There’s always something new to discover, and some new problem to be overcome.

Exception: Commercial viability of capabilities which support pre-commercialization

Generally, this paper is arguing strongly against premature commercialization of quantum computing, especially the notion that quantum computers might be ready to be considered for deploying production quantum applications. But there is one category of exception to this opposition to commercialization of quantum computing, namely, products and services which are targeted at supporting pre-commercialization activities themselves, namely, research, prototyping, and experimentation. But just not products or services geared towards production deployment and operational use of production applications.

  1. Equipment. For use by researchers and for prototyping and experimentation.
  2. Software. Classical quantum simulators. Support software.
  3. Software tools. Compilers. Algorithm analysis tools. Debugging tools.
  4. Services. Including consulting.

Early commercial opportunities for selling tools and services to enable and facilitate research, prototyping, and experimentation

Just emphasizing again the commercial opportunities that could be available during pre-commercialization — anything that enables or supports pre-commercialization activities — research, prototyping, and experimentation.

Exception: Random number-based applications are actually commercially viable today

Although much of quantum computing is still subject to research and intensive pre-commercialization, there is one narrow niche that actually is commercially viable right now, today — generation of true random numbers. True random numbers are not mathematically computable, and hence cannot be computed using a Turing machine or classical computer. Special hardware is needed to access the level of entropy required for true random numbers. Quantum effects can supply the necessary entropy. And quantum computers are able to access the necessary quantum effects, using the Hadamard transform using the simple Hadamard gate. No further research is required to make use of this simple capability right now, today.

Even for exceptions for commercialization during pre-commercialization, be especially wary

Even if early commercialization does seem warranted during pre-commercialization, there are plenty of potential gotchas. Be especially wary for:

  1. Sudden and unexpected technology evolution. Dramatic and expensive rework could be required.
  2. Incompatible changes. More rework.
  3. Potential for a Quantum Winter which could cause business to evaporate rapidly.
  4. Uncertain market conditions.
  5. Unpredictable markets.
  6. Unpredictable budgets.
  7. Sudden appearance of competitors.
  8. Rapid technology changes which could erase technical opportunities for commercial products. Of course, the opposite could occur as well — new technical opportunities can appear out of nowhere at any time.
  9. Costs. Unexpected costs. Costs which you should have expected. Higher costs than you did expect. More costs than you expected.
  10. Difficulty or impossibility of obtaining needed service level agreements (SLA). Contractual commitments are essential for services, including access to equipment and personnel.
  11. Need to offer service level agreements (SLA) to customers. If you’re an equipment or service provider, or offer staffing.

Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization

Yes, there are exceptions when it comes to commercialization during pre-commercialization, such as highlighted in the preceding sections, but even then there are significant caveats:

  1. Cost. Sure, a lot of access is free these days, but that’s not for production deployment. What exactly is the cost for production deployment? Especially if you require dedicated hardware for 24/7 operation, and for redundancy to protect against outages.
  2. Availability. Again, a lot of access is free, but what is availability for production deployment?
  3. Service level agreements (SLA). Production deployment requires a solid commitment for availability, performance, capacity, support, redundancy, etc. Be sure to have contractual commitments to all of the above, which is nominally in the form of a service level agreement (SLA). Be sure to read all of the fine print.
  4. Diversity of sourcing. Don’t be reliant on a single provider for any service, equipment, software, or tool. Companies can go out of business during a technological winter, or change their terms of service in an unacceptable manner at any time.

Beware of any capabilities available during pre-commercialization which might seem as if they are likely to apply to commercialization as well

To be clear, commercialization of quantum computing will have a clean slate compared to whatever might be available during pre-commercialization.

Products which enable quantum computing vs. products which are enabled by quantum computing

There are really two distinct categories of products covered by this paper:

  1. Quantum-enabled products. Products which are enabled by quantum computing. Such as quantum algorithms, quantum applications, and quantum computers themselves.
  2. Quantum-enabling products. Products which enable quantum computing. Such as software tools, compilers, classical quantum simulators, and support software. They run on classical computers and can be run even if quantum computing hardware is not available. Also includes classical hardware components and systems, as well as laboratory equipment.

Potential for commercial viability of quantum-enabling products during pre-commercialization

Although most quantum-related products, including quantum applications and quantum computers themselves, have no substantive real value until the commercialization stage of quantum computing, a whole range of quantum-enabling products do potentially have very real value, even commercial value, during pre-commercialization and even during early pre-commercialization. These include:

  1. Quantum software tools.
  2. Compilers and translators.
  3. Algorithm analysis tools.
  4. Support software.
  5. Classical quantum simulators.
  6. Hardware components used to build quantum computers.

Preliminary quantum-enabled products during pre-commercialization

Some vendors may in fact offer preliminary quantum-enabled products — or consulting services — during pre-commercialization, strictly for experimentation and evaluation, but with no prospect for commercial use during pre-commercialization.

Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization

Literally every aspect of quantum computing technology during pre-commercialization is subject to change. Hardware, software, tools, algorithms, applications, programming models, knowledge, methods — you name it, all of it has a high probability of changing by the time true commercialization begins. This especially includes support software and tools.

Risk of business development during pre-commercialization — beware of premature commercialization

The same comments from the preceding section apply to business development. All factors involved in business decisions, including but not limited to the technology, are likely to change, evolve, and even radically change as pre-commercialization progresses.

Quantum computing is still in the realm of the lunatic fringe

The concept of the lunatic fringe in technology innovation refers to the leading edge or actually the bleeding edge of early-early adopters who are literally willing to try any new technology long before it is proven and ready for commercial deployment. And in fact they don’t even mind if it doesn’t work properly yet — since they enjoy fixing products themselves.

Quantum Ready — It’s never too early for The Lunatic Fringe

We can debate exactly what the right moment is for a given organization to decide to become Quantum Ready, but for one group, The Lunatic Fringe, it is never too early. After all, by definition, they are willing to work with any technology at any stage, even before it is even remotely close to being usable. But, they are not representative of most organizations or most normal, average technical staff.

Quantum Aware is fine, but be careful about Quantum Ready

There’s no real harm in getting a variety of levels of staff to be Quantum Aware, knowledgeable about the general capabilities — and limitations — of quantum computing. But, it may not be so advisable to get many members of staff to the much higher level of Quantum Ready, where they actually have a deep enough level of training to actually design quantum algorithms and develop quantum applications.

Expect radical change — continually update vision of what quantum computing will look like

As research, prototyping, experimentation, and even product development progress, our notions and visions of what quantum computing will ultimately look like will evolve and have to be continually updated.

Quantum computing is still a mere laboratory curiosity, not ready for production deployment

At present, quantum computing is a mere laboratory curiosity — not yet ready for production deployment. There is much work to be completed, including and especially basic research, before quantum computing is ready for production deployment.

  1. Features.
  2. Performance.
  3. Capacity.
  4. Reliability.
  5. Availability.
  6. Ease of use.
  7. Stability.
  8. Predictability.
  9. Support. Including commitment and service level agreements (SLA).
  10. Availability of working quantum algorithms.
  11. Availability of working quantum applications.
  12. Availability of a talent pool of technical staff.
  13. Development of quantum computer science as a mature field.
  14. Development of quantum computer engineering as a mature field.
  15. Development of quantum software engineering as a mature field.

Quantum computing is still more suited for elite technical teams than average, normal technical teams

Quantum computing is still much too complex and ill-suited for use by your average, normal technical team or IT staff. It takes a much more elite level of technical sophistication to master quantum computing.

Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization

There will be plenty of temptations to blindly leap into quantum computing during pre-commercialization, but… better to look before you leap, or in fact don’t leap at all, waiting for true commercialization to actually begin or at least be imminent.

Pre-commercialization is about constant change while commercialization is about stability and carefully controlled and compatible evolution

The whole point of pre-commercialization is that there are lots of open issues and questions without clear or stable answers. The process of resolving them will result in a continuous sequence of changes, sometimes relatively minor but sometimes very disruptive.

Customers and users prefer carefully designed products, not cobbled prototypes

Again, The Lunatic Fringe will be perfectly happy with the prototype products of pre-commercialization which are cobbled together from disparate components and which are far from polished, incomplete, inconsistent, and even cryptic and problematic to use. Worst case, they’re more than happy to roll up their sleeves and fix or even replace any problematic or balky products or components.

Customers and users will seek the stability of methodical commercialization, not the chaos of pre-commercialization

Customers and users of commercial products want to avoid all sense of drama in their IT products. They want the stability and consistency of commercialization rather than the chaos of pre-commercialization.

Quantum ecosystem

A successful quantum computing commercial product will require a thriving, vibrant, and mutually-supportive ecosystem, which consists of:

  1. Hardware vendors.
  2. Software vendors. Tools and support software.
  3. Consulting firms.
  4. Quantum algorithms.
  5. Quantum applications.
  6. Open source whenever possible. Algorithms, applications and tools. Hardware and firmware as well. Freely accessible plans so that anyone could build a quantum computer. Libraries, metaphors, design patterns, application frameworks, and configurable packaged quantum solutions. Training materials. Tutorials. Examples. All open source.
  7. Community. Including online discussion and networking. Meetups, both in-person and virtual.
  8. Analysts. Technical research as well as financial markets.
  9. Journalists. Technical and mainstream media.
  10. Publications. Academic journals, magazines, books. Videos and podcasts.
  11. Conferences. Presentation of papers, tutorials, and trade show exhibits. Personal professional networking opportunities.
  12. Vendors. Hardware, software, services, algorithms, applications, solutions, consulting, training, conferences.
  13. Research community. Academia, corporate, nonprofit, and government.

Early, preliminary development of quantum ecosystem during pre-commercialization

The preceding section summarized aspects of the quantum ecosystem which would be expected to thrive during commercialization, but early subsets of that ultimate ecosystem can be expected to begin to take root or occasionally even thrive during pre-commercialization.

When might the initial commercialization stage, C1.0, be available?

There’s no clarity or certainty as to the timeframe for commercialization of quantum computing, but it might be illuminating to speculate about maximum, nominal, and minimum paths to both pre-commercialization and initial commercialization — C1.0. These numbers are somewhat arbitrary, but hopefully helpful to bound expectations.

  • Pre-commercialization. Minimal: 2 years. Nominal: 4 years. Maximal: 10 years.
  • Commercialization. Minimal: 2 years. Nominal: 3 years. Maximal: 5 years.
  • Total. Minimal: 4 years. Nominal: 7 years. Maximal: 15 years.

IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization

Late last year (2021), IBM finally announced the availability of their 127-qubit Eagle quantum computer system. That was a major accomplishment and a big step forward, finally breaking the 100-qubit barrier. But… it’s not yet a commercial product and despite its achievement and progress is still woefully short of what is needed for true commercialization of quantum computing. Hence, it is solid evidence that we’re still deep in the realm of pre-commercialization.

Must assure that there are no great unanswered questions hanging over the heads of the commercialization teams

Commercialization of a product cannot begin until all relevant great questions have been answered:

  1. All relevant research has been completed. No relevant questions remain unanswered.
  2. All relevant prototyping has been completed. No relevant questions remain unanswered.
  3. All relevant experimentation has been completed. No relevant questions remain unanswered.

My apologies — There’s so much more! See my three papers

I’ve tried to keep this paper as short as possible, so it’s limited to summaries and some highlights. Additional detail are in my preceding three papers, upon which this paper was based:

Grand finale — So what do we do now??

Just to make sure that we end with the right note:

  • Premature commercialization is a really, really, REALLY bad idea.
  • Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
  • Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.

My original proposal for this topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

Summary and conclusions

  1. General risks of premature commercialization of quantum computing…
  2. The technology just isn’t ready. Too much is missing. Too much is too primitive. Too much research is incomplete. Too much needs further research. Too much is incapable of delivering on the many wild promises that have been made,
  3. Risk of disenchantment and loss of project funding and commitment.
  4. Failure to complete projects.
  5. Failure of completed projects to meet expectations.
  6. Critical project failures now could make it harder to fund credible projects in the future.
  7. Risk of backlash. Disenchantment could lead to pushback on quantum computing. Denial of the potential for quantum computing.
  8. Surest path to an early quantum winter. By hitting a critical mass of disenchantment due to unmet expectations.
  9. Constant rework needed as the technology constantly and radically evolves. That’s the nature of the pre-commercialization stage. It’s a good thing at this stage, but not good for commercialization
  10. The technology is changing and evolving rapidly, so likely to be obsolete relatively soon, so it’s bad to bet on it in its current state.
  11. Insufficient research. Trying to skip too much of the needed research.
  12. Insufficient prototyping. Trying to skip too much of the needed prototyping.
  13. Insufficient experimentation. Trying to skip too much of the needed experimentation.
  14. Premature for any significant quantum advantage on any consistent basis across application categories. Very little, if any, quantum advantage available in the near term.
  15. Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications. Capabilities, tools, and methods are too primitive. Benchmarking not well developed.
  16. General comments…
  17. Commercialization of current technology will NOT lead to dramatic quantum advantage. The hardware is too primitive. Much research is needed.
  18. Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out and rely on simulation until the hardware is ready.
  19. Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
  20. Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
  21. Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic. Much hard work is needed. And it needs to be done during pre-commercialization. Attempting scalability during commercialization is a really bad idea. All of the issues need to be identified and worked out before commercialization even begins.
  22. It’s premature to even begin commercialization. The technology just isn’t ready. Not even close. Both hardware and algorithms, and applications.
  23. Much pre-commercialization work remains before commercialization can begin.
  24. Boost research, prototyping, and experimentation — pre-commercialization.
  25. Much research remains to fully characterize and resolve many technical obstacles. Many engineering challenges don’t have sufficient research results to guide them. Both hardware and algorithms, and applications.
  26. Hardware may seem to be the primary limiting factor, but algorithms are an even greater limiting factor. We can simulate 32 and 40-qubit algorithms, but they’re nowhere to be found.
  27. The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses. But we can make some good, tentative guesses now.
  28. Variational methods are an unproductive distraction and technical dead end — the focus should be on quantum Fourier transform (QFT) and quantum phase estimation (QPE). It will take years for the hardware to support this, but simulation can be used in the meantime.
  29. Quantum error correction (QEC) and logical qubits will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
  30. Prototyping and experimentation for quantum algorithms and quantum applications should focus on simulation configured to match the hardware expected at the time of initial commercialization rather than focusing on current, very limited hardware.
  31. There should be no expectation of running or even testing algorithms or applications for 64 or more qubits during pre-commercialization. Not until the hardware can be confirmed to be approaching the target capabilities for the initial commercialization stage — not just raw qubit count, but quality of the qubits. Simulation-only during pre-commercialization. May be limited to 50, 48, 44, 40, or even 32 qubits based on the limits of the simulator and circuit depth.
  32. Even initial commercialization will be fairly limited and it could take ten or more subsequent commercialization stages before the full promise of quantum computing can be delivered.
  33. Any efforts at premature commercialization are doomed to be counterproductive and a distraction from research and simulation for prototyping and experimentation.
  34. Hardware and algorithm research and development should be allowed to be on their own, parallel but independent tracks. Very slow progress on hardware must not be permitted to slow algorithm progress.
  35. Double down on pre-commercialization? Double down is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
  36. Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent. Some people and organizations require excitement and rapid change while others require calm stability — individuals and organizations must decide clearly which they are.
  37. Pre-commercialization could take another 2 to 4 years — or longer.
  38. The initial commercialization stage could take another 2 to 3 years — or longer, beyond pre-commercialization.
  39. The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
  40. Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.
  41. Quantum-enabled products. Products which are enabled by quantum computing. Such as quantum algorithms, quantum applications, and quantum computers themselves.
  42. Quantum-enabling products. Products which enable quantum computing. Such as software tools, compilers, classical quantum simulators, and support software. They run on classical computers and can be run even if quantum computing hardware is not available. Also includes classical hardware components and systems, as well as laboratory equipment.
  43. There are indeed exceptions: products or services which can actually thrive during pre-commercialization. Namely equipment, software, tools, and services which enable pre-commercialization, focused on research, prototyping, and experimentation. Anything but production deployment. Generally, quantum-enabling products.
  44. Even for exceptions for commercialization during pre-commercialization, be especially wary. Plenty of potential gotchas.
  45. Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization.
  46. The overall message is twofold…
  47. Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
  48. Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.
  49. Assure that there are no great unanswered questions hanging over the heads of professional product engineering teams that could interfere with their ability to develop commercial products by slowing their progress or putting their success at risk. Any needed research, prototyping, or experimentation must be complete and out of the way before commercialization can begin. No great questions can remain unanswered once commercialization commences.

--

--

Freelance Consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store