Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization

  1. Commercialization of current technology will NOT lead to dramatic quantum advantage. The hardware is too primitive. Much research is needed.
  2. Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out and rely on simulation until the hardware is ready.
  3. Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
  4. Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
  5. It’s premature to even begin commercialization. The technology just isn’t ready. Not even close. Both hardware and algorithms, and applications.
  6. Much pre-commercialization work remains before commercialization can begin.
  7. Boost research, prototyping, and experimentation — pre-commercialization.
  8. Much research remains to fully characterize and resolve many technical obstacles. Many engineering challenges don’t have sufficient research results to guide them. Both hardware and algorithms, and applications.
  9. Hardware may seem to be the primary limiting factor, but algorithms are an even greater limiting factor. We can simulate 32 and 40-qubit algorithms, but they’re nowhere to be found.
  10. The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses. But we can make some good, tentative guesses now.
  11. Variational methods are an unproductive distraction and technical dead end — the focus should be on quantum Fourier transform (QFT) and quantum phase estimation (QPE). It will take years for the hardware to support this, but simulation can be used in the meantime.
  12. Quantum error correction (QEC) and logical qubits will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
  13. Prototyping and experimentation for quantum algorithms and quantum applications should focus on simulation configured to match the hardware expected at the time of initial commercialization rather than focusing on current, very limited hardware.
  14. There should be no expectation of running or even testing algorithms or applications for 64 or more qubits during pre-commercialization. Not until the hardware can be confirmed to be approaching the target capabilities for the initial commercialization stage — not just raw qubit count, but quality of the qubits. Simulation-only during pre-commercialization. May be limited to 50, 48, 44, 40, or even 32 qubits based on the limits of the simulator and circuit depth.
  15. Even initial commercialization will be fairly limited and it could take ten or more subsequent commercialization stages before the full promise of quantum computing can be delivered.
  16. Any efforts at premature commercialization are doomed to be counterproductive and a distraction from research and simulation for prototyping and experimentation.
  17. Hardware and algorithm research and development should be allowed to be on their own, parallel but independent tracks. Very slow progress on hardware must not be permitted to slow algorithm progress.
  18. Double down on pre-commercialization? Double down is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
  19. Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent. Some people and organizations require excitement and rapid change while others require calm stability — individuals and organizations must decide clearly which they are.
  20. Pre-commercialization could take another 2 to 4 years — or longer.
  21. The initial commercialization stage could take another 2 to 3 years — or longer, beyond pre-commercialization.
  22. The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
  23. Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.
  1. This paper merely summarizes and highlights from the issues — see my two previous papers for details
  2. What do commercialization and pre-commercialization mean?
  3. The crux of the problem, the dilemma
  4. Premature commercialization is the problem now facing us
  5. No need for premature Quantum Ready
  6. Great for Fortune 500 companies to do their own research push
  7. Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid
  8. Current dramatic push for commercialization is a counterproductive distraction
  9. Commercialization of current technology will NOT lead to dramatic quantum advantage
  10. Little if any of the current technology will be relevant in 5–10 years
  11. Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation
  12. Big risk of hitting a Quantum Winter in two years
  13. Taming the hype may be impossible, so we need to push the reality to catch up
  14. Boost research, prototyping, and experimentation — pre-commercialization
  15. We need to push research much harder to try to catch up with the hype
  16. Distinguishing pre-commercialization from commercialization
  17. Avoid premature commercialization
  18. Critical technical gating factors for initial stage of commercialization
  19. Minimum viable product (MVP)
  20. Initial commercialization stage — C1.0
  21. Subsequent commercialization stages
  22. Quantum error correction (QEC) and logical qubits — later, not in C1.0
  23. Near-perfect qubits — good enough for most applications
  24. No, noisy NISQ quantum computers are not viable for commercialization
  25. Configurable packaged quantum solutions
  26. Critical hardware research issues
  27. Critical algorithm and application research areas
  28. Other critical research areas
  29. We need to decouple hardware development and algorithm and application research, prototyping, and experimentation
  30. Focus algorithm and application research, prototyping, and experimentation on simulation
  31. Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?
  32. Hardware engineers should run their own functional tests, stress tests, and benchmark tests
  33. Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware
  34. Functional enhancements and performance and capacity improvements are needed for simulation
  35. Where are all of the 40-qubit quantum algorithms?
  36. Scalability is essential for robust quantum algorithms
  37. Configure simulation to match expected commercial hardware
  38. Configure simulation to match expected improvements — or shortfalls — of the hardware
  39. Research will continue even as commercialization commences
  40. Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization
  41. Risk of business development during pre-commercialization — beware of premature commercialization
  42. Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization
  43. When might the initial commercialization stage, C1.0, be available?
  44. IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization
  45. My apologies — There’s so much more! See my two papers
  46. Summary and conclusions

This paper merely summarizes and highlights from the issues — see my two previous papers for details

This paper won’t delve too deeply into the many issues related to research and commercialization, as these were covered extensively in my preceding two papers, designed and explicitly written to provide the foundation for this paper. The intent here is to get people to focus more on pre-commercialization and less on (premature) commercialization.

What do commercialization and pre-commercialization mean?

Taken from my previous paper on pre-commercialization and commercialization:

  1. Pre-commercialization means research as well as prototyping and experimentation. This will continue until the research advances to the stage where sufficient technology is available to produce a viable product that solves production-scale practical real-world problems. All significant technical issues have been resolved, so that commercialization can proceed with minimal technical risk.
  2. Commercialization means productization after research as well as prototyping and experimentation are complete. Productization means a shift in focus from research to a product engineering team — commercial product-oriented engineers and software developers rather than scientists.

The crux of the problem, the dilemma

Quantum computing has been plodding along for several decades now, finally accelerating in the past few years, so it’s only natural that people would finally like to see the fruits of this labor put into action actually solving practical real-world problems. The desire is very real. It’s palpable. It’s irresistible.

Premature commercialization is the problem now facing us

Research in hardware, algorithms, and applications may be the clear technical problem in front of us, but the main problem right in front of us is the temptation of premature commercialization.

No need for premature Quantum Ready

Not all organizations or all individuals within a particular organization need to get Quantum Ready at the same time or pace, or even at all. It all depends on the needs, interests, criteria, and timing of the particular organization, department, project, team, role, or individual. It’s not a one size fits all proposition. The timing will vary, as will the pace.

Great for Fortune 500 companies to do their own research push

Large organizations with deep pockets for research and advanced development should of course be focusing on quantum computing, but…

  1. Focus on the long term. These research efforts should not be billed as intended to produce production solutions over the next few years.
  2. Demonstrations of the future, not the near-term. Research prototypes should be billed as demonstrations of possible production-scale technology in 5–7 years and production deployment in 10–15 years, but not near term in the next 2–4 years.
  3. Integration of quantum into mainline applications will take years. Integration of quantum technology into larger solutions could take 3–5 (or even 7) years alone, even once the quantum technology itself is ready.
  4. Some elite teams may develop ENIAC-class solutions in less time. Maybe to production deployment in 3–5 years, but most organizations will have to wait another 5–8 years for The FORTRAN Moment, or utilize configurable packaged quantum solutions acquired from outside vendors who do have the necessary elite teams.

Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid

The tremendous hype surrounding quantum computing is quite intoxicating. It’s one thing to be taken in by the promise of a brilliant future, but it’s an entirely different matter to treat that distant promise as if it were reality today or even the relatively near future.

Current dramatic push for commercialization is a counterproductive distraction

The quantum computing field is plagued with excessive hype, not simply promises of great benefits to come, but even to the point of claims that the benefits are actually here, now, or at least in the very near future.

  1. Quantum Computing May Be Closer Than You Think
  2. Quantum Computing Might Be Here Sooner Than You Think
  3. Quantum Computing May Be A Reality Sooner Than You Think
  4. Quantum Computing Will Change Everything, and Sooner Than You Expect
  5. Quantum Computing is coming to you faster than you think
  6. Quantum computing could be useful faster than anyone expected
  7. A Quantum Future will be here Sooner than You Think
  8. Quantum Computers Could Go Mainstream Sooner than We Think
  9. You’ll Be Using Quantum Computers Sooner Than You Think
  10. And more!

Commercialization of current technology will NOT lead to dramatic quantum advantage

Current quantum computing technology is actually fairly impressive compared to just a few years ago, but is still well short of being suitable for solving production-scale practical real-world problems and achieving even a tiny fraction of dramatic quantum advantage. And this includes both hardware and algorithms.

Little if any of the current technology will be relevant in 5–10 years

Instead of expending inordinate energy on distorting and shoehorning stripped-down algorithms into current hardware, we should instead rely on simulation in the near term, and focus algorithm and application research on expected hardware 2–7 years out.

  1. Much higher qubit fidelity.
  2. Greater qubit counts.
  3. Finer phase granularity.
  4. Quantum error correction (QEC) and logical qubits.
  5. Lower gate error rates.
  6. Lower measurement error rates.
  7. Greater qubit connectivity.
  8. Capable of supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation

Variational methods are quite popular right now, particularly for such applications as quantum computational chemistry, primarily because they work, in a fashion, on current NISQ hardware, but they are far from ideal and only work for smaller problems. In fact, they are a distraction and an absolute technical dead end — variational algorithms will never achieve dramatic quantum advantage, just by the nature of how they do work.

Big risk of hitting a Quantum Winter in two years

The biggest risk in quantum computing is an excess of investment flows into quantum computing with an expectation of payoff (presumed commercialization) within two or three years, followed by a Quantum Winter as vendors and organizations are unable to deliver on those bold promises by the time the patience of that investment money has been pushed to the limit.

Taming the hype may be impossible, so we need to push the reality to catch up

Sure, we can push back on the hype, to some degree, but that’s a Herculean task that would consume all of our energy. Instead, we can focus on how to advance the technology so that it catches up with at least a fair fraction of the hype.

Boost research, prototyping, and experimentation — pre-commercialization

If there is only one message for the reader to take away from this paper it is to boost research, as well as to boost prototyping and experimentation.

We need to push research much harder to try to catch up with the hype

We may have only another two or three years to begin delivering on at least some of the bold promises of quantum computing before patient capital loses patience.

Distinguishing pre-commercialization from commercialization

Commercialization implies that you have the necessary science, technology, knowledge, and skills, but now you just have to do it. You have the science. You just need to do the engineering. It’s not quite that simple, but that’s the essence.

Avoid premature commercialization

Just to reemphasize the point, premature commercialization is the attempt to engage in commercialization before the necessary science, technology, knowledge and skills have been developed. In short, don’t do it!

Critical technical gating factors for initial stage of commercialization

Here’s a brief summary of the more critical technical gating factors which must be addressed before quantum computing can be considered ready for commercialization, an expectation for the initial stage of commercialization, C1.0:

  1. Near-perfect qubits. At least four nines of qubit fidelity — 99.99%. Possibly five nines — 99.999%.
  2. Circuit depth. Generally limited by coherence time. No clear threshold at this stage but definitely going to be a critical gating factor. Whether it is 50, 100, 500, or 1,000 is unclear. Significantly more than it is now. Let’s call it 250 for the sake of argument.
  3. Qubit coherence time. Sufficient to support needed circuit depth.
  4. Near-full qubit connectivity. Either full any to any qubit connectivity or qubit fidelity high enough to permit SWAP networks to simulate near-full connectivity.
  5. 64 qubits. Roughly. No precise threshold. Maybe 48 qubits would be enough, or maybe 72 or 80 qubits might be more appropriate. Granted, I think people would prefer to see 128 to 256 qubits, but 64 to 80 might be sufficient for the initial commercialization stage.
  6. Alternative architectures may be required. Especially for more than 64 qubits. Or even for 64, 60, 56, 50, and 48 qubits in order to deal with limited qubit connectivity.
  7. Fine phase granularity to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). 40 qubits = 2⁴⁰ gradations — one trillion gradations should be the preferred target for C1.0. At least 20 or 30 qubits = 2²⁰ to 2³⁰ gradations — one million to one billion gradations, at a mimumum. Even 20 qubits may be a hard goal to achieve. 50 qubits needed for dramatic quantum advantage.
  8. Quantum Fourier transform (QFT) and quantum phase estimation (QPE). Needed for quantum computational chemistry and other applications. Needed to achieve quantum advantage through quantum parallelism. Relies on fine granularity of phase.
  9. Conceptualization and methods for calculating shot count (circuit repetitions) for quantum circuits. This will involve technical estimation based on quantum computer science coupled with engineering processes based on quantum software engineering. See my paper below.
  10. Moderate improvements to the programming model. Unlikely that a full higher-level programming model will be available soon (before The FORTRAN Moment), but some improvements should be possible.
  11. Moderate library of high-level algorithmic building blocks.
  12. The ENIAC Moment. A proof that something realistic is possible. The first production-scale practical real-world application.
  13. Substantial quantum advantage. Full, dramatic quantum advantage (one quadrillion X speedup) is not so likely, but an advantage of at least a million or a billion is a reasonable expectation — much less will be seen as not really worth the trouble. This will correspond to roughly 20 to 30 qubits in a single Hadamard transform — 2²⁰ = one million, 2³⁰ = one billion. An advantage of one trillion — 2⁴⁰ may or may not be reachable by the initial stage of commercialization. Worst case, maybe minimal quantum advantage — 1,000X to 50,000X — might be acceptable for the initial stage of commercialization.
  14. 40-qubit quantum algorithms. Quantum algorithms utilizing 32 to 48 qubits should be common. Both the algorithms and hardware supporting those algorithms. 48 to 72-qubit algorithms may be possible, or not — they may require significantly greater qubit fidelity.
  15. Classical quantum simulators for 48-qubit algorithms. The more the better, but that may be the practical limit in the near term. We should push the researchers for 50 to 52 or even 54 qubits of full simulation.
  16. Overall the technology is ready for production deployment.
  17. No further significant research is needed to support the initial commercialization stage product, C1.0. Further research for subsequent commercialization stages, but not for the initial commercialization stage. The point is that research belongs in the pre-commercialization stage, not during commercialization.

Minimum viable product (MVP)

One could accept all of the critical technical gating factors for the initial stage of commercialization (C1.0) as the requirements for a minimum viable product (MVP). That would be the preference. But, it may turn out that not all customers or users need all of those capabilities or features. Or, maybe everybody wants and needs all of those capabilities and features, but they simply aren’t technically or economically feasible in a reasonable timeframe. In such situations it may make sense or at least be tempting to define a minimum viable product (MVP) which is substantially less than the more capable desired initial product.

  1. Qubit count. 128 or 256 qubits may be a clear preference, but maybe 72 or 64 or even 48 qubits might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
  2. Qubit fidelity. Five nines or at least 4.5 nines of qubit fidelity might be the preference, but four or even 3.5 nines might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
  3. Connectivity. Full connectivity might not be achievable. Maybe SWAP networks are feasible if qubit fidelity is high enough.
  4. Fineness of phase granularity. Critical for quantum Fourier transform and quantum phase estimation. Sufficient for at least 20 to 30 qubits = 2²⁰ to 2³⁰ gradations in a quantum Fourier transform, rather than the desired 50 to 64.
  5. Quantum Fourier transform and quantum phase estimation resolution. Preferably at least 20 to 30 qubits, but maybe only 16 bits of precision can be achieved — or even only 12, rather than 32 to 64 bits.

Initial commercialization stage — C1.0

The initial commercialization stage would be the very first product offering to result from the commercialization process. Call it C1.0. This initial product would meet all of the critical technical gating factors detailed in a preceding section of this paper, although it is possible that there might be a minimum viable product (MVP) which doesn’t meet all of those factors.

Subsequent commercialization stages

Research continues even as the technical details of the initial commercialization stage, C1.0, stabilize. This ongoing research will form the basis for subsequent commercialization stages.

Quantum error correction (QEC) and logical qubits — later, not in C1.0

As valuable and essential as quantum error correction (QEC) and logical qubits will be for quantum computing, it seems unlikely that it will be ready for the initial commercialization stage, C1.0. In fact it will likely take several additional stages before the technology advances far enough even in research before it is ready for commercialization. My best guess is for it to debut in C3.0, the third major commercialization stage.

Near-perfect qubits — good enough for most applications

Although quantum error correction (QEC) and logical qubits are the ideal, most quantum algorithms and applications won’t actually need truly perfect logical qubits — a fairly tiny error may be acceptable. Near-perfect qubits may in fact be good enough in most situations.

No, noisy NISQ quantum computers are not viable for commercialization

Absent support for quantum error correction (QEC), only near-perfect qubits would supply the qubit fidelity needed to support larger and more complex quantum algorithms and to achieve substantial quantum advantage, such as supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE) which are required for quantum computational chemistry.

Configurable packaged quantum solutions

Design of production-scale quantum algorithms and development of production-scale quantum applications is likely to remain well beyond the technical ability of most organizations for quite some time. Rather, I expect that many organizations will be able to get started in quantum computing by acquiring and deploying configurable packaged quantum solutions which provide them with all of the benefits of quantum computing without the need to go anywhere near a raw quantum algorithm.

Critical hardware research issues

A separate paper lays out much of the needed research for quantum computing hardware. This paper only briefly summarizes the critical areas:

  1. Need for a more ideal qubit technology. Current qubit technology demonstrates quantum computing capabilities, but is incapable of delivering on the full promises of quantum computing.
  2. Limited qubit capacity. Need a lot more qubits.
  3. Limited qubit fidelity. Too many errors.
  4. Limited qubit coherence. Limits circuit depth.
  5. Limited circuit depth. Basically limited by qubit coherence.
  6. Limited gate fidelity. Too many errors executing quantum logic gates.
  7. Limited granularity of phase. Need fine granularity for quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  8. Limited measurement fidelity. Too many errors measuring a qubit to get results.
  9. Unable to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). Rely on qubit fidelity, fine granularity of phase, gate fidelity, circuit depth, and measurement fidelity. Needed for application categories such as quantum computational chemistry.
  10. Unable to achieve substantial quantum advantage. A dramatic performance advantage over classical computing is the only point of even pursuing quantum computing. This will require a combination of sufficient hardware capabilities with quantum algorithms which exploit those hardware capabilities.

Critical algorithm and application research areas

A separate paper lays out much of the needed research for quantum algorithms and quantum applications. This paper only briefly summarizes the critical areas:

  1. Need for a higher-level programming model. Current programming model is too low-level, too primitive, too much like classical machine and assembly language.
  2. Need for a robust collection of high-level algorithmic building blocks.
  3. Need for a high-level programming language. Tailored to the needs of quantum algorithms and quantum applications.
  4. Need for a robust collection of example algorithms. Which demonstrate production-scale quantum parallelism and show how practical real-world problems can be easily transformed into quantum algorithms and applications.
  5. Need for algorithm debugging capabilities. Difficult enough for relatively simple quantum algorithms, virtually impossible for complex quantum algorithms.
  6. Need for configurable packaged quantum solutions. Generalized applications for each major application category which allow the developer to present input data and input parameters in an easy way which can readily be automatically transformed into adaptations of the pre-written quantum algorithms and application framework. Still requires a lot of work, but not expertise in quantum circuits.
  7. Research in specific algorithms for each application category.

Other critical research areas

Besides hardware, algorithms, and applications, there are a number of other areas of critical and urgent research needed to fully exploit the promised potential of quantum computing. From my paper, here is the summary list of the areas:

  1. Physics.
  2. Hardware.
  3. Firmware (see: Hardware).
  4. Hardware support.
  5. Debugging.
  6. Classical quantum simulators.
  7. Quantum information science in general.
  8. Software. Support software, tools.
  9. Quantum software engineering. A new field.
  10. Quantum computer science. A new field.
  11. Cybersecurity.
  12. Quantum algorithm support.
  13. Quantum algorithms.
  14. Quantum application support.
  15. Quantum applications.
  16. Quantum application solutions. Particularly configurable packaged quantum solutions.
  17. Quantum general artificial intelligence.
  18. Quantum advantage and quantum supremacy.
  19. Other areas of QSTEM research.

We need to decouple hardware development and algorithm and application research, prototyping, and experimentation

Trying to prototype and experiment with algorithms and applications on woefully-inadequate hardware is an exercise in futility. There’s another, better path: simulation.

Focus algorithm and application research, prototyping, and experimentation on simulation

Simulation can be slow and is limited to 40 to 50 or so qubits, but is much more reliable and ultimately more efficient and productive than attempting to prototype and experiment with hardware that simply isn’t up to the task.

Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?

There are very few real algorithms that use more than about 23 qubits on a real quantum computer at present. That is likely due to the fact that this is approximately the limit of current hardware, particularly with respect to qubit fidelity, coherence time, circuit depth, gate fidelity, and measurement errors.

Hardware engineers should run their own functional tests, stress tests, and benchmark tests

The quantum hardware engineers should run their own functional tests, stress tests, and benchmark tests to confirm that their hardware is performing as expected. No need to slow down algorithm and application research, prototyping, and experimentation just to test the hardware in a rather inefficient manner.

Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware

There’s no good reason to slow down algorithm and application research, prototyping, and experimentation or to gate them by hardware research.

  1. Hardware limits don’t interfere with progress.
  2. Simulation and analysis software can alert them to bugs and other issues in their algorithms and applications. Debugging on actual quantum hardware is very problematic.

Functional enhancements and performance and capacity improvements are needed for simulation

Simulation runs reasonably fine today. Yes, significant functional enhancements and performance and capacity improvements would be very beneficial, but simulators are generally more usable than actual hardware at the moment — and for the indefinite future.

Where are all of the 40-qubit quantum algorithms?

Indeed, where are they? There doesn’t appear to be any technically valid reason that we don’t see a plethora of 40-qubit or even 32-qubit algorithms, other than the mere fact that it’s so intoxicating to run algorithms on actual quantum hardware. An increased focus on simulation should improve the situation.

Scalability is essential for robust quantum algorithms

We want to be able to develop quantum algorithms and applications today which will run on future hardware as it becomes available. We definitely don’t want to have to redesign and reimplement quantum algorithms and quantum applications every time there is even a modest advance in hardware capabilities. Scalability is the key. And in fact automatic scalability.

Configure simulation to match expected commercial hardware

Simulation can be configured to match any hardware configuration. The default should be the target hardware configuration for the initial commercialization stage of quantum computing, C1.0, with the only significant difference being fewer qubits since simulation is limited to 40 to 50 or so qubits.

  1. Qubit count.
  2. Coherence time.
  3. Circuit depth.
  4. Connectivity.
  5. Phase granularity.
  6. Qubit fidelity.
  7. Gate fidelity.
  8. Measurement fidelity.

Configure simulation to match expected improvements — or shortfalls — of the hardware

Simulation is flexible and can be configured to match any hardware configuration (limited only by the maximum qubits for simulation or roughly 40 to 50 or so qubits.)

Research will continue even as commercialization commences

Research for any vibrant field is never-ending. There’s always something new to discover, and some new problem to be overcome.

Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization

Literally every aspect of quantum computing technology during pre-commercialization is subject to change. Hardware, software, tools, algorithms, applications, programming models, knowledge, methods — you name it, all of it has a high probability of changing by the time true commercialization begins. This especially includes support software and tools.

Risk of business development during pre-commercialization — beware of premature commercialization

The same comments from the preceding section apply to business development. All factors involved in business decisions, including but not limited to the technology, are likely to change, evolve, and even radically change as pre-commercialization progresses.

Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization

There will be plenty of temptations to blindly leap into quantum computing during pre-commercialization, but… better to look before you leap, or in fact don’t leap at all, waiting for true commercialization to actually begin or at least be imminent.

When might the initial commercialization stage, C1.0, be available?

There’s no clarity or certainty as to the timeframe for commercialization of quantum computing, but it might be illuminating to speculate about maximum, nominal, and minimum paths to both pre-commercialization and initial commercialization — C1.0. These numbers are somewhat arbitrary, but hopefully helpful to bound expectations.

  • Pre-commercialization. Minimal: 2 years. Nominal: 4 years. Maximal: 10 years.
  • Commercialization. Minimal: 2 years. Nominal: 3 years. Maximal: 5 years.
  • Total. Minimal: 4 years. Nominal: 7 years. Maximal: 15 years.

IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization

Late breaking news… Just as I was finishing up this paper I saw the news that IBM has finally announced the availability of their 127-qubit Eagle quantum computer system. That’s a major accomplishment and a big step forward, finally breaking the 100-qubit barrier. But… it’s not yet a commercial product and despite its achievement and progress is still woefully short of what is needed for true commercialization of quantum computing. Hence, it is solid evidence that we’re still deep in the realm of pre-commercialization.

My apologies — There’s so much more! See my two papers

I’ve tried to keep this paper as short as possible, so it’s limited to summaries and some highlights. The full details are in my immediately two preceding papers which were designed and explicitly written to provide the foundation for this paper:

Summary and conclusions

  1. Despite many advances, it still seems premature to attempt to commercialize quantum computing at this time — or any time soon.
  2. There has been great progress for quantum computing over the past few years.
  3. But significant deficits, shortfalls, limitations, and obstacles remain.
  4. Commercialization is not underway, not imminent, and not near.
  5. Too much talk as if commercialization was in fact underway, imminent, or near.
  6. We’re in the pre-commercialization stage for the indefinite future.
  7. Much more research is needed before the process of commercialization can even begin.
  8. Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
  9. Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
  10. Much prototyping and experimentation are needed for algorithms and applications, but focus should be on simulation since current hardware is too limited and too much of an unproductive distraction.
  11. The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses.
  12. Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation.
  13. Quantum error correction (QEC) will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
  14. Simulators should be configured for what is expected once commercialization has occurred. Or some future hardware at a subsequent stage of commercialization.
  15. We need a bold push for 40-qubit algorithms — as a start, with more complex algorithms after that. Under simulation, for now.
  16. We need a bold push for automatically and provably scalable algorithms.
  17. We need a bold push for greater quantum parallelism and quantum advantage.
  18. We need a bold push for substantial quantum advantage, even if full, dramatic quantum advantage is still a more distant future.
  19. Only much later in pre-commercialization — if not until initial commercialization, when the hardware has matured and stabilized — should prototyping and experimentation be attempted on actual hardware. Even then, more as a final test rather than as a primary design and development mode.
  20. Although hardware is a very limiting factor, algorithms and applications are a much greater limiting factor.
  21. Attempts at commercialization at this stage are unwarranted.
  22. Commercialization must wait until all of the technical uncertainties of hardware, algorithms, and applications are clearly identified and resolved — which is the purpose of pre-commercialization.
  23. Risk of changes to support software and tools during pre-commercialization. Accept that everything can and will change, more than once. Revision, rework, and restarts should be the expected norm during pre-commercialization.
  24. Risk of business development during pre-commercialization. All factors related to business deals, technology and business alike, should be expected to change, more than once. Revision, rework, and restarts should be the expected norm during pre-commercialization.
  25. Premature commercialization does more harm than good. An unproductive distraction. Algorithm research, prototyping, and experimentation should be focused on simulating the hardware expected for eventual commercialization.
  26. Double down on pre-commercialization is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
  27. Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent.
  28. Pre-commercialization could take another 2 to 4 years — or longer.
  29. The initial commercialization stage could take another 2 to 3 years — or longer — beyond pre-commercialization.
  30. The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
  31. Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.

--

--

Freelance Consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store