Risks of Premature Commercialization of Quantum Computing
Premature commercialization of quantum computing will be counterproductive and lead to unnecessary disenchantment with the long-term potential for quantum computing. At the extreme, this disenchantment could lead to a Quantum Winter in two to three years due to failure to meet excessive expectations. Despite the dramatic progress of quantum computing in recent years, many significant obstacles remain before commercialization can even begin. Much more basic research, prototyping, and experimentation is needed in hardware, support software and tools, programming models, algorithmic building blocks, algorithms, and quantum applications before quantum computing will be ready to be exploited by organizations seeking to develop and deploy production-scale quantum applications for practical real-world problems. This pre-commercialization is required before commercialization can even begin to develop the specs for what commercial quantum computing products should look like, including details on performance, capacity, and functional capabilities. This informal paper will discuss the risks and briefly propose a path forward which can dramatically accelerate the advance of quantum computing. That path is discussed in greater detail in other papers.
If all of this and this paper in general feel way too negative, there is a positive message, a way out, a way to avoid this mess and potentially looming disaster:
- Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
- Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.
But the focus of this paper is the risks if you don’t do that, if you try to skip too much of the basic research, if you don’t do enough prototyping, and if you don’t do enough experimentation to confirm that you really do have enough of the important questions fully answered to have a solid foundation to begin a professional product engineering effort for commercialization.
Just to make sure that we start out with the right note:
- Premature commercialization is a really, really, REALLY bad idea.
That said, there are indeed exceptions, products or services which can actually thrive during pre-commercialization, namely equipment, software, tools, and services which enable pre-commercialization, focused on research, prototyping, and experimentation. Anything but production deployment.
A fair amount of the material in this paper was previously published in several of my previous papers, but is coalesced and extended in this paper to focus on premature commercialization itself. The previous papers:
Need for more extensive research:
- Essential and Urgent Research Areas for Quantum Computing
- https://jackkrupansky.medium.com/essential-and-urgent-research-areas-for-quantum-computing-302172b12176
Commercialization, pre-commercialization, and premature commercialization:
- Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization
- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a
Emphasizing the need to double down on pre-commercialization before proceeding to commercialization, and to avoid premature commercialization:
- Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization
- https://jackkrupansky.medium.com/prescription-for-advancing-quantum-computing-much-more-rapidly-hold-off-on-commercialization-but-28d1128166a
Some of the material from those papers may be included here to make this paper easier to read and to provide adequate context so that this paper can generally stand on its own.
Topics discussed in this paper:
- In a nutshell
- What is premature commercialization?
- Overview
- The overall model
- Brief summary of pre-commercialization
- Research, more research, and even more research
- Premature commercialization risk
- Summary of risks of premature commercialization
- Overall the technology is NOT ready for production deployment
- No production deployment of quantum computing during pre-commercialization
- No great detail on commercialization proper here since focus here is on pre-commercialization
- For more on commercialization itself
- The crux of the problem, the dilemma
- Premature commercialization is the problem now facing us
- No need for premature Quantum Ready
- Great for Fortune 500 companies to do their own research push
- Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid
- Current dramatic push for commercialization is a counterproductive distraction
- Commercialization of current technology will NOT lead to dramatic quantum advantage
- Premature for any significant quantum advantage on any consistent basis across application categories
- Little if any of the current technology will be relevant in 5–10 years
- Wait a few years for the software technology to mature and evolve before getting started
- Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation
- Risk of backlash
- Big risk of hitting a Quantum Winter in two to three years
- Taming the hype may be impossible, so we need to push the reality to catch up
- Boost research, prototyping, and experimentation — pre-commercialization
- We need to push research much harder to try to catch up with the hype
- Distinguishing pre-commercialization from commercialization
- Avoid premature commercialization
- Critical technical gating factors for initial stage of commercialization
- Minimum viable product (MVP)
- No, noisy NISQ quantum computers are not viable for commercialization
- 48 fully-connected near-perfect qubits as the sweet spot goal for near-term quantum computing
- Critical hardware research issues
- Critical algorithm and application research areas
- Other critical research areas
- We need to decouple hardware development and algorithm and application research, prototyping, and experimentation
- Focus algorithm and application research, prototyping, and experimentation on simulation
- Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?
- Hardware engineers should run their own functional tests, stress tests, and benchmark tests
- Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware
- Functional enhancements and performance and capacity improvements are needed for simulation
- Where are all of the 40-qubit quantum algorithms?
- Scalability is essential for robust quantum algorithms
- Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic — much hard work is needed during pre-commercialization
- Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications
- Configure simulation to match expected commercial hardware
- Configure simulation to match expected improvements — or shortfalls — of the hardware
- Research will continue even as commercialization commences
- Exception: Commercial viability of capabilities which support pre-commercialization
- Early commercial opportunities for selling tools and services to enable and facilitate research, prototyping, and experimentation
- Exception: Random number-based applications are actually commercially viable today
- Even for exceptions for commercialization during pre-commercialization, be especially wary
- Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization
- Beware of any capabilities available during pre-commercialization which might seem as if they are likely to apply to commercialization as well
- Products which enable quantum computing vs. products which are enabled by quantum computing
- Potential for commercial viability of quantum-enabling products during pre-commercialization
- Preliminary quantum-enabled products during pre-commercialization
- Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization
- Risk of business development during pre-commercialization — beware of premature commercialization
- Quantum computing is still in the realm of the lunatic fringe
- Quantum Ready — It’s never too early for The Lunatic Fringe
- Quantum Aware is fine, but be careful about Quantum Ready
- Expect radical change — continually update vision of what quantum computing will look like
- Quantum computing is still a mere laboratory curiosity, not ready for production deployment
- Quantum computing is still more suited for elite technical teams than average, normal technical teams
- Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization
- Pre-commercialization is about constant change while commercialization is about stability and carefully controlled and compatible evolution
- Customers and users prefer carefully designed products, not cobbled prototypes
- Customers and users will seek the stability of methodical commercialization, not the chaos of pre-commercialization
- Quantum ecosystem
- Early, preliminary development of quantum ecosystem during pre-commercialization
- When might the initial commercialization stage, C1.0, be available?
- IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization
- Must assure that there are no great unanswered questions hanging over the heads of the commercialization teams
- My apologies — There’s so much more! See my three papers
- Grand finale — So what do we do now??
- My original proposal for this topic
- Summary and conclusions
In a nutshell
- General risks of premature commercialization of quantum computing…
- The technology just isn’t ready. Too much is missing. Too much is too primitive. Too much research is incomplete. Too much needs further research. Too much is incapable of delivering on the many wild promises that have been made,
- Risk of disenchantment and loss of project funding and commitment.
- Failure to complete projects.
- Failure of completed projects to meet expectations.
- Critical project failures now could make it harder to fund credible projects in the future.
- Risk of backlash. Disenchantment could lead to pushback on quantum computing. Denial of the potential for quantum computing.
- Surest path to an early quantum winter. By hitting a critical mass of disenchantment due to unmet expectations.
- Constant rework needed as the technology constantly and radically evolves. That’s the nature of the pre-commercialization stage. It’s a good thing at this stage, but not good for commercialization
- The technology is changing and evolving rapidly, so likely to be obsolete relatively soon, so it’s bad to bet on it in its current state.
- Insufficient research. Trying to skip too much of the needed research.
- Insufficient prototyping. Trying to skip too much of the needed prototyping.
- Insufficient experimentation. Trying to skip too much of the needed experimentation.
- Premature for any significant quantum advantage on any consistent basis across application categories. Very little, if any, quantum advantage available in the near term.
- Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications. Capabilities, tools, and methods are too primitive. Benchmarking not well developed.
- General comments…
- Commercialization of current technology will NOT lead to dramatic quantum advantage. The hardware is too primitive. Much research is needed.
- Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out and rely on simulation until the hardware is ready.
- Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
- Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
- Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic. Much hard work is needed. And it needs to be done during pre-commercialization. Attempting scalability during commercialization is a really bad idea. All of the issues need to be identified and worked out before commercialization even begins.
- It’s premature to even begin commercialization. The technology just isn’t ready. Not even close. Both hardware and algorithms, and applications.
- Much pre-commercialization work remains before commercialization can begin.
- Boost research, prototyping, and experimentation — pre-commercialization.
- Much research remains to fully characterize and resolve many technical obstacles. Many engineering challenges don’t have sufficient research results to guide them. Both hardware and algorithms, and applications.
- Hardware may seem to be the primary limiting factor, but algorithms are an even greater limiting factor. We can simulate 32 and 40-qubit algorithms, but they’re nowhere to be found.
- The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses. But we can make some good, tentative guesses now.
- Variational methods are an unproductive distraction and technical dead end — the focus should be on quantum Fourier transform (QFT) and quantum phase estimation (QPE). It will take years for the hardware to support this, but simulation can be used in the meantime.
- Quantum error correction (QEC) and logical qubits will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
- Prototyping and experimentation for quantum algorithms and quantum applications should focus on simulation configured to match the hardware expected at the time of initial commercialization rather than focusing on current, very limited hardware.
- There should be no expectation of running or even testing algorithms or applications for 64 or more qubits during pre-commercialization. Not until the hardware can be confirmed to be approaching the target capabilities for the initial commercialization stage — not just raw qubit count, but quality of the qubits. Simulation-only during pre-commercialization. May be limited to 50, 48, 44, 40, or even 32 qubits based on the limits of the simulator and circuit depth.
- Even initial commercialization will be fairly limited and it could take ten or more subsequent commercialization stages before the full promise of quantum computing can be delivered.
- Any efforts at premature commercialization are doomed to be counterproductive and a distraction from research and simulation for prototyping and experimentation.
- Hardware and algorithm research and development should be allowed to be on their own, parallel but independent tracks. Very slow progress on hardware must not be permitted to slow algorithm progress.
- Double down on pre-commercialization? Double down is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
- Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent. Some people and organizations require excitement and rapid change while others require calm stability — individuals and organizations must decide clearly which they are.
- Pre-commercialization could take another 2 to 4 years — or longer.
- The initial commercialization stage could take another 2 to 3 years — or longer, beyond pre-commercialization.
- The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
- Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.
- Quantum-enabled products. Products which are enabled by quantum computing. Such as quantum algorithms, quantum applications, and quantum computers themselves.
- Quantum-enabling products. Products which enable quantum computing. Such as software tools, compilers, classical quantum simulators, and support software. They run on classical computers and can be run even if quantum computing hardware is not available. Also includes classical hardware components and systems, as well as laboratory equipment.
- There are indeed exceptions: products or services which can actually thrive during pre-commercialization. Namely equipment, software, tools, and services which enable pre-commercialization, focused on research, prototyping, and experimentation. Anything but production deployment. Generally, quantum-enabling products.
- Even for exceptions for commercialization during pre-commercialization, be especially wary. Plenty of potential gotchas.
- Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization.
- The overall message is twofold…
- Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
- Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.
- Assure that there are no great unanswered questions hanging over the heads of professional product engineering teams that could interfere with their ability to develop commercial products by slowing their progress or putting their success at risk. Any needed research, prototyping, or experimentation must be complete and out of the way before commercialization can begin. No great questions can remain unanswered once commercialization commences.
What is premature commercialization?
Premature commercialization is an attempt to commercialize a technology before the basic research has been completed and before there has been sufficient prototyping and experimentation with the new technology to confirm that it really is ready for commercialization in terms of having enough answers to enough questions so that detailed specifications can be written for what commercial products should look like so that professional product engineering teams can proceed to develop commercial products without great unanswered questions hanging over their heads and slowing their progress or risking their success.
Overview
Premature commercialization of quantum computing will be counterproductive and lead to unnecessary disenchantment with the long-term potential for quantum computing.
At the extreme, this disenchantment could lead to a Quantum Winter due to failure to meet excessive expectations.
Despite the dramatic progress of quantum computing in recent years, many significant obstacles remain before commercialization can even begin.
This informal paper will discuss the obstacles and propose a path forward which can dramatically accelerate the advance of quantum computing.
Much more basic research, prototyping, and experimentation is needed in hardware, support software and tools, programming models, algorithmic building blocks, algorithms, and quantum applications before quantum computing will be ready to be exploited by organizations seeking to develop and deploy production-scale quantum applications for practical real-world problems. This pre-commercialization is required before commercialization can even begin.
Premature commercialization will be an extreme distraction, counterproductive, and outright harmful. Boosting research, both for hardware and algorithms is essential. Algorithm and application research should focus on simulation configured to match the hardware expected at the time of commercialization, rather than attempting to distort and shoehorn advanced, complex algorithms into woefully inadequate near-term hardware.
This paper won’t delve too deeply into the many issues related to research and commercialization, as these were covered extensively in my preceding three papers, designed and explicitly written to provide the foundation for this paper:
Need for more extensive research:
- Essential and Urgent Research Areas for Quantum Computing
- https://jackkrupansky.medium.com/essential-and-urgent-research-areas-for-quantum-computing-302172b12176
Commercialization, pre-commercialization, and premature commercialization:
- Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization
- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a
Emphasizing the need to double down on pre-commercialization before proceeding to commercialization, and to avoid premature commercialization:
- Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization
- https://jackkrupansky.medium.com/prescription-for-advancing-quantum-computing-much-more-rapidly-hold-off-on-commercialization-but-28d1128166a
The purpose of this paper is to pull together and briefly summarize and highlight portions of those three papers that emphasize premature commercialization.
The overall model
The overall model for the development of a technology has three stages:
- Pre-commercialization. Research, prototyping, and experimentation. Start with some ideas and see what can be done with them. The end result is not a product, but sufficient answers to all of the relevant questions so that a professional product engineering team can write specifications for commercial products and services.
- Premature commercialization. A false stage before pre-commercialization has been completed, where people imagine or fantasize that all of the preliminary foundational research, prototyping, and experimentation has been completed when in fact it has not been completed or in some cases not even started. The results will be rather less than impressive if not outright disastrous.
- Commercialization. All of the pre-commercialization work has been completed and enough answers to enough questions are readily available so that a professional product engineering team can proceed to writing specifications for products and services and proceed to implementing, delivering, and even deploying them with minimal technical risk.
Brief summary of pre-commercialization
Three main activities are occurring during the pre-commercialization stage of a new technology:
- Research. The basic research needed to understand the initial ideas and concepts to develop the rudiments of the technology.
- Prototyping. Attempting to use the rudimentary new technology to develop portions of applications which are representative of how the technology might be used when fully developed and commercialized. Provides feedback into further research.
- Experimentation. Using the new technology and prototypes to gather data and get a sense of how well the technology is performing. Including some preliminary degree of performance and capacity testing and benchmarking. Provides additional feedback into further research.
To be clear, none of the work products of pre-commercialization will be suitable as commercial products or services, or for operational production deployment. The only value of such work products is to gain knowledge about the technology and to feed back into further research.
For more detail on pre-commercialization, see my paper:
- Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization
- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a
Research, more research, and even more research
The single greatest need at this stage for quantum computing is more research at all levels and in all areas. Theory, experimental, and applied.
I’m hopeful that the more research that people see being funded, the more they will viscerally realize that the technology simply isn’t even close to being ready for prime-time commercial production deployment — commercialization.
For more in-depth details on needed research, see my paper:
- Essential and Urgent Research Areas for Quantum Computing
- https://jackkrupansky.medium.com/essential-and-urgent-research-areas-for-quantum-computing-302172b12176
Premature commercialization risk
A key motivation for this paper is to attempt to avoid the incredible technical and business risks that would come from premature commercialization of an immature technology — trying to create a commercial product before the technology is ready, feasible, or economically viable.
Quantum computing has come a long way over several decades and especially in the past few years, but still has a long way to go before it is ready for prime-time production deployment of production-scale practical real-world applications.
So, this paper focuses on pre-commercialization — all of the work that needs to be completed before a product engineering team can even begin serious, low-risk planning and development of a commercial product.
Summary of risks of premature commercialization
- The technology just isn’t ready. Too much is missing. Too much is too primitive. Too much research is incomplete. Too much needs further research. Too much is incapable of delivering on the many wild promises that have been made,
- Risk of disappointment.
- Risk of disenchantment.
- Risk of backlash. Disenchantment could lead to pushback on quantum computing. Denial of the potential for quantum computing.
- Risk of loss of enthusiasm and energy.
- Risk of loss of project funding.
- Risk of loss of commitment to the cause and promise of quantum computing.
- Failure to complete projects.
- Failure of completed projects to meet expectations.
- Critical project failures now could make it harder to fund credible projects in the future.
- Surest path to an early quantum winter. By hitting a critical mass of disenchantment due to unmet expectations.
- Constant rework needed as the technology constantly and radically evolves. That’s the nature of the pre-commercialization stage. It’s a good thing at this stage, but not good for commercialization
- The technology is changing and evolving rapidly, so likely to be obsolete relatively soon, so it’s bad to bet on it in its current state.
- Insufficient research. Trying to skip too much of the needed research.
- Insufficient prototyping. Trying to skip too much of the needed prototyping.
- Insufficient experimentation. Trying to skip too much of the needed experimentation.
- Insufficient knowledge. Too many unanswered questions. Or questions that never even got asked. Pre-commercialization surfaces and answers all relevant questions.
- Premature for any significant quantum advantage on any consistent basis across application categories. Very little, if any, quantum advantage available in the near term.
- Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications. Capabilities, tools, and methods are too primitive. Benchmarking not well developed.
Overall the technology is NOT ready for production deployment
Tremendous progress is being made in quantum computing, but we have very far yet to go. We don’t even know how far we have to go. The simple truth is that wherever we are or however far we have to go, overall quantum computing is simply not ready for production deployment. Prototyping and experimentation, yes, but production deployment, no.
No production deployment of quantum computing during pre-commercialization
Saying it again, more clearly, quantum computing is still too immature to even consider production deployment during pre-commercialization.
No great detail on commercialization proper here since focus here is on pre-commercialization
Commercialization itself is discussed in this paper to some degree, but not as a main focus since the primary intent is to highlight what work should be considered during pre-commercialization vs. product engineering for commercialization.
The primary focus here is on pre-commercialization — the need for deep research, prototyping, and experimentation which answers all of the hard questions so that product engineering teams can focus on engineering rather than research — and on being methodical rather than experimentation and trial and error.
The other reason to focus on commercialization to any extent in this paper is simply to clarify what belongs in commercialization rather than during pre-commercialization, to avoid premature commercialization — to avoid the premature commercialization risk mentioned in the preceding sections.
For more on commercialization itself
As mentioned, this paper does not focus on commercialization itself as a primary focus. For more detail on commercialization itself, see my two papers:
- Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization
- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a
- Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization
- https://jackkrupansky.medium.com/prescription-for-advancing-quantum-computing-much-more-rapidly-hold-off-on-commercialization-but-28d1128166a
The crux of the problem, the dilemma
Quantum computing has been plodding along for several decades now, finally accelerating in recent years, so it’s only natural that people would finally like to see the fruits of this labor put into action actually solving practical real-world problems. The desire is very real. It’s palpable. It’s irresistible.
But, despite the advances of recent years, the technology is far from ready for prime-time deployment. The hardware isn’t ready. Algorithms aren’t ready. Applications aren’t ready. Nothing’s ready. Except all of the hype.
Much research is still required. Many technical questions and issues remain unresolved. Hardware, algorithms, and applications. And programming models, algorithmic building blocks, design patterns, frameworks, and programming languages. And support software and tools. You name it, much more research is required.
Much prototyping and experimentation is also needed, but it is generally still premature to do so until much more of the foundational research has occurred.
In short, much pre-commercialization is still needed for quantum computing. We’re not even close to being ready to begin commercialization.
Premature commercialization is the problem now facing us
Research in hardware, algorithms, and applications may be the clear technical problem in front of us, but the main problem right in front of us is the temptation of premature commercialization.
Everybody is being told to believe that they need to get Quantum Ready, even though the basic technology won’t be ready to even begin commercialization for three to five or seven years or even longer.
Sure, researchers must be laboring away right now, but not the individuals or organizations that will ultimately use the eventual results of that research.
In fact, the technology really isn’t ready for even advanced development groups at larger organizations. Much basic research remains before that can happen.
No need for premature Quantum Ready
Not all organizations or all individuals within a particular organization need to get Quantum Ready at the same time or pace, or even at all. It all depends on the needs, interests, criteria, maturity, and timing of the particular organization, department, project, team, role, or individual. It’s not a one size fits all proposition. The timing will vary, as will the pace.
Some may need to get Quantum Ready early in pre-commercialization, others later in pre-commercialization, others not until early commercialization, others not until later stages of commercialization, and others at any stage in between, and others not at all.
Of course nobody wants to get Quantum Ready too late but there is no special merit in getting there too early, and there are real risks with doing so, such as the risk that work, knowledge, and skills from early in pre-commercialization may well be obsolete by the time commercialization rolls around. Investment is supposed to be leveraged, not discarded and redone.
Organizations, teams, and individuals should get Quantum Ready when they need it — not too late and not too soon. Most organizations and individuals will not need to rush. And now is not the time for most organizations and individuals to get Quantum Ready. In fact, for many or even most organizations and individuals it is still too soon to get Quantum Aware. Timing does matter.
For more on the various aspects of Quantum Ready, see my pre-commercialization paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.
Great for Fortune 500 companies to do their own research push
Large organizations with deep pockets for research and advanced development should of course be focusing on quantum computing, but…
- Focus on the long term. These research efforts should not be billed as intended to produce production solutions over the next few years.
- Demonstrations of the future, not the near-term. Research prototypes should be billed as demonstrations of possible production-scale technology in 5–7 years and production deployment in 10–15 years, but not near term in the next 2–4 years.
- Integration of quantum into mainline applications will take years. Integration of quantum technology into larger solutions could take 3–5 (or even 7) years alone, even once the quantum technology itself is ready.
- Some elite teams may develop ENIAC-class solutions in less time. Maybe to production deployment in 3–5 years, but most organizations will have to wait another 5–8 years for The FORTRAN Moment, or utilize configurable packaged quantum solutions acquired from outside vendors who do have the necessary elite teams.
Excessive hype is getting the best of us — we’re drinking too much of the Kool-Aid
The tremendous hype surrounding quantum computing is quite intoxicating. It’s one thing to be taken in by the promise of a brilliant future, but it’s an entirely different matter to treat that distant promise as if it were reality today or even the relatively near future.
The hype is well beyond the reality.
Current dramatic push for commercialization is a counterproductive distraction
The quantum computing field is plagued with excessive hype, not simply promises of great benefits to come, but even to the point of claims that the benefits are actually here, now, or at least in the very near future.
As some pundits (and, unfortunately, journalists) put it — and these are actual headlines:
- “Quantum Computing May Be Closer Than You Think”
- “Quantum Computing Might Be Here Sooner Than You Think”
- “Quantum Computing May Be A Reality Sooner Than You Think”
- “Quantum Computing Will Change Everything, and Sooner Than You Expect”
- “Quantum Computing is coming to you faster than you think”
- “Quantum computing could be useful faster than anyone expected”
- “A Quantum Future will be here Sooner than You Think”
- “Quantum Computers Could Go Mainstream Sooner than We Think”
- “You’ll Be Using Quantum Computers Sooner Than You Think”
- And more!
Vendors of hardware, software, algorithms, and applications are doing nothing to tamp down this rampant hype.
But the hardware, software, algorithms, and applications simply aren’t ready to be placed into production deployment. Not even close.
This dramatic push for rapid commercialization is a counterproductive distraction. It’s outright harmful. It will inevitably cause eventual disillusionment and an eventual pullback in investment.
The problem is not just the hype alone, since many people will sensibly ignore it, but unfortunately many people are actually acting as if they really did believe the hype.
Commercialization of current technology will NOT lead to dramatic quantum advantage
Current quantum computing technology is actually fairly impressive compared to just a few years ago, but is still well short of being suitable for solving production-scale practical real-world problems and achieving even a tiny fraction of dramatic quantum advantage. And this includes both hardware and algorithms.
So, any attempt to commercialize current quantum computing technology is doomed to be a guaranteed market flop in terms of solving production-scale practical real-world problems.
What’s needed, again, focused on pre-commercialization, is a lot more research, as well as prototyping and experimentation for quantum algorithms and quantum applications, but based on simulation rather than attempting to run on current, very-limited hardware.
For more on dramatic quantum advantage, see my paper:
- What Is Dramatic Quantum Advantage?
- https://jackkrupansky.medium.com/what-is-dramatic-quantum-advantage-e21b5ffce48c
Premature for any significant quantum advantage on any consistent basis across application categories
There may be some narrow niche cases where some minor quantum advantage can be achieved, but there is certainly no opportunity to achieve any significant quantum advantage in any broad sense across a wide swath of applications or application categories, and certainly not on any consistent basis.
Again, to be clear from the preceding section, there is no opportunity to achieve dramatic quantum advantage with current or near-term quantum computing technologies.
There is a wide range of degrees of quantum advantage well short of dramatic quantum advantage, which I refer to as fractional quantum advantage.
In the broadest terms, I suggest three levels of quantum advantage:
- Minimal quantum advantage. A 1,000X performance advantage over classical solutions. 2X, 10X, and 100X (among others) are reasonable stepping stones.
- Substantial or significant quantum advantage. A 1,000,000X performance advantage over classical solutions. 20,000X, 100,000X, and 500,000X (among others) are reasonable stepping stones.
- Dramatic quantum advantage. A one quadrillion X (one million billion times) performance advantage over classical solutions. 100,000,000X, a billion X, and a trillion X (among others) are reasonable stepping stones.
To put it most simply, anything less than a dramatic quantum advantage would be considered a fractional quantum advantage.
For more on fractional quantum advantage, see my paper:
- Fractional Quantum Advantage — Stepping Stones to Dramatic Quantum Advantage
- https://jackkrupansky.medium.com/fractional-quantum-advantage-stepping-stones-to-dramatic-quantum-advantage-6c8014700c61
For more on dramatic quantum advantage, see my paper:
- What Is Dramatic Quantum Advantage?
- https://jackkrupansky.medium.com/what-is-dramatic-quantum-advantage-e21b5ffce48c
Little if any of the current technology will be relevant in 5–10 years
Instead of expending inordinate energy on distorting and shoehorning stripped-down algorithms into current hardware, we should instead rely on simulation in the near term, and focus algorithm and application research on expected hardware 2–7 years out.
Expected hardware advances which will make life much easier for algorithm designers and application developers include:
- Much higher qubit fidelity.
- Greater qubit counts.
- Finer phase granularity.
- Quantum error correction (QEC) and logical qubits.
- Lower gate error rates.
- Lower measurement error rates.
- Greater qubit connectivity.
- Capable of supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE).
Wait a few years for the software technology to mature and evolve before getting started
Besides the hardware advancing, if you wait a few years the software and programming models will have evolved as well, making it much easier to get started.
If you get started today, even with simulators simulating the hardware of 2–7 years from now, you’ll have to make do with much more primitive software and programming models.
Advances which can be expected on the software side:
- Higher-level programming models.
- Higher-level quantum-native programming languages.
- Higher-level algorithmic building blocks.
- More sophisticated design patterns.
- More sophisticated application frameworks.
- Availability of configurable packaged quantum solutions. Let somebody else do all of the really hard work. Or, do all of the hard work yourself and then get others to pay you for access.
- Plethora of operational quantum algorithms and applications to mimic. Ah, so that’s how you do it!
Variational methods are an unproductive distraction and technical dead end — focus on quantum Fourier transform (QFT) and quantum phase estimation (QPE) using simulation
Variational methods are quite popular right now, particularly for such applications as quantum computational chemistry, primarily because they work, in a fashion, on current NISQ hardware, but they are far from ideal and only work for smaller problems. In fact, they are a distraction and an absolute technical dead end — variational algorithms will never achieve dramatic quantum advantage, just by the nature of how they do work.
Variational methods are a short-term crutch, a stopgap measure, designed to compensate for the inability of current hardware to support the desired algorithmic approach of quantum Fourier transform (QFT) and quantum phase estimation (QPE). Limited granularity of phase, limited circuit depth, limited qubit fidelity, and excessive gate errors are some of the hardware limitations precluding QFT and QPE at present.
The preferred alternative at this stage should be to refrain from trying to implement algorithms on current hardware in favor of implementing them on classical quantum simulators. Granted, that limits implementations to 32 to 40 or maybe 50 qubits, but this puts more emphasis on designing automatically scalable algorithms, so that the algorithms can use the best and optimal technical approach and be ready for the day when the hardware really is ready for commercialization.
For more information on variational methods, quantum Fourier transform and quantum phase estimation, and automatically scalable algorithms, see my pre-commercialization paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.
Risk of backlash
If sufficient progress is not made during the next year or two, there is a very real risk of developing a backlash against quantum computing.
We could see some significant pushback on quantum computing.
We could see development of denial of the potential for quantum computing.
Big risk of hitting a Quantum Winter in two to three years
The biggest risk in quantum computing is an excess of investment flows into quantum computing with an expectation of payoff (presumed commercialization) within two or three years, followed by a Quantum Winter as vendors and organizations are unable to deliver on those bold promises by the time the patience of that investment money has been pushed to the limit.
For more on the risks of a Quantum Winter, see my paper:
- Risk Is Rising for a Quantum Winter for Quantum Computing in Two to Three Years
- https://jackkrupansky.medium.com/risk-is-rising-for-a-quantum-winter-for-quantum-computing-in-two-to-three-years-70b3ba974eca
Taming the hype may be impossible, so we need to push the reality to catch up
Sure, we can push back on the hype, to some degree, but that’s a Herculean task that would consume all of our energy. Instead, we can focus on how to advance the technology so that it catches up with at least a fair fraction of the hype.
We need to double-down — even triple or quadruple-down — on research, as well as prototyping and experimentation.
Boost research, prototyping, and experimentation — pre-commercialization
If there is only one message for the reader to take away from this paper it is to boost research, as well as to boost prototyping and experimentation.
But to be clear, the research has to come first.
Prototyping and experimentation with current quantum hardware is an unproductive distraction.
Quantum algorithms and quantum applications should instead be prototyped and experimented with using simulation — configured to match the typical quantum hardware configuration expected at the initial commercialization stage (C1.0 — whatever that really might be.)
To be clear, prototyping and experimentation with quantum algorithms and quantum applications can proceed in parallel with a lot of the hardware research, but it must be focused on simulation.
But, research that is focused on the functional capabilities of qubits, programming models, and algorithmic building blocks, must be completed before it makes any sense to begin prototyping algorithms and applications.
Granted, additional research on the functional capabilities of qubits, programming models, and algorithmic building blocks can proceed in parallel with prototyping and experimentation with quantum algorithms and quantum applications, but only to the extent that the capabilities of the initial commercialization stage (C1.0) are already clearly known, or at least a reasonable approximation.
For more on research, see my paper: Essential and Urgent Research Areas for Quantum Computing.
For thoughts on the initial commercialization stage, C1.0, see my paper:
- Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization
- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a
We need to push research much harder to try to catch up with the hype
We may have only another two or three years to begin delivering on at least some of the bold promises of quantum computing before patient capital loses patience.
We can probably get away without full delivery on all promises, but we at least need to deliver on a substantial fraction of the promises. Such as achieving The ENIAC Moment, when a quantum computer can finally run a production-scale practical real-world application, and achieve at least a substantial fraction of quantum advantage.
But that’s going to take a lot of research. A whole lot. Much more than we are currently pursuing.
Can we catch up with the hype? Maybe. Possibly. If we try hard enough. But, it’s not a slam dunk.
Distinguishing pre-commercialization from commercialization
Commercialization implies that you have the necessary science, technology, knowledge, and skills, and now you just have to do it. You have the science. You just need to do the engineering. It’s not quite that simple, but that’s the essence.
Pre-commercialization is the process of seeking and obtaining all of the science, technology, knowledge, and skills which can then be applied during commercialization to engineer a viable commercial product.
Premature commercialization is the attempt to engage in commercialization before the necessary science, technology, knowledge, and skills have been developed. In short, don’t do it!
For more detail on commercialization, pre-commercialization, and premature commercialization see my paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.
Avoid premature commercialization
Just to reemphasize the point, premature commercialization is the attempt to engage in commercialization before the necessary science, technology, knowledge, and skills have been developed. In short, don’t do it!
Science, technology, knowledge, and skills for quantum computing are constantly changing as research progresses during pre-commercialization.
Any reliance on science, technology, knowledge, and skills from pre-commercialization while pre-commercialization is in progress is very risky — it may all change by the time that commercialization begins.
Critical technical gating factors for initial stage of commercialization
Here’s a brief summary of the more critical technical gating factors which must be addressed before quantum computing can be considered ready for commercialization, an expectation for the initial stage of commercialization, C1.0:
- Near-perfect qubits. At least four nines of qubit fidelity — 99.99%. Possibly five nines — 99.999%. Okay, maybe 3.75 nines or even 3.5 nines might be enough at least for some applications.
- Circuit depth. Generally limited by coherence time. No clear threshold at this stage but definitely going to be a critical gating factor. Whether it is 50, 100, 500, or 1,000 is unclear. Significantly more than it is now. Let’s call it 250 for the sake of argument.
- Qubit coherence time. Sufficient to support needed circuit depth.
- Near-full qubit connectivity. Either full any to any qubit connectivity or qubit fidelity high enough to permit SWAP networks to simulate near-full connectivity.
- 64 qubits. Roughly. No precise threshold. Maybe 48 qubits would be enough, or maybe 72 or 80 qubits might be more appropriate. Granted, I think people would prefer to see 128 to 256 qubits, but 64 to 80 (or maybe 48) might be sufficient for the initial commercialization stage.
- Alternative architectures may be required. Especially for more than 64 qubits. Or even for 64, 60, 56, 50, and 48 qubits in order to deal with limited qubit connectivity.
- Fine phase granularity to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). 40 qubits = 2⁴⁰ gradations — one trillion gradations should be the preferred target for C1.0. At least 20 or 30 qubits = 2²⁰ to 2³⁰ gradations — one million to one billion gradations, at a minimum. Even 20 qubits may be a hard goal to achieve. 50 qubits needed for dramatic quantum advantage.
- Quantum Fourier transform (QFT) and quantum phase estimation (QPE). Needed for quantum computational chemistry and other applications. Needed to achieve quantum advantage through quantum parallelism. Relies on fine granularity of phase.
- Conceptualization and methods for calculating shot count (circuit repetitions) for quantum circuits. This will involve technical estimation based on quantum computer science coupled with engineering processes based on quantum software engineering. See my paper below.
- Moderate improvements to the programming model. Unlikely that a full higher-level programming model will be available soon (before The FORTRAN Moment), but some improvements should be possible.
- Moderate library of high-level algorithmic building blocks.
- The ENIAC Moment. A proof that something realistic is possible. The first production-scale practical real-world application.
- Substantial quantum advantage. Full, dramatic quantum advantage (one quadrillion X speedup) is not so likely, but an advantage of at least a million or a billion is a reasonable expectation — much less will be seen as not really worth the trouble. This will correspond to roughly 20 to 30 qubits in a single Hadamard transform — 2²⁰ = one million, 2³⁰ = one billion. An advantage of one trillion — 2⁴⁰ may or may not be reachable by the initial stage of commercialization. Worst case, maybe minimal quantum advantage — 1,000X to 50,000X — might be acceptable for the initial stage of commercialization.
- 40-qubit quantum algorithms. Quantum algorithms utilizing 32 to 48 qubits should be common. Both the algorithms and hardware supporting those algorithms. 48 to 72-qubit algorithms may be possible, or not — they may require significantly greater qubit fidelity.
- Classical quantum simulators for 48-qubit algorithms. The more the better, but that may be the practical limit in the near term. We should push the researchers for 50 to 52 or even 54 qubits of full simulation.
- Overall the technology is ready for production deployment. At least in some minimal sense.
- No further significant research is needed to support the initial commercialization stage product, C1.0. Further research for subsequent commercialization stages, but not for the initial commercialization stage. The point is that research belongs in the pre-commercialization stage, not during commercialization.
For more detail consult my pre-commercialization paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.
Minimum viable product (MVP)
One could accept all of the critical technical gating factors for the initial stage of commercialization (C1.0) as the requirements for a minimum viable product (MVP). That would be the preference. But, it may turn out that not all customers or users need all of those capabilities or features. Or, maybe everybody wants and needs all of those capabilities and features, but they simply aren’t technically or economically feasible in a reasonable timeframe. In such situations it may make sense or at least be tempting to define a minimum viable product (MVP) which is substantially less than the more capable desired initial product.
This paper won’t attempt to predict what sort of minimum viable product (MVP) will form the ultimate initial commercialization stage, C1.0, but it is worth considering.
Some obvious compromises:
- Qubit count. 128 or 256 qubits may be a clear preference, but maybe 72 or 64 or even 48 qubits might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
- Qubit fidelity. Five nines or at least 4.5 nines of qubit fidelity might be the preference, but four or even 3.5 nines might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
- Connectivity. Full connectivity might not be achievable. Maybe SWAP networks are feasible if qubit fidelity is high enough.
- Fineness of phase granularity. Critical for quantum Fourier transform and quantum phase estimation. Sufficient for at least 20 to 30 qubits = 2²⁰ to 2³⁰ gradations in a quantum Fourier transform, rather than the desired 50 to 64.
- Quantum Fourier transform and quantum phase estimation resolution. Preferably at least 20 to 30 qubits, but maybe only 16 bits of precision can be achieved — or even only 12, rather than 32 to 64 bits.
Quantum error correction (QEC) would likely come in a later stage of commercialization in any case. Near-perfect qubits should be good enough for many applications.
In any case, the precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses — hopefully increasing in capabilities as more technical unknowns are resolved, favorably.
No, noisy NISQ quantum computers are not viable for commercialization
Absent support for quantum error correction (QEC), only near-perfect qubits would supply the qubit fidelity needed to support larger and more complex quantum algorithms and to achieve substantial quantum advantage, such as supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE) which are required for quantum computational chemistry.
By definition, a NISQ quantum computer has only noisy qubits with qubit fidelity of no more than two or three nines of reliability, 99% to 99.9%, rather than the four or five nines of reliability, 99.99% to 99.999%, required for supporting more complex quantum algorithms and quantum applications.
For more on nines of qubit fidelity, see my paper:
48 fully-connected near-perfect qubits as the sweet spot goal for near-term quantum computing
After some careful thought, I came up with a proposal for what might be the most viable quantum computer to be available in two to three years or so.
Oversimplified:
- 48 qubits.
- Full any to any connectivity for all qubits.
- Near-perfect qubits. 3.25 to 4 nines of qubit fidelity.
- Fine granularity of phase to support a 20-bit quantum Fourier transform (QFT). And enable quantum phase estimation (QPE) to support quantum computational chemistry.
- Up to 2,000-gate circuits.
Trying to get anything useful done with less than this is likely to be a counterproductive distraction.
The hope is that simulators can be configured for this or something close to this so that quantum algorithms can be designed and tested and used in quantum applications long before the hardware is actually available.
For more details, see my paper:
- 48 Fully-connected Near-perfect Qubits As the Sweet Spot Goal for Near-term Quantum Computing
- https://jackkrupansky.medium.com/48-fully-connected-near-perfect-qubits-as-the-sweet-spot-goal-for-near-term-quantum-computing-7d29e330f625
Critical hardware research issues
A separate paper lays out much of the needed research for quantum computing hardware. This paper only briefly summarizes the critical areas:
- Need for a more ideal qubit technology. Current qubit technology demonstrates quantum computing capabilities, but is incapable of delivering on the full promises of quantum computing.
- Limited qubit capacity. Need a lot more qubits.
- Limited qubit fidelity. Too many errors.
- Limited qubit coherence. Limits circuit depth.
- Limited circuit depth. Basically limited by qubit coherence.
- Limited gate fidelity. Too many errors executing quantum logic gates.
- Limited granularity of phase. Need fine granularity for quantum Fourier transform (QFT) and quantum phase estimation (QPE).
- Limited measurement fidelity. Too many errors measuring a qubit to get results.
- Unable to support quantum Fourier transform (QFT) and quantum phase estimation (QPE). Rely on qubit fidelity, qubit connectivity, fine granularity of phase, gate fidelity, circuit depth, and measurement fidelity. Needed for application categories such as quantum computational chemistry.
- Unable to achieve substantial quantum advantage. A dramatic performance advantage over classical computing is the only point of even pursuing quantum computing. This will require a combination of sufficient hardware capabilities with quantum algorithms which exploit those hardware capabilities.
For much more detail on hardware research areas, see my paper: Essential and Urgent Research Areas for Quantum Computing.
Critical algorithm and application research areas
A separate paper lays out much of the needed research for quantum algorithms and quantum applications. This paper only briefly summarizes the critical areas:
- Need for a higher-level programming model. Current programming model is too low-level, too primitive, too much like classical machine and assembly language.
- Need for a robust collection of high-level algorithmic building blocks.
- Need for a high-level programming language. Tailored to the needs of quantum algorithms and quantum applications.
- Need for a robust collection of example algorithms. Which demonstrate production-scale quantum parallelism and show how practical real-world problems can be easily transformed into quantum algorithms and applications.
- Need for algorithm debugging capabilities. Difficult enough for relatively simple quantum algorithms, virtually impossible for complex quantum algorithms.
- Need for configurable packaged quantum solutions. Generalized applications for each major application category which allow the developer to present input data and input parameters in an easy way which can readily be automatically transformed into adaptations of the pre-written quantum algorithms and application frameworks. Still requires a lot of work, but not expertise in quantum circuits.
- Research in specific algorithms for each application category.
As mentioned elsewhere in this paper, algorithm and application research during pre-commercialization should focus on simulation with the simulator configured to match the expected target hardware capabilities for the initial commercialization stage (C1.0) or a subsequent stage. Attempting to perform quantum algorithm and quantum application research on actual near-term hardware would be counterproductive and a gross distraction.
For much more detail on quantum algorithm and quantum application research areas see my paper: Essential and Urgent Research Areas for Quantum Computing.
Other critical research areas
Besides hardware, algorithms, and applications, there are a number of other areas of critical and urgent research needed to fully exploit the promised potential of quantum computing. From my paper, here is the summary list of the areas:
- Physics.
- Hardware.
- Firmware (see: Hardware).
- Hardware support.
- Debugging.
- Classical quantum simulators.
- Quantum information science in general.
- Software. Support software, tools.
- Quantum software engineering. A new field.
- Quantum computer science. A new field.
- Cybersecurity.
- Quantum algorithm support.
- Quantum algorithms.
- Quantum application support.
- Quantum applications.
- Quantum application solutions. Particularly configurable packaged quantum solutions.
- Quantum general artificial intelligence.
- Quantum advantage and quantum supremacy.
- Other areas of QSTEM research.
For details, see my paper: Essential and Urgent Research Areas for Quantum Computing.
We need to decouple hardware development and algorithm and application research, prototyping, and experimentation
Trying to prototype and experiment with algorithms and applications on woefully-inadequate hardware is an exercise in futility. There’s another, better path: simulation.
Focus algorithm and application research, prototyping, and experimentation on simulation
Simulation can be slow and is limited to 40 to 50 or so qubits, but is much more reliable and ultimately more efficient and productive than attempting to prototype and experiment with hardware that simply isn’t up to the task.
Plus, a simulator can provide support for analysis and debugging which is not physically feasible with a real quantum computer.
For more detail on simulation and needed research, see my paper: Essential and Urgent Research Areas for Quantum Computing.
Sure, it can be intoxicating to run your algorithm on an actual quantum computer, but what does it prove and where does it get you?
There are very few real algorithms that use more than about 23 qubits on a real quantum computer at present. That is likely due to the fact that this is approximately the limit of current hardware, particularly with respect to qubit fidelity, qubit connectivity, coherence time, circuit depth, gate fidelity, and measurement errors.
Sure, it’s great to demonstrate a quantum algorithm on an actual quantum computer, but to what effect? Technically, we can run that same algorithm on a simulator.
Hardware engineers should run their own functional tests, stress tests, and benchmark tests
The quantum hardware engineers should run their own functional tests, stress tests, and benchmark tests to confirm that their hardware is performing as expected. No need to slow down algorithm and application research, prototyping, and experimentation just to test the hardware in a rather inefficient manner.
Use simulation to enable algorithm and application research, prototyping, and experimentation to proceed at their own pace independent of the hardware
There’s no good reason to slow down algorithm and application research, prototyping, and experimentation or to gate them by hardware research.
They can proceed just fine with simulation.
In fact, they can proceed better than just fine since:
- Hardware limits don’t interfere with progress.
- Simulation and analysis software can alert them to bugs and other issues in their algorithms and applications. Debugging on actual quantum hardware is very problematic.
Functional enhancements and performance and capacity improvements are needed for simulation
Simulation runs reasonably fine today. Yes, significant functional enhancements and performance and capacity improvements would be very beneficial, but simulators are generally more usable than actual hardware at the moment — and for the indefinite future.
For more detail on needed research for simulation, including enhancements and improvements, see my paper: Essential and Urgent Research Areas for Quantum Computing.
Where are all of the 40-qubit quantum algorithms?
Indeed, where are they? There doesn’t appear to be any technically valid reason that we don’t see a plethora of 40-qubit or even 32-qubit algorithms, other than the mere fact that it’s so intoxicating to run algorithms on actual quantum hardware. An increased focus on simulation should improve the situation.
See my paper on the merits of focusing on scalable 40-qubit algorithms:
- Where Are All of the 40-qubit Quantum Algorithms?
- https://jackkrupansky.medium.com/where-are-all-of-the-40-qubit-quantum-algorithms-14b711017086
Scalability is essential for robust quantum algorithms
We want to be able to develop quantum algorithms and applications today which will run on future hardware as it becomes available. We definitely don’t want to have to redesign and reimplement quantum algorithms and quantum applications every time there is even a modest advance in hardware capabilities. Scalability is the key. And in fact automatic scalability.
Scalability is also essential as we get to larger and more complex algorithms so that a scaled-down version of the algorithm can be fully and accurately simulated and validated on a 32 to 40-qubit simulator, under the presumption that automatic analysis can confirm that the logic of the algorithm is fully, reliably, and accurately — and automatically — scalable to a larger number of qubits without introducing errors.
Current quantum algorithms generally aren’t scalable. Why? Simply because it isn’t currently considered a priority. That needs to change — scalability, automatic scalability — needs to be a top research priority.
See my paper on the staged model for scalable quantum algorithms:
- Staged Model for Scaling of Quantum Algorithms
- https://jackkrupansky.medium.com/staged-model-for-scaling-of-quantum-algorithms-d1070056907f
Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic — much hard work is needed during pre-commercialization
Commercialization is a really, really, super-bad time to start thinking about how to scale your quantum algorithms or quantum applications.
Scalability of quantum algorithms or quantum applications is not:
- Free.
- Cheap.
- Easy.
- Obvious.
- Automatic.
Designing a quantum algorithm or application to be scalable takes:
- A lot of effort.
- A lot of time.
- A lot of careful attention to detail.
- A lot of patience.
- A lot of diligence.
- A lot of focus.
It takes all the things that are in very short supply during the mad rush called commercialization.
So when it comes to making quantum algorithms and applications scalable during commercialization, there’s really only one thing to keep in mind:
- Don’t do it!
Scalability won’t work as an afterthought.
Pre-commercialization is exactly the right time when scalability can and should be addressed.
Premature commercialization is the single worst thing you can do to the scalability of quantum algorithms and quantum applications.
Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications
Understanding the performance and capacity of quantum computers is essential for success for commercialization. Ditto for the performance and capacity of quantum algorithms and quantum applications.
Performance and capacity need to be:
- Observed.
- Measured.
- Tabulated.
- Analyzed.
- Characterized.
- Reported.
- Used.
Characterization of performance and capacity is critical. It needs to be:
- Understood fully.
- Understood in detail.
- Understood accurately.
The capabilities, tools, and methods needed to properly characterize performance and capacity are too primitive, or even nonexistent.
Capabilities, tools, and methods for benchmarking are not well developed either.
Much research, engineering, prototyping, and experimentation is needed before characterization of performance and capacity of quantum computers, algorithms, and applications will be ready for commercialization.
Configure simulation to match expected commercial hardware
Simulation can be configured to match any hardware configuration. The default should be the target hardware configuration for the initial commercialization stage of quantum computing, C1.0, with the only significant difference being fewer qubits since simulation is limited to 40 to 50 or so qubits.
Configuration factors include:
- Qubit count.
- Coherence time.
- Circuit depth.
- Connectivity.
- Phase granularity.
- Qubit fidelity.
- Gate fidelity.
- Measurement fidelity.
Configure simulation to match expected improvements — or shortfalls — of the hardware
Simulation is flexible and can be configured to match any hardware configuration (limited only by the maximum qubits for simulation or roughly 40 to 50 or so qubits.)
This makes it easy to test how algorithms and applications might behave on hardware that isn’t quite ready for commercialization, or for hardware improvements for subsequent commercialization stages beyond initial commercialization. Algorithms and applications can be tested even before the hardware becomes available.
Research will continue even as commercialization commences
Research for any vibrant field is never-ending. There’s always something new to discover, and some new problem to be overcome.
Granted, a true mountain of research must be completed during pre-commercialization before commercialization can begin, but that will not be the end of research in quantum computing.
The initial commercialization stage will be only the beginning of a long sequence of improvements and enhancements, many of which will only be possible as the result of additional research.
Generally, on average, it could easily take five years from the onset of research in some area to produce results which will find their way into commercialization. Some research may produce results which can be commercialized in less than five years, but some research may take seven to ten or more years to produce results suitable for commercialization.
This means that research for subsequent stages of commercialization after the initial stage of commercialization will need to commence even before commercialization commences — during pre-commercialization.
Exception: Commercial viability of capabilities which support pre-commercialization
Generally, this paper is arguing strongly against premature commercialization of quantum computing, especially the notion that quantum computers might be ready to be considered for deploying production quantum applications. But there is one category of exception to this opposition to commercialization of quantum computing, namely, products and services which are targeted at supporting pre-commercialization activities themselves, namely, research, prototyping, and experimentation. But just not products or services geared towards production deployment and operational use of production applications.
Exceptions and candidates for commercialization during pre-commercialization include:
- Equipment. For use by researchers and for prototyping and experimentation.
- Software. Classical quantum simulators. Support software.
- Software tools. Compilers. Algorithm analysis tools. Debugging tools.
- Services. Including consulting.
To be clear, all of these items would be intended only to support research, prototyping, and experimentation.
Some of these items might seem to be what might be expected for commercial products and services, but that is irrelevant — the only issue here is whether they specifically support pre-commercialization activities, regardless of whether they might be or resemble capabilities of potential eventual commercial products and services.
Early commercial opportunities for selling tools and services to enable and facilitate research, prototyping, and experimentation
Just emphasizing again the commercial opportunities that could be available during pre-commercialization — anything that enables or supports pre-commercialization activities — research, prototyping, and experimentation.
Anything but… production deployment or operational use.
Exception: Random number-based applications are actually commercially viable today
Although much of quantum computing is still subject to research and intensive pre-commercialization, there is one narrow niche that actually is commercially viable right now, today — generation of true random numbers. True random numbers are not mathematically computable, and hence cannot be computed using a Turing machine or classical computer. Special hardware is needed to access the level of entropy required for true random numbers. Quantum effects can supply the necessary entropy. And quantum computers are able to access the necessary quantum effects, using the Hadamard transform using the simple Hadamard gate. No further research is required to make use of this simple capability right now, today.
This ability to generate true random numbers has significant applications for cybersecurity, such as encryption key generation.
For more detail on true random number generation and quantum computers, see my paper:
- Quantum Advantage Now: Generation of True Random Numbers
- https://jackkrupansky.medium.com/quantum-advantage-now-generation-of-true-random-numbers-237d89f8a7f2
In fact, there is already at least one commercial product exploiting this specific feature of quantum computing:
- Cambridge Quantum launches Quantum Origin — a quantum-enhanced cryptographic key generation platform to protect data from advancing threats
- Quantum Origin is the world’s first commercial product built using quantum computers that delivers an outcome that classical computers could not achieve
- Quantum Origin is the first platform to derive cryptographic keys using the output of a quantum computer to ensure data is protected at foundational level against evolving attacks
- https://www.prnewswire.com/news-releases/cambridge-quantum-launches-quantum-origin--a-quantum-enhanced-cryptographic-key-generation-platform-to-protect-data-from-advancing-threats-301438568.html
Even for exceptions for commercialization during pre-commercialization, be especially wary
Even if early commercialization does seem warranted during pre-commercialization, there are plenty of potential gotchas. Be especially wary for:
- Sudden and unexpected technology evolution. Dramatic and expensive rework could be required.
- Incompatible changes. More rework.
- Potential for a Quantum Winter which could cause business to evaporate rapidly.
- Uncertain market conditions.
- Unpredictable markets.
- Unpredictable budgets.
- Sudden appearance of competitors.
- Rapid technology changes which could erase technical opportunities for commercial products. Of course, the opposite could occur as well — new technical opportunities can appear out of nowhere at any time.
- Costs. Unexpected costs. Costs which you should have expected. Higher costs than you did expect. More costs than you expected.
- Difficulty or impossibility of obtaining needed service level agreements (SLA). Contractual commitments are essential for services, including access to equipment and personnel.
- Need to offer service level agreements (SLA) to customers. If you’re an equipment or service provider, or offer staffing.
Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization
Yes, there are exceptions when it comes to commercialization during pre-commercialization, such as highlighted in the preceding sections, but even then there are significant caveats:
- Cost. Sure, a lot of access is free these days, but that’s not for production deployment. What exactly is the cost for production deployment? Especially if you require dedicated hardware for 24/7 operation, and for redundancy to protect against outages.
- Availability. Again, a lot of access is free, but what is availability for production deployment?
- Service level agreements (SLA). Production deployment requires a solid commitment for availability, performance, capacity, support, redundancy, etc. Be sure to have contractual commitments to all of the above, which is nominally in the form of a service level agreement (SLA). Be sure to read all of the fine print.
- Diversity of sourcing. Don’t be reliant on a single provider for any service, equipment, software, or tool. Companies can go out of business during a technological winter, or change their terms of service in an unacceptable manner at any time.
Beware of any capabilities available during pre-commercialization which might seem as if they are likely to apply to commercialization as well
To be clear, commercialization of quantum computing will have a clean slate compared to whatever might be available during pre-commercialization.
Sure, some capabilities available during pre-commercialization might actually carry over intact to commercialization, but they should be viewed as the exception rather than the rule.
As a general rule, presume that all work products developed during pre-commercialization will need to be discarded and developed from scratch, even though some amount of reuse might be possible.
In particular, earlier research might be rendered completely or partially obsolete by later research.
Early tools and support software may be fully or partially replaced or upgraded.
Sure, some knowledge will or might carry over, but a significant fraction of knowledge will not carry over and will have to be learned from scratch.
And, of course, quantum computer hardware available during pre-commercialization is very likely to be completely obsolete and superseded by newer hardware by the time commercialization of quantum computing becomes viable.
Products which enable quantum computing vs. products which are enabled by quantum computing
There are really two distinct categories of products covered by this paper:
- Quantum-enabled products. Products which are enabled by quantum computing. Such as quantum algorithms, quantum applications, and quantum computers themselves.
- Quantum-enabling products. Products which enable quantum computing. Such as software tools, compilers, classical quantum simulators, and support software. They run on classical computers and can be run even if quantum computing hardware is not available. Also includes classical hardware components and systems, as well as laboratory equipment.
The former are not technically practical until quantum computing has exited from the pre-commercialization stage and entered (or exited) the commercialization stage.
The latter can be implemented at any time, even and especially during pre-commercialization. Some may in fact be focused on pre-commercialization, such as lab equipment and classical hardware used in conjunction with quantum lab experiments.
The point is that some quantum-related (quantum-enabling) products can in fact be commercially viable even before quantum computing has entered the commercialization stage, while any products which are enabled by quantum computing must wait until commercialization before they are commercially viable.
And of course any production deployment of quantum algorithms and quantum applications must wait until commercialization has been completed.
Potential for commercial viability of quantum-enabling products during pre-commercialization
Although most quantum-related products, including quantum applications and quantum computers themselves, have no substantive real value until the commercialization stage of quantum computing, a whole range of quantum-enabling products do potentially have very real value, even commercial value, during pre-commercialization and even during early pre-commercialization. These include:
- Quantum software tools.
- Compilers and translators.
- Algorithm analysis tools.
- Support software.
- Classical quantum simulators.
- Hardware components used to build quantum computers.
Organizations involved with research or prototyping and experimentation may be willing to pay not insignificant amounts of money for such quantum-enabling products, even during pre-commercialization.
Preliminary quantum-enabled products during pre-commercialization
Some vendors may in fact offer preliminary quantum-enabled products — or consulting services — during pre-commercialization, strictly for experimentation and evaluation, but with no prospect for commercial use during pre-commercialization.
Personally, I would refer to these as preview products, even if offered for financial compensation.
These could include quantum algorithms and quantum applications, as well as a variety of quantum-enabling tools as discussed in the preceding section, but again, focused on experimentation and evaluation, not commercial use or production deployment.
Risk of changes to support software and tools during pre-commercialization — beware of premature commercialization
Literally every aspect of quantum computing technology during pre-commercialization is subject to change. Hardware, software, tools, algorithms, applications, programming models, knowledge, methods — you name it, all of it has a high probability of changing by the time true commercialization begins. This especially includes support software and tools.
There is a significant risk of premature commercialization of support software and tools during pre-commercialization. Vendors, customers, and users alike will be very tempted to latch onto appealing technology and make an investment and commitment to stick to their choices, but a year or two or even less of change, evolution, and innovation could trivially make those choices obsolete.
I’m not saying that you can or should avoid such choices, but simply that you have to be aware that such commitments and work products derived from those commitments will likely have to be reworked or started again from scratch when true commercialization does begin.
Vendors will want to sell support software, tools, algorithms, and applications, and customers will want to buy them, but just be aware that those expenditures and investments of time may have to be made again when commercialization does begin, maybe even multiple times, and especially as vendors come and go or as customers switch vendors as capabilities and features evolve.
Risk of business development during pre-commercialization — beware of premature commercialization
The same comments from the preceding section apply to business development. All factors involved in business decisions, including but not limited to the technology, are likely to change, evolve, and even radically change as pre-commercialization progresses.
Business deals arranged during pre-commercialization are unlikely to survive into commercialization. They should be considered short-term. Revision, rework, and restarts should be the expected norm during pre-commercialization — for both technology and business.
Quantum computing is still in the realm of the lunatic fringe
The concept of the lunatic fringe in technology innovation refers to the leading edge or actually the bleeding edge of early-early adopters who are literally willing to try any new technology long before it is proven and ready for commercial deployment. And in fact they don’t even mind if it doesn’t work properly yet — since they enjoy fixing products themselves.
The lunatic fringe are drawn to quantum computing like moths to a flame — they cannot resist even though they may get burned very badly before the technology really is ready for commercial deployment.
The lunatic fringe needs a special mindset, exceptional resilience, and deep tolerance for uncertainty and change. Indeed, uncertainty and change are key motivators for them.
Pre-commercialization is the ideal environment for the lunatic fringe.
That can also make it a less reasonable environment for those who are not of the same mindset as the lunatic fringe.
Normal, average technical staff should probably wait until pre-commercialization is complete, and in fact until commercialization is complete.
For more on the lunatic fringe, especially related to quantum computing, see my paper:
- When Will Quantum Computing Be Ready to Move Beyond the Lunatic Fringe?
- https://jackkrupansky.medium.com/when-will-quantum-computing-be-ready-to-move-beyond-the-lunatic-fringe-27cc8ddd776e
For deeper background on the basic concept of the lunatic fringe, see my paper:
- What Is the Lunatic Fringe (of Technology)?
- https://jackkrupansky.medium.com/what-is-the-lunatic-fringe-of-technology-ce96297de21b
Quantum Ready — It’s never too early for The Lunatic Fringe
We can debate exactly what the right moment is for a given organization to decide to become Quantum Ready, but for one group, The Lunatic Fringe, it is never too early. After all, by definition, they are willing to work with any technology at any stage, even before it is even remotely close to being usable. But, they are not representative of most organizations or most normal, average technical staff.
Quantum Aware is fine, but be careful about Quantum Ready
There’s no real harm in getting a variety of levels of staff to be Quantum Aware, knowledgeable about the general capabilities — and limitations — of quantum computing. But, it may not be so advisable to get many members of staff to the much higher level of Quantum Ready, where they actually have a deep enough level of training to actually design quantum algorithms and develop quantum applications.
So much can and will change — it’s virtually impossible to stay current for a rapidly moving target.
Expect radical change — continually update vision of what quantum computing will look like
As research, prototyping, experimentation, and even product development progress, our notions and visions of what quantum computing will ultimately look like will evolve and have to be continually updated.
If anybody seriously imagines that they know what a quantum computer will look like in five to seven or ten years, they are seriously mistaken. Expect radical change.
Quantum computing is still a mere laboratory curiosity, not ready for production deployment
At present, quantum computing is a mere laboratory curiosity — not yet ready for production deployment. There is much work to be completed, including and especially basic research, before quantum computing is ready for production deployment.
Some of the aspects of quantum computing which are not ready for production deployment:
- Features.
- Performance.
- Capacity.
- Reliability.
- Availability.
- Ease of use.
- Stability.
- Predictability.
- Support. Including commitment and service level agreements (SLA).
- Availability of working quantum algorithms.
- Availability of working quantum applications.
- Availability of a talent pool of technical staff.
- Development of quantum computer science as a mature field.
- Development of quantum computer engineering as a mature field.
- Development of quantum software engineering as a mature field.
For more on quantum computing as still just a mere laboratory curiosity, see my paper:
- When Will Quantum Computing Advance Beyond Mere Laboratory Curiosity?
- https://jackkrupansky.medium.com/when-will-quantum-computing-advance-beyond-mere-laboratory-curiosity-2e1b88329136
Quantum computing is still more suited for elite technical teams than average, normal technical teams
Quantum computing is still much too complex and ill-suited for use by your average, normal technical team or IT staff. It takes a much more elite level of technical sophistication to master quantum computing.
Maybe five to seven years from now the situation will be different — it should be by then, but for the next few years design of quantum algorithms and development of quantum applications will need to be reserved for only the most elite of technical teams.
Pre-commercialization will be the Wild West of quantum computing — accept that or stay out until true commercialization
There will be plenty of temptations to blindly leap into quantum computing during pre-commercialization, but… better to look before you leap, or in fact don’t leap at all, waiting for true commercialization to actually begin or at least be imminent.
Just be aware of what you are leaping into — constant and incompatible change since quantum computing will remain a mere laboratory curiosity for the indefinite future. That’s fine for The Lunatic Fringe and more technologically sophisticated organizations, but not for most organizations or most normal, average technical staff.
Pre-commercialization is about constant change while commercialization is about stability and carefully controlled and compatible evolution
The whole point of pre-commercialization is that there are lots of open issues and questions without clear or stable answers. The process of resolving them will result in a continuous sequence of changes, sometimes relatively minor but sometimes very disruptive.
The Lunatic Fringe will be perfectly happy with such constant change, but most organizations cannot cope with such constant change. That’s the point of commercialization, to provide a sense of stability, with any change being carefully controlled and done in a compatible manner to preserve technological investments.
During pre-commercialization, there’s no guarantee that work products that work today will work tomorrow, or that worked yesterday will work today. The hardware is evolving rapidly, as is the software.
But once we enter commercialization, customers and users will need to be able to rely on products, algorithms, and applications that work the same one day, one week, one month, one year, five years, and ten years in the future.
Compatibility and consistency will be the watchwords — during commercialization, but during pre-commercialization not so much.
Customers and users prefer carefully designed products, not cobbled prototypes
Again, The Lunatic Fringe will be perfectly happy with the prototype products of pre-commercialization which are cobbled together from disparate components and which are far from polished, incomplete, inconsistent, and even cryptic and problematic to use. Worst case, they’re more than happy to roll up their sleeves and fix or even replace any problematic or balky products or components.
Commercial customers and users on the other hand would prefer carefully and even artfully designed products, which do everything that is expected of them, do it well, and do it smoothly and without any hitches. Prototypes simply won’t be acceptable to them. And this is what commercialization is for, to design, develop, test, deliver, and deploy carefully designed products that are completely free of any drama.
Customers and users will seek the stability of methodical commercialization, not the chaos of pre-commercialization
Customers and users of commercial products want to avoid all sense of drama in their IT products. They want the stability and consistency of commercialization rather than the chaos of pre-commercialization.
Again, The Lunatic Fringe is perfectly happy and at home with the chaos of pre-commercialization.
To each his own. In truth, there are two distinct audiences for pre-commercialization and commercialization. They don’t see eye to eye. They are not compatible. But, they both have their strengths despite their weaknesses, so both are to be valued and catered to.
Quantum ecosystem
A successful quantum computing commercial product will require a thriving, vibrant, and mutually-supportive ecosystem, which consists of:
- Hardware vendors.
- Software vendors. Tools and support software.
- Consulting firms.
- Quantum algorithms.
- Quantum applications.
- Open source whenever possible. Algorithms, applications and tools. Hardware and firmware as well. Freely accessible plans so that anyone could build a quantum computer. Libraries, metaphors, design patterns, application frameworks, and configurable packaged quantum solutions. Training materials. Tutorials. Examples. All open source.
- Community. Including online discussion and networking. Meetups, both in-person and virtual.
- Analysts. Technical research as well as financial markets.
- Journalists. Technical and mainstream media.
- Publications. Academic journals, magazines, books. Videos and podcasts.
- Conferences. Presentation of papers, tutorials, and trade show exhibits. Personal professional networking opportunities.
- Vendors. Hardware, software, services, algorithms, applications, solutions, consulting, training, conferences.
- Research community. Academia, corporate, nonprofit, and government.
Early, preliminary development of quantum ecosystem during pre-commercialization
The preceding section summarized aspects of the quantum ecosystem which would be expected to thrive during commercialization, but early subsets of that ultimate ecosystem can be expected to begin to take root or occasionally even thrive during pre-commercialization.
Anything that enables and supports pre-commercialization activities — research, prototyping, and experimentation — would be a great candidate for the quantum ecosystem during pre-commercialization.
When might the initial commercialization stage, C1.0, be available?
There’s no clarity or certainty as to the timeframe for commercialization of quantum computing, but it might be illuminating to speculate about maximum, nominal, and minimum paths to both pre-commercialization and initial commercialization — C1.0. These numbers are somewhat arbitrary, but hopefully helpful to bound expectations.
So, here they are, as elapsed times:
- Pre-commercialization. Minimal: 2 years. Nominal: 4 years. Maximal: 10 years.
- Commercialization. Minimal: 2 years. Nominal: 3 years. Maximal: 5 years.
- Total. Minimal: 4 years. Nominal: 7 years. Maximal: 15 years.
Four to seven years seems to be the best and optimistic bet for the timeframe for C1.0.
Commercialization here refers to the readiness of the initial commercialization stage, C1.0. This would be the first production-quality product. Alpha, beta, and pre-releases would be available earlier.
And to be clear, all of these numbers are highly speculative and subject to change.
IBM 127-qubit Eagle announcement is proof that we’re still in pre-commercialization — and at risk of premature commercialization
Late last year (2021), IBM finally announced the availability of their 127-qubit Eagle quantum computer system. That was a major accomplishment and a big step forward, finally breaking the 100-qubit barrier. But… it’s not yet a commercial product and despite its achievement and progress is still woefully short of what is needed for true commercialization of quantum computing. Hence, it is solid evidence that we’re still deep in the realm of pre-commercialization.
For my own preliminary review of Eagle, see my paper:
- Preliminary Thoughts on the IBM 127-qubit Eagle Quantum Computer
- https://jackkrupansky.medium.com/preliminary-thoughts-on-the-ibm-127-qubit-eagle-quantum-computer-e3b1ea7695a3
Technical details are still sparse, but it doesn’t appear to have made any major breakthrough in qubit fidelity, qubit connectivity, or phase granularity, so other than the big jump in raw qubit count, it’s not terribly noteworthy.
Unfortunately, it has been presented in a slick and flashy manner, with all of the trappings of a commercial product even though it is still fundamentally inadequate to support production-scale practical real-world applications. It is still very much a mere laboratory curiosity. I’m sure that the lunatic fringe desperately want to get their hands on it, but that only emphasizes the conclusion that it is part of pre-commercialization, not even close to being a commercial product.
IBM explicitly admitted that they are not close to the key technical milestone of quantum advantage, without which there would be no good reason to proceed with commercialization of quantum computing:
- “We believe that we will be able to reach a demonstration of quantum advantage — something that can have practical value — within the next couple of years. That is our quest,” Gil said.
- IBM says quantum chip could beat standard chips in two years
- https://www.reuters.com/technology/ibm-says-quantum-chip-could-beat-standard-chips-two-years-2021-11-15/
The term quest is very appropriate, and once again emblematic of a research project, not the mundane tasks of an engineering project.
To be clear, IBM is making great progress, but the point is they still have a long way to go. Hence, they’re still deep in pre-commercialization, even though they present the technology as if it was imminent and virtually ready to go, even though it is not even close to being ready for development and deployment of production-scale practical real-world applications.
In short, it is unfortunately a prime example of premature commercialization.
I would have been happier if they showed some raw lab photos rather than pretentious flashy graphics.
I don’t want to be too negative, but I do think we need to focus more attention on this being part of the pre-commercialization of quantum computing, not commercialization itself.
The IBM press release:
- IBM Unveils Breakthrough 127-Qubit Quantum Processor
- Delivers 127 qubits on a single IBM quantum processor for the first time with breakthrough packaging technology
- New processor furthers IBM’s industry-leading roadmaps for advancing the performance of its quantum systems
- Previews design for IBM Quantum System Two, a next generation quantum system to house future quantum processors
- https://newsroom.ibm.com/2021-11-16-IBM-Unveils-Breakthrough-127-Qubit-Quantum-Processor
The IBM blog post:
- IBM Quantum breaks the 100‑qubit processor barrier
- https://research.ibm.com/blog/127-qubit-quantum-processor-eagle
Initial press coverage by Reuters:
- IBM says quantum chip could beat standard chips in two years
- https://www.reuters.com/technology/ibm-says-quantum-chip-could-beat-standard-chips-two-years-2021-11-15/
Must assure that there are no great unanswered questions hanging over the heads of the commercialization teams
Commercialization of a product cannot begin until all relevant great questions have been answered:
- All relevant research has been completed. No relevant questions remain unanswered.
- All relevant prototyping has been completed. No relevant questions remain unanswered.
- All relevant experimentation has been completed. No relevant questions remain unanswered.
The ultimate goal of pre-commercialization is to assure that there are no great unanswered questions hanging over the heads of professional product engineering teams that could interfere with their ability to develop commercial products by slowing their progress or putting their success at risk. Any needed research, prototyping, or experimentation must be complete and out of the way before commercialization can begin.
No great questions can remain unanswered once commercialization commences.
My apologies — There’s so much more! See my three papers
I’ve tried to keep this paper as short as possible, so it’s limited to summaries and some highlights. Additional detail are in my preceding three papers, upon which this paper was based:
Need for more extensive research:
- Essential and Urgent Research Areas for Quantum Computing
- https://jackkrupansky.medium.com/essential-and-urgent-research-areas-for-quantum-computing-302172b12176
Commercialization, pre-commercialization, and premature commercialization:
- Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization
- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a
Emphasizing the need to double down on pre-commercialization before proceeding to commercialization, and to avoid premature commercialization:
- Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization
- https://jackkrupansky.medium.com/prescription-for-advancing-quantum-computing-much-more-rapidly-hold-off-on-commercialization-but-28d1128166a
Grand finale — So what do we do now??
Just to make sure that we end with the right note:
- Premature commercialization is a really, really, REALLY bad idea.
But just to end on more of a positive, action-oriented note, if all of this and this paper in general feel way too negative, there is a positive message, a way out, a way to avoid this mess and potentially looming disaster:
- Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
- Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.
And I have a paper which focuses exactly on that message:
- Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization
- https://jackkrupansky.medium.com/prescription-for-advancing-quantum-computing-much-more-rapidly-hold-off-on-commercialization-but-28d1128166a
So, that’s it, that’s where we are and our best path forward.
My original proposal for this topic
For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.
- Risks of premature commercialization of quantum computing. People are jumping the gun when much more research, prototyping, and experimentation is needed. Disappointment. Disenchantment. Backlash. Even the risk of Quantum Winter. The solution is Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization.
Summary and conclusions
- General risks of premature commercialization of quantum computing…
- The technology just isn’t ready. Too much is missing. Too much is too primitive. Too much research is incomplete. Too much needs further research. Too much is incapable of delivering on the many wild promises that have been made,
- Risk of disenchantment and loss of project funding and commitment.
- Failure to complete projects.
- Failure of completed projects to meet expectations.
- Critical project failures now could make it harder to fund credible projects in the future.
- Risk of backlash. Disenchantment could lead to pushback on quantum computing. Denial of the potential for quantum computing.
- Surest path to an early quantum winter. By hitting a critical mass of disenchantment due to unmet expectations.
- Constant rework needed as the technology constantly and radically evolves. That’s the nature of the pre-commercialization stage. It’s a good thing at this stage, but not good for commercialization
- The technology is changing and evolving rapidly, so likely to be obsolete relatively soon, so it’s bad to bet on it in its current state.
- Insufficient research. Trying to skip too much of the needed research.
- Insufficient prototyping. Trying to skip too much of the needed prototyping.
- Insufficient experimentation. Trying to skip too much of the needed experimentation.
- Premature for any significant quantum advantage on any consistent basis across application categories. Very little, if any, quantum advantage available in the near term.
- Inadequate characterization of performance and capacity of quantum computers, algorithms, and applications. Capabilities, tools, and methods are too primitive. Benchmarking not well developed.
- General comments…
- Commercialization of current technology will NOT lead to dramatic quantum advantage. The hardware is too primitive. Much research is needed.
- Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out and rely on simulation until the hardware is ready.
- Generally focus on simulation rather than running on actual quantum computing hardware since current hardware will rarely represent the ultimate target hardware to be available during commercialization. Or in subsequent stages of commercialization.
- Quantum algorithms should be designed to be automatically scalable to run on future hardware without change. Also to permit them to be simulated with fewer qubits than will be available on larger capacity hardware.
- Don’t imagine that scalability of quantum algorithms and applications is free, cheap, easy, obvious, or automatic. Much hard work is needed. And it needs to be done during pre-commercialization. Attempting scalability during commercialization is a really bad idea. All of the issues need to be identified and worked out before commercialization even begins.
- It’s premature to even begin commercialization. The technology just isn’t ready. Not even close. Both hardware and algorithms, and applications.
- Much pre-commercialization work remains before commercialization can begin.
- Boost research, prototyping, and experimentation — pre-commercialization.
- Much research remains to fully characterize and resolve many technical obstacles. Many engineering challenges don’t have sufficient research results to guide them. Both hardware and algorithms, and applications.
- Hardware may seem to be the primary limiting factor, but algorithms are an even greater limiting factor. We can simulate 32 and 40-qubit algorithms, but they’re nowhere to be found.
- The precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses. But we can make some good, tentative guesses now.
- Variational methods are an unproductive distraction and technical dead end — the focus should be on quantum Fourier transform (QFT) and quantum phase estimation (QPE). It will take years for the hardware to support this, but simulation can be used in the meantime.
- Quantum error correction (QEC) and logical qubits will come in a later stage of commercialization — near-perfect qubits should be good enough for many applications.
- Prototyping and experimentation for quantum algorithms and quantum applications should focus on simulation configured to match the hardware expected at the time of initial commercialization rather than focusing on current, very limited hardware.
- There should be no expectation of running or even testing algorithms or applications for 64 or more qubits during pre-commercialization. Not until the hardware can be confirmed to be approaching the target capabilities for the initial commercialization stage — not just raw qubit count, but quality of the qubits. Simulation-only during pre-commercialization. May be limited to 50, 48, 44, 40, or even 32 qubits based on the limits of the simulator and circuit depth.
- Even initial commercialization will be fairly limited and it could take ten or more subsequent commercialization stages before the full promise of quantum computing can be delivered.
- Any efforts at premature commercialization are doomed to be counterproductive and a distraction from research and simulation for prototyping and experimentation.
- Hardware and algorithm research and development should be allowed to be on their own, parallel but independent tracks. Very slow progress on hardware must not be permitted to slow algorithm progress.
- Double down on pre-commercialization? Double down is a gross understatement. It probably requires a 10X to 50X increase in research, prototyping, and experimentation. Both hardware and algorithms, and applications. Much more people, time, and money. Much more.
- Pre-commercialization will be the Wild West of quantum computing. Accept that or stay out until true commercialization begins or is imminent. Some people and organizations require excitement and rapid change while others require calm stability — individuals and organizations must decide clearly which they are.
- Pre-commercialization could take another 2 to 4 years — or longer.
- The initial commercialization stage could take another 2 to 3 years — or longer, beyond pre-commercialization.
- The initial commercialization stage, C1.0, might be ready in 4 to 7 years — or longer. That would be production-quality, with alpha, beta and pre-releases available earlier.
- Configurable packaged quantum solutions are the best bet for most organizations. Most organizations will not be in a position to design and implement or even understand their own quantum algorithms.
- Quantum-enabled products. Products which are enabled by quantum computing. Such as quantum algorithms, quantum applications, and quantum computers themselves.
- Quantum-enabling products. Products which enable quantum computing. Such as software tools, compilers, classical quantum simulators, and support software. They run on classical computers and can be run even if quantum computing hardware is not available. Also includes classical hardware components and systems, as well as laboratory equipment.
- There are indeed exceptions: products or services which can actually thrive during pre-commercialization. Namely equipment, software, tools, and services which enable pre-commercialization, focused on research, prototyping, and experimentation. Anything but production deployment. Generally, quantum-enabling products.
- Even for exceptions for commercialization during pre-commercialization, be especially wary. Plenty of potential gotchas.
- Keep cost and service level agreements in mind even for the rare exceptions during pre-commercialization.
- The overall message is twofold…
- Double down on pre-commercialization — more basic research, more prototyping, and more experimentation.
- Don’t even think about commercialization until we have answers to all of the important questions needed to succeed at commercialization.
- Assure that there are no great unanswered questions hanging over the heads of professional product engineering teams that could interfere with their ability to develop commercial products by slowing their progress or putting their success at risk. Any needed research, prototyping, or experimentation must be complete and out of the way before commercialization can begin. No great questions can remain unanswered once commercialization commences.
For more of my writing: List of My Papers on Quantum Computing.