# Three Stages of Adoption for Quantum Computing: The ENIAC Moment, Configurable Packaged Quantum Solutions, and The FORTRAN Moment

Designing and developing a sophisticated and complex computer application is a challenging, expensive, and risky proposition. A quantum application with quantum algorithms even more so. Quantum technology is evolving rapidly, from the hardware, algorithm, and application perspectives, so that the optimal approaches to the conceptualization, design, and development of quantum algorithms and applications will continue to evolve dramatically for years to come. This informal paper identifies and describes a proposal for the three major stages of evolution that adoption of quantum applications (and solutions) will likely follow over the next five to ten years on the path to achieving widespread adoption of quantum computing.

**Caveats:** There are two major caveats for this paper:

- The concern here is only for
*production and production-scale operational*deployment of quantum applications. This excludes toy, mockup, experimental, and prototype applications. The focus here is the*commercial deployment*of practical real-world quantum applications, not activities which occur during the*pre-commercialization*stage of the evolution of quantum computing. - This paper
*proposes*,*envisions*, and*advocates*a*potential future*for quantum applications and solutions. There is no guarantee that this envisioned future will come to fruition, nor is there any certainty or confidence as to when it might come to fruition. The only confidence is that it won’t happen in the very near future (next year or two), and is likely to take at least a few years to develop — and require significant progress in the advancement of quantum computing hardware.

**Counter-caveat:**The proposal of this paper may not be a certainty but it certainly is a golden*opportunity*.

**Topics covered by this paper:**

- In a nutshell
- The essential goal: Gain as widespread adoption of quantum computing as possible in the shortest amount of time
- Focus on production-scale practical real-world quantum applications
- Comparing to adoption of classical computing
- The three stages of adoption for quantum computing
- Stage 1, Stage 2, and Stage 3
- Reliance on super-elite technical professionals
- Gradual declining of reliance on super-elite technical professionals
- Stage 1 — The ENIAC Moment
- Stage 2 — Configurable packaged quantum solutions
- Stage 3 — The FORTRAN Moment
- The FORTRAN Moment will herald widespread full-custom quantum algorithms and applications
- Technical gating factors for each stage of adoption
- Technical gating factors for Stage 1 (The ENIAC Moment)
- Need for near-perfect qubits
- Need for automatically scalable algorithms
- Technical gating factors for Stage 2 (Configurable packaged quantum solutions)
- Technical gating factors for Stage 3 (The FORTRAN Moment)
- Even in stage 3, not all quantum applications will need quantum error correction
- Stages of quantum advantage
- Expectations after the advent of stage 1
- Expectations after the advent of stage 2
- Expectations after the advent of stage 3
- What stage are we at now?
- Stage 0 — where we are today — pre-commercialization
- Where are all of the 40-qubit algorithms?
- Timing of the stages
- Lessons from the evolution of classical computing
- Beyond stage 3 — stage 4, stage 5, stage 6
- And even beyond stage 6 — to stage 10 and beyond
- Need for ongoing research
- What is quantum expertise?
- What is the potential for quantum-inspired computing?
- Model for stages of adoption for a new technology
- Original proposed topic
- Summary and conclusions

# In a nutshell

Design and development of quantum applications is not amenable to any sort of *boil the ocean* solution which will lead to a rapid breakthrough that solves all problems for everyone for all of time all at once.

The approach advocated in this paper has three main stages to get to relatively widespread adoption of quantum computing for production-scale practical real-world quantum applications:

**A few hand-crafted applications (****The ENIAC Moment****).**Limited to super-elite technical teams.**A few****configurable packaged quantum solutions****.**Focus super-elite technical teams on generalized, flexible, configurable applications which can then be configured and deployed by non-elite technical teams. Each such solution can be acquired and deployed by a fairly wide audience of users and organizations without any quantum expertise required.**Higher-level programming model (****The FORTRAN Moment****).**Which can be used by more normal, average, non-elite technical teams to develop custom quantum applications. Also predicated on perfect logical qubits based on full, automatic, and transparent quantum error correction (QEC).

Key points:

- The essential goal is to gain as widespread adoption of quantum computing as possible in the shortest amount of time.
- There will be many stages in the adoption of quantum computing.
- The first three stages will be critical and establish the initial widespread adoption and usage.
- The first stage will mark the first significant production-scale practical real-world quantum application.
- That initial success will be replicated and extended by numerous organizations.
- But, all of the stage 1 work will require super-elite technical teams, placing such efforts beyond the reach of most organizations.
- Full quantum error correction (QEC) will not be needed for Stage 1. Near-perfect qubits coupled with some minimal degree of manual error mitigation should be sufficient.
- Stage 1 will have a lot of visibility, but only minimal actual application results.
- The second stage, the deployment of configurable packaged quantum solutions, will mark the first wave of widespread adoption.
- Super-elite technical teams will still be required to design and create configurable packaged quantum solutions.
- But normal, average, non-elite technical teams at even average organizations will be able to configure and deploy those configurable packaged quantum solutions without any of the intensive quantum expertise that was needed to design and create those solutions. All of the quantum expertise lies buried deep under the hood of the configurable packaged quantum solutions.
- As with stage 1, full quantum error correction (QEC) will not be needed for Stage 2. Near-perfect qubits coupled with some minimal degree of manual error mitigation should be sufficient.
- A relatively modest collection of configurable packaged quantum solutions will be able to meet many of the quantum needs of a wide range of organizations. Certainly not all of their needs, but enough that quantum computing is now not only very visible but achieving a fairly high volume of very practical applications.
- What remains unaddressed after that second stage are custom applications.
- The third stage will finally enable non-elite technical teams to design, implement, and deploy full-custom quantum applications with literally no quantum expertise. As with configurable packaged quantum solutions, all of the hard-core quantum expertise lies buried deep under the hood of more advanced programming models, application frameworks, higher-level algorithmic building blocks, rich libraries, and even quantum-native programming languages, which enable non-elite professionals to develop solutions from scratch without the direct involvement or dependence on super-elite quantum professionals or even any quantum expertise.
- The third stage also ushers in the era of fault-tolerant quantum computing with perfect logical qubits which are enabled by full, automatic, and transparent quantum error correction (QEC).
- Even in stage 3, not all quantum applications will need quantum error correction. Some applications will run fine with only near-perfect qubits.
- Each stage builds on and extends the previous stage, so that by the third stage there will be a mix of very high-end applications designed and developed by super-elite technical teams, widespread deployment of configurable packaged quantum solutions, and a growing population of custom quantum applications based on fault-tolerant quantum computing and higher-level programming models.
- But it doesn’t end with these three stages. They are only the beginning, the start of widespread adoption, comparable to where classical computing was in the late 1950’s and early 1960’s — very impressive and widespread, but with an even brighter future in the years and decades ahead.
- Stage 3 won’t be unlike the confluence of the FORTRAN programming language and the transistor which really kicked classical computing into high gear in 1958.
- Each of these stages requires an increasing level of hardware capability. Some of that improved hardware gets used for raw performance and capacity, but a fair amount of it gets used to make it easier for non-elite technical teams to design and implement quantum solutions.
- Or, put another way, a
*decreasing*level of quantum expertise is needed for successive stages. - How quantum-inspired computing might fit into this staged model of adoption is an open question, but it does appear to have significant potential. I discuss some possibilities, but leave it as a separate exercise.
- When might all of this happen? Stage 1 (The ENIAC Moment) in two to four years, nominally in three years. Stage 2 (Configurable packaged quantum solutions) in another one to three years, three to seven years from now, nominally five years from now. Stage 3 (The FORTRAN Moment) in another two to four years, five to eleven years from now, nominally eight years from now. Or six to nine years from now on a looser basis. Those are all just wild but educated guesses.

# The essential goal: Gain as widespread adoption of quantum computing as possible in the shortest amount of time

We currently have a significant amount of quantum computing technology which could be used for a wide range of applications, but there are many obstacles which preclude widespread adoption of quantum computing at this time. This paper proposes a model for how to pursue adoption of quantum computing which will address these many obstacles to achieving the central, essential goal to:

*Gain as widespread adoption of quantum computing as possible in the shortest amount of time.*

To be clear, it is not the goal of this paper to describe getting to the *maximal* or necessarily *optimal* degree, level, and extent of adoption of quantum computing, but simply to *widespread* adoption in the *shortest* amount of time. There will still be plenty of room for improvement and expansion beyond that proposed by this paper.

In fact, this paper also hints at expansion of adoption beyond the three stages covered.

In summary:

- Stage 1 is reasonably useful, at least for some niche applications and a relatively small number of organizations.
- Stage 2 is quite useful, for a broader range of applications and for a fairly wide range of organizations.
- Stage 3 is the broadest, deepest, and richest adoption, for an even broader range of organizations and for a deeper and richer set of applications.
- Beyond stage 3 gets incrementally broader, deeper, and richer over time.

# Focus on production-scale practical real-world quantum applications

The central focus of this paper is on commercial-grade and industrial-grade applications for quantum computing — what we call *production-scale practical real-world quantum applications*.

This paper is *not* focused on:

**Toy and small-scale algorithms and applications.****Experimentation.****Prototyping.****Demonstration projects.****Mockups.****Proof of concept projects.****Testing.****Benchmarking.****Evaluation.****Education.****Training.****Familiarization.****Research.**

# Comparing to adoption of classical computing

The three stages covered by this paper are comparable to classical computing in the 1940’s, 1950’s, and early 1960’s.

Classical computers were in fairly widespread use and for a fairly wide range of applications by the early 1960’s.

Of course there was a veritable explosion of classical computing from the mid-1960’s, 1970’s, 1980’s, and beyond. But getting to where classical computing was in the late 1950’s and early 1960’s was a phenomenal achievement.

Similarly, this paper is concerned primarily with a comparable process of really getting the ball rolling with quantum computing to the degree that we saw with classical computing in the mid to late 1950’s and early 1960’s, with widespread applications in:

**Science.****Engineering.****Business.****Government.**

# The three stages of adoption for quantum computing

Without further ado, here are the likely three stages of adoption for quantum computing for production-scale practical real-world quantum applications:

**The ENIAC Moment****.**Hand-crafted applications. Very limited deployments. Relying on super-elite technical teams at only the most elite of organizations.**Configurable packaged quantum solutions****.**Widespread deployments of a relatively few applications. Requires no quantum expertise.**The FORTRAN Moment****.**Higher-level programming model. Widespread development of custom applications. No longer requires super-elite technical teams and is no longer limited to only the most elite of organizations.

# Stage 1, Stage 2, and Stage 3

For convenience, the three stages of adoption for quantum computing will be referred to by number:

**Stage 1.**The ENIAC Moment.**Stage 2.**Configurable packaged quantum solutions.**Stage 3.**The FORTRAN Moment.

# Reliance on super-elite technical professionals

One thread that runs through all stages of the adoption of quantum computing for production-scale practical real-world quantum applications is a reliance on super-elite technical professionals. Normal, average non-elite technical staff just won’t cut it. That’s an unfortunate truth that will persist through all stages of adoption of quantum computing, but there is a recognition of an evolution of technologies and methods to moderate and reduce that reliance as time passes.

# Gradual declining of reliance on super-elite technical professionals

Stage 1 will be the toughest, being 100% reliant on super-elite technical professionals for all levels of quantum technologies, from hardware, firmware, and support software, to algorithms and applications.

Stage 2 will still be reliant on super-elite technical professionals for development of configurable packaged quantum solutions, but normal, average, non-elite technical professionals will be able to configure and deploy sophisticated quantum solutions. That will be a big improvement. So super-elite technical professionals will still have a central role, but significantly reduced.

Stage 3 will finally enable normal, average, non-elite technical professionals to develop custom quantum applications. Super-elite technical professionals will still have key roles in designing and developing core, underlying technology, as well as a smaller collection of high-value niche applications, but no longer for all applications in general.

So there will always be a need for elite and super-elite technical professionals, especially in the earlier stages of the adoption of quantum computing, but there does need to be a sense of urgency on reducing that need as rapidly as possible.

# Stage 1 — The ENIAC Moment

The initial stage of adoption for quantum computing — ushered in by The ENIAC Moment — relies on super-elite technical professionals using a wide range of tricks and supreme cleverness to achieve workable quantum solutions. For example, manual error mitigation. Only the few and most elite organizations will have the talent, skill, and resources to design, develop, and deploy quantum applications at this stage.

The ENIAC Moment will mark the start of this stage, to be followed by other teams and organizations attempting to replicate the success of The ENIAC Moment, but only at great cost and using the most elite of technical teams.

Anybody can design and develop a *toy quantum application*, but achieving production-scale practical real-world quantum applications will be a monumental feat requiring great skill, effort, resources, and risk.

Some of the more elite consulting firms will likely be able to assist large organizations design, develop, and deploy quantum applications at this stage, but only at great cost — and great risk.

For more details on The ENIAC Moment, see my paper:

*When Will Quantum Computing Have Its ENIAC Moment?*- https://jackkrupansky.medium.com/when-will-quantum-computing-have-its-eniac-moment-8769c6ba450d

# Stage 2 — Configurable packaged quantum solutions

The second stage of adoption for quantum computing — based on *configurable packaged quantum solutions* — as discussed in a separate paper, also relies on similar *super-elite professionals* to create complete frameworks for solutions which can then be configured and deployed by non-elite professionals and less-elite organizations to achieve practical, production-scale quantum solutions. Those non-elite professionals are able to prepare their domain-specific input data in a convenient form compatible with their non-elite capabilities, but not have to comprehend or even touch or even examine the underlying quantum algorithms or application code.

Deployment of quantum applications will become widespread in this second stage, but the total number of unique quantum applications will remain quite modest and limited.

Consulting firms will facilitate planning, deployment, validation, monitoring, and maintenance of configurable packaged quantum solutions at a reasonably modest cost. Modest being a relative term. Although non-elite technical teams can accomplish all of this work, less-elite organizations will tend to seek the comfort of knowing that a more-sophisticated and experienced technical team is watching over the entire process. At least at the start.

More-sophisticated organizations will develop their own talent pool to manage planning, deployment, validation, monitoring, and maintenance of configurable packaged quantum solutions at a lower cost (they hope!) without the need to rely on expensive consulting firms.

But overall, organizations deploying configurable packaged quantum solutions won’t need to know much at all about *quantum technology* or even have any quantum expertise since it’s all buried deep under the hood and they will be able to rely on external organizations for any hard-core *quantum expertise*.

For more details on configurable packaged quantum solutions, see my paper:

*Configurable Packaged Quantum Solutions Are the Greatest Opportunity for Widespread Adoption of Quantum Computing*- https://jackkrupansky.medium.com/configurable-packaged-quantum-solutions-are-the-greatest-opportunity-for-widespread-adoption-of-f66bed126a36

# Stage 3 — The FORTRAN Moment

The final stage of adoption for quantum computing — led by The FORTRAN Moment — relies on much more advanced and high-level programming models, higher-level algorithmic building blocks, true higher-level quantum programming languages, application frameworks, and libraries, as well as logical qubits based on full, automatic, and transparent quantum error correction to enable non-elite professionals to develop full-custom quantum solutions from scratch without the need for any quantum expertise themselves and without the direct involvement or dependence on super-elite technical professionals for their quantum expertise. This will eventually happen, but quite a few years down the road. For now, the target to aim for is configurable packaged quantum solutions.

By *final* we don’t mean to suggest that nothing follows, but simply that quantum computing has finally achieved a *level of maturity *suitable for a wide range of applications, rather than being a specialized, niche technology.

For more detail on The FORTRAN Moment for quantum computing, see my paper:

*When Will Quantum Computing Have Its FORTRAN Moment?*- https://jackkrupansky.medium.com/when-will-quantum-computing-have-its-fortran-moment-995f510605cd

# The FORTRAN Moment will herald widespread full-custom quantum algorithms and applications

As just mentioned, The FORTRAN Moment for quantum computing will finally usher in the era of full-custom quantum algorithms and applications, when the vast majority of organizations will finally be able to easily design and develop their own custom quantum algorithms and applications.

The key capabilities which will enable The FORTRAN Moment include:

**Advanced and high-level programming models.****Higher-level algorithmic building blocks.****True higher-level quantum programming languages.****Application frameworks.****Libraries.****Perfect logical qubits based on full, automatic, and transparent quantum error correction (QEC).**

The FORTRAN Moment will:

**Remove the remaining obstacles and impediments to designing and developing custom quantum algorithms and applications.****But… still require at least some degree of skill and expertise at design and development.**Classical computing skills should be sufficient, but required.**Open the door for much more widespread custom quantum algorithm and application development.****But the need, benefits, and opportunities for configurable packaged quantum solutions will continue and remain dominant.**Configurable packaged quantum solutions will likely remain dominant. Half to 80% of organizations may still prefer configurable packaged quantum solutions even though custom quantum algorithms and applications are now within reach.**Custom quantum algorithms will expand the richness of applications in a typical organization.**

# Technical gating factors for each stage of adoption

Overall, the three stages of adoption for quantum computing are driven primarily by technical factors.

Each stage has its own unique factors in three areas:

**Level of sophistication of technical staff.**To design and implement quantum algorithms and quantum applications.**Hardware.****Algorithms, applications, and software.**Including architecture and software engineering.

# Technical gating factors for Stage 1 (The ENIAC Moment)

**Level of sophistication of technical staff:**

**Super-elite technical teams will be required to design, implement, and deploy quantum algorithms and applications in stage 1.**Very deep quantum expertise required.**In short, quantum applications will be out of reach for normal, average, non-elite technical teams.****Extreme technical cleverness will be required.**Especially to take advantage of very limited hardware.

**Hardware:**

**Qubit count.**Somewhere in the range of 40 to 160.**Qubit fidelity.**Near-perfect qubits — generally four to five nines. Possibly 3.5 nines for some applications. Slim chance that three nines might be sufficient.**Qubit connectivity.**Significantly better than merely nearest-neighbor.**Greater qubit coherence.**To support deeper, more complex algorithms.**Some degree of manual error mitigation.****Finer granularity for phase and probability amplitude.**Needed for quantum Fourier transform (QFT), quantum phase estimation (QPE), and quantum amplitude estimation (QAE).

**Algorithms and applications:**

**More sophisticated algorithms.**Require elite technical teams.**Automatically scalable algorithms.**See next section.**More sophisticated applications.**Require elite technical teams.

# Need for near-perfect qubits

*Noisy qubits* and so-called *NISQ devices* just won’t cut it for achieving production-scale practical real-world quantum applications. And full-blown *quantum error correction (QEC)* won’t be available for quite a few years. So, what we really need now, in the near future, like within two years, to begin serious adoption of quantum computing are *near-perfect qubits* — qubits with a *qubit fidelity* of at least *four nines* of fidelity.

For more on *near-perfect qubits*, see my paper:

*What Is a Near-perfect Qubit?*- https://jackkrupansky.medium.com/what-is-a-near-perfect-qubit-4b1ce65c7908

# Need for automatically scalable algorithms

Even stage 1 needs algorithms which are *automatically scalable*. This means:

**Use generative coding.**To generate the algorithm and quantum circuit dynamically based on input size and parameters.**Test on smaller systems.**For small input data.**Test on simulators up to 32–40 qubits.**For a range of input data.**Automatically analyze the algorithm to assure that it is scalable.**Assure that it doesn’t use any coding patterns which don’t scale.

For more on *scalable quantum algorithms*, see my paper:

*Staged Model for Scaling of Quantum Algorithms*- https://jackkrupansky.medium.com/staged-model-for-scaling-of-quantum-algorithms-d1070056907f

# Technical gating factors for Stage 2 (Configurable packaged quantum solutions)

**Level of sophistication of technical staff:**

**Although super-elite technical teams will be required to design and implement configurable packaged quantum solutions in stage 1, the adoption and deployment of such solutions can be performed by normal, average, non-elite technical teams.**No quantum expertise required.**In short, the focus and essential need is for normal, average, non-elite technical teams.**No quantum expertise required.**Extreme technical cleverness will be required for the super-elite teams who design and implement configurable packaged quantum solutions.**Very deep quantum expertise required.**No technical cleverness will be required for the normal, average, non-elite technical teams who acquire, configure, and deploy those configurable packaged quantum solutions.**

**Hardware:**

**Modest to moderate incremental improvements of hardware…**No specific requirements overall, but each application will have its own requirements.**Moderately more qubits.****Moderately better qubit fidelity.****Moderately better qubit connectivity.****Moderately greater qubit coherence.**To support deeper, more complex algorithms.**Moderately finer granularity of phase and probability amplitude.**

**Algorithms and applications:**

**More sophisticated algorithms.**Generative coding.**Sophisticated software architecture.**Focus on software engineering rather than mere coding.**Focus on resilience.****Focus on configurability.**

# Technical gating factors for Stage 3 (The FORTRAN Moment)

**Level of sophistication of technical staff:**

**Although super-elite technical teams will be required to design and implement the underlying technology, the adoption, design, implementation, and deployment of quantum applications which utilize that technology can be performed by normal, average, non-elite technical teams.**Depth of required quantum expertise will be dramatically reduced.**In short, the focus and essential need is for normal, average, non-elite technical teams.**Less focus on quantum expertise.

**Hardware:**

**Quantum error correction (QEC).**Full, automatic, and transparent. Support perfect logical qubits.**Much higher physical qubit count.**Needed to support QEC and perfect logical qubits.**General qubit improvements.**Fidelity, connectivity, coherence for circuit depth, finer granularity of phase and probability amplitude. Although QEC will cover most of these needs, better physical qubits will lead to better logical qubits, and fewer physical qubits needed for each logical qubit.

**Algorithms, applications, and software:**

**Improved programming models.**Much higher level.**High-level algorithmic building blocks.**Much higher level. Richer functionality.**True high-level quantum programming language.**Native support for the higher-level programming models and algorithmic building blocks.**Much more advanced algorithms.****Application frameworks.****Framework-based applications.****Libraries.****Analysis tools.**To detect potential problems in quantum algorithms. Including coding patterns which are unlikely to scale well.

# Even in stage 3, not all quantum applications will need quantum error correction

Design and development of custom production-scale quantum applications by non-elite technical staff will generally have to wait for stage 3 (The FORTRAN Moment) with the advent of *perfect logical qubits* enabled by full, automatic, and transparent *quantum error correction (QEC)*.

Before then — in stages 1 and 2 — quantum error correction won’t even be available, so all applications and configurable packaged quantum solutions will have to get by with *near-perfect qubits* and manual error mitigation.

But even during stage 3, when quantum error correction (QEC) is available, not all applications will require it. Some exceptions:

**High-end applications may require more qubits than are available as logical qubits.****High-end applications may have higher performance requirements than offered by logical qubits.****High-end applications may not need the additional qubit fidelity of perfect logical qubits.****Configurable packaged quantum solutions will have been optimized to get by with near-perfect qubits and some manual error mitigation.****Even some applications developed by non-elite technical staff may not require the additional qubit fidelity.**Near-perfect qubits may be sufficient, enabling the applications to be run on smaller quantum computers, using physical qubits rather than logical qubits.

In short, even though stage 3 is predicated on the availability of quantum error correction, a fair fraction of applications won’t require it. But, it may not always be possible to tell in advance which applications will need it and which won’t need it.

# Stages of quantum advantage

There are three rough stages on the path towards achieving *quantum advantage* over classical computing:

**Minimal quantum advantage.**Roughly 1,000X a classical solution.**Substantial quantum advantage.**Roughly 1,000,000X a classical solution.**Dramatic quantum advantage.**Roughly one quadrillion X a classical solution.

For more on *dramatic quantum advantage*, see my paper:

*What Is Dramatic Quantum Advantage?*- https://jackkrupansky.medium.com/what-is-dramatic-quantum-advantage-e21b5ffce48c

The first two stages are sometimes known as *fractional quantum advantage*. For more on fractional quantum advantage, see my paper:

*Fractional Quantum Advantage — Stepping Stones to Dramatic Quantum Advantage*- https://jackkrupansky.medium.com/fractional-quantum-advantage-stepping-stones-to-dramatic-quantum-advantage-6c8014700c61

The stages for quantum advantage during the stages of adoption for quantum computing:

**The ENIAC Moment****.**Hopefully at least the lower-end of*substantial quantum advantage*, but possibly still somewhere in the range of*minimal quantum advantage*.*Dramatic quantum advantage*is a real possibility, but not highly likely or probable.**Configurable packaged quantum solutions****.**Well within the range of*substantial quantum advantage*, but hopefully at least some applications are achieving*dramatic quantum advantage*.**The FORTRAN Moment****.***Dramatic quantum advantage*is readily within reach of all applications which utilize sufficient quantum parallelism (at least 50 qubits in a single parallel quantum computation — 2⁵⁰ = one quadrillion parallel operations), but some applications may only use enough quantum parallelism to achieve only*substantial quantum advantage*or even only*minimal quantum advantage*. It almost purely depends on the size of the*solution space*being evaluated by the quantum computation.

There are a lot of factors which go into the calculation of quantum advantage. For example, *shot count* or *circuit repetitions*. See my paper:

*What Is the Quantum Advantage of Your Quantum Algorithm?*- https://jackkrupansky.medium.com/what-is-the-quantum-advantage-of-your-quantum-algorithm-8f481aefe8d0

# Expectations after the advent of stage 1

Once **The ENIAC Moment** has been reached, marking the advent of stage 1, we can expect:

**Copycat applications.**The same exact application from different vendors and other sources and customers.**Derivative applications.**Almost identical but incremental differences.**Applications with complexity comparable to the initial stage 1 application.**Distinct applications, but the initial stage 1 application blazed the trail and showed the way.**Incrementally more complex than the initial stage 1 application.**Each new application will reach a little further.**Incremental hardware advances enable incrementally more complex applications.****But all of this will continue to require super-elite technical teams.**Putting development of such applications out of the reach of most organizations.

# Expectations after the advent of stage 2

Once the first *configurable packaged quantum solution* has been produced and deployed, marking the advent of stage 2, we can expect:

**Copycat configurable packaged quantum solutions.**The same exact solution (at least functionally equivalent) but from different vendors and other sources.**Derivative configurable packaged quantum solutions.**Almost identical but incremental differences.**Solutions with complexity comparable to the initial configurable packaged quantum solution.**Distinct solutions, but the initial configurable packaged quantum solution blazed the trail and showed the way.**Incrementally more complex than the initial configurable packaged quantum solution.**Each new solution will reach a little further.**Incremental hardware advances enable incrementally more complex solutions.****Super-elite (or even merely elite) technical teams won’t be required for most organizations.**Only those organizations designing and creating configurable packaged quantum solutions will require super-elite technical teams.**Still occasional stage 1-style applications.**Very high complexity. Super-elite technical teams required.

# Expectations after the advent of stage 3

Once **The FORTRAN Moment** has been reached, marking the advent of stage 3, we can expect:

**Copycat applications.**The same exact applications but from different customers. As well as derivative applications (almost identical but incremental differences.)**Derivative applications.**Almost identical but incremental differences.**Applications with complexity comparable to the initial stage 3 applications.**Distinct applications, but the initial high-level applications blazed the trail and showed the way.**Incrementally more complex than the initial stage 3 applications.**Each new application will reach a little further and become a little richer.**Incremental hardware advances enable incrementally more complex applications.****Most applications won’t require elite technical teams.****Some applications will require semi-elite technical teams.**Greater complexity.**Few applications will still require super-elite technical teams.**Very high-end. Very high-performance. Comparable to the effort that went into stage 1, but a much higher degree of complexity.**Configurable packaged quantum solutions will still drive bulk of adoption.**Ongoing sweet spot for widespread adoption. A broader and richer collection of configurable packaged quantum solutions will be developed, marketed, and deployed for the bread and butter applications that don’t require custom applications.

# What stage are we at now?

Sorry to say it, but we’re not even close to being at even *stage 1*.

We’re essentially at… *stage 0*…

# Stage 0 — where we are today — pre-commercialization

Sorry to say it, but for all of the recent advances in quantum computing, we’re still at what we could call *stage 0*. Simply meaning whatever occurs before stage 1.

By definition, we still haven’t achieved a *production-scale practical real-world quantum application*. If we had, then by definition we would be at The ENIAC Moment.

So, to be clear, stage 0 is by definition *pre-production*.

Stage 0 also fits in with my conceptualization of *pre-commercialization*. See my paper:

*Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization*- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a

Stage 0 is characterized by:

**Primarily focused on research.****Lots of academic research papers, but little of practical utility.**Somewhat limited by current hardware. But also limited by lack of off-the-shelf algorithms that will be ready to go when the stage 1 hardware is available.**Familiarization.**Working with algorithms and applications not with any intended purpose other than simply to become familiar with the technology.**Small, toy algorithms and applications.**Nothing serious. Nothing that fully solves any production-scale practical real-world problems. A desire and initial effort to move beyond simple familiarization.**Modest niche applications.**They do do something practical, but not of any great complexity. No significant or dramatic quantum advantage. With the one exception of generating random numbers, which even the simplest of quantum computers can do very well.**Experimentation.**Trying things to see where they lead or as attempts to achieve some goal.**Prototyping.**More serious attempt to produce something resembling a product. Or at least portions of a product.**Demonstration.**Of capabilities. Or possibly a product. Suitable for presentation for review by others.**Proof of concept projects.**Determine what is actually feasible.**No production-scale practical real-world quantum applications.****No production-scale quantum algorithms.****Educational.****Training.****Minimal standardized high-level algorithmic building blocks.****Great plans, expectations, and promises for the future.**But not delivered any time soon.**Limited hardware.**Limited qubit count. Limited qubit fidelity. Limited qubit connectivity. Limited qubit coherence and circuit depth. Coarse granularity of phase and probability amplitude. Limited ability to support quantum Fourier transform (QFT), quantum phase estimation (QPE), and quantum amplitude estimation (QAE).**No significant quantum advantage.**Essentially no quantum advantage at all, other than generating random numbers.

# Need to avoid premature commercialization

Sure, we’d like to get started on stage 1 ASAP, but we have to be careful not to jump the gun, careful not to engage in *premature commercialization*.

I discuss this in depth in my paper:

*Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization*- https://jackkrupansky.medium.com/prescription-for-advancing-quantum-computing-much-more-rapidly-hold-off-on-commercialization-but-28d1128166a

We need to focus much more intently on *pre-commercialization* — particularly *research* as well as *experimentation* and *prototyping* — rather than jump too quickly into *commercialization*, otherwise known as *premature commercialization*.

For more on *pre-commercialization* itself, see my paper:

*Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization*- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a

# Where are all of the 40-qubit algorithms?

Just to highlight what a primitive stage we are at right now, I note that we don’t have much in the way of 40-qubit or even 32-qubit algorithms being discussed.

Sure, we don’t have 40-qubit or even 32-qubit real quantum hardware with sufficient qubit fidelity, connectivity, and coherence time, but we do have 32-qubit and 40-qubit simulators, so at least researchers should be able to work with such algorithms.

Just the other day I saw a very recent research paper which used all of… 5 qubits. Yes, five — F-I-V-E qubits. That’s not atypical. Or seven or eleven. Even 16 or 23 qubits are considered extreme, leading edge. Not even close to my 32-qubit minimum.

Read more about this depressing state of affairs in my paper:

*Where Are All of the 40-qubit Quantum Algorithms?*- https://jackkrupansky.medium.com/where-are-all-of-the-40-qubit-quantum-algorithms-14b711017086

# Timing of the stages

There’s no clear and definitive estimation of the timing of the three stages of adoption of quantum computing, but a fair guess might be:

**Stage 1 —****The ENIAC Moment****.**Two to four years from now. Nominally three years from now.**Stage 2 — Configurable packaged quantum solutions.**One to three years after The ENIAC Moment. Three to seven years from now. Nominally five years from now.**Stage 3.**The FORTRAN Moment. Two to four years after the advent of configurable packaged quantum solutions. Five to eleven years from now. Nominally eight years from now. Or six to nine years from now on a looser basis.

# Lessons from the evolution of classical computing

The commencement of stage 3 will be somewhat analogous to 1958 for classical computing. Classical computers evolved dramatically from the 1940’s to the mid 1950’s and 1958, from electromechanical relays, to vacuum tubes, from very primitive forms of memory to core memory, and dramatic increases in processing speed and capacity.

The transistor was invented in 1947, but it took a full decade before it had evolved to be practical for large-scale computers. There were a very few transistor-based computers in the mid 1950’s, but it was only in 1958 that the transistor was finally mature enough that all of the major computing companies switched from vacuum tubes to transistors. From 1958 on, all new computers were based on transistors. This transition opened the floodgates for many more interesting computing advances to come, such as:

**Improvements to the transistor.**Smaller, faster, cheaper, more reliable.**Operating systems.****Minicomputers.****High-speed and high-capacity computing.****Real-time and avionics computing.**High-performance aircraft. Missiles. Rockets and spacecraft.**Small-scale integrated circuits.****Medium-scale integrated circuits.****Large-scale integrated circuits.****Multics, UNIX.**More sophisticated operating systems.**Semiconductor memory.**Quickly replacing core memory.**Microprocessors.****Electronic calculators.****Personal computers.****Very large-scale integrated circuits.****Network interface computers.**ARPANET.**Network routers.****Workstations.**High-speed, high-capacity, very interactive, large displays.**Graphical user interfaces.****High-speed and high-capacity networked server computers.****Productivity applications.****Personal computing applications.****Office applications.****Email.****Internet.****World Wide Web.****Media.**Audio. Video. Images.**Web applications.****Smart phones.****Tablets.****Wearable computers.****Internet of Things.****And so much more.**

But the point is that the deployment of discrete transistors did not mark the end of the evolution of classical computing, but enabled and kicked off many further innovations, many of which couldn’t even be envisioned at the time.

For a timeline of the early stages of classical computing, see my paper:

*Timeline of Early Classical Computers*- https://jackkrupansky.medium.com/timeline-of-early-classical-computers-9539d5bf82d8

# Beyond stage 3 — stage 4, stage 5, stage 6

The commencement of stage 3 is not the end of the evolution of quantum computing, but simply the beginning of widespread development of quantum computing applications by non-elite technical teams. There will still be plenty of room for further and even more dramatic progress.

Nobody can predict what stages of evolution of quantum computing might occur after stage 3, The FORTRAN Moment. Sure, I can envision and suggest a bunch, but there will likely be so much more beyond my own imagination.

It’s easy to list some areas of further development:

**Hardware.**Better, faster, higher-capacity, higher-fidelity, more reliable, cheaper, smaller.**Programming models.****Algorithmic building blocks.****Algorithms.****Support software.****Merging of quantum and classical computing.**Evolution towards a universal quantum computer.**Networking and distributed quantum processing.****Integration of quantum sensing, imaging, and computing.****Quantum storage.****Applications.****Quantum personal computing.****Quantum general artificial intelligence.**

Based on hardware alone, and somewhat analogous to how classical computing evolved, one could surmise at least three additional hardware evolutionary stages.

**Stage 4.**Hundreds of logical qubits. Or 1,000 near-perfect qubits.**Stage 5.**Thousands of logical qubits. Or 50,000 near-perfect qubits.**Stage 6.**Millions of logical qubits. Or millions of near-perfect qubits.

And some stages of evolutionary technical development of technical metrics:

**Qubit count:**

**Hundreds of logical qubits.**Or thousands of near-perfect qubits.**Thousands of logical qubits.**Or hundreds of thousands of near-perfect qubits.**One Million logical qubits.**Or tens of millions of near-perfect qubits.

**Qubit fidelity:**

**9 nines.****12 nines.****15 nines.**

**Fine granularity of phase and probability amplitude:**

**Billions of gradations.****Trillions of gradations.****Quadrillions of gradations.**Begs the question of what is actually theoretically possible — is there some Planck-level minimum unit of angle?

And one can envision stages beyond those additional three as well, but the main focus of this paper is to get us to a comparable stage as classical computing in the late 1950’s when the transistor and FORTRAN (and other languages) finally took over.

This paper is drawing a rough equivalence between *The Transistor Moment* of classical computing and The FORTRAN Moment of quantum computing.

FORTRAN was invented in 1954 (the year I was born!), but wasn’t a commercial success until 1957, just a year before the transistor took over on the hardware front. For all intents and purposes, The FORTRAN Moment for both classical computing and quantum computing are very similar, unleashing many applications and enabling and incentivizing many further innovations.

# And even beyond stage 6 — to stage 10 and beyond

As I said in the preceding section one can envision stages beyond those additional three as well. In fact, in my paper on stages of commercialization, I envisioned up to *ten stages of commercialization*, and not intending for even that to be the end of the evolution.

See the section *Subsequent commercialization stages — Beyond the initial ENIAC Moment* of my paper:

*Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization*- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a

For ease of reference, here is the full list of stages of commercialization from that paper, which isn’t exactly in sync with this paper, but close enough:

**C1.0**— Reached The ENIAC Moment. All of the pieces are in place.**C1.5**— Reached multiple ENIAC Moments.**C2.0**— First configurable packaged quantum solution.**C2.5**— Reached multiple configurable packaged quantum solutions. And maybe or hopefully finally achieve full, dramatic quantum advantage somewhere along the way as well.**C3.0**— Quantum Error Correction (QEC) and logical qubits. Very small number of logical qubits.**C3.5**— Incremental improvements to QEC and increases in logical qubit capacity.**C4.0**— Reached The FORTRAN Moment. And maybe full, dramatic quantum advantage as well.**C4.5**— Widespread custom applications based on QEC, logical qubits, and FORTRAN Moment programming model. Presumption that full, dramatic quantum advantage is the norm by this stage.**C5.0**— The BASIC Moment. Much easier to develop more modest applications. Anyone can develop a quantum application achieving dramatic quantum advantage.**C5.5**— Ubiquitous quantum computing ala personal computing.**C6.0**— More general AI, although not full AGI.**C7.0**— Quantum networking. Networked quantum state.**C8.0**— Integration of quantum sensing and quantum imaging with quantum computing. Real-time quantum image processing.**C9.0**— Incremental advances along the path to a mature technology.**C10.0**— Universal quantum computer. Merging full classical computing.

# Need for ongoing research

Plenty of research is needed even to get to stage 1.

And even more, much more, research is needed for all stages beyond stage 1.

In truth, research is an *ongoing* need, not just a one-time, one-shot, short-term task.

To be clear, a truly *massive amount of research* is needed to achieve widespread adoption of quantum computing.

For more on research needed for quantum computing, see my paper:

*Essential and Urgent Research Areas for Quantum Computing*- https://jackkrupansky.medium.com/essential-and-urgent-research-areas-for-quantum-computing-302172b12176

# What is quantum expertise?

There are a variety of areas of education, training, knowledge, skill, expertise, and experience that can be involved with quantum computing, including:

**Raw native intellect.**Natural mental capacity and abilities to work with concepts, abstractions, and details.**Formal education.**Physics, quantum chemistry, mathematics, computer science, computer engineering, electrical engineering, engineering in general.**Technical education.**All aspects of technology related to quantum computing.**Specialized training.**Specific technologies, specific products, specific methods, specific tools.**Quantum mindset.**A general awareness and intuition about quantum effects.**Quantum experience.**Actual experience using quantum technologies and exposure to work which is based on quantum effects.**Quantum expertise.**The whole package of raw native intellect, education, training, and experience which an individual, team, or organization brings to the table for projects which have a significant quantum aspect.**Quantum-trained.**Individuals, teams, and organizations whose raw native intellect, education, training, knowledge, skill, expertise, and experience in quantum effects, quantum technologies, and quantum computing qualify them to work on projects with a significant quantum aspect.**Elite.**A much more select set of individuals, teams, and organizations whose raw native intellect, education, training, knowledge, skill, expertise, and experience in quantum effects, quantum technologies, and quantum computing are well beyond what is typical for individuals, teams, and organizations who work in the quantum field. Well above the average for the*quantum-trained*.**Super-elite.**An even more select set of individuals, teams, and organizations whose quantum knowledge and expertise are far beyond what is typical for even*elite*quantum professionals, teams, and organizations.**Quantum Aware.**A much broader set of individuals, teams, and organizations whose raw native intellect, education, training, knowledge, skill, expertise, and experience in quantum effects, quantum technologies, and quantum computing are far more limited than the*elite*, but they do possess a general awareness of the concepts of quantum computing. May or may not be*quantum-trained*.**Quantum Ready.**A more select set of individuals, teams, and organizations whose raw native intellect, education, training, knowledge, skill, expertise, and experience in quantum effects, quantum technologies, and quantum computing are much more limited than the*elite*, but much more sophisticated and useful than those who are merely*Quantum Aware*. More likely to be*quantum-trained*.

*Quantum expertise* puts it all together. It’s the capabilities that individuals, teams, and organizations bring to the table for projects which have a significant quantum aspect.

# What is the potential for quantum-inspired computing?

Personally, I think there is great potential for *quantum-inspired computing* — looking at quantum approaches to algorithms and applications and then attempting to *approximate* or *simulate* them using classical computing. In many situations, *approximations* are actually good enough, especially when the full quantum solutions are not yet *feasible*, especially in the earlier stages of adoption of quantum computing.

One example is using *Monte Carlo simulation* to approximate (actually, *sample*) a full quantum parallel computation. Maybe for, say, a traveling-salesman optimization problem, as an example. Again, it’s an approximation, not a full and accurate solution, but in many situations it may be good enough. Or at least better than a traditional classical solution when a full quantum solution is not yet feasible or otherwise not yet available.

Whether to consider quantum-inspired computing as under the umbrella of quantum computing proper is an interesting semantic problem. I can see it both ways. But that does not take away from its potential. And, it doesn’t take away from the starting point, which is research into quantum approaches to computation.

Develop the quantum algorithms first (possibly using simulators), and then evaluate possible classical approximations that are inspired by the quantum approach.

The question here is how to fit it or blend it into this staged-model of adoption of quantum computing. But I’ll leave that as a separate exercise. For now, I’ll just say that it fits into all stages of the adoption of quantum computing, to varying degrees.

Maybe quantum-inspired solutions could be more common in the early stages when fewer quantum solutions are available, and then less common as quantum solutions mature.

Or, the opposite — that maturing quantum solutions in successive stages of adoption are even more inspiring and open doors that previously were unknown during earlier stages.

Or, anywhere between those two ends of a spectrum.

And maybe all of the above! The possibilities are literally endless.

# Model for stages of adoption for a new technology

The *adoption* of a new technology tends not to be a quick, instant, one-time action, but an intensive and rather drawn out process. The overall thrust of this paper is not the detailed process of adopting a new technology but to highlight and suggest that there will likely be three distinct technology adoption processes. So, a two-level hierarchy of stages — three top-level stages as well as the many stages of adoption of this model for each of those three top-level stages.

For any new technology, here’s the sequence of *stages* that I’ve identified for *the adoption process*, so each of this paper’s three top-level stages has all (or some subset) of these detailed stages:

**Ignorance.**No clue as to the existence of the new technology.**Glimmer.**The new technology catches your eye, somehow.**Awareness.**Starting to think about it.**Initial reaction.**Form an opinion — good, bad, neutral, indifferent, lukewarm, or whatever.**Resistance/Denial.**Feeling it’s an undesirable distraction and has no credible value. Could deny relevance, significance, or importance.**Acknowledgment.**Recognize that it does have some value.**Familiarization.**Informally coming up to speed on the new technology.**Knowledge.**Learn about it. In detail.**Evaluation.**Experimentation and prototyping to learn more about the technology in action, not just the theory on paper.**Acceptance.**Know enough to accept that it really is a viable proposition.**Conviction.**Strong feeling about its value.**Commitment.**You’re hooked.**Adoption decision.**Explicit decision to follow through on commitment. Make it happen. The decision to make it happen, coupled with everything that follows needed to make it actually happen.**Prioritize.**Even if committed, how high a priority will it be?**Advance planning.**General scoping of the issue. Overview of what needs to be done and what it will take to do it, and how to go about it.**Budgeting.**Add it to the budget — targeted at some chosen time in the future.**Funding.**The money is actually available to spend.**Planning.**The details of making it happen. From strategic planning to fine details. Vision. Mission. Values. Strategic objectives. Strategy. Tactics.**Staffing.**Recruiting and assigning members to the team.**Education and training.**Formally coming up to speed on the new technology. The underlying technology.**Design.**The technical details of utilization of the technology.**Implementation.**Realizing the technical details.**Development.**Including both design and implementation. And testing and validation. Including performance testing, performance characterization, and benchmarking.**Validation.**Everything required to test and confirm that the technology really does work as claimed.**Deployment.**Making it available to users and customers.**Customer and user education and training.**Formally supporting customers and users to come up to speed on using the deployed new technology.**Access.**Actually putting it in people’s hands.**Usage.**People are actually using it.**Customer support and engagement.**Assuring that real users can actually use it and use it effectively.**Ecosystem development.**Nurturing all of the partners and allied technologies that support and enable the new technology.**Realization of potential.**And business value. People can finally actually see its value in action — they experience its value. The whole point of adoption in the first place.**Maintenance.**Sometimes things go wrong or break, or something changes.**Enhancement and renewal.**Always striving to do more and do it better.**Evolution.**Rinse and repeat on major changes. Each required a new decision and new commitment and new follow-through.**Evangelism.**Raise awareness in others.**Obsolescence.**Loss of relevance. Or, no longer delivers value or sufficient value.**Retirement.**Eventually something replaces it, so it’s no longer needed. Or, it’s no longer needed even if nothing replaces it.**Historical.**A record of what transpired and why. The results. An analysis and critique.

Note to self: Turn all of this section into a separate paper on the technology adoption process.

# Original proposed topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

**Three stages of adoption for quantum computing: The ENIAC Moment, configurable packaged quantum solutions, and The FORTRAN Moment.**The initial stage of adoption — The ENIAC Moment — for quantum computing solutions relies on super-elite STEM professionals using a wide range of tricks and supreme cleverness to achieve solutions. For example, manual error mitigation. The second stage — configurable packaged quantum solutions — also relies on similar super-elite professionals to create frameworks for solutions which can then be configured by non-elite professionals to achieve solutions. Those non-elite professionals are able to prepare their domain-specific input data in a convenient form compatible with their non-elite capabilities, but not have to comprehend or even touch the underlying quantum algorithms or code. The final stage — The FORTRAN Moment — relies on a much more advanced and high-level programming model, application frameworks, and libraries, as well as logical qubits based on full, automatic, and transparent quantum error correction to enable non-elite professionals to develop solutions from scratch without the direct involvement or dependence on super-elite professionals.

# Summary and conclusions

- The essential goal is to gain as widespread adoption of quantum computing as possible in the shortest amount of time.
- There will be many stages in the adoption of quantum computing.
- The first three stages will be critical and establish the initial widespread adoption and usage.
- The first stage, The ENIAC Moment, will mark the first significant production-scale practical real-world quantum application.
- That initial success will be replicated and extended.
- But, all of the stage 1 work will require super-elite technical teams, placing such efforts beyond the reach of most organizations.
- Full quantum error correction (QEC) will not be needed for Stage 1. Near-perfect qubits coupled with some minimal degree of manual error mitigation should be sufficient.
- Stage 1 will have a lot of visibility, but only minimal actual application results. No widespread usage, but a necessary technical milestone.
- The second stage, the deployment of configurable packaged quantum solutions, will mark the first wave of widespread adoption.
- Super-elite technical teams will still be required to design and create configurable packaged quantum solutions.
- But normal, average, non-elite technical teams at even average organizations will be able to configure and deploy those configurable packaged quantum solutions without any of the intensive quantum expertise that was needed to design and create those solutions. All of the quantum expertise lies buried deep under the hood of the configurable packaged quantum solutions.
- As with stage 1, full quantum error correction (QEC) will not be needed for Stage 2. Near-perfect qubits coupled with some minimal degree of manual error mitigation should be sufficient.
- A relatively modest collection of configurable packaged quantum solutions will be able to meet many of the quantum needs of a wide range of organizations. Certainly not all of their needs, but enough that quantum computing is now not only very visible but achieving a fairly high volume of very practical applications.
- What remains unaddressed after that second stage is custom applications.
- The third stage, The FORTRAN Moment, finally enables non-elite technical teams to design, implement, and deploy full-custom quantum applications with literally no quantum expertise. As with configurable packaged quantum solutions, all of the hard-core quantum expertise lies buried deep under the hood of more advanced programming models, application frameworks, higher-level algorithmic building blocks, rich libraries, and even quantum-native programming languages, which enable non-elite professionals to develop solutions from scratch without the direct involvement or dependence on super-elite quantum professionals or even any quantum expertise.
- The third stage also ushers in the era of fault-tolerant quantum computing with perfect logical qubits which are enabled by full, automatic, and transparent quantum error correction.
- Even in stage 3, not all quantum applications will need quantum error correction. Some applications will run fine with only near-perfect qubits.
- Each stage builds on and extends the previous stage, so that by the third stage there will be a mix of very high-end applications designed and developed by super-elite technical teams, widespread deployment of configurable packaged quantum solutions, and a growing population of custom quantum applications based on fault-tolerant quantum computing and higher-level programming models.
- But it doesn’t end with these three stages. They are only the beginning, the start of widespread adoption, comparable to where classical computing was in the late 1950’s and early 1960’s — very impressive and widespread, but with an even brighter future in the years and decades ahead.
- Stage 3 won’t be unlike the confluence of the FORTRAN programming language and the transistor which really kicked classical computing into high gear in 1958.
- Each of these stages requires an increasing level of hardware capability. Some of that improved hardware gets used for raw performance and capacity, but a fair amount of it gets used to make it easier for non-elite technical teams to design and implement quantum solutions.
- Or, put another way, a
*decreasing*level of quantum expertise is needed for successive stages. - How quantum-inspired computing might fit into this staged model of adoption is an open question, but it does appear to have significant potential. I discuss some possibilities, but leave it as a separate exercise.
- When might all of this happen? Stage 1 (The ENIAC Moment) in two to four years, nominally in three years. Stage 2 (Configurable packaged quantum solutions) in another one to three years, three to seven years from now, nominally five years from now. Stage 3 (The FORTRAN Moment) in another two to four years, five to eleven years from now, nominally eight years from now. Or six to nine years from now on a looser basis. Those are all just wild but educated guesses.

For more of my writing: ** List of My Papers on Quantum Computing**.