Quantum Computing Advances We Need to See Over the Coming 12 to 18 to 24 Months to Stay on Track

  1. In a nutshell
  2. On track for what?
  3. Preparing for The ENIAC Moment
  4. Will we have achieved The ENIAC Moment in two years?
  5. How close might we be to achieving The ENIAC Moment in two years?
  6. I suspect that we’ll see The ENIAC Moment in roughly three years
  7. To be clear, The ENIAC Moment is neither assured nor required nor assumed in two years
  8. The ENIAC Moment is a demonstration, not necessarily ready for production deployment
  9. Hardware advances needed
  10. Yes, we need more qubits, but…
  11. How many qubits do we need? 160 to 256 for some, 64 to 80 for most, 48 as a minimum
  12. Trapped-ion quantum computers need more qubits
  13. Neutral-atom quantum computers need to be delivered
  14. Algorithms need to be automatically scalable
  15. Rich set of sample quantum algorithms
  16. The seven main quantum application categories
  17. Rich set of sample quantum applications
  18. Take the guesswork out of modeling shot count (circuit repetitions)
  19. Higher standards for documenting algorithms
  20. Near-perfect qubits are required
  21. Perfect logical qubits and quantum error correction (QEC) are not needed in this timeframe
  22. Architectural improvements needed to enhance transmon qubit connectivity
  23. Transmon qubit proponents need to announce a connectivity strategy and roadmap
  24. RIP: Ode to SWAP networks
  25. Unclear how much improvement in coherence time is needed in this time period
  26. Quantum advantage requires dramatic improvement
  27. Still in pre-commercialization
  28. Not yet time for commercialization
  29. Even in two years we may still be deep in pre-commercialization
  30. Avoid premature commercialization
  31. Ongoing research
  32. Need a replacement for Quantum Volume for higher qubit counts
  33. Need application category-specific benchmarks
  34. We need some real application using 100 or even 80 qubits
  35. Some potential for algorithms up to 160 qubits
  36. We’re still in stage 0 of the path to commercialization and widespread adoption of quantum computing
  37. A critical mass of these advances is needed
  38. My Christmas and New Year wish list
  39. Is a Quantum Winter likely in two years? No, but…
  40. Critical technical gating factors which could presage a Quantum Winter in two years
  41. A roadmap for the next two years?
  42. Where might we be in one year?
  43. RIP: Ode to NISQ
  44. On track to… end of the NISQ era and the advent of the Post-NISQ era
  45. RIP: Noisy qubits
  46. Details for other advances
  47. And what advances are needed beyond two years?
  48. My original proposal for this topic
  49. Summary and conclusions

In a nutshell

  1. Current technology is too limited. Current quantum computers are simply not up to the task of supporting any production-scale practical real-world applications.
  2. The ENIAC Moment is not in sight. No hardware, algorithm, or application capable of The ENIAC Moment for quantum computing at this time. Coin flip whether it will be achievable within two years.
  3. Still deep in the pre-commercialization stage. Overall, we’re still deep in the pre-commercialization stage of quantum computing. Nowhere close to being ready for commercialization. Much research is needed.
  4. We’re still in stage 0 of the path to commercialization of quantum computing. Achievement of the technical goals outlined in this paper will leave us poised to begin stage 1, the final path to The ENIAC Moment. The work outlined here lays the foundation for stage 1 to follow.
  5. A critical mass of these advances is needed. But exactly what that critical mass is and exactly what path will be needed to get there will be a discovery process rather than some detailed plan known in advance.
  6. Which advance is the most important, critical, and urgent — the top priority? There are so many vying for achieving critical mass.
  7. Achievement of these technical goals will leave the field poised to achieve The ENIAC Moment.
  8. Achievement of these technical goals will mark the end of the NISQ era.
  9. Achievement of these technical goals will mark the advent of the Post-NISQ era.
  10. No Quantum Winter. I’m not predicting a Quantum Winter in two years and think it’s unlikely, but it’s not out of the question if a substantial fraction (critical mass) of the advances from this paper are not achieved.
  11. Beyond two years? A combination of fresh research, extrapolation of the next two years, and topics I’ve written about in my previous papers. But this paper focuses on the next two years.
  1. Hardware. And firmware.
  2. Support software. And tools.
  3. Algorithms. And algorithm support.
  4. Applications. And application support.
  5. Research in general. In every area.
  1. Qubit fidelity.
  2. Qubit connectivity.
  3. Qubit coherence time and circuit depth.
  4. Qubit phase granularity.
  5. Qubit measurement.
  6. Quantum advantage.
  7. Near-perfect qubits are required.
  8. Trapped-ion quantum computers need more qubits.
  9. Neutral-atom quantum computers need to support 32 to 48 qubits.
  10. Support quantum Fourier transform (QFT) and quantum phase estimation (QPE) on 16 to 20 qubits. And more — 32 to 48, but 16 to 20 minimum. Most of the hardware advances are needed to support QFT and QPE.
  1. A replacement for the Quantum Volume metric is needed. For circuits using more than 32–40 qubits and certainly for 50 qubits and beyond.
  2. Need application category-specific benchmarks. Generic benchmarks aren’t very helpful to real users.
  3. 44-qubit classical quantum simulators for reasonably deep quantum circuits.
  1. Quantum algorithms using 32 to 40 qubits are needed. Need to be very common, typical.
  2. Some real algorithm using 100 or even 80 qubits. Something that really pushes the limits.
  3. Some potential for algorithms up to 160 qubits.
  4. Reasonably nontrivial use of quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  5. Algorithms need to be automatically scalable. Generative coding and automated analysis to detect scaling problems.
  6. Higher-level algorithmic building blocks. At least a decent start, although much research is needed.
  7. Rich set of sample quantum algorithms. A better starting point for new users.
  8. Modeling circuit repetitions. Take the guesswork out of shot count (circuit repetitions.)
  9. Higher standards for documenting algorithms. Especially discussion of scaling and quantum advantage.
  1. Some real applications using 100 or even 80 qubits. Something that really pushes the limits.
  2. At least a few applications which are candidates or at least decent stepping stones towards The ENIAC Moment.
  3. Applications using quantum Fourier transform (QFT) and quantum phase estimation (QPE) on 16 to 20 qubits. And more — 32 to 48, but 16 to 20 minimum.
  4. At least a few stabs at application frameworks.
  5. Rich set of sample quantum applications. A better starting point for new users.
  1. Higher-level programming models.
  2. Much higher-level and richer algorithmic building blocks.
  3. Rich libraries.
  4. Application frameworks.
  5. Foundations for configurable packaged quantum solutions.
  6. Quantum-native programming languages.
  7. More innovative hardware architectures.
  8. Quantum error correction (QEC).

On track for what?

  1. Experiments.
  2. Demonstrations.
  3. Prototypes.
  4. Proofs of concept.
  5. Toy algorithms.
  6. Toy applications.

Preparing for The ENIAC Moment

Will we have achieved The ENIAC Moment in two years?

How close might we be to achieving The ENIAC Moment in two years?

  1. Indeed we’ve achieved it.
  2. A mere matter of a few months away.
  3. Another six months.
  4. Another year.
  5. Another two years.
  6. Another three years?

I suspect that we’ll see The ENIAC Moment in roughly three years

To be clear, The ENIAC Moment is neither assured nor required nor assumed in two years

The ENIAC Moment is a demonstration, not necessarily ready for production deployment

Hardware advances needed

  1. Qubit fidelity. Requires dramatic improvement. Gate execution errors too high. See near-perfect qubits.
  2. Qubit connectivity. Requires dramatic improvement. Well beyond nearest-neighbor. May require significant architectural improvements, at least for transmon qubits.
  3. Qubit coherence time and circuit depth. Requires dramatic improvement.
  4. Qubit phase granularity. Requires dramatic improvement. Need much finer granularity of phase — and probability amplitude. In order to support large quantum Fourier transforms (QFT) and quantum phase estimation (QPE).
  5. Qubit measurement. Requires dramatic improvement. Error rate is too high.
  6. Quantum advantage. Requires dramatic improvement. Need to hit minimal quantum advantage first — 1,000X a classical solution, and then evolve towards significant quantum advantage — 1,000,000X a classical solution. True, dramatic quantum advantage of one quadrillion X is still off over the horizon.
  7. Near-perfect qubits are required. We need to move beyond noisy NISQ qubits ASAP. 3.5 to four nines of qubit fidelity are needed, maybe three nines for some applications.
  8. But perfect logical qubits enabled by full quantum error correction (QEC) are not needed in this timeframe. They should remain a focus for research, theory, prototyping, and experimentation, but not serious use for realistic applications in this timeframe.
  9. More qubits are not needed. We already have 65 and 127-qubit machines, with a couple of 100-qubit machines due imminently (neutral atoms), but we can’t use all of those qubits effectively for the reasons listed above (qubit fidelity, connectivity, phase granularity, etc.)
  10. 160 to 256 qubits would be useful for some applications. Not needed for most applications, but some could exploit them. Or at least 110 to 125 qubits for more apps.
  11. 64 to 80 qubits would be a sweet spot for many applications. Plenty of opportunity for significant or even dramatic quantum advantage.
  12. 48 qubits as the standard minimum for all applications. Achieving any significant quantum advantage on fewer qubits is not practical.
  13. Trapped-ion quantum computers need more qubits. 32 to 48 qubits in two years, if not 64 to 72 or even 80.
  14. Neutral-atom quantum computers need to support 32 to 48 qubits. 100 qubits have been promised, but something needs to be delivered.
  15. Support quantum Fourier transform (QFT) and quantum phase estimation (QPE) on 16 to 20 qubits. And more — 32 to 48, but 16 to 20 minimum. Most of the hardware advances are needed to support QFT and QPE.

Yes, we need more qubits, but…

  1. We don’t have algorithms ready to use those qubits. Or even the qubits that we already have. No 27-qubit, 65-qubit, or 127-qubit algorithms.
  2. Mediocre qubit fidelity prevents full exploitation of more qubits. Or even the qubits that we already have.
  3. Mediocre qubit connectivity prevents exploitation of more qubits. Or even the qubits that we already have.
  4. Limited coherence time and limited circuit depth prevents exploitation of more qubits. Or even the qubits that we already have.
  1. We definitely need more qubits.
  2. But not as the top priority. Or even the second or third priority.
  3. Quantum error correction (QEC) will indeed require many more qubits, but that’s a longer-term priority. Three to five years, not one to two years.

How many qubits do we need? 160 to 256 for some, 64 to 80 for most, 48 as a minimum

Trapped-ion quantum computers need more qubits

Neutral-atom quantum computers need to be delivered

Algorithms need to be automatically scalable

Rich set of sample quantum algorithms

The seven main quantum application categories

  1. Simulating physics.
  2. Simulating chemistry.
  3. Material design.
  4. Drug design.
  5. Business process optimization.
  6. Finance. Including portfolio optimization.
  7. Machine learning.

Rich set of sample quantum applications

Take the guesswork out of modeling shot count (circuit repetitions)

Higher standards for documenting algorithms

  1. Discussion of scaling.
  2. Discussion of calculating shot count (circuit repetitions). A more analytic approach rather than guesswork and trial and error. Including estimation beyond the input size tested by the paper itself.
  3. Discussion of quantum advantage.
  4. Discussion of specific hardware requirements to achieve acceptable results for problems which cannot be adequately solved today.
  5. Place all code, test data, and results in a GitHub repository. In a standardized form and organization.
  6. Commenting conventions for code. Both quantum and classical.

Near-perfect qubits are required

Perfect logical qubits and quantum error correction (QEC) are not needed in this timeframe

Architectural improvements needed to enhance transmon qubit connectivity

Transmon qubit proponents need to announce a connectivity strategy and roadmap

RIP: Ode to SWAP networks

Unclear how much improvement in coherence time is needed in this time period

  1. A few dozen gates. A slam-dunk minimum.
  2. 50 gates.
  3. 75 gates.
  4. 100 gates.
  5. 175 gates.
  6. 250 gates.
  7. 500 gates.
  8. 750 gates.
  9. 1,000 gates.
  10. 1,500 gates.
  11. 2,500 gates.
  12. 5,000 gates.
  13. 10,000 gates.
  14. More?
  1. Total circuit size vs. maximum circuit depth. Does it matter or not?
  2. How might algorithms evolve over the next two years? The long slow march to production-scale. Advent of use of quantum Fourier transform (QFT) and quantum phase estimation (QPE) for nontrivial input sizes.
  3. Requirements for larger quantum Fourier transforms (QFT). For transform sizes in the range of 8 to 32 or even 40 bits. What will their key limiting factor(s) be? Will qubit fidelity and connectivity be bigger issues than coherence time and maximum circuit depth?

Quantum advantage requires dramatic improvement

Still in pre-commercialization

  1. Many technical questions remain unanswered.
  2. Much research is needed.
  3. Experimentation is the norm.
  4. Prototyping is common.
  5. Premature commercialization is a very real risk. The technology just isn’t ready. Avoid commercialization until the technology is ready. But premature attempts to commercialize quantum computing will be common.

Not yet time for commercialization

Even in two years we may still be deep in pre-commercialization

Avoid premature commercialization

  1. The technology just isn’t ready.
  2. Avoid commercialization until the technology is ready.
  3. But premature attempts to commercialize quantum computing will be common.
  4. Premature commercialization can lead to disenchantment over broken or unfulfilled promises.
  5. Disenchantment due to premature commercialization is the surest path to a Quantum Winter.

Ongoing research

  1. Higher-level programming models.
  2. Much higher-level and richer algorithmic building blocks.
  3. Rich libraries.
  4. Application frameworks.
  5. Foundations for configurable packaged quantum solutions.
  6. Quantum-native programming languages.
  7. Identify and develop more advanced qubit technologies. Which are more inherently isolated. More readily and reliably connectable. Finer granularity of phase and probability amplitude.
  8. More innovative hardware architectures. Such as modular and bus architectures for qubit connectivity.
  9. Quantum error correction (QEC). To enable perfect logical qubits.
  10. Application category-specific benchmarks.
  11. Some real application using 100 or even 80 qubits. Something that really pushes the limits. We do need these within two years, but it’s a stretch goal, so it may take another year or two after that.
  12. Modeling circuit repetitions. Take the guesswork out of shot count (circuit repetitions.)
  13. And more.

Need a replacement for Quantum Volume for higher qubit counts

Need application category-specific benchmarks

We need some real application using 100 or even 80 qubits

Some potential for algorithms up to 160 qubits

  1. 16 qubits.
  2. 24 qubits.
  3. 32 qubits.
  4. 40 qubits
  5. 56 qubits.
  6. 64 qubits.
  7. 72 qubits.
  8. 80 qubits.
  9. 96 qubits.
  10. 112 qubits.
  11. 128 qubits.
  12. 136 qubits.
  13. 144 qubits.
  14. 156 qubits.
  15. 160 qubits.

We’re still in stage 0 of the path to commercialization and widespread adoption of quantum computing

  1. A few hand-crafted applications (The ENIAC Moment). Limited to super-elite technical teams.
  2. A few configurable packaged quantum solutions. Focus super-elite technical teams on generalized, flexible, configurable applications which can then be configured and deployed by non-elite technical teams. Each such solution can be acquired and deployed by a fairly wide audience of users and organizations without any quantum expertise required.
  3. Higher-level programming model (The FORTRAN Moment). Which can be used by more normal, average, non-elite technical teams to develop custom quantum applications. Also predicated on perfect logical qubits based on full, automatic, and transparent quantum error correction (QEC).

A critical mass of these advances is needed

Which advance is the most important, critical, and urgent — the top priority?

  1. Higher qubit fidelity is needed.
  2. Greater Qubit connectivity is needed.
  3. Greater qubit coherence time and circuit depth are needed.
  4. Finer granularity of qubit phase is needed.
  5. Higher fidelity qubit measurement is needed.
  6. Quantum advantage needs to be reached.
  7. Near-perfect qubits are required.
  8. Algorithms need to be automatically scalable. Generative coding and automated analysis to detect scaling problems.
  9. Support quantum Fourier transform (QFT) and quantum phase estimation (QPE) on 16 to 20 qubits. And more — 32 to 48, but 16 to 20 minimum. Most of the hardware advances are needed to support QFT and QPE.
  1. Hardware capable of supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE). Hopefully for at least 20 to 30 qubits, but even 16 or 12 or 10 or even 8 qubits would be better than nothing. Support for 40 to 50 if not 64 qubits would be my more ideal wish, but just feels way too impractical at this stage. QFT and QPE are the only viable path to quantum parallelism and ultimately to dramatic quantum advantage.
  1. Qubit fidelity. Low error rate.
  2. Qubit connectivity. Much better than only nearest neighbor. Sorry, SWAP networks don’t cut it.
  3. Gate fidelity. Low error rate. Especially 2-qubit gates.
  4. Measurement fidelity. Low error rate.
  5. Coherence time and circuit depth. Can’t do much with current hardware.
  6. Fine granularity of phase. At least a million to a billion gradations — to support 20 to 30 qubits in a single QFT or QPE.
  1. Fine granularity of phase. At least a million to a billion gradations — to support 20 to 30 qubits in a single QFT or QPE.
  1. Qubit fidelity of 3.5 nines. Approaching near-perfect qubits. I really want to see four nines (99.99%) within two years, but even IBM is saying that they won’t be able to deliver that until 2024. Trapped-ions may do better.

My Christmas and New Year wish list

  1. Qubit fidelity of 3.5 nines. Approaching near-perfect qubits. I really want to see four nines (99.99%) within two years, but even IBM is saying that they won’t be able to deliver that until 2024. Trapped-ions may do better.
  2. Hardware capable of supporting quantum Fourier transform (QFT) and quantum phase estimation (QPE). Hopefully for at least 20 to 30 qubits, but even 16 or 12 or 10 or even 8 qubits would be better than nothing. Support for 40 to 50 if not 64 qubits would be my more ideal wish, but just feels way too impractical at this stage. QFT and QPE are the only viable path to quantum parallelism and ultimately to dramatic quantum advantage.
  3. Fine granularity of phase. Any improvement or even an attempt to characterize phase granularity at all. My real wish is for at least a million to a billion gradations — to support 20 to 30 qubits in a single QFT or QPE, but I’m not holding my breath for this in 2022.
  4. At least a handful of automatically scalable 40-qubit algorithms. And plenty of 32-qubit algorithms. Focused on simulation rather than real hardware since qubit quality is too low. Hopefully using quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  5. Simulation of 44 qubits. I really want to see 48-qubit simulation — and plans for research for 50 and 54-qubit simulation. Including support for deeper circuits, not just raw qubit count. And configurability to closely match real and expected quantum hardware over the next few to five to seven years.
  6. At least a few programming model improvements. Or at least some research projects initiated.
  7. At least a handful of robust algorithmic building blocks. Which are applicable to the majority of quantum algorithms.
  8. At least one new qubit technology. At least one a year until we find one that really does the job.
  9. At least some notable progress on research for quantum error correction (QEC) and logical qubits. Possibly a roadmap for logical qubit capacity. When can we see even a single logical qubit and then two logical qubits, and then five logical qubits and then eight logical qubits?
  10. A strong uptick in research spending. More new programs and projects as well as more spending for existing programs and projects.

Is a Quantum Winter likely in two years? No, but…

  1. Critical technical metrics not achieved. Such as low qubit fidelity and connectivity, or still no 32 to 40-qubit algorithms.
  2. Quantum Winter alert. Early warning signs. Beyond the technical metrics. Such as projects seeing their budgets and hiring frozen.
  3. Quantum Winter warning. Serious warning signs. Some high-profile projects failing.
  4. Quantum Winter. Difficulty getting funding for many projects. Funding and staffing cuts. Slow pace of announced breakthroughs.
  5. Severe Quantum Winter. Getting even worse. Onset of psychological depression. People leaving projects. People switching careers.

Critical technical gating factors which could presage a Quantum Winter in two years

  1. Lack of near-perfect qubits.
  2. Still only nearest-neighbor connectivity for transmon qubits. Transmon qubits haven’t adapted architecturally.
  3. Lack of reasonably fine granularity of phase. Needed for quantum Fourier transform (QFT) and quantum phase estimation (QPE) to enable quantum advantage, such as for quantum computational chemistry in particular.
  4. Less than 32 qubits on trapped-ion machines.
  5. Dearth of 32 to 40-qubit algorithms.
  6. Failure to achieve even minimal quantum advantage.
  7. Rampant and restrictive IP (intellectual property) impeding progress. Intellectual property (especially patents) can either help or hinder adoption of quantum computing.

A roadmap for the next two years?

Where might we be in one year?

RIP: Ode to NISQ

  1. NPTSQ. Near-perfect tiny/toy-scale quantum devices. One to 23 near-perfect qubits.
  2. NPMSQ. Near-perfect medium-scale quantum devices. 24 to 49 near-perfect qubits.

On track to… end of the NISQ era and the advent of the Post-NISQ era

RIP: Noisy qubits

Details for other advances

And what advances are needed beyond two years?

My original proposal for this topic

  • Quantum computing advances we need to see over the coming 12 to 18 to 24 months to stay on track. Qubit fidelity — 3 to 4 nines — 2.5, 2.75, 3, 3.25, 3.5, 3.75 — moving towards near-perfect qubits. Qubit connectivity — Trapped ions or Bus architecture for transmon qubits. Fine phase granularity. Reasonably nontrivial use of quantum Fourier transform (QFT) and quantum phase estimation (QPE). To be viable and relevant, trapped ion machines need to grow to at least 32 to 48 qubits, if not 64 to 72 or even 80. Need to see lots of 32 to 40-qubit algorithms. Need alternative to Quantum Volume capacity and quality metric to support hardware and algorithms with more than 50 qubits. Need to see 44-qubit classical quantum simulators for reasonably deep quantum circuits. Is that about it? 64 to 128 qubits should be enough for this time period.

Summary and conclusions

  1. Current quantum computing technology is too limited. Current quantum computers are simply not up to the task of supporting any production-scale practical real-world applications.
  2. The ENIAC Moment is not in sight. No hardware, algorithm, or application capable of The ENIAC Moment for quantum computing at this time. Coin flip whether it will be achievable within two years.
  3. We’re still deep in the pre-commercialization stage. Overall, we’re still deep in the pre-commercialization stage of quantum computing. Nowhere close to being ready for commercialization. Much research is needed.
  4. Advances are needed in all of the major areas. Hardware (and firmware), support software (and tools), algorithms (and algorithm support), applications (and application support), and research in general (in every area.)
  5. Higher qubit fidelity is needed.
  6. Greater Qubit connectivity is needed.
  7. Greater qubit coherence time and circuit depth are needed.
  8. Finer granularity of qubit phase is needed.
  9. Higher fidelity qubit measurement is needed.
  10. Quantum advantage needs to be reached.
  11. Near-perfect qubits are required.
  12. Trapped-ion quantum computers need more qubits.
  13. Neutral-atom quantum computers need to support 32 to 48 qubits.
  14. Support quantum Fourier transform (QFT) and quantum phase estimation (QPE) on 16 to 20 qubits. And more — 32 to 48, but 16 to 20 minimum. Most of the hardware advances are needed to support QFT and QPE.
  15. A replacement for the Quantum Volume metric is needed. For circuits using more than 32–40 qubits and certainly for 50 qubits and beyond.
  16. Need application category-specific benchmarks. Generic benchmarks aren’t very helpful to real users.
  17. Need 44-qubit classical quantum simulators for reasonably deep quantum circuits.
  18. Quantum algorithms using 32 to 40 qubits are needed. Need to be very common, typical.
  19. Need to see some real algorithm using 100 or even 80 qubits. Something that really pushes the limits.
  20. Need to see some potential for algorithms up to 160 qubits.
  21. Need to see reasonably nontrivial use of quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  22. Algorithms need to be automatically scalable. Generative coding and automated analysis to detect scaling problems.
  23. Higher-level algorithmic building blocks are needed. At least a decent start, although much research is needed.
  24. Need a rich set of sample quantum algorithms. A better starting point for new users.
  25. Need support for modeling circuit repetitions. Take the guesswork out of shot count (circuit repetitions.)
  26. Need higher standards for documenting algorithms. Especially discussion of scaling and quantum advantage.
  27. Need to see some real applications using 100 or even 80 qubits. Something that really pushes the limits.
  28. Need to see at least a few applications which are candidates or at least decent stepping stones towards The ENIAC Moment.
  29. Need to see applications using quantum Fourier transform (QFT) and quantum phase estimation (QPE) on 16 to 20 qubits. And more — 32 to 48, but 16 to 20 minimum.
  30. Need to see at least a few stabs at application frameworks.
  31. Need a rich set of sample quantum applications. A better starting point for new users.
  32. Research is needed in general, in all areas. Including higher-level programming models, much higher-level and richer algorithmic building blocks, rich libraries, application frameworks, foundations for configurable packaged quantum solutions, quantum-native programming languages, more innovative hardware architectures, and quantum error correction (QEC).
  33. We’re still in stage 0 of the path to commercialization of quantum computing. Achievement of the technical goals outlined in this paper will leave us poised to begin stage 1, the final path to The ENIAC Moment. The work outlined here lays the foundation for stage 1 to follow.
  34. A critical mass of these advances is needed. But exactly what that critical mass is and exactly what path will be needed to get there will be a discovery process rather than some detailed plan known in advance.
  35. Which advance is the most important, critical, and urgent — the top priority? There are so many vying for achieving critical mass.
  36. Achievement of these technical goals will leave the field poised to achieve The ENIAC Moment.
  37. Achievement of these technical goals will mark the end of the NISQ era.
  38. Achievement of these technical goals will mark the advent of the Post-NISQ era.
  39. Quantum Winter is unlikely. I’m not predicting a Quantum Winter in two years and think it’s unlikely, but it’s not out of the question if a substantial fraction (critical mass) of the advances from this paper are not achieved.
  40. Beyond two years? A combination of fresh research, extrapolation of the next two years, and topics I’ve written about in my previous papers. But this paper focuses on the next two years.

--

--

--

Freelance Consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jack Krupansky

Jack Krupansky

Freelance Consultant

More from Medium

Quantum Computing Overview

From FeMoco to Bitcoin: Universal Quantum answers two major quantum advantage questions

1. What is a quantum computer anyway?

What I Learned From Fire Opal…