Thoughts on the IBM Quantum Hardware Roadmap

  1. When will quantum computers support production-scale applications?
  2. When will quantum computers achieve quantum advantage (or quantum supremacy) for production-scale applications?
  1. Positive highlights.
  2. Negative highlights.
  3. My own interests.
  4. The IBM roadmap itself.
  5. Graphic for the IBM quantum hardware roadmap.
  6. Earlier hint of a roadmap.
  7. I’m not so interested in support software and tools.
  8. Too short — need more detail for longer-term aims, beyond 2023, just two years from now.
  9. Too brief — need more detail on each milestone.
  10. Limited transparency — I’m sure IBM has the desired detail in their internal plans.
  11. When will quantum error correction (QEC) be achieved?
  12. Need roadmap milestones for nines of qubit fidelity.
  13. Need roadmap milestones for qubit measurement fidelity.
  14. When might IBM get to near-perfect qubits?
  15. What will the actual functional transition milestones be on the path to logical qubits?
  16. Will there be any residual error for logical qubits or will they be as perfect as classical bits?
  17. Will future machines support only logical qubits or will physical qubit circuits still be supported?
  18. What functional advantages might come from larger numbers of qubits?
  19. Need milestones for granularity of phase and probability amplitude.
  20. Need timeframes and milestones for size supported for both quantum phase estimation and quantum Fourier transform.
  21. When will quantum chemists (among others) be able to rely on quantum phase estimation and quantum Fourier transform?
  22. When or will IBM support a higher-level programming model?
  23. When will larger algorithms — like using 40 qubits — become possible?
  24. When could a Quantum Volume of 2⁴⁰ be expected?
  25. When will IBM develop a replacement for the Quantum Volume metric?
  26. When will IBM need a replacement for the Quantum Volume metric?
  27. How large could algorithms be on a 1,121-qubit Condor?
  28. When might The ENIAC Moment be achieved?
  29. When might The FORTRAN Moment be achieved?
  30. When might quantum advantage be achieved?
  31. Will IBM achieve even minimal quantum advantage by the end of their hardware roadmap?
  32. How many bits can Shor’s algorithm handle at each stage of the roadmap?
  33. What applications or types of applications might be enabled in terms of support for production-scale data at each milestone?
  34. Not clear whether or when quantum networking will be supported.
  35. Quantum is still a research program at IBM — and much more research is required.
  36. Quantum computers are still a laboratory curiosity, not a commercial product.
  37. When will IBM offer production-scale quantum computing as a commercial product (or service)?
  38. Quantum Ready? For Who? For What?
  39. Quantum Hardware Ready is needed.
  40. Need for higher-quality (and higher-capacity) simulators.
  41. Need for debugging capabilities.
  42. Need for testing capabilities.
  43. Need for dramatic improvements in documentation and technical specifications at each milestone.
  44. Brief comments on IBM’s roadmap for building an open quantum software ecosystem.
  45. Maybe many of the milestones and details which interest me occur beyond the end of the current roadmap
  46. Heads up for other quantum computing vendors — all of these comments apply to you as well!
  47. Summary and conclusions.

Positive highlights

  1. IBM’s transparency on putting out such a roadmap.
  2. The view into the future beyond the next year or two — including a path to 1,000 qubits, a million qubits, and beyond.
  3. The mention of error correction and logical qubits.
  4. The mention of linking quantum computers to create a massively parallel quantum computer.
  5. The prospect of achieving 100 qubits sometime this year.

Negative highlights

  1. Disappointing that it took so long to put the roadmap out. I first heard mention that they had a roadmap back in 2018.
  2. Raises more questions than it answers.
  3. Too short — need more detail for longer-term aims, beyond 2023, just two years from now.
  4. Too brief — need more detail on each milestone.
  5. Needs more milestones. Intermediate stages and further stages. I certainly hope that they are working on more machines than listed over the next three to five years.
  6. Other than raw number of qubits, roughly what can algorithm designers and application developers expect to see in the next two machines, 127-qubit Eagle — in the coming six months, by the end of 2021, and the 433-qubit Osprey in 2022? Obviously a lot can change over the next six to eighteen months, but some sort of expectations need to be set.
  7. Not clear when the quantum processing unit will become modular. When will there be support for more qubits than will fit on a single chip?
  8. Not clear when or whether multiple quantum computers can be directly connected at the quantum level. Comparable to a classical multiprocessor, either tightly-coupled or loosely-coupled.
  9. Not clear whether or when quantum networking will be supported.
  10. Silent as to when error correction and logical qubits will become available.
  11. No milestones given for the path to error correction and logical qubits. What will the actual milestones, the functional transitions really be?
  12. Silent as to when qubit counts will begin to refer to logical qubits. I’m presuming that all qubit counts on the current roadmap are for physical qubits.
  13. Silent as to milestones for capacities of logical qubits, especially for reaching support for practical, production-scale applications.
  14. Silent as any improvements in connectivity between qubits. Each milestone should indicate degree of connectivity. Will SWAP networks still be required? Will full any-to-any connectivity be achieved by some milestone?
  15. Silent as to milestones for improvements to qubit and gate fidelity. No hints for nines of qubit fidelity at each milestone.
  16. Silent as to milestones for improvements to qubit measurement fidelity.
  17. Silent as to when near-perfect qubits might be achieved. High enough fidelity that many algorithms won’t need full quantum error correction.
  18. Silent as to milestones for granularity of phase and probability amplitude.
  19. Silent as to when quantum chemists (among others) will be able to rely on quantum phase estimation and quantum Fourier transform of various sizes. When will quantum phase estimation become practical?
  20. Silent as to the metric to replace quantum volume, which doesn’t work for more than about 50 qubits. Can’t practically simulate a quantum circuit using more than about 50 qubits.
  21. Silent as to the stage at which quantum volume exceeds the number of qubits which can be practically simulated on a classical computer.
  22. Silent as to when larger algorithms — like using 40 qubits — will become possible. When could a Quantum Volume of 2⁴⁰ be expected.
  23. Silent as to how large algorithms could be on a 1,121-qubit Condor. What equivalent of Quantum Volume — number of qubits and depth of circuit — could be expected.
  24. Silent as to when quantum advantage might be expected to be achieved — for any real, production-scale, practical application. Should we presume that means that IBM doesn’t expect quantum advantage until some time after the end of the roadmap?
  25. Silent as to what applications or types of applications might be enabled in terms of support for production-scale data at each milestone.
  26. Silent on the roadmap for machine simulators, including maximum qubit count which can be simulated at each milestone. Silent as to where they think the ultimate wall is for the maximum number of qubits which can be simulated.
  27. Silent as to improvements in qubit coherence and circuit depth at each stage.
  28. Silent as to maximum circuit size and maximum circuit depth which can be supported at each stage.
  29. Silent as to how far they can go with NISQ and which machines might be post-NISQ.
  30. Silent as to when fault-tolerant machines will become available.
  31. Silent as to milestones for various intra-circuit hybrid quantum/classical programming capabilities.
  32. Open question: Will there be any residual error for logical qubits or will they be as perfect as classical bits?
  33. Open question: At some stage, will future machines support only logical qubits or will physical qubit circuits still be supported?
  34. Open question: What will be the smallest machine supporting logical qubit circuits?
  35. Silent as to debugging capabilities.
  36. Silent as to testing capabilities.
  37. It is quite clear that quantum computing is still a research program at IBM, not a commercial product suitable for production use.
  38. Silent as to when quantum computing might transition from mere laboratory curiosity to front-line commercial product suitable for production-scale use cases.
  39. Silent as to how much additional research, beyond the end of the current roadmap, may be necessary before a transition to a commercial product.
  40. Silent as to improvements in documentation and technical specifications at each milestone.

My own interests

  1. When might quantum advantage be achieved — for any real, production-scale, practical application? For minimal quantum advantage (e.g., 2X, 10X, 100X), significant quantum advantage (e.g., 1,000X to 1,000,000X), and dramatic quantum advantage (one quadrillion X)?
  2. How close will each stage come to full quantum advantage? What fractional quantum advantage is achieved at each stage?
  3. What applications might achieve quantum advantage at each stage?
  4. What applications will be supported at each stage which weren’t feasible at earlier stages?
  5. Each successive stage should have some emblematic algorithm which utilizes the new capabilities of that stage, such as more qubits, deeper circuit depth, not just running the same old algorithms with the same number of qubits and circuit depth as for earlier, smaller machines.
  6. What functional advantages might come from larger numbers of qubits, beyond simply that algorithms can handle more data?
  7. Is there any reason to believe that there might be a better qubit technology (alternative to superconducting transmon qubits) down the road, or any reason to believe that no better qubit technology is needed? Does IBM anticipate that there might be a dramatic technology transition at some stage, maybe five, ten, or more years down the road?
  8. Does IBM anticipate that they might actually support more than one qubit technology at some stage? Like, trapped ion?
  9. When or will IBM support a higher-level programming model with higher-level algorithmic building blocks which makes it feasible for non-quantum experts to translate application problems into quantum solutions without knowledge of quantum logic gates and quantum states.
  10. When might The ENIAC Moment be achieved? First production-scale application.
  11. When might The FORTRAN Moment be achieved? Higher-level programming model which makes it easy for most organizations to develop quantum applications — without elite teams.
  12. How many bits can Shor’s algorithm handle at each stage of the roadmap?
  13. Need for a broad set of benchmark tests to evaluate performance, capacity, and precision of various algorithmic building blocks, such as phase estimation, along with target benchmark results for each hardware milestone.
  14. Milestones for optimizing various algorithmic building blocks, such as phase estimation, based on hardware improvements at each stage.
  15. The maximum size of algorithms which can correctly run on the physical hardware at each milestone but can no longer be classically simulated. Number of qubits and circuit depth. Maybe several thresholds for the fraction of correct executions. For now, this could parallel projections of log2(Quantum Volume) and estimate when log2(QV) exceeds the maximum classical quantum simulator capacity.

The IBM roadmap itself

Graphic for the IBM quantum hardware roadmap

Earlier hint of a roadmap

I’m not so interested in support software and tools

Too short — need more detail for longer-term aims, beyond 2023, just two years from now

  1. 3 years.
  2. 5 years.
  3. 7 years.
  4. 10 years.
  5. 12 years.
  6. 15 years.
  7. 20 years.
  8. 25 years. Where is the technology really headed?

Too brief — need more detail on each milestone

  1. Qubit fidelity.
  2. Qubit lattice layout.
  3. Qubit connectivity.
  4. Gate cycle time.
  5. Qubit coherence.
  6. Maximum circuit depth.
  7. Maximum circuit size.
  8. Maximum circuit executions per second.

Limited transparency — I’m sure IBM has the desired detail in their internal plans

When will quantum error correction (QEC) be achieved?

  1. … as we scale up the number of physical qubits, we will also be able to explore how they’ll work together as error-corrected logical qubits — every processor we design has fault tolerance considerations taken into account.
  2. We think of Condor as an inflection point, a milestone that marks our ability to implement error correction and scale up our devices…

Need roadmap milestones for nines of qubit fidelity

  1. Coherence time.
  2. Gate errors. Both single-qubit and two-qubit.
  3. Measurement errors.
  1. Two nines — 99%.
  2. Three nines — 99.9%.
  3. Four nines — 99.99%.
  4. Five nines — 99.999%.
  5. Six nines — 99.9999%.
  6. Whether IBM has intentions or plans for more than six nines of qubit fidelity should be specified. Seven, eight, nine, and higher nines of qubit fidelity would be great, but will likely be out of reach in the next two to four years.
  7. What maximum qubit fidelity, short of quantum error correction, could be achieved in the longer run, beyond the published roadmap, should also be specified.

Need roadmap milestones for qubit measurement fidelity

When might IBM get to near-perfect qubits?

What will the actual functional transition milestones be on the path to logical qubits?

Will there be any residual error for logical qubits or will they be as perfect as classical bits?

  1. Six nines — one error in a million operations.
  2. Nine nieces — one error in a billion operations.
  3. Twelve nines — one error in a trillion operations.
  4. Fifteen nines — one error in a quadrillion operations.

Will future machines support only logical qubits or will physical qubit circuits still be supported?

What functional advantages might come from larger numbers of qubits?

Need milestones for granularity of phase and probability amplitude

Need timeframes and milestones for size supported for both quantum phase estimation and quantum Fourier transform

  1. 4-bit.
  2. 8-bit.
  3. 12-bit.
  4. 16-bit.
  5. 20-bit.
  6. 24-bit.
  7. 32-bit.
  8. 40-bit.
  9. 48-bit.
  10. 56-bit.
  11. 64-bit.
  12. 80-bit.
  13. 96-bit.
  14. 128-bit.
  15. 192-bit.
  16. 256-bit.

When will quantum chemists (among others) be able to rely on quantum phase estimation and quantum Fourier transform?

When or will IBM support a higher-level programming model?

When might The ENIAC Moment be achieved?

When might The FORTRAN Moment be achieved?

When will larger algorithms — like using 40 qubits — become possible?

  1. 24 qubits.
  2. 28 qubits.
  3. 32 qubits.
  4. 36 qubits.
  5. 40 qubits.
  6. 44 qubits.
  7. 48 qubits.
  8. 50 qubits.
  9. 56 qubits.
  10. 60 qubits.
  11. 64 qubits.
  12. And more.

When could a Quantum Volume of 2⁴⁰ be expected?

When will IBM develop a replacement for the Quantum Volume metric?

When will IBM need a replacement for the Quantum Volume metric?

How large could algorithms be on a 1,121-qubit Condor?

When might quantum advantage be achieved?

  1. Minimal quantum advantage. A 1,000X performance advantage over classical solutions. 2X, 10X, and 100X (among others) are reasonable stepping stones.
  2. Substantial or significant quantum advantage. A 1,000,000X performance advantage over classical solutions. 20,000X, 100,000X, and 500,000X (among others) are reasonable stepping stones.
  3. Dramatic quantum advantage. A one quadrillion X (one million billion times) performance advantage over classical solutions. 100,000,000X, a billion X, and a trillion X (among others) are reasonable stepping stones.

Will IBM achieve even minimal quantum advantage by the end of their hardware roadmap?

How many bits can Shor’s algorithm handle at each stage of the roadmap?

  1. 5-bit.
  2. 6-bit.
  3. 7-bit.
  4. 8-bit.
  5. 10 bit.
  6. 12-bit.
  7. 16-bit.
  8. 20-bit.
  9. 24-bit.
  10. 32-bit.

What applications or types of applications might be enabled in terms of support for production-scale data at each milestone?

Not clear whether or when quantum networking will be supported

Quantum is still a research program at IBM — and much more research is required

Quantum computers are still a laboratory curiosity, not a commercial product

When will IBM offer production-scale quantum computing as a commercial product (or service)?

Quantum Ready? For Who? For What?

Quantum Hardware Ready is needed

Need for higher-quality (and higher-capacity) simulators

Need for debugging capabilities

Need for testing capabilities

  1. Unit testing.
  2. Module testing.
  3. System testing.
  4. Performance testing.
  5. Logic analysis.
  6. Coverage analysis.
  7. Shot count and circuit repetitions — analyzing results for multiple executions of the same circuit.
  8. Calibration.
  9. Diagnostics.
  10. Hardware fault detection.

Need for dramatic improvements in documentation and technical specifications at each milestone

Brief comments on IBM’s roadmap for building an open quantum software ecosystem

  1. IBM’s software roadmap is too brief, too terse, and too vague to make many definitive comments about it.
  2. It sort of hints at a higher-level programming model, but in a fragmentary manner, not fully integrated, and doesn’t even use the term programming model at all.
  3. It does indeed have some interesting fragmentary thoughts, but just too little in terms of a coherent overarching semantic model. Some pieces of the puzzle are there, but not the big picture that puts it all together.
  4. I heartily endorse open source software, but there is a wide range of variations on support for open source software. Will IBM cede 100% of control to outside actors or maintain 100% control but simply allow user submissions? Who ultimately has veto authority about the direction of the software — the community or IBM?
  5. I heartily endorse ecosystems as well, but that can be easier said than done.
  6. I nominally support their three levels (they call them segments) of kernel, algorithms, and models, but I would add two levels: custom applications, and then packaged solutions (generalized applications.) From the perspective of this (my) paper, I’m focused on the programming model(s) to be used by algorithm developers and application developers.
  7. I personally use the term algorithmic building blocks, which may or may not be compatible with IBM’s notion of modules. My algorithmic building blocks would apply primarily to algorithm designers, but also to application developers (custom and packaged) and application framework developers as well.
  8. IBM also refers to application-specific modules for natural science, optimization, machine learning, and finance, which I do endorse, but I also personally place attention on general-purpose algorithmic building blocks which can be used across application domains. Personally, I would substitute domain-specific for application-specific.
  9. I personally use the term application framework, which may be a match to IBM’s concept of a model.
  10. In their visual diagram, IBM refers to Enterprise Clients, but that seems to refer to enterprise developers.
  11. I appreciate IBM’s commitment to a frictionless development framework, but it’s all a bit too vague for me to be very confident about what it will actually do in terms of specific semantics for algorithms and applications. Again, I’m not so interested in support services and tools as I am in the actual semantics of the programming model.
  12. IBM says “where the hardware is no longer a concern to users or developers”, but that’s a bit too vague. Does it mean they aren’t writing code at all? Or does it simply mean a machine-independent programming model? Or does it mean a higher-level programming model, such as what I have been proposing? Who knows! IBM needs to supply more detail.
  13. I’m all in favor of domain-specific pre-built runtimes — if I understand IBM’s vague description, which seem consistent with my own thought about packaged solutions which allow the user to focus on preparing input data and parameters, and then processing output data without even touching or viewing the actual quantum algorithms or application source code. That said, I worry a little that their use of runtime may imply significant application logic that invokes the runtime rather than focussing the user on data and configuration parameters. I do see that the vast majority of users of quantum applications won’t even be writing any code, but how we get there is an open question. In any case, this paper of mine is focused on quantum algorithm designers and quantum application developers and how they see and use the hardware.
  14. Kernel-level code is interesting, but not so much to me. Maybe various algorithmic building blocks, such as quantum Fourier transform or SWAP networks could be implemented at kernel level, but ultimately, all I really care about is the high-level interface that would be available to algorithm designers and application developers — the programming model, their view of the hardware. The last thing I want to see is algorithm designers and application developers working way down at machine-specific kernel level.
  15. I heartily endorse application-specific modules for natural science, optimization, machine learning, and finance — at least at a conceptual level. Anything that enables users or application developers to perform computations at the application level without being burdened by details about either the hardware or quantum mechanics. All of that said, I can’t speak to whether I would approve of how IBM is approaching the design of these modules. Also, I am skeptical as to when the hardware will be sufficiently mature to support such modules at production-scale.
  16. I nominally endorse quantum model services for natural science, optimization, machine learning, and finance — at least at a conceptual level. If I read the IBM graphic properly, such model services won’t be available until 2023 at the earliest and possibly not until 2026. Even there, it’s not clear if it’s simply that all of the lower-level capabilities are in place to enable model developers to develop such application-specific models, or whether such models will then be ready for use by application developers.
  17. No mention of any dependencies on hardware advances, such as quantum error correction and logical qubits, improvements in qubit fidelity, improvements in qubit connectivity.
  18. No mention of Quantum Volume or matching size of algorithms and required hardware.
  19. No sense of synchronizing the hardware roadmap and the software roadmap.
  20. No mention of networked applications or quantum networking.
  21. No mention of evolution towards vendor-neutral technical standards. The focus is clearly on IBM setting the standards for others to follow. That may not be so much a negative as simply a statement of how young and immature the sector remains.

Maybe many of the milestones and details which interest me occur beyond the end of the current roadmap

Heads up for other quantum computing vendors — all of these comments apply to you as well!

Summary and conclusions

  1. Great that IBM has shared what they have for a roadmap.
  2. Disappointing that it took so long to get it out.
  3. More questions than answers.
  4. Much greater detail is needed.
  5. Full error correction is still far over the horizon.
  6. Evolution of qubit fidelity between milestones is unclear.
  7. Not very clear what developers will really have to work with at each milestone, especially in terms of coherence time, qubit fidelity, gate error rate, measurement error rate, and connectivity.
  8. Waiting to hear what will succeed Quantum Volume once more than 50 qubits can be used reliably in a deep algorithm.
  9. This is all still just a research program, a laboratory curiosity, not a commercial product (or service) suitable for production use for production-scale practical applications.
  10. Unclear how much more research will be required after the end of the current IBM hardware roadmap before quantum computing can transition to a commercial product suitable for production-scale practical quantum applications.
  11. Unclear what the timeframe will be for transition to a commercial product (or service.)
  12. No sense of when they might achieve The ENIAC Moment — first production-scale application.
  13. No sense of when they might achieve The FORTRAN Moment — easy for most organizations to develop quantum applications — without elite teams.
  14. Unclear whether IBM will achieve even minimal quantum advantage (1,000X classical solutions) by the end of their hardware roadmap (2023 with 1,121-qubit Condor) or whether we’ll have to await the “and beyond” stages after the end of the roadmap.
  15. It’s very possible that many of the milestones or details which interest me might occur well beyond the end of IBM’s current quantum hardware roadmap — in the “and beyond” stage beyond the 1,121-qubit Condor, the current end of the roadmap.
  16. Many, most, if not virtually all of my comments here apply to any vendor in quantum computing, including IBM’s competitors and everyone’s partners, suppliers, and customers as well. And researchers in academia as well. Show your roadmaps, your milestones, and the details I have noted.
  17. For now, we remain waiting for the next machine on the roadmap — 127-qubit Eagle — in the coming six months, by the end of 2021, and the 433-qubit Osprey in 2022.
  18. Overall, we’re still in the early innings for quantum computing — and not close to being ready for prime time with production-scale practical applications — or even close to achieving any significant degree of quantum advantage, the purpose for even considering quantum computing.

--

--

--

Freelance Consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jack Krupansky

Jack Krupansky

Freelance Consultant

More from Medium

Quantum Game Theory — A Mathematical Venture

An introduction to Quantum Computing

Ten Fundamental Facts about Quantum Computing

From meat packing to quantum computing: vertical integration