Speculative Preview of the IBM 433-qubit Osprey Quantum Computer

Jack Krupansky
61 min readMar 23, 2022

--

After being disappointed by the IBM 127-qubit Eagle quantum computer, I’m anxious as to what’s in store with the upcoming 433-qubit Osprey quantum computer due to be introduced by IBM towards the end of 2022. Will it be basically just more of the same, just with more qubits (and a new refrigerator), or will it be a real breakthrough in any of the areas which really matter to algorithm designers and quantum application developers? This informal paper explores what we do know and speculate about what we don’t know, both what is most likely and what would be best. My hope is that this paper might help to cajole IBM into doing a better job of setting expectations for Osprey and to do a better job of detailing milestones for specific technical factors in their hardware roadmap going forward.

Updates

  1. Since I wrote and posted this back in March 2022, there have been a few additional tidbits of information and speculation to add. See the section UPDATE on October 28, 2022 — Additional information.
  2. And some additional tidbits have come out with the actual unveiling of Osprey at the IBM Quantum Summit on November 9, 2022. See the section UPDATE on November 18, 2022 — Formal unveiling at IBM Quantum Summit 2022 on November 9, 2022. Unfortunately, even with the unveiling, there is still very little actual information available since Osprey won’t actually be available for a few more months.
  3. In short, stay tuned for actual availability sometime in Q1 of 2023.
  4. Oops… as of April 9, 2023, five full months since Osprey’s formal unveiling, IBM has still not made Osprey publicly available, so ignore my previous comment above. For more fresh comments, see the section UPDATE on April 9, 2023 — Five full months since unveiling but still no hint of public availability.
  5. May 8, 2023 — BREAKING NEWS — Osprey has landed! Now publicly available! See initial comments in the section UPDATE on May 8, 2023 — Osprey is now publicly available.
  6. September 7,2023 — BREAKING NEWS — Osprey is now retired. See the section UPDATE on September 7, 2023 — Osprey is now retired. No reason given.

Topics discussed in this paper:

  1. In a nutshell
  2. References
  3. Quantum processor vs. quantum computer
  4. My motivation
  5. Goals for this paper
  6. What I include in this paper
  7. Timing of my paper
  8. Osprey is poised for disappointment
  9. Osprey may offer enough improvement to not be an absolute flop, but disappointing enough to be a relative flop
  10. IBM needs much better messaging
  11. IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available
  12. IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available
  13. What are IBM’s intentions with Osprey?
  14. Are there other intentions of IBM that we just don’t know about, yet?
  15. Osprey is still an impressive engineering achievement under the hood
  16. What value is IBM attempting to offer quantum algorithm designers and quantum application developers with Osprey?
  17. Extrapolation from Eagle
  18. The key technical factors for Osprey that could impact quantum algorithms and quantum applications
  19. The Big Four of the key technical factors which matter the most
  20. General support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) is critical
  21. Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE)
  22. What qubit count has IBM committed for Osprey?
  23. What qubit count is likely in Osprey?
  24. What qubit count does Osprey need to inspire confidence?
  25. How many qubits is enough to keep people happy?
  26. What qubit fidelity has IBM committed for Osprey?
  27. Nines of qubit fidelity
  28. What qubit fidelity is likely in Osprey?
  29. What qubit fidelity does Osprey need to inspire confidence?
  30. Does Osprey need near-perfect qubits to inspire significant excitement? Probably.
  31. What about qubit measurement fidelity for Osprey?
  32. Mediocre qubit measurement fidelity would be a critical flaw for Osprey
  33. What qubit connectivity has IBM committed for Osprey?
  34. What qubit connectivity is likely in Osprey?
  35. What qubit connectivity does Osprey need to inspire confidence?
  36. Will Osprey need some sort of quantum state bus to satisfy demand for greater qubit connectivity?
  37. Is some other hardware mechanism needed to increase qubit connectivity for Osprey?
  38. No, SWAP networks are not a viable alternative to enhanced qubit connectivity
  39. What granularity of phase and probability amplitude has IBM committed for Osprey?
  40. What granularity of phase and probability amplitude is likely in Osprey?
  41. What granularity of phase and probability amplitude does Osprey need to inspire confidence?
  42. What coherence time and circuit depth has IBM committed for Osprey?
  43. What coherence time and circuit depth is likely in Osprey?
  44. What coherence time and circuit depth does Osprey need to inspire confidence?
  45. What has IBM committed to for quantum error correction (QEC) in Osprey?
  46. What do I hope that IBM will deliver for quantum error correction (QEC) in Osprey?
  47. Do I expect a demonstration of quantum error correction in Osprey? No, but…
  48. Maybe a demonstration of quantum error correction in a subsequent revision to Osprey? Possibly, but…
  49. Is the credibility of quantum error correction at stake with Osprey?
  50. IBM technical credibility will suffer if they neither show progress with quantum error correction nor progress towards near-perfect qubits
  51. What Quantum Volume has IBM committed for Osprey?
  52. What Quantum Volume is likely in Osprey?
  53. What Quantum Volume does Osprey need to inspire confidence?
  54. Quantum Volume cannot use more than 50 qubits (and maybe only 40 or even 32 qubits)
  55. Topology and connectivity of Osprey’s 433 qubits
  56. Will Osprey be modular?
  57. IBM Quantum System Two
  58. Will the new dilution refrigerator offer any functional advantage to quantum applications?
  59. What speed or throughput will Osprey have?
  60. Will Osprey support Qiskit Runtime?
  61. Is Osprey simply a bigger Eagle (more qubits)?
  62. Concerns from Eagle which may or may not apply to Osprey as well
  63. Competition from other qubit technologies
  64. IBM quantum is still in research mode, not mainline commercial product engineering, yet
  65. When might IBM transition their quantum efforts from research to commercial product development?
  66. Unfortunately, parts of IBM are acting as if quantum was a commercial product and engaging in premature commercialization
  67. Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization
  68. Still a mere laboratory curiosity
  69. Still more appropriate for the lunatic fringe rather than mainstream application developers
  70. Risk for Quantum Winter
  71. What can we expect from future revisions of Osprey?
  72. I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey
  73. UPDATE on October 28, 2022 — Additional information
  74. UPDATE on November 18, 2022 — Formal unveiling at IBM Quantum Summit 2022 on November 9, 2022
  75. UPDATE on April 9, 2023 — Five full months since unveiling but still no hint of public availability
  76. UPDATE on May 8, 2023 — Osprey is now publicly available
  77. UPDATE on September 7, 2023 — Osprey is now retired
  78. My original proposal for this topic
  79. Summary and conclusions

In a nutshell

There is really so much to say, but let me try to summarize it as briefly as possible…

  1. Osprey is poised for disappointment. Numerous technical obstacles.
  2. Plenty of hope for Osprey. But hope is not a plan or certainty.
  3. IBM needs to up their game to avoid disaster. Technical obstacles can be overcome, but only through much more serious effort.
  4. Overall, Osprey is just an upsized Eagle. More qubits, miniaturized, but individually not much better functionally.
  5. My overall disappointment with the IBM 127-qubit Eagle. No improvement in qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.
  6. My overall disappointment with the IBM hardware roadmap. No milestones or detail for quantum error correction (QEC), qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.
  7. Low qubit fidelity and weak qubit connectivity. No dramatic improvement over Eagle.
  8. Mediocre qubit measurement fidelity. Same issues as with Eagle. A critical weakness.
  9. Still an impressive engineering achievement under the hood. But most of that internal engineering effort doesn’t affect fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.
  10. Maybe enough improvement to not be an absolute flop, but disappointing enough to be a relative flop.
  11. IBM needs much better messaging. Needs to set expectations more accurately.
  12. Osprey probably needs near-perfect qubits to inspire any significant excitement.
  13. Only four details we know for sure about Osprey…
  14. Osprey will have 433 qubits.
  15. Key advancement of Osprey will be miniaturization of components.
  16. Osprey will be based on the new quantum hardware infrastructure of the IBM Quantum System Two.
  17. The IBM Quantum System Two incorporates a new cryogenic refrigerator from Bluefors.
  18. IBM hasn’t committed to any improvement in qubit fidelity.
  19. IBM hasn’t committed to any improvement in granularity of phase and probability amplitude.
  20. Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE). Essential for advanced applications such as quantum computational chemistry.
  21. IBM hasn’t committed to any improvement in coherence time and circuit depth.
  22. IBM hasn’t committed to any advances in quantum error correction (QEC) in Osprey.
  23. But I would hope that IBM would demonstrate at least a small handful of perfect logical qubits in Osprey. Maybe six or seven.
  24. IBM hasn’t committed to any improvement in Quantum Volume (QV).
  25. Advances or lack of advances in Osprey will confirm, set, or undermine IBM’s technical credibility.
  26. Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization.
  27. Still a mere laboratory curiosity. Not ready for production-scale practical real-world quantum applications.
  28. Still more appropriate for the lunatic fringe rather than mainstream application developers.
  29. IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available. Including Quantum Volume (QV) and qubit fidelity.
  30. IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available. Including a Principles of Operation document which details the programming model.
  31. I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey. A 27-qubit Falcon with enhanced connectivity and another 1.5 nines of qubit fidelity would be much more beneficial for quantum algorithm designers and quantum application developers than whatever Osprey might deliver. Hummingbird and Eagle were distractions rather than substantial advances for quantum algorithm designers and quantum application developers.
  32. My hope is that this paper might help to cajole IBM into doing a better job of setting expectations for Osprey and to do a better job of detailing milestones for specific technical factors in their hardware roadmap going forward.
  33. Some late-breaking updates. See the section UPDATE on October 28, 2022 — Additional information.
  34. Some information at the unveiling at the IBM Quantum Summit. See the section UPDATE on November 18, 2022 — Formal unveiling at IBM Quantum Summit 2022 on November 9, 2022.
  35. In short, stay tuned for actual availability sometime in Q1 of 2023. Oops… that was the expectation, but obviously that didn’t happen.
  36. Stay tuned for actual public availability, whenever. No idea when that will be now. See the section UPDATE on April 9, 2023 — Five full months since unveiling but still no hint of public availability.
  37. UPDATE on May 8, 2023 — Osprey is now publicly available.
  38. UPDATE on September 7, 2023 — Osprey is now retired.

References

My preliminary comments on the IBM 127-qubit Eagle:

There are a number of references for Eagle in that paper.

IBM quantum hardware roadmap which contains the first reference to Osprey:

IBM press release for Eagle which also previews the IBM Quantum System Two and mentions that it will work with 433-qubit processors (and beyond), although it doesn’t name Osprey by name:

Random IBM post (no date) which mentions Osprey:

  • 5 Things to Know About the IBM Roadmap to Scaling Quantum Technology
  • Eagle will be followed by the 433-qubit “Osprey” processor in 2022. Osprey continues to push the boundaries of fabrication techniques to build a smaller chip to ensure more logical qubits that don’t sacrifice performance. Its more-efficient and denser controls and cryogenic infrastructure will ensure that scaling up future processors doesn’t sacrifice the performance of individual qubits, introduce further sources of noise, or take up too large a footprint.
  • https://newsroom.ibm.com/IBM-research?item=32425

Quantum processor vs. quantum computer

Technically, Osprey is a quantum processor rather than a quantum computer per se.

The quantum processor is where all of the computation is performed. The actual chip. All of the rest of the hardware is the quantum computer system, or simply quantum computer, or as IBM refers to it, the quantum system.

Most of the quantum system is common, regardless of the actual quantum processor chip. So, the 127-qubit Eagle, the 65-qubit Hummingbird, and the 27-qubit Falcon all share the same overall quantum system, called the IBM Quantum System One.

Osprey and future quantum processors, including the 1,121-qubit Condor, will all share a new overall quantum system, called the IBM Quantum System Two. All of that hardware other than the processor chip is the same, regardless of which processor chip is used.

There is also the wiring and electronics to drive the wiring, but that is all the same, just one for each qubit or a sequence of qubits for newer systems with serial readout.

All of that said, I personally will continue to refer to these as quantum computers — the 433-qubit Osprey quantum computer. Maybe that’s because I’m a software guy and it’s the functions under the hood which matter most, regardless of how it is all sliced, diced, and packaged.

My motivation

My personal motivation for this informal paper is twofold:

  1. My overall disappointment with Eagle. A big advance in qubit count and great internal engineering improvements, but no benefits in any of the main technical areas. No advances which would benefit quantum algorithm designers or quantum application developers.
  2. My overall disappointment with the IBM hardware roadmap. Lack of technical detail. Other than a brief mention and qubt count, virtually nothing. No milestones for quantum error correction (QEC). No milestones for qubit fidelity. No milestones for the other key technical factors.

I will endeavor to:

  1. List all known details about Osprey.
  2. Predict or speculate about possible details about Osprey.

Goals for this paper

Briefly, my intentions are to:

  1. Detail what we know.
  2. Speculate on what we think is likely.
  3. Speculate on what we don’t know.
  4. Express aspirations.
  5. Express concerns with the intent that maybe IBM will respond in some constructive manner.
  6. Overall, set expectations for what Osprey will, might, could, won’t, and is unlikely to be.

What I include in this paper

My writing in this informal paper focuses on:

  1. What I expect from IBM and Osprey.
  2. What I am concerned about.
  3. What I worry about.
  4. Opportunities which IBM might miss.
  5. Missteps with Eagle which IBM might repeat with Osprey.
  6. Premonitions, nightmares.
  7. Poor messaging. Missed opportunities for great messaging from IBM.

Timing of my paper

It will be quite a few months from when I post this paper until IBM actually introduces Osprey later this year, 2022. My rationale for posting in advance of the formal announcement and this far in advance is twofold:

  1. Long enough lead time to have some hope of influencing IBM’s planning and efforts. I don’t actually expect that IBM will pay any attention to what I write, but I do feel an ethical obligation to let them know about what I see that concerns me.
  2. Hope to have some impact on IBM’s messaging. Try to get them to communicate more explicitly about the key benefits of Osprey over Eagle and Falcon. They need to do a much better job of setting expectations than they’ve done in the past, especially for Eagle, but for all of their other quantum processors as well.

Osprey is poised for disappointment

Overall, there are just too many technical obstacles for Osprey to overcome for it to be anything more than a significant disappointment. Yes, there is plenty of hope for Osprey, but hope is not a plan or certainty.

IBM can still overcome the technical challenges, but they need to up their game to avoid disaster. Technical obstacles can be overcome, but only through much more serious effort.

Some of the reasons why Osprey is poised to be a disappointment:

  1. Overall, Osprey is just an upsized Eagle. More qubits, miniaturized, but individually not much better functionally.
  2. My overall disappointment with the IBM 127-qubit Eagle. No improvement in qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.
  3. My overall disappointment with the IBM hardware roadmap. No milestones or detail for quantum error correction (QEC), qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.
  4. Low qubit fidelity and weak qubit connectivity. No dramatic improvement over Eagle.
  5. Mediocre qubit measurement fidelity. Same issues as with Eagle. A critical weakness.
  6. Still an impressive engineering achievement under the hood. But most of that internal engineering effort doesn’t affect fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.
  7. Maybe enough improvement to not be an absolute flop, but disappointing enough to be a relative flop.
  8. IBM needs much better messaging. Needs to set expectations more accurately.

Osprey may offer enough improvement to not be an absolute flop, but disappointing enough to be a relative flop

I seriously don’t expect Osprey to be a total flop. I do expect a variety of improvements, but I still suspect that they won’t be enough to avoid serious disappointment.

Eagle was much more disappointing than I expected.

I do hope that IBM learned some valuable and insightful lessons from Eagle which will result in fewer missteps and more improvements in Osprey.

But on net, I fully expect a significant deficit of sentiment, biased against Osprey.

IBM needs much better messaging

Quite a few of the issues I’ve raised in previous papers concerning Eagle and the IBM quantum hardware roadmap really come down to just basic messaging, just setting expectations realistically.

The overall goal of messaging for quantum computers is to give quantum algorithm designers and quantum application developers the critical information they need to make the best use of the technology.

Some specific action items:

  1. IBM needs to set expectations more accurately.
  2. IBM needs to avoid overpromising.
  3. IBM needs to focus attention on where they excel, their strong points.
  4. IBM needs to be much more direct and honest as to their technical shortcomings — and offer action plans for how and when they will be addressing each technical shortcoming.
  5. The IBM quantum hardware roadmap needs to be much more explicit in terms of milestones and specific technical features. Whether it’s increments of improvement in qubit fidelity, architectural changes for qubit connectivity, or the staging of quantum error correction.
  6. Benchmarking results need to be a formal aspect of required messaging. This includes Quantum Volume (QV) and qubit fidelity.
  7. Full technical documentation and technical specifications also need to be a formal aspect of required messaging.

IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available

Benchmarking results for Osprey should be formally disclosed at the time that Osprey is announced and made available. They should be fairly comprehensive, and certainly include Quantum Volume (QV) and qubit fidelity (nines of qubit fidelity) very prominently. This is all part of messaging.

Qubit fidelity is an excellent surrogate for the overall capabilities of a quantum computer.

For more on nines of qubit fidelity, see my paper:

IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available

Full technical documentation and technical specifications also need to be a formal aspect of required messaging, and be provided when Osprey is formally announced and made available.

Technical specifications must include a formal Principles of Operation document which provides a sufficient level of detail about the programming model of the quantum computer so that quantum algorithm designers and quantum application developers can effectively exploit the technical capabilities of the quantum computer.

I describe the concept of a Principles of Operation document in my paper:

What are IBM’s intentions with Osprey?

Some questions about IBM’s intentions with Osprey:

  1. What problem is IBM trying to solve?
  2. What did IBM intend to do with Osprey?
  3. What’s the point of Osprey?
  4. What is IBM attempting to accomplish with Osprey?
  5. What value is IBM attempting to offer quantum algorithm designers and quantum application developers? Other than simply more of the same qubits as Eagle.
  6. Was the intent simply to offer more qubits?
  7. Was the intent simply to miniaturize qubits and components — but not make them function any different or better?
  8. Is Osprey simply an internal interim engineering stepping stone rather than explicitly offering additional features or functions or enhancements per se?
  9. Is Osprey more of an internal hardware improvement rather than functionally better?
  10. Who might actually benefit from all of these extra qubits?
  11. Are there other intentions that we just don’t know about, yet?

Unfortunately, none of these questions has answers at this stage. IBM hasn’t said. We could speculate, but that’s about it.

The closest we can come to an answer is simply:

  1. Osprey offers a lot more qubits. Each qubit is likely comparable to a qubit on Eagle.
  2. IBM focused on miniaturization to enable more qubits.
  3. Functionally, the qubits should behave approximately the same as on Eagle from the perspective of quantum algorithm designers and quantum application developers. Just that there are more of them.

Are there other intentions of IBM that we just don’t know about, yet?

I presume that there may very well be additional intentions that we won’t know about until IBM introduces Osprey late this year. And maybe not even then.

What might they be? All we can do is speculate.

But if indeed Osprey does have functional, feature, and performance improvements, they will quickly become apparent when Osprey is introduced, but until then we will be flying blind.

Osprey is still an impressive engineering achievement under the hood

Despite all of the potential technical shortcomings that I might highlight for Osprey, it still will have quite a bit of impressive engineering achievement under the hood.

  1. Miniaturizing qubits and control circuits is no trivial feat.
  2. Isolating, maintaining, and connecting 433 qubits is no trivial feat.
  3. A new cryogenic dilution refrigerator is no trivial feat.

IBM’s hardware engineers and scientists should be applauded.

But most of that internal engineering effort doesn’t affect qubit fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.

What value is IBM attempting to offer quantum algorithm designers and quantum application developers with Osprey?

Other than simply more of the same qubits as Eagle, as far as I can tell at this junction, IBM doesn’t seem to be intending to offer any real, significant value to quantum algorithm designers and quantum application developers with Osprey.

So, if you are a quantum algorithm designer or quantum application developer, you might as well stick with Eagle or even Falcon.

Without dramatic improvements in qubit fidelity or qubit connectivity, even 27-qubit Falcon has more qubits than most quantum algorithm designers and quantum application developers can effectively use.

Just a reminder that Eagle offers only a Quantum Volume (QV) of 64 and Falcon offers a Quantum Volume of 128. That’s 6 and 7 qubits respectively as the maximum number that can be used in a quantum circuit to deliver a relatively high fidelity result.

Extrapolation from Eagle

The default in this paper for any unknown details about Osprey will be to extrapolate from Eagle, just with many more qubits.

In many cases this will be fairly accurate.

In most cases this will be fairly reasonable.

In all cases this is all we can do.

The key technical factors for Osprey that could impact quantum algorithms and quantum applications

Each of these key technical factors will be explored in subsequent sections:

  1. Qubit count.
  2. Qubit fidelity. Including qubit measurement fidelity. How close to near-perfect qubits.
  3. Qubit connectivity.
  4. Granularity of phase and probability amplitude.
  5. Coherence time and circuit depth.
  6. Quantum error correction (QEC).
  7. Quantum Volume (QV).

For each of these key technical factors this paper will discuss:

  1. What has IBM committed for Osprey?
  2. What is likely in Osprey?
  3. What does Osprey need to inspire confidence?

There are two other technical factors that will be mentioned but only in passing:

  1. Speed or throughput. Measured in circuit layer operations per second (CLOPS).
  2. Support for Qiskit Runtime.

The Big Four of the key technical factors which matter the most

The previous section identified a number of important technical factors which impact quantum algorithms and quantum applications. Here we identify the Big Four of the technical factors which matter the most:

  1. Higher qubit fidelity.
  2. Greater qubit connectivity.
  3. Finer granularity of phase and probability amplitude. Essential for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  4. Greater circuit depth. Either greater coherence time for the same gate execution time, or faster gate execution time for the same coherence time, or both greater coherence time and faster gate execution time.

This is not to say that the other technical factors don’t matter or might not matter more to some people, but simply that generally these four key technical factors will be the primary determinants of whether people will be excited or disappointed with Osprey when it is finally announced and available for use.

General support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) is critical

Quantum Fourier transform (QFT) and quantum phase estimation (QPE) may be the single most powerful and important algorithmic tool available to quantum algorithm designers and quantum application developers. It is critical, for example, to quantum computational chemistry. Without it, dramatic quantum advantage may not even be possible. Sure, small, toy algorithms can get by without it, but any major, large, complex, and sophisticated quantum algorithm is likely to require it.

People will likely get very excited if Osprey does indeed support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Similarly, people will likely be rather disappointed if Osprey does not support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) does require all of the Big Four key technical factors:

  1. Higher qubit fidelity.
  2. Greater qubit connectivity.
  3. Finer granularity of phase and probability amplitude.
  4. Greater circuit depth.

Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE)

Fine granularity of phase is tricky. Granularity of phase and probability amplitude are never adequately documented or even understood for current quantum computers. This needs to change since fine granularity of phase is essential to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE), which are needed for advanced quantum applications such as quantum computational chemistry.

For more detail on this issue, see my paper:

What qubit count has IBM committed for Osprey?

A qubit count of 433 qubits is solidly committed by IBM for Osprey.

What qubit count is likely in Osprey?

A qubit count of 433 qubits is likely for Osprey.

What qubit count does Osprey need to inspire confidence?

A qubit count of 433 is just fine and actually a lot more than most people will need in the near term.

How many qubits is enough to keep people happy?

People don’t seem ecstatic with the 127 qubits of Eagle, but I don’t think their lack of ecstasy is due to a need for more qubits, but a need for higher qubit fidelity and greater connectivity.

This leaves open the question of exactly how many qubits people really do need to design, develop, and use quantum algorithms to address production-scale practical real-world problems. The possibilities:

  1. 127 or 128 qubits. Personally, I think this is enough qubits. I recall some people suggesting that 105 qubits or 125 qubits would be needed for various quantum computational chemistry applications, for example. But they need qubit fidelity, qubit connectivity, fine granularity of phase, and greater circuit depth before they can actually use that many qubits.
  2. 160 qubits. Should be a sufficient margin for many applications.
  3. 192 qubits. Ditto.
  4. 256 qubits. I’m not even sure which applications need this, but most applications should be covered.
  5. 384 qubits. Ditto.

So, 433 qubits should be fine for Osprey. But only if they have higher fidelity, greater connectivity, finer granularity of phase, and greater circuit depth.

What qubit fidelity has IBM committed for Osprey?

IBM has made no commitment as to what qubit fidelity can be expected in Osprey.

Shocking, but true, believe it or not.

Nines of qubit fidelity

Qubit fidelity (reliability) is an excellent surrogate overall indicator of the overall capability of a quantum computer. It doesn’t matter how many qubits you have if they are not reliable.

Reliability of qubits can best be expressed as nines of qubit fidelity — 99.9% reliability is three nines of qubit fidelity.

For more on nines of qubit fidelity, see my paper:

What qubit fidelity is likely in Osprey?

A qubit fidelity roughly comparable to that of Eagle is most likely in Osprey.

Unfortunately, that might only be 1.8 nines at best.

Possibly even worse.

Unlikely to be much better.

But, that’s based purely on extrapolating from Eagle.

What qubit fidelity does Osprey need to inspire confidence?

I don’t think quantum algorithm designers and quantum application developers will get excited by less than a minimum of 3 nines. Preferably 3.5 nines.

Three nines would be the bare minimum that would maintain confidence in Osprey.

3.5 nines would engender at least a modicum of excitement in Osprey.

Four nines of qubit fidelity would definitely inspire confidence in Osprey, but that seems beyond reach at this stage.

Some possibilities:

  1. 1.8 nines. Comparable to Eagle. Not impressive. In fact, very disappointing.
  2. Two nines. Better than Eagle, but not great.
  3. 2.5 nines. Substantially better than Eagle, but still not great.
  4. 2.75 nines. Maybe the minimal acceptable, but still not great.
  5. Three nines. Target for minimum acceptable.
  6. 3.25 nines. Starting to look appealing.
  7. 3.5 nines. Moderately appealing.
  8. 3.75 nines. Moderately impressive.
  9. Four nines. Impressive. Near-perfect qubits. This is what we really should be seeing to believe that we really are on track for supporting production-scale practical real-world quantum applications.

For more on nines of qubit fidelity, see my paper:

Does Osprey need near-perfect qubits to inspire significant excitement? Probably.

As just noted, I do think that near-perfect qubits would inspire significant excitement in Osprey, but the question is whether Osprey can inspire excitement without near-perfect qubits? I’m not sure what the right answer is.

It is very possible that other technical factors could inspire some excitement.

But it just feels less likely that Osprey could inspire any truy dramatic excitement if qubit fidelity is still low, under three to 3.5 nines.

So I lean towards answering that near-perfect qubits are probably needed for Osprey to inspire any significant excitement or confidence.

For more on near-perfect qubits, see my paper:

What about qubit measurement fidelity for Osprey?

For the purposes of this paper, I consider qubit measurement fidelity to be bundled into qubit fidelity even though it is generally characterized separately.

All of my comments above about qubit fidelity generally apply to qubit measurement fidelity as well.

Technically, qubit measurement fidelity in Osprey could diverge from overall qubit fidelity in Osprey, but we can’t know that at this time, so I presume that qubit measurement fidelity will be roughly comparable to what it is for Eagle.

Unfortunately, qubit measurement fidelity in Eagle was fairly mediocre, so it will be somewhat of a disappointment if there is no dramatic improvement in qubit measurement fidelity (and overall qubit fidelity) in Osprey.

Be prepared to be surprised, but don’t be surprised if there is no surprise.

Mediocre qubit measurement fidelity would be a critical flaw for Osprey

Osprey is likely to have the same issues with qubit measurement fidelity as did Eagle. If so, this will be a critical weakness for Osprey.

There’s no point in accurately performing a quantum computation if you can’t accurately read the result. To repeat…

  • There’s no point in accurately performing a quantum computation if you can’t accurately read the result.

What qubit connectivity has IBM committed for Osprey?

IBM has publicly committed nothing in terms of improvements in qubit connectivity beyond what is available in Eagle, Hummingbird, and Falcon.

What qubit connectivity is likely in Osprey?

Osprey is likely to offer only the same limited nearest-neighbor qubit connectivity which is currently found in Eagle, Hummingbird, and Falcon.

What qubit connectivity does Osprey need to inspire confidence?

In my opinion, Osprey needs something more than limited nearest-neighbor qubit connectivity to inspire the confidence of quantum algorithm designers and quantum application developers.

Osprey doesn’t need full any-to-any qubit connectivity, but at least something more than limited nearest-neighbor qubit connectivity.

And to be clear, the use of so-called SWAP networks to shuffle the state of qubits around to simulate qubit connectivity is not an acceptable alternative to full qubit connectivity. Qubit fidelity is far too low to rely on SWAP networks to move quantum state around.

Actually, my own opinion is that Osprey really does need full any-to-any qubit connectivity, even though I concede that it likely won’t get it.

Will Osprey need some sort of quantum state bus to satisfy demand for greater qubit connectivity?

Yes, I personally do think that Osprey — or some other future IBM quantum computer — really does need some new hardware architecture to enable significantly greater qubit connectivity.

But, as much as I think that it is needed, I don’t expect it to happen for Osprey or in the relatively near future.

In my own thinking I refer to two terms:

  1. Quantum state bus.
  2. Dynamically-routable resonator.

In a traditional superconducting transmon qubit quantum computer there is a resonator connecting each pair of qubits which can be operated on by a two-qubit quantum logic gate.

It would be impractical to provide such a resonator for every pair of qubits when the number of qubits is large:

  1. 27 qubits would require 27 * 26 / 2 = 351 resonators.
  2. 65 qubits would require 65 * 64 / 2 = 2,080 resonators.
  3. 127 qubits would require 127 * 126 / 2 = 8,001 resonators.
  4. 433 qubits would require 433 * 432 / 2 = 93,528 resonators.
  5. 1,121 qubits would require 1,121 * 1,120 / 2 = 627,760 resonators.

With a quantum state bus or dynamically-routable resonator each qubit would have an entry ramp and exit ramp to a single, shared resonator and a hardware device to dynamically enable the entry ramp and exit ramp for the pair of qubits to be used in a two-qubit quantum logic gate.

This hardware mechanism would provide full any-to-any qubit connectivity.

This is a purely-speculative hardware mechanism on my part.

I don’t have any expectation that Osprey or any other near-term transmon qubit quantum computer would have such a hardware mechanism, but it is definitely needed.

Is some other hardware mechanism needed to increase qubit connectivity for Osprey?

Absent my suggested quantum state bus, which would provide full any-to-any qubit connectivity, there may be simpler hardware support that could be added to Osprey or some other near-term transmon qubit quantum computer to enable at least some additional qubit connectivity.

Something is definitely needed. Anything more than the very limited nearest-neighbor connectivity of Eagle, Hummingbird, and Falcon.

And to be clear, the use of so-called SWAP networks to shuffle the state of qubits around to simulate qubit connectivity is not an acceptable alternative to full qubit connectivity. Qubit fidelity is far too low to rely on SWAP networks to move quantum state around.

That said, IBM has made no such commitment, and I have no expectation that they will provide such a capability in Osprey, but… we’ll have to see, later this year.

No, SWAP networks are not a viable alternative to enhanced qubit connectivity

To be clear, the use of so-called SWAP networks to shuffle the state of qubits around to simulate qubit connectivity is not an acceptable alternative to full qubit connectivity. Qubit fidelity is far too low to rely on SWAP networks to move quantum state around.

SWAP networks are a significant part of the reason that Quantum Volume (QV) is so low for IBM’s quantum computers, even for their most advanced quantum computer, the 127-qubit Eagle, which has a Quantum Volume of only 64, meaning that only at most six qubits (log2(64) = 6) can be used in a reasonably high-fidelity quantum computation.

What granularity of phase and probability amplitude has IBM committed for Osprey?

IBM has publicly committed nothing in terms of improvements in granularity of phase and probability amplitude beyond what is available in Eagle, Hummingbird, and Falcon.

What granularity of phase and probability amplitude is likely in Osprey?

Osprey is likely to offer only the same granularity of phase and probability amplitude which is currently found in Eagle, Hummingbird, and Falcon.

Granularity could be a little finer than in Eagle.

But granularity could be coarser if engineering tradeoffs were needed in order to accommodate the dramatic boost in qubit count.

What granularity of phase and probability amplitude does Osprey need to inspire confidence?

The main interest in finer granularity of phase and probability amplitude is to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE), as well as quantum amplitude estimation (QAE), so what would inspire confidence in Osprey would be the ability to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE), as well as quantum amplitude estimation (QAE) for some number of bits of precision more than a small handful, such as:

  1. 6 bits. Too trivial for any practical application.
  2. 8 bits. Bare minimum.
  3. 10 bits. Still rather bare-bones minimum.
  4. 12 bits. Still rather bare-bones minimum.
  5. 14 bits. Starting to get nontrivial.
  6. 16 bits. Lower bound of nontrivial.
  7. 20 bits. Nontrivial. Minimum to achieve minimal quantum advantage.
  8. 24 bits. Nontrivial. Starting to get interesting.
  9. 28 bits. Nontrivial. More interesting.
  10. 32 bits. Nontrivial. Possibly even useful. Possibly even enough to achieve substantial quantum advantage.
  11. 48 bits. Definitely nontrivial, and likely needed to achieve dramatic or at least substantial quantum advantage, but well beyond any near-term expectations.
  12. 50 bits. Enough to achieve dramatic quantum advantage, but well beyond near-term expectations.

I’ll be impressed if Osprey can support even a 10 or 12-bit quantum Fourier transform.

I’ll be very impressed if Osprey can achieve a 16-bit quantum Fourier transform. But others may still be disappointed.

What coherence time and circuit depth has IBM committed for Osprey?

IBM has publicly committed nothing in terms of improvements in coherence time and circuit depth beyond what is available in Eagle, Hummingbird, and Falcon.

What coherence time and circuit depth is likely in Osprey?

Osprey is likely to offer only the same coherence time and circuit depth which is currently found in Eagle, Hummingbird, and Falcon.

Coherence time and circuit depth could be a little greater than in Eagle.

But coherence time and circuit depth could be less than in Eagle if engineering tradeoffs were needed in order to accommodate the dramatic boost in qubit count.

What coherence time and circuit depth does Osprey need to inspire confidence?

Coherence time and circuit depth are essential for large and more complex and more sophisticated quantum algorithms, but are moot if qubit fidelity, qubit connectivity, and granularity of phase and probability amplitude are not dramatically improved in Osprey relative to Eagle.

So, even a dramatic improvement in coherence time and circuit depth won’t inspire confidence in Osprey unless dramatic improvements are not simultaneously made for granularity of phase and probability amplitude.

All of that said, I’d be impressed if IBM could double coherence time and circuit depth in Osprey relative to Eagle.

What has IBM committed to for quantum error correction (QEC) in Osprey?

IBM has made an explicit commitment to pursuing quantum error correction (QEC), promising that each new machine will come one step closer to supporting quantum error correction:

That statement with more context:

  • … we’ve struck a delicate balance of connectivity and reduction of crosstalk error with our fixed-frequency approach to two-qubit gates and hexagonal qubit arrangement introduced by Falcon. This qubit layout will allow us to implement the “heavy-hexagonal” error-correcting code that our team debuted last year, so as we scale up the number of physical qubits, we will also be able to explore how they’ll work together as error-corrected logical qubits — every processor we design has fault tolerance considerations taken into account.

Unfortunately, I haven’t been able to find any mention of any advances in quantum error correction in the 127-qubit Eagle quantum computer.

And IBM has made no explicit mention of specific advances in quantum error correction in the 433-qubit Osprey quantum computer.

In fact, even the 1,121-qubit Condor quantum computer lacks any explicit mention of quantum error correction

The only mention of quantum error correction in the roadmap diagram is for the “and beyond — Path to 1 million qubits and beyond” category after 2023 and the 1,121-qubit Condor quantum computer, which has the key advancement caption:

  • Build new infrastructure,
    Quantum error correction

In short, IBM has not committed to any additional support for quantum error correction in Osprey.

That doesn’t mean that they won’t deliver any additional support for quantum error correction, just that there has been no commitment.

What do I hope that IBM will deliver for quantum error correction (QEC) in Osprey?

This is only my own personal speculative hope for some sort of advance for quantum error correction (QEC) in Osprey.

IBM hasn’t made any explicit and specific promises or commitments about the 433-qubit Osprey which is due out later this year (2022), but given the large number of qubits — which no existing algorithms would be ready to utilize, I would hope that IBM would at least attempt to demonstrate at least a handful of perfect logical qubits.

In my personal view, IBM will need to demonstrate five to eight logical qubits with quantum error correction and six nines of qubit fidelity for logical qubits later this year to maintain any sense of technical credibility.

Based on reading some IBM papers, the possibilities would be:

  1. 6 logical qubits. If 65 physical qubits are needed for each logical qubit.
  2. 7 logical qubits. If 57 physical qubits are needed for each logical qubit.

IBM published a paper in 2019/2020 which contains some formulas for calculating physical qubits per logical qubit for a couple of approaches to quantum error correction. This is where the 57 and 65 numbers came from.

The 2019/2020 paper:

The IBM researchers evaluated two approaches:

  1. Heavy hexagon code. 57 physical qubits per logical qubit.
  2. Heavy square code. 65 physical qubits per logical qubit.

Failure to demonstrate even a small handful of perfect logical qubits on Osprey will send shock waves though the field. It may not be enough to trigger a full Quantum Winter by itself, but it would be enough to put everyone on edge so that even some minor additional setbacks, failed advances, or delayed advances could be the straw that breaks the camel’s back and kicks off a deep Quantum Winter.

For more on quantum error correction, logical qubits, and fault-tolerant quantum computing, see my paper:

Do I expect a demonstration of quantum error correction in Osprey? No, but…

Even though I don’t believe that there is a high likelihood of a demonstration of quantum error correction in Osprey, there may still be some level of gossip and rumor which might lead others to expect such a demonstration.

Maybe a demonstration of quantum error correction in a subsequent revision to Osprey? Possibly, but…

Even though the initial release of Osprey may not have any support for or a demonstration of quantum error correction, it’s very possible that a subsequent revision to Osprey might offer such a demonstration.

I can’t discount that possibility, but I can’t confirm it or suggest that it is likely either.

And IBM has made no such commitment or even dropped the slightest hint.

Is the credibility of quantum error correction at stake with Osprey?

IBM and others have been talking up a storm about the potential for quantum error correction (QEC) for quite a few years now with nothing to show for it. They’ve been able to get away with this simply because there were no quantum computers with enough physical qubits to implement even two logical qubits. But now with Osprey they will have enough physical qubits to implement a small handful of logical qubits, not enough to do anything useful, but enough to at least demonstrate several functional and interacting logical qubits, maybe six of seven logical qubits.

If even Osprey isn’t good enough to demonstrate a few logical qubits, then what will be enough?

If Osprey is unable to demonstrate even a few logical qubits, IBM will have a lot of explaining to do — or suffer the consequences of a loss of technical credibility.

That said, IBM didn’t commit, promise, suggest, or even hint that Osprey might support any logical qubits, and didn’t even do that for the 1,121-qubit Condor due out in 2023, so a reasonable case can made that IBM doesn’t have their reputation at risk per se.

Still, I suspect that a lot of people are getting tired of hearing about quantum error correction as always being beyond the horizon and seeing actual hardware with more than enough physical qubits to support quantum error correction but without even a hint of trying to support it might be the straw that breaks the camel’s back of confidence in IBM.

Or maybe it doesn’t completely break the camel’s back, but simply takes enough of the wind out of the sails of enthusiasm for quantum computing — and for IBM — that momentum is severely damaged even if raw technical credibility remains mostly intact.

Failure to demonstrate at least some support for quantum error correction in Osprey won’t likely be the primary trigger for commencing a slide down into a Quantum Winter, but it could sure grease the skids.

The only way out that I can see for IBM if they aren’t going to demonstrate quantum error correction in Osprey is to at least announce a fairly detailed roadmap for how they will progress towards full support for quantum error correction in some future quantum computers.

IBM technical credibility will suffer if they neither show progress with quantum error correction nor progress towards near-perfect qubits

IBM’s technical credibility may also hinge on any progress with qubit fidelity — if they are getting close enough to near-perfect qubits (three to four nines of reliability) so that most people won’t even need quantum error correction, then IBM can get a free pass on lack of progress towards quantum error correction.

On the flip side, if IBM is not able to demonstrate a significant improvement in qubit fidelity in Osprey, then the pressure will be on to show real progress with quantum error correction.

Either alternative can work. Doing neither will cost IBM dearly in terms of technical credibility.

What Quantum Volume has IBM committed for Osprey?

IBM has not committed to any particular Quantum Volume (QV) for Osprey.

IBM has not given any guidance whatsoever.

What Quantum Volume is likely in Osprey?

A Quantum Volume (QV) comparable to Eagle at 64 is most likely.

A QV a little better, at 128, is very possible.

A QV of even 256 might be possible.

It all depends on whether qubit fidelity is much better than Eagle.

And with mediocre qubit connectivity it may be very difficult to get beyond a QV of 256.

Expecting a QV of even 512 may be unrealistic given relatively low expectations for qubit fidelity and qubit connectivity.

What Quantum Volume does Osprey need to inspire confidence?

Anything less than a Quantum Volume (QV) of 256 would be seen as a severe disappointment.

Even a QV of 512 or 1024 would be seen as rather disappointing compared to Honeywell achieving 2048.

A QV of 2048 would at least make Osprey seem competitive with trapped-ion qubits.

A QV of 4096 or higher would more clearly inspire confidence, but I don’t see that in the cards due to relatively low expectations for qubit fidelity and qubit connectivity.

Quantum Volume cannot use more than 50 qubits (and maybe only 40 or even 32 qubits)

Note that Quantum Volume tops at at roughly 2⁵⁰, or more likely 2⁴⁰ or even only 2³², since measurement of Quantum Volume requires simulation of the quantum circuit being tested and simulators can’t handle more than 50 qubits, and even 40 or 32 qubits can be problematic.

A quantum computer can have more than 50 or 40 or 32 qubits, but only 50 or 40 or 32 can be used at a time for measuring Quantum Volume.

For more details on this limit, see my paper:

Topology and connectivity of Osprey’s 433 qubits

Although IBM hasn’t committed or explicitly stated what the qubit topology and connectivity will be for Osprey’s 433 qubits, it’s a fairly safe bet that it will be an extrapolation of Eagle’s 127 qubits.

Essentially, the topology will look like a brick wall with layers or rows of staggered bricks, but with more rows, with more bricks in each row, similar to the layout of Eagle.

For details of the actually topology of qubits in Eagle, see my paper:

Again, this is all mere speculation about Osprey due to lack of detail from IBM.

Will Osprey be modular?

Eagle wasn’t modular. All qubits are on a single chip, one module.

Rigetti’s new 80-qubit processor is modular, based on their 40-qubit quantum computer module.

Might Osprey be modular? Sure, it’s possible, but IBM has given no hint that it might be, so my presumption is that it won’t be modular.

IBM Quantum System Two

An IBM quantum computer can be thought of as two parts:

  1. The quantum processor chip. The heart and brain of the quantum computer.
  2. The overall quantum system. The mechanical, packaging, cryogenics, and classical electronics needed to support the operation of the quantum processor chip.

Technically, none of the details of the overall quantum system should have any effect on what a quantum algorithm designer or a quantum application developer can design and develop, except as overall system improvements might have some impact on technical feasibility of technical features of the quantum processor chip.

The quantum processor chip typically has a name, such as:

  1. Falcon.
  2. Hummingbird.
  3. Eagle.
  4. Osprey.

The overall quantum system is common across quantum processor chips, although some chips may require more advanced quantum systems.

At present, IBM has a single overall quantum system:

  1. IBM Quantum System One. Supports Falcon, Hummingbird, and Eagle chips.

Beginning with Osprey, IBM will be introducing a new overall quantum system:

  1. IBM Quantum System Two.

Actually, IBM Quantum System Two was announced at the same time as Eagle, back in November 2021:

The technical details of IBM Quantum System Two are currently only sketchy — as per the IBM press release:

  1. Modular architecture. Control hardware has the flexibility and resources necessary to scale.
  2. A new generation of scalable qubit control electronics.
  3. Higher-density cryogenic components and cabling.
  4. A new cryogenic platform. Based on a new cryogenic dilution refrigerator. Designed in conjunction with Bluefors, featuring a novel, innovative structural design to maximize space for the support hardware required by larger processors while ensuring that engineers can easily access and service the hardware.
  5. The possibility to provide a larger shared cryogenic work-space. Ultimately leading to the potential linking of multiple quantum processors.

But none of the details of the IBM Quantum System Two should have any effect on what a quantum algorithm designer or a quantum application developer can design and develop, except as overall system improvements might have some impact on technical feasibility of technical features of the quantum processor chip.

Will the new dilution refrigerator offer any functional advantage to quantum applications?

As noted above, the new IBM Quantum System Two which will debut with Osprey will include a new cryogenic dilution refrigerator. The question is whether this new system component is simply cheaper and more efficient but otherwise functionally identical to the old dilution refrigerator. Will the overall system simply operate more efficiently, or will quantum application developers see any functional benefit to applications?

Some questions I have:

  1. Will the new refrigerator have better shielding or otherwise reduce environmental interference so that there is a net improvement in qubit fidelity?
  2. Will the temperature be more stable and result in more consistent results?
  3. Will the refrigerator be substantially cheaper?
  4. Will the refrigerator be substantially cheaper to operate? Less electrical power? Less loss of refrigerant?
  5. Might the new refrigerator have engineering tradeoffs which have a negative impact on qubit fidelity, such as less shielding to reduce cost, or cheaper components to increase qubit count but with lower qubit fidelity?
  6. Will quantum algorithm designers or quantum application developers have to adjust their algorithms to work effectively with the new refrigerator?
  7. Will quantum algorithm designers or quantum application developers be able to take advantage of or otherwise exploit the new refrigerator?
  8. In short, will the new refrigerator be a net gain for quantum algorithms and quantum applications, or a net loss, or a wash, or an uneven mix of gains and losses?

My hope and expectation would be that the new refrigerator would be completely transparent to the work of quantum algorithm designers and quantum application developers. Or, there would be some net improvements. But certainly no net losses.

What speed or throughput will Osprey have?

IBM has neither committed nor hinted what speed or throughput (CLOPS) Osprey will have.

All we can do is extrapolate from Eagle, which had a circuit layer operations per second (CLOPS) of 850, which seemed low compared to a CLOPS of 1.5K for the 65-qubit Hummingbird and 2K (1.8K to 2.4K) for the 27-qubit Falcon.

So, let’s assume that Osprey will come in at 850 CLOPS. But… don’t hold me to it.

Will Osprey support Qiskit Runtime?

Oddly, Eagle didn’t initially support Qiskit Runtime and still doesn’t, even though both Hummingbird and Falcon support it.

IBM has given no explanation for this lack of support for Qiskit Runtime by Eagle.

All we can do is presume that support for Qiskit Runtime will be the same in Osprey as it is in Eagle, which is that it is not supported. But… don’t hold me to it.

Is Osprey simply a bigger Eagle (more qubits)?

To be honest, we just don’t know yet whether Osprey is simply a bigger version of Eagle with the only main difference being more qubits. We do know that components for Osprey are intended to be smaller (miniaturized), and that there will be more qubits, but what we don’t know is whether those qubits will be functionally any different than the qubits of Eagle in terms of the key technical factors that could impact quantum algorithms and quantums applications, as mentioned earlier:

  1. Qubit count. Guaranteed to be different.
  2. Qubit fidelity.
  3. Qubit connectivity.
  4. Granularity of phase and probability amplitude.
  5. Coherence time and circuit depth.
  6. Quantum error correction (QEC).
  7. Quantum Volume (QV).

In short, one of two propositions will be true once Osprey is unveiled later this year:

  1. It’s just a larger version of Eagle — more qubits.
  2. It’s functionally somewhat different from Eagle — besides count of qubits.

Personally, I hope it is different, with significant improvements in all of those key technical factors.

Concerns from Eagle which may or may not apply to Osprey as well

After carefully reviewing the limited detail available from the announcement of Eagle, I developed this list of concerns, posted in my paper on Eagle in December 2021. Some, many, most, or maybe even all of them may apply to Osprey as well. For the sake of argument, presume that one out of every three or four of the concerns will likely still apply to Osprey.

To be clear, this is the exact, literal list from my Eagle paper. In many cases references to Falcon can be read as references to Eagle and references to Eagle can be read as references to Osprey.

Any references to actual performance of Eagle should be read as speculating and extrapolating on comparable results for Osprey.

Once Osprey is finally announced and becomes available for testing, a fresh list can be developed that is specific to Osprey.

  1. No significant benefits to most typical near-term quantum algorithm designers or quantum application developers. All of the engineering is under the hood where most typical users won’t see it. Low qubit fidelity — no significant improvement from previous processors — precludes using more than 20 or so qubits in a single circuit — which can already be done with a 27-qubit Falcon, so the dramatic increase in qubit count isn’t generally functionally useful for most typical users, at present.
  2. No hint of any significant change to the basic core qubit technology. Despite the dramatic overall engineering redesign, there is no hint that the core qubit technology has changed. Presumably IBM would have touted that if it had been improved.
  3. No significant increase in qubit fidelity. Some 27-qubit Falcon processors are better.
  4. No hint of improvement in fine granularity of phase and probability amplitude. Needed for quantum Fourier transform (QFT) and quantum phase estimation (QPE), as well as for more complex algorithms utilizing quantum amplitude estimation (QAE). Needed for quantum computational chemistry, so no significant advance on this front.
  5. No hint of any significant improvement in measurement fidelity. Sorely needed.
  6. No improvement in qubit connectivity. Same topology. Low qubit fidelity limits use of SWAP networks to simulate connectivity.
  7. No significant increase in qubit coherence time. Many 27-qubit Falcon processors are better, some by a lot.
  8. No significant improvement in gate execution time. The minimum does seem to show significant improvement, but the average is not quite as good as ibm_hanoi (27-qubit Falcon), although somewhat better than ibmq_brooklyn (65-qubit Hummingbird.)
  9. No significant increase in circuit depth. Follows qubit coherence time and gate execution time.
  10. No improvement in Quantum Volume (QV). Measured at only 32 as of December 8, 2021. Very disappointing. Worse than Falcon (64 and 128). Matches 65-qubit Hummingbird. I had hoped for 256.
  11. No significant progress in two of the three metrics for progress given by IBM. Scale increased, but no significant increase in quality (QV) or speed (CLOPS).
  12. No support for Qiskit Runtime. At least not initially, but I presume that will come, eventually.
  13. Unlikely to attain any substantial degree of quantum advantage. Due to limited qubit fidelity and limited connectivity.
  14. No documented attempt to implement quantum error correction (QEC) or logical qubits.
  15. Clearly Eagle and IBM are still deep in the pre-commercialization stage of quantum computing, not yet ready to even begin commercialization. Many questions and issues and much research remains. Not even close to commercialization.
  16. No roadmap for enhancements to Eagle. Other than Osprey and Condor being successors. But I want to know about r2, r3, r4, and r5.

To read my original concerns about Eagle in context, consult my paper on Eagle:

Competition from other qubit technologies

Trapped-ion qubits and neutral-atom qubits seem poised to give superconducting transmon qubits such as those from IBM a run for their money.

It will be interesting to see how far these alternative qubit technologies will have advanced over the coming months in contrast to where Osprey will be when it is announced and made available later this year.

There are more exotic qubit technologies under development, but not likely to have any impact this year:

  1. Topological qubits. Microsoft.
  2. Silicon spin qubits. Intel.

Ultimately, comparison of qubit technologies will come down to comparing the big four key technical factors:

  1. Higher qubit fidelity.
  2. Greater qubit connectivity.
  3. Finer granularity of phase and probability amplitude.
  4. Greater circuit depth.

(Assuming the technology supports a sufficient number of qubits as needed by common applications.)

As well as the degree to which nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) are supported.

IBM quantum is still in research mode, not mainline commercial product engineering, yet

Something to keep in mind is that IBM quantum is still a research project at IBM, not a mainstream commercial business unit. The focus remains research, not development, distribution, deployment, and support of revenue-producing commercial products. The focus and product of their current efforts is research, not revenue.

As such, we need to refrain from judging IBM’s quantum efforts as if they were commercial products.

If Eagle and Osprey are necessary stepping stones for research in quantum computing, that’s fine. We shouldn’t judge them as if they were true commercial products. Except… to the degree that IBM and others may talk as if they were commercial products. But we should continually remind everyone that they are not commercial products.

When might IBM transition their quantum efforts from research to commercial product development?

I don’t think anybody does or can know how many more years of research might be required before IBM (or anybody else) finally stumbles on the right formula for a practical quantum computer capable of supporting production-scale practical real-world quantum applications.

Some possibilities:

  1. Two years. Very unlikely.
  2. Three years. Still very unlikely.
  3. Four years. Unlikely. But possible.
  4. Five years. Possibly. Fair bet.
  5. Seven years. More likely.
  6. Ten years. Probably.

Unfortunately, parts of IBM are acting as if quantum was a commercial product and engaging in premature commercialization

IBM’s big push for Quantum Ready is clearly an effort at premature commercialization. IBM and everyone else should let the research get much more settled before attempting to treat the technology as if it were a commercial product.

The main focus right now needs to be on what I call pre-commercialization, focused on research, prototyping, and experimentation, quite a long way from commercialization.

For more on pre-commercialization, see my paper:

And for more on the risk of premature commercialization and the need to focus on research, see my paper:

Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization

Overall, Osprey just feels like a research experiment, which is what it is. And that’s actually okay since that’s what we should expect in the pre-commercialization stage of quantum computing, long before the technology is ready for true commercialization.

IBM does some research, and then the community prototypes and experiments with applications using the results of that research, providing IBM with feedback to improve the next cycle of research. Rinse and repeat. This is a perfectly sane and responsible and productive process — provided that nobody gets the misguided idea that these research results are viable commercial products.

For more on pre-commercialization, see my paper:

Still a mere laboratory curiosity

As with the rest of quantum computing, Osprey will likely be still at the stage of being a mere laboratory curiosity, not even close to being ready for development and deployment of production-scale practical real-world quantum applications.

Much research is still required. Many technical issues remain to be resolved.

Granted, as a laboratory curiosity it is indeed quite appropriate to prototype systems and to experiment with quantum algorithms and quantum applications.

But prototyping and experimentation should not be confused with product engineering and development and deployment of production-scale practical real-world quantum applications.

Being a mere laboratory curiosity is fine for where we are today, focused on prototyping and experimentation, but we run the risk of slipping into a Quantum Winter if we’re still at this stage of being a mere laboratory curiosity two to three years from now.

For more discussion of quantum computing being a mere laboratory curiosity, see my paper:

Still more appropriate for the lunatic fringe rather than mainstream application developers

The lunatic fringe are those super-elite technical staff who are capable and interested in working with a new technology regardless of whether the technology is ready for commercial deployment. As with the rest of quantum computing, Osprey will still be at the stage where its primary appeal is to the lunatic fringe rather than to mainstream application developers.

This is okay for where we are today, but two to three years from now it will be necessary to cater to mainstream application developers rather than the lunatic fringe.

Quantum computing runs the rising risk of falling into a Quantum Winter if it still only appeals to the lunatic fringe two to three years from now.

For more on the lunatic fringe, see my paper:

Risk for Quantum Winter

Disappointment over Osprey alone is unlikely to trigger the onset of a Quantum Winter by itself, but it could end up being a contributing factor and help to set the stage.

Disappointment over Osprey could help to set the stage for a Quantum Winter, and then maybe disappointment over Condor a year later could then become the straw that breaks the camel’s back which helps to trigger the Quantum Winter.

Of course IBM is not the only game in town — other vendors might leapfrog ahead of IBM, so that even if IBM were to have its own Quantum Winter, the overall quantum computing field could still continue in a thriving and vibrant Quantum Summer. Could. How things turn out remains to be seen.

The main technical factors driving whether people become excited or disappointed in Osprey are:

  1. Qubit fidelity. Including qubit measurement fidelity. How close to near-perfect qubits.
  2. Qubit connectivity. Something better than nearest neighbor.
  3. Granularity of phase and probability amplitude. Support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  4. Coherence time and circuit depth.
  5. Quantum error correction (QEC). Some significant sense of progress.
  6. Quantum Volume (QV). Hopefully much better than Eagle, and Falcon as well.

For more on Quantum Winter, see my paper:

What can we expect from future revisions of Osprey?

At this stage, given how speculative expectations for the initial revision of Osprey are, I hesitate to guess what an Osprey 1.1, 1.2, or 2.1 or 2.2 might look like, other than gradual incremental improvements.

And of course we would like to see incremental improvements in all of the Big Four key technical factors:

  1. Higher qubit fidelity.
  2. Greater qubit connectivity.
  3. Finer granularity of phase and probability amplitude.
  4. Greater circuit depth.

Such improvements would drive improvements in Quantum Volume (QV) and support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).

I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey

A 27-qubit Falcon with enhanced connectivity and another 1.5 nines of qubit fidelity would be much more beneficial for quantum algorithm designers and quantum application developers than whatever Osprey might deliver.

The extra qubits of 65-qubit Hummingbird and 127-qubit Eagle were distractions rather than substantial advances for quantum algorithm designers and quantum application developers. The extra qubits were essentially useless since algorithm complexity was severely limited by low qubit fidelity and weak qubit connectivity.

Full any-to-any qubit connectivity would be a huge win for algorithm complexity. That may be too much to ask, but any significant improvement in qubit connectivity would still be a big win.

Even if 1.5 nines of additional qubit fidelity is beyond reach, almost any increase in qubit fidelity would be a huge win.

But simply offering more of the same old noisy and weakly-connected qubits is a great loss, not a win by any measure.

I’d like to see IBM perfect 27-qubit Falcon — in a scalable manner — before pursuing a lot more mediocre qubits.

And then scaling an upgraded Falcon to 32, 36, 40, 44, and 48 qubits with that enhanced connectivity and higher qubit fidelity would similarly be a huge win relative to the distraction of simply adding more noisy and weakly-connected qubits.

Once IBM gets to a high-fidelity 48-qubit quantum computer with high qubit fidelity and full or near-full qubit connectivity, as well as finer granularity of phase and probability amplitude, then and only then will much higher qubit counts be warranted or even useful.

UPDATE on October 28, 2022 — Additional information

Not much information has come out since I originally posted this informal paper back in March. As of this date, I only have the following additional items to discuss:

  1. Improved prospect of Quantum Volume (QV) of 1024. Since March, IBM has achieved Quantum Volume (QV) of 256 and 512 for the Falcon processor. In addition, IBM has publicly committed to achieving QV of 1024 by the end of 2022. Since they haven’t achieved 1024 elsewhere by now, this raises the prospect that Osprey might be where they achieve QV of 1024 in 2022. IBM hasn’t made any commitment to Quantum Volume for Osprey, but at least this is a reasonably strong possibility.Whether IBM might be able to achieve QV even better than 1024 is possible, but neither committed nor a slam dunk, nor highly likely, but still possible.
  2. Renewed uncertainty over whether the IBM Quantum System Two will be used. IBM has used conflicting statements as to when the IBM Quantum System Two will become available, sometimes saying it won’t be available until 2023 and sometimes saying it will be used by all new systems — and Osprey is certainly a new system, but Osprey will supposedly be available in late 2022, ahead of 2023. So, maybe Osprey will use IBM Quantum System Two, but maybe not.
  3. Some possibility for an on-chip non-local coupler for improved qubit connectivity. IBM posted a paper preprint on arXiv in September which briefly alluded to “on-chip non-local couplers” which would provide the ability to perform two-qubit operations on qubits which are not adjacent to each other, without requiring a SWAP network. Most of the references to this feature referred to error correction, such as “On-chip non-local couplers for Non-planar error-correcting code.” Whether these new couplers can be used generally to connect any two qubits on the chip or are reserved solely for error correction for logical qubits is unclear. The paper didn’t indicate when or on which systems this feature would become available. I’m not terribly hopeful that it will be available on Osprey, but it is at least a faint possibility.
  4. Uncertainty as to when and where Osprey will actually be unveiled. Last year Eagle was unveiled at IBM’s Quantum Summit event in November, but no such event is currently scheduled for November, so there is no indication as to when and where Osprey will actually be unveiled. An article in Politico dated October 18, 2022 says “[Jay] Gambetta … says IBM will unveil Osprey, its latest 433-qubit quantum chip, in a matter of weeks.” Unfortunately, “a matter of weeks” could range from two to ten weeks, so that could be anytime in November, or even in December, or maybe not until January. November is still a better bet, although likely not Thanksgiving week, but we just can’t say for sure.

UPDATE on November 18, 2022 — Formal unveiling at IBM Quantum Summit 2022 on November 9, 2022

Unfortunately, very little technical detail was made available when Osprey was formally unveiled at the IBM Quantum Summit 2022 on November 9, 2022.

Well, technically, Osprey was unveiled — they showed the actual chip, but no actual system was shown or available.

As far as availability, IBM stated “which we’ll be making available to our clients in a few months”, but there was no hint whether a few months was more like January, February, or even March. Call it Q1.

Overall, I am not impressed by this release of Osprey. It offers users no meaningful benefit over the 27-qubit Falcon processor.

The press release:

Blog post with some additional information:

IBM Quantum Summit 2022 keynote by SVP and head of research Dario Gil, mentions a little about Osprey:

Technical session of IBM Quantum Summit 2022 with a little more about Osprey:

Some specific points to make at this stage:

  1. 433 qubits is impressive any way you look at it.
  2. Impressive engineering development under the hood.
  3. But despite the qubit count and engineering development, a typical user won’t see ANY benefit from using Osprey. Might as well stick with the 27-qubit Falcon, and in fact see better performance than Osprey, at least at this stage.
  4. Virtually no technical data.
  5. No hint of Quantum Volume. So, maybe they won’t be achieving QV 1024 this year or at least not on Osprey this year.
  6. No hint of qubit fidelity.
  7. Minor, tentative hint of coherence time. A brief mention of T1 of roughly 70 to 100 microseconds. That’s actually a step backwards from Eagle, but IBM said that they expected to improve on that significantly with the next revision of Osprey. No clarity which revision of Osprey would be the first available to users in the coming months.
  8. No hint on gate execution time.
  9. No hint on maximum circuit size.
  10. Osprey is not currently available.
  11. Osprey is not expected to be available in the very near future. Compared to Eagle, which was available within just a couple of weeks.
  12. Suggestion that Osprey would be available in a few months. IBM stated “which we’ll be making available to our clients in a few months”, but there was no hint whether a few months was more like January, February, or even March. Call it Q1.
  13. A second revision of the Osprey chip is expected relatively soon. The revision should have better performance, but details were too minimal and sketchy.
  14. Overall, IBM has done rather poorly at transparency and technical disclosure with this unveiling of Osprey. They shouldn’t have unveiled Osprey at this time if it wasn’t technically ready to be unveiled. They should have tested and benchmarked the system before an unveiling. They should have released a technical data sheet and/or white paper loaded with the kind of critical technical details that prospective users would need to properly and fully evaluate the system. And if they weren’t ready to do that, they should have admitted it, publicly. It was their choice to unveil Osprey at this time — they chose poorly.
  15. Overall, I am not impressed by this release of Osprey. The second revision may show some improvement, but nothing IBM has said publicly gives me the impression that my overall impression will change significantly in a few months or whenever the second revision is actually made available and full technical detail is finally available for evaluation.
  16. In short, Osprey is yet another dud/flop/misfire for IBM Quantum. Following on the heels of their three preceding duds with the two Hummingbirds and Eagle. There’s no meaningful benefit to most end users of these machines over the 27-qubit Falcon.
  17. The new IBM Quantum System Two dilution refrigerator and packaging won’t be available until late next year. To be shown at IBM Quantum Summit 2023.
  18. Therefore, Osprey is not likely to be based on IBM Quantum System Two.
  19. I counted 13 rows of 34 qubits per row in the Osprey graphic rendering. That’s 442 qubits, which suggests that 9 of those qubits aren’t used or available.
  20. I only counted 12 rows of readouts on the chip graphic. That seems odd for 13 rows of qubits. Maybe that’s simply a human error when manually creating the graphic.
  21. The 100 x 100 challenge in 2024. IBM is committing to be able to execute a circuit 100 gates deep for 100 qubits. That would mean executing a total of 10,000 gates. But that’s likely for the 133-qubit Heron processor, not Osprey, but it’s not possible to tell for sure at this stage.
  22. IBM is hoping to get better than three nines of qubit (gate) fidelity on Heron in 2024. Part of the 100 x 100 challenge.
  23. No hint of an on-chip non-local coupler for direct connectivity between non-adjacent qubits. In fact, that feature wasn’t mentioned at all during the summit even though it was discussed in the September paper by IBM and other new forms of modularity and connectivity were discussed at the summit — but none for Osprey.

In short, we have to wait for the actual availability of Osprey in the coming months before we get some real answers to many of these questions, so a lot of my speculation will remain relevant for a bit longer.

UPDATE on April 9, 2023 — Five full months since unveiling but still no hint of public availability

As of April 9, 2023, it’s been five full months since IBM unveiled their 433-qubit Osprey quantum processor on November 9, 2022, but still no sign or hint of public availability or any transparency and technical disclosure of functional details or performance metrics.

IBM has dropped no fresh hints as to public availability.

And no hints or signs of any technical transparency or technical disclosure for functional details or performance metrics of Osprey.

I have to applaud IBM for such great operational security and secrecy at preventing leaks.

There has been some hint of IBM partners and customers getting early access for testing, but that’s not general public availability or transparency or technical disclosure — publicly.

I had expected that public availability in Q1 would have been a slam dunk, but obviously that didn’t happen.

I have no expectations of when public availability or any transparency and technical disclosure of functional details or performance metrics might occur. It could be within days, weeks, or months.

Overall, I still expect that Osprey will be a flop, with a lot of solid engineering improvements, but nothing that offers quantum algorithm designers or application developers any functional or performance benefit over the 27-qubit Falcon.

It almost doesn’t matter when Osprey becomes publicly available since it is likely to end up being a flop anyway — in my own personal opinion.

Some possible factors or reasons for the big delay of Osprey, not implying that any of these possible factors or reasons was actually a factor or reason at all or a significant factor or reason:

  1. Hardware wasn’t fully functional.
  2. Hardware was functional but not reliable or stable.
  3. Hardware was reliably functional but performance was unacceptable. Qubit quality or circuit execution time.
  4. Hardware is now so complex that much more testing is required, which requires a lot more time. Significantly more time is required. And more of other resources as well.
  5. Qubit quality was unacceptable.
  6. Quantum Volume (QV) metric was embarrassing. Didn’t hit 1024 or maybe even worse than Eagle, which is worse than Falcon. Maybe they didn’t hit 512 or 256, which might have been their minimal goal.
  7. Decided to integrate error suppression logic to compensate for poor qubit fidelity and low Quantum Volume. And it was just harder and took longer than they thought. Or maybe the results were disappointing and they had to go back to the drawing board and try again.
  8. Third-party add-ons for error suppression were not ready.
  9. Coherence time was too short or gate execution time was too slow. Maximum circuit size was too small.
  10. Calibration time was excessive. Google has warned about exponential growth of calibration time as qubit count grows. This would interfere with circuit execution time — circuits per second.
  11. Overall, maybe IBM was simply embarrassed by the functional and performance aspects of Osprey and was deep into damage control to reclaim some sense of being an advance in quantum computing other than raw qubit count.

I noticed that IBM has updated their 2022 quantum roadmap to indicate that their commitment for Osprey in 2022 has been fulfilled, but I don’t buy it. Unveiling is one step, but delivery or public availability, coupled with public technical disclosure of functional details and performance metrics in the cloud is what should matter for IBM to claim that their roadmap commitment has been fulfilled.

Overall, IBM has done a particularly lousy job of setting expectations for Osprey. That’s setting up potential users, customers, and partners for a major disappointment whenever Osprey actually is finally made publicly available.

Even at this late stage, IBM should set at least some expectations for Osprey.

I may focus an unusual level of attention on IBM, but just as in the mainframe days in the 1960’s, it’s IBM and… “the seven dwarfs” — IBM’s competitors. IBM leads in many ways, so it’s not worth focusing much on those trailing behind IBM.

Stay tuned for the next stage of the great saga of Osprey.

UPDATE on May 8, 2023 — Osprey is now publicly available

Several times a day for almost six months now I’ve been refreshing the IBM quantum system dashboard web page, patiently waiting for Osprey to appear, but no such luck, until today (technically yesterday, but these comments are my thoughts from yesterday, Monday, May 8, 2023). Osprey is here! Osprey has landed! Osprey is publicly available!

The overall IBM quantum system dashboard web page:

The detail page for the 433-qubit Osprey ibm_seattle quantum system is here:

My initial comments, originally posted on LinkedIn:

BREAKING NEWS: Osprey has landed! (Public availability.)

IBM’s 433-qubit Osprey quantum processor is now showing up on IBM’s quantum system dashboard as ibm_seattle — see below.

I may update with additional details later, but here are my initial observations:

  1. This is Osprey r1. I thought they had already moved on to a second revision. They do indeed have a second revision, internally, but not yet public.
  2. No Quantum Volume (QV) listed.
  3. No CLOPS listed.
  4. Last calibrated: 6 days ago. Although IBM had talked about calibrating less to make error mitigation more consistent. No jobs in queue, so although it shows as Status: Online, maybe it’s not yet really ready for use. Oops… now I see 1 pending job!
  5. Version: 1.0.0.
  6. Median ECR Error: 2.139e-2. Not so great — 1.78 nines of qubit fidelity. Max 1.0 (not working) — messes up the average. Min 7.537e-3 — not as bad, but not great — 2.25 nines of qubit fidelity.
  7. Median SX Error: 5.973e-4. Seems decent — 3.4 nines of qubit fidelity.
  8. Median Readout Error: 6.333e-2. Not so great — 1.37 nines of qubit fidelity. Max 1.0 (not working) messes up the average. Min 6.200e-3 — not so bad — 2.4 nines of qubit fidelity.
  9. Median T1: 86.14 us. Not so great. Min 2.67 usec — drags down the average. Max 320.88 — much, much better.
  10. Median T2: 60.77 us. Not so great. Min 2.22 usec — drags down the average. Max 164.58 — much better.
  11. Gate execution time: 660 ns. Not great. Min 135 ns, much better. Max 1,404 ns (1.404 usec) — lousy.
  12. I counted 19 qubits on the topology map which had a readout assignment error of 1.0 — readout never works. Jay Gambetta posted that 20 qubits were non-functional, but I was unable to find a 20th qubit.
  13. OpenQASM3 is not yet supported, so no support for dynamic circuits.

Some additional comments I posted on LinkedIn on Jay Gambetta’s post that announced availability of Osprey:

  1. I look forward to the first batch of papers on experiments to actually make use of these qubits for practical real-world applications — rather than oddball computer science experiments and simplistic physics experiments such as cross-entropy benchmarking and simulation of time crystals and wormholes.
  2. A report on Quantum Volume (QV) would be a great start — what it turns out to be, what specific factors limits it at present, and what specific improvements need and can to be made for future quantum processors (Condor, Heron, Flamingo, et al) to achieve some sort of quantum leap in QV — more than just a couple of qubits.
  3. At present, the highest QV of the systems on the IBM quantum system dashboard is 128 — even worse than the 256, 512, and (promised but not delivered) 1024 from last year.
  4. At present, Eagle has never achieved QV over 32 and 64.
  5. Does IBM have a transparent expectation for QV for Condor and Heron that can be shared in public?
  6. In any case, some transparency and technical disclosure on QV and its future would be greatly appreciated.

Overall, nothing substantial has really changed since my original speculative preview from over a year ago. In fact, we still don’t know the Quantum Volume (QV), although we do have preliminary metrics for qubit fidelity.

So, overall, I have to judge this initial version of Osprey as simultaneously a significant engineering achievement (the physical engineering and packaging), but a complete flop from a functional product perspective.

  • It remains true that quantum algorithm designers and quantum application developers will actually do best sticking with the 27-qubit Falcon quantum processor (or 33-qubit Egret), for now.

Additional details may come to light over the coming days, weeks, and months.

UPDATE on September 7, 2023 — Osprey is now retired

Wow, that was fast. Osprey only lasted for not quite four months, since May 8.

It had been offline for a time before that anyway.

No reason has been given by IBM for its retirement.

Indeed, I saw no papers or reports of usage of Osprey. As if it never existed.

Here’s IBM’s web page listing retired quantum systems, which lists ibm_seattle (Osprey):

Technically, this is just this one specific system being retired. But that’s the only Osprey processor that has ever been made available by IBM. Technically, IBM could introduce a new Osprey-based system, but that hasn’t happened in the almost-month since the retirement of ibm_seattle.

In theory, we should see at least the public unveiling of the 133-qubit Heron and 1,121-qubit Condor quantum processors at the upcoming annual IBM Quantum Summit in November.

Heron is theoretically multiple processors, so if IBM unveils it with three processors, then that would be a total of 399 qubits, close to Osprey’s 433 qubits. Give it four processors, and it would come in at 532 qubits, well ahead of Osprey.

And in theory, Heron should have significantly improved qubit and gate fidelity. So, Osprey would be technically obsolete once Heron came out anyway.

My original proposal for this topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

  • Speculative preview of the IBM 433-qubit Osprey. This is less a preview than a wish list for what I hope that IBM will deliver later this year (2022). I would dearly love to know what capabilities the upcoming IBM 433-qubit Osprey quantum processor will deliver later this year, but other than 433 qubits, we simply don’t know. I would hope that it has improved qubit fidelity, but we just don’t know at this stage. We’ll have to wait and see. The recent 127-qubit Eagle didn’t deliver improved qubit fidelity, so I’m primed for further disappointment, but still hopeful. I would hope that it has some improvement to qubit connectivity, but IBM hasn’t hinted at any, so I’m not holding my breath. But I do hope they at least deliver a roadmap and set expectations for qubit connectivity improvements in future years and future processors. We’ll simply have to wait until November to see.

Summary and conclusions

  1. Osprey is poised for disappointment. Numerous technical obstacles.
  2. Plenty of hope for Osprey. But hope is not a plan or certainty.
  3. IBM needs to up their game to avoid disaster. Technical obstacles can be overcome, but only through much more serious effort.
  4. Overall, Osprey is just an upsized Eagle. More qubits, miniaturized, but individually not much better functionally.
  5. My overall disappointment with the IBM 127-qubit Eagle. No improvement in qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.
  6. My overall disappointment with the IBM hardware roadmap. No milestones or detail for quantum error correction (QEC), qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.
  7. Low qubit fidelity and weak qubit connectivity. No dramatic improvement over Eagle.
  8. Mediocre qubit measurement fidelity. Same issues as with Eagle. A critical weakness.
  9. Still an impressive engineering achievement under the hood. But most of that internal engineering effort doesn’t affect fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.
  10. Maybe enough improvement to not be an absolute flop, but disappointing enough to be a relative flop.
  11. IBM needs much better messaging. Needs to set expectations more accurately.
  12. Osprey probably needs near-perfect qubits to inspire any significant excitement.
  13. Only four details we know for sure about Osprey…
  14. Osprey will have 433 qubits.
  15. Key advancement of Osprey will be miniaturization of components.
  16. Osprey will be based on the new quantum hardware infrastructure of the IBM Quantum System Two.
  17. The IBM Quantum System Two incorporates a new cryogenic refrigerator from Bluefors.
  18. IBM hasn’t committed to any improvement in qubit fidelity.
  19. IBM hasn’t committed to any improvement in granularity of phase and probability amplitude.
  20. Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE). Essential for advanced applications such as quantum computational chemistry.
  21. IBM hasn’t committed to any improvement in coherence time and circuit depth.
  22. IBM hasn’t committed to any advances in quantum error correction (QEC) in Osprey.
  23. But I would hope that IBM would demonstrate at least a small handful of perfect logical qubits in Osprey. Maybe six or seven.
  24. IBM hasn’t committed to any improvement in Quantum Volume (QV).
  25. Advances or lack of advances in Osprey will confirm, set, or undermine IBM’s technical credibility.
  26. Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization.
  27. Still a mere laboratory curiosity. Not ready for production-scale practical real-world quantum applications.
  28. Still more appropriate for the lunatic fringe rather than mainstream application developers.
  29. IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available. Including Quantum Volume (QV) and qubit fidelity.
  30. IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available. Including a Principles of Operation document which details the programming model.
  31. I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey. A 27-qubit Falcon with enhanced connectivity and another 1.5 nines of qubit fidelity would be much more beneficial for quantum algorithm designers and quantum application developers than whatever Osprey might deliver. Hummingbird and Eagle were distractions rather than substantial advances for quantum algorithm designers and quantum application developers.
  32. My hope is that this paper might help to cajole IBM into doing a better job of setting expectations for Osprey and to do a better job of detailing milestones for specific technical factors in their hardware roadmap going forward.
  33. Some late-breaking updates. See the section UPDATE on October 28, 2022 — Additional information.
  34. Some information at the unveiling at the IBM Quantum Summit. See the section UPDATE on November 18, 2022 — Formal unveiling at IBM Quantum Summit 2022 on November 9, 2022.
  35. In short, stay tuned for actual availability sometime in Q1 of 2023.
  36. Stay tuned for actual public availability, whenever. No idea when that will be now. See the section UPDATE on April 9, 2023 — Five full months since unveiling but still no hint of public availability.
  37. UPDATE on May 8, 2023 — Osprey is now publicly available.
  38. UPDATE on September 7, 2023 — Osprey is now retired.

--

--