# Speculative Preview of the IBM 433-qubit Osprey Quantum Computer

After being disappointed by the IBM 127-qubit Eagle quantum computer, I’m anxious as to what’s in store with the upcoming 433-qubit Osprey quantum computer due to be introduced by IBM towards the end of 2022. Will it be basically just more of the same, just with more qubits (and a new refrigerator), or will it be a real breakthrough in any of the areas which really matter to algorithm designers and quantum application developers? This informal paper explores what we do know and speculate about what we don’t know, both what is most likely and what would be best. My hope is that this paper might help to cajole IBM into doing a better job of setting expectations for Osprey and to do a better job of detailing milestones for specific technical factors in their hardware roadmap going forward.

**Topics discussed in this paper:**

- In a nutshell
- References
- Quantum processor vs. quantum computer
- My motivation
- Goals for this paper
- What I include in this paper
- Timing of my paper
- Osprey is poised for disappointment
- Osprey may offer enough improvement to not be an absolute flop, but disappointing enough to be a relative flop
- IBM needs much better messaging
- IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available
- IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available
- What are IBM’s intentions with Osprey?
- Are there other intentions of IBM that we just don’t know about, yet?
- Osprey is still an impressive engineering achievement under the hood
- What value is IBM attempting to offer quantum algorithm designers and quantum application developers with Osprey?
- Extrapolation from Eagle
- The key technical factors for Osprey that could impact quantum algorithms and quantum applications
- The Big Four of the key technical factors which matter the most
- General support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) is critical
- Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE)
- What qubit count has IBM committed for Osprey?
- What qubit count is likely in Osprey?
- What qubit count does Osprey need to inspire confidence?
- How many qubits is enough to keep people happy?
- What qubit fidelity has IBM committed for Osprey?
- Nines of qubit fidelity
- What qubit fidelity is likely in Osprey?
- What qubit fidelity does Osprey need to inspire confidence?
- Does Osprey need near-perfect qubits to inspire significant excitement? Probably.
- What about qubit measurement fidelity for Osprey?
- Mediocre qubit measurement fidelity would be a critical flaw for Osprey
- What qubit connectivity has IBM committed for Osprey?
- What qubit connectivity is likely in Osprey?
- What qubit connectivity does Osprey need to inspire confidence?
- Will Osprey need some sort of quantum state bus to satisfy demand for greater qubit connectivity?
- Is some other hardware mechanism needed to increase qubit connectivity for Osprey?
- No, SWAP networks are not a viable alternative to enhanced qubit connectivity
- What granularity of phase and probability amplitude has IBM committed for Osprey?
- What granularity of phase and probability amplitude is likely in Osprey?
- What granularity of phase and probability amplitude does Osprey need to inspire confidence?
- What coherence time and circuit depth has IBM committed for Osprey?
- What coherence time and circuit depth is likely in Osprey?
- What coherence time and circuit depth does Osprey need to inspire confidence?
- What has IBM committed to for quantum error correction (QEC) in Osprey?
- What do I hope that IBM will deliver for quantum error correction (QEC) in Osprey?
- Do I expect a demonstration of quantum error correction in Osprey? No, but…
- Maybe a demonstration of quantum error correction in a subsequent revision to Osprey? Possibly, but…
- Is the credibility of quantum error correction at stake with Osprey?
- IBM technical credibility will suffer if they neither show progress with quantum error correction nor progress towards near-perfect qubits
- What Quantum Volume has IBM committed for Osprey?
- What Quantum Volume is likely in Osprey?
- What Quantum Volume does Osprey need to inspire confidence?
- Quantum Volume cannot use more than 50 qubits (and maybe only 40 or even 32 qubits)
- Topology and connectivity of Osprey’s 433 qubits
- Will Osprey be modular?
- IBM Quantum System Two
- Will the new dilution refrigerator offer any functional advantage to quantum applications?
- What speed or throughput will Osprey have?
- Will Osprey support Qiskit Runtime?
- Is Osprey simply a bigger Eagle (more qubits)?
- Concerns from Eagle which may or may not apply to Osprey as well
- Competition from other qubit technologies
- IBM quantum is still in research mode, not mainline commercial product engineering, yet
- When might IBM transition their quantum efforts from research to commercial product development?
- Unfortunately, parts of IBM are acting as if quantum was a commercial product and engaging in premature commercialization
- Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization
- Still a mere laboratory curiosity
- Still more appropriate for the lunatic fringe rather than mainstream application developers
- Risk for Quantum Winter
- What can we expect from future revisions of Osprey?
- I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey
- My original proposal for this topic
- Summary and conclusions

# In a nutshell

There is really so much to say, but let me try to summarize it as briefly as possible…

**Osprey is poised for disappointment.**Numerous technical obstacles.**Plenty of hope for Osprey.**But hope is not a plan or certainty.**IBM needs to up their game to avoid disaster.**Technical obstacles can be overcome, but only through much more serious effort.**Overall, Osprey is just an upsized Eagle.**More qubits, miniaturized, but individually not much better functionally.**My overall disappointment with the IBM 127-qubit Eagle.**No improvement in qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.**My overall disappointment with the IBM hardware roadmap.**No milestones or detail for quantum error correction (QEC), qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.**Low qubit fidelity and weak qubit connectivity.**No dramatic improvement over Eagle.**Mediocre qubit measurement fidelity.**Same issues as with Eagle. A critical weakness.**Still an impressive engineering achievement under the hood.**But most of that internal engineering effort doesn’t affect fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.**Maybe enough improvement to not be an absolute flop, but disappointing enough to be a relative flop.****IBM needs much better messaging.**Needs to set expectations more accurately.**Osprey probably needs near-perfect qubits to inspire any significant excitement.****Only four details we know for sure about Osprey…****Osprey will have 433 qubits.****Key advancement of Osprey will be miniaturization of components.****Osprey will be based on the new quantum hardware infrastructure of the IBM Quantum System Two.****The IBM Quantum System Two incorporates a new cryogenic refrigerator from Bluefors.****IBM hasn’t committed to any improvement in qubit fidelity.****IBM hasn’t committed to any improvement in granularity of phase and probability amplitude.****Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).**Essential for advanced applications such as quantum computational chemistry.**IBM hasn’t committed to any improvement in coherence time and circuit depth.****IBM hasn’t committed to any advances in quantum error correction (QEC) in Osprey.****But I would hope that IBM would demonstrate at least a small handful of perfect logical qubits in Osprey.**Maybe six or seven.**IBM hasn’t committed to any improvement in Quantum Volume (QV).****Advances or lack of advances in Osprey will confirm, set, or undermine IBM’s technical credibility.****Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization.****Still a mere laboratory curiosity.**Not ready for production-scale practical real-world quantum applications.**Still more appropriate for the lunatic fringe rather than mainstream application developers.****IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available.**Including Quantum Volume (QV) and qubit fidelity.**IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available.**Including a*Principles of Operation*document which details the programming model.**I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey.**A 27-qubit Falcon with enhanced connectivity and another 1.5 nines of qubit fidelity would be much more beneficial for quantum algorithm designers and quantum application developers than whatever Osprey might deliver. Hummingbird and Eagle were distractions rather than substantial advances for quantum algorithm designers and quantum application developers.**My hope is that this paper might help to cajole IBM into doing a better job of setting expectations for Osprey and to do a better job of detailing milestones for specific technical factors in their hardware roadmap going forward.**

# References

My preliminary comments on the IBM 127-qubit Eagle:

*Preliminary Thoughts on the IBM 127-qubit Eagle Quantum Computer*- https://jackkrupansky.medium.com/preliminary-thoughts-on-the-ibm-127-qubit-eagle-quantum-computer-e3b1ea7695a3

There are a number of references for Eagle in that paper.

IBM quantum hardware roadmap which contains the first reference to Osprey:

*IBM’s roadmap for scaling quantum technology*- September 15, 2020
- https://research.ibm.com/blog/ibm-quantum-roadmap

IBM press release for Eagle which also p*reviews the IBM Quantum System Two and *mentions that it will work with 433-qubit processors (and beyond), although it doesn’t name Osprey by name:

*IBM Unveils Breakthrough 127-Qubit Quantum Processor**- Delivers 127 qubits on a single IBM quantum processor for the first time with breakthrough packaging technology**- New processor furthers IBM’s industry-leading roadmaps for advancing the performance of its quantum systems**- Previews design for IBM Quantum System Two, a next generation quantum system to house future quantum processors*- https://newsroom.ibm.com/2021-11-16-IBM-Unveils-Breakthrough-127-Qubit-Quantum-Processor

Random IBM post (no date) which mentions Osprey:

*5 Things to Know About the IBM Roadmap to Scaling Quantum Technology**Eagle will be followed by the 433-qubit “Osprey” processor in 2022. Osprey continues to push the boundaries of fabrication techniques to build a smaller chip to ensure more logical qubits that don’t sacrifice performance. Its more-efficient and denser controls and cryogenic infrastructure will ensure that scaling up future processors doesn’t sacrifice the performance of individual qubits, introduce further sources of noise, or take up too large a footprint.*- https://newsroom.ibm.com/IBM-research?item=32425

# Quantum processor vs. quantum computer

Technically, Osprey is a *quantum processor* rather than a *quantum computer* per se.

The quantum processor is where all of the computation is performed. The actual *chip*. All of the rest of the hardware is the *quantum computer system*, or simply *quantum computer*, or as IBM refers to it, the *quantum system*.

Most of the *quantum system* is common, regardless of the actual *quantum processor chip*. So, the 127-qubit Eagle, the 65-qubit Hummingbird, and the 27-qubit Falcon all share the same overall quantum system, called the *IBM Quantum System One*.

Osprey and future quantum processors, including the 1,121-qubit Condor, will all share a new overall quantum system, called the *IBM Quantum System Two*. All of that hardware other than the processor chip is the same, regardless of which processor chip is used.

There is also the wiring and electronics to drive the wiring, but that is all the same, just one for each qubit or a sequence of qubits for newer systems with serial readout.

All of that said, I personally will continue to refer to these as *quantum computers* — the *433-qubit Osprey quantum computer*. Maybe that’s because I’m a software guy and it’s the functions under the hood which matter most, regardless of how it is all sliced, diced, and packaged.

# My motivation

My personal motivation for this informal paper is twofold:

**My overall disappointment with Eagle.**A big advance in qubit count and great internal engineering improvements, but no benefits in any of the main technical areas. No advances which would benefit quantum algorithm designers or quantum application developers.**My overall disappointment with the IBM hardware roadmap.**Lack of technical detail. Other than a brief mention and qubt count, virtually nothing. No milestones for quantum error correction (QEC). No milestones for qubit fidelity. No milestones for the other key technical factors.

I will endeavor to:

**List all known details about Osprey.****Predict or speculate about possible details about Osprey.**

# Goals for this paper

Briefly, my intentions are to:

**Detail what we know.****Speculate on what we think is likely.****Speculate on what we don’t know.****Express aspirations.****Express concerns with the intent that maybe IBM will respond in some constructive manner.****Overall, set expectations for what Osprey will, might, could, won’t, and is unlikely to be.**

# What I include in this paper

My writing in this informal paper focuses on:

**What I expect from IBM and Osprey.****What I am concerned about.****What I worry about.****Opportunities which IBM might miss.****Missteps with Eagle which IBM might repeat with Osprey.****Premonitions, nightmares.****Poor messaging.**Missed opportunities for great messaging from IBM.

# Timing of my paper

It will be quite a few months from when I post this paper until IBM actually introduces Osprey later this year, 2022. My rationale for posting in advance of the formal announcement and this far in advance is twofold:

**Long enough lead time to have some hope of influencing IBM’s planning and efforts.**I don’t actually expect that IBM will pay any attention to what I write, but I do feel an ethical obligation to let them know about what I see that concerns me.**Hope to have some impact on IBM’s messaging.**Try to get them to communicate more explicitly about the key benefits of Osprey over Eagle and Falcon. They need to do a much better job of setting expectations than they’ve done in the past, especially for Eagle, but for all of their other quantum processors as well.

# Osprey is poised for disappointment

Overall, there are just too many technical obstacles for Osprey to overcome for it to be anything more than a significant disappointment. Yes, there is plenty of hope for Osprey, but hope is not a plan or certainty.

IBM can still overcome the technical challenges, but they need to up their game to avoid disaster. Technical obstacles can be overcome, but only through much more serious effort.

Some of the reasons why Osprey is poised to be a disappointment:

**Overall, Osprey is just an upsized Eagle.**More qubits, miniaturized, but individually not much better functionally.**My overall disappointment with the IBM 127-qubit Eagle.**No improvement in qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.**My overall disappointment with the IBM hardware roadmap.**No milestones or detail for quantum error correction (QEC), qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.**Low qubit fidelity and weak qubit connectivity.**No dramatic improvement over Eagle.**Mediocre qubit measurement fidelity.**Same issues as with Eagle. A critical weakness.**Still an impressive engineering achievement under the hood.**But most of that internal engineering effort doesn’t affect fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.**Maybe enough improvement to not be an absolute flop, but disappointing enough to be a relative flop.****IBM needs much better messaging.**Needs to set expectations more accurately.

# Osprey may offer enough improvement to not be an absolute flop, but disappointing enough to be a relative flop

I seriously don’t expect Osprey to be a total flop. I do expect a variety of improvements, but I still suspect that they won’t be enough to avoid serious disappointment.

Eagle was much more disappointing than I expected.

I do hope that IBM learned some valuable and insightful lessons from Eagle which will result in fewer missteps and more improvements in Osprey.

But on net, I fully expect a significant deficit of sentiment, biased against Osprey.

# IBM needs much better messaging

Quite a few of the issues I’ve raised in previous papers concerning Eagle and the IBM quantum hardware roadmap really come down to just basic *messaging*, just *setting expectations* realistically.

The overall goal of messaging for quantum computers is to give quantum algorithm designers and quantum application developers the critical information they need to make the best use of the technology.

Some specific action items:

**IBM needs to set expectations more accurately.****IBM needs to avoid overpromising.****IBM needs to focus attention on where they excel, their strong points.****IBM needs to be much more direct and honest as to their technical shortcomings — and offer action plans for how and when they will be addressing each technical shortcoming.****The IBM quantum hardware roadmap needs to be much more explicit in terms of milestones and specific technical features.**Whether it’s increments of improvement in qubit fidelity, architectural changes for qubit connectivity, or the staging of quantum error correction.**Benchmarking results need to be a formal aspect of required messaging.**This includes Quantum Volume (QV) and qubit fidelity.**Full technical documentation and technical specifications also need to be a formal aspect of required messaging.**

# IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available

Benchmarking results for Osprey should be formally disclosed at the time that Osprey is announced and made available. They should be fairly comprehensive, and certainly include Quantum Volume (QV) and qubit fidelity (nines of qubit fidelity) very prominently. This is all part of messaging.

Qubit fidelity is an excellent surrogate for the overall capabilities of a quantum computer.

For more on *nines of qubit fidelity*, see my paper:

# IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available

Full technical documentation and technical specifications also need to be a formal aspect of required messaging, and be provided when Osprey is formally announced and made available.

Technical specifications must include a formal *Principles of Operation* document which provides a sufficient level of detail about the *programming model* of the quantum computer so that quantum algorithm designers and quantum application developers can effectively exploit the technical capabilities of the quantum computer.

I describe the concept of a *Principles of Operation* document in my paper:

*Framework for Principles of Operation for a Quantum Computer*- https://jackkrupansky.medium.com/framework-for-principles-of-operation-for-a-quantum-computer-652ead10bc48

# What are IBM’s intentions with Osprey?

Some questions about IBM’s intentions with Osprey:

**What problem is IBM trying to solve?****What did IBM intend to do with Osprey?****What’s the point of Osprey?****What is IBM attempting to accomplish with Osprey?****What value is IBM attempting to offer quantum algorithm designers and quantum application developers?**Other than simply more of the same qubits as Eagle.**Was the intent simply to offer more qubits?****Was the intent simply to miniaturize qubits and components — but not make them function any different or better?****Is Osprey simply an internal interim engineering stepping stone rather than explicitly offering additional features or functions or enhancements per se?****Is Osprey more of an internal hardware improvement rather than functionally better?****Who might actually benefit from all of these extra qubits?****Are there other intentions that we just don’t know about, yet?**

Unfortunately, none of these questions has answers at this stage. IBM hasn’t said. We could speculate, but that’s about it.

The closest we can come to an answer is simply:

**Osprey offers a lot more qubits.**Each qubit is likely comparable to a qubit on Eagle.**IBM focused on miniaturization to enable more qubits.****Functionally, the qubits should behave approximately the same as on Eagle from the perspective of quantum algorithm designers and quantum application developers.**Just that there are more of them.

# Are there other intentions of IBM that we just don’t know about, yet?

I presume that there may very well be additional intentions that we won’t know about until IBM introduces Osprey late this year. And maybe not even then.

What might they be? All we can do is speculate.

But if indeed Osprey does have functional, feature, and performance improvements, they will quickly become apparent when Osprey is introduced, but until then we will be flying blind.

# Osprey is still an impressive engineering achievement under the hood

Despite all of the potential technical shortcomings that I might highlight for Osprey, it still will have quite a bit of impressive engineering achievement under the hood.

**Miniaturizing qubits and control circuits is no trivial feat.****Isolating, maintaining, and connecting 433 qubits is no trivial feat.****A new cryogenic dilution refrigerator is no trivial feat.**

IBM’s hardware engineers and scientists should be applauded.

But most of that internal engineering effort doesn’t affect qubit fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.

# What value is IBM attempting to offer quantum algorithm designers and quantum application developers with Osprey?

Other than simply more of the same qubits as Eagle, as far as I can tell at this junction, IBM doesn’t seem to be intending to offer *any* real, significant value to quantum algorithm designers and quantum application developers with Osprey.

So, if you are a quantum algorithm designer or quantum application developer, you might as well stick with Eagle or even Falcon.

Without dramatic improvements in qubit fidelity or qubit connectivity, even 27-qubit Falcon has more qubits than most quantum algorithm designers and quantum application developers can effectively use.

Just a reminder that Eagle offers only a Quantum Volume (QV) of 64 and Falcon offers a Quantum Volume of 128. That’s 6 and 7 qubits respectively as the maximum number that can be used in a quantum circuit to deliver a relatively high fidelity result.

# Extrapolation from Eagle

The default in this paper for any unknown details about Osprey will be to extrapolate from Eagle, just with many more qubits.

In many cases this will be fairly accurate.

In most cases this will be fairly reasonable.

In all cases this is all we can do.

# The key technical factors for Osprey that could impact quantum algorithms and quantum applications

Each of these key technical factors will be explored in subsequent sections:

**Qubit count.****Qubit fidelity.**Including qubit measurement fidelity. How close to near-perfect qubits.**Qubit connectivity.****Granularity of phase and probability amplitude.****Coherence time and circuit depth.****Quantum error correction (QEC).****Quantum Volume (QV).**

For each of these key technical factors this paper will discuss:

**What has IBM committed for Osprey?****What is likely in Osprey?****What does Osprey need to inspire confidence?**

There are two other technical factors that will be mentioned but only in passing:

**Speed or throughput.**Measured in*circuit layer operations per second*(CLOPS).**Support for Qiskit Runtime.**

# The Big Four of the key technical factors which matter the most

The previous section identified a number of important technical factors which impact quantum algorithms and quantum applications. Here we identify the *Big Four* of the technical factors which matter the most:

**Higher qubit fidelity.****Greater qubit connectivity.****Finer granularity of phase and probability amplitude.**Essential for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).**Greater circuit depth.**Either greater coherence time for the same gate execution time, or faster gate execution time for the same coherence time, or both greater coherence time and faster gate execution time.

This is not to say that the other technical factors don’t matter or might not matter more to some people, but simply that generally these four key technical factors will be the primary determinants of whether people will be excited or disappointed with Osprey when it is finally announced and available for use.

# General support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) is critical

Quantum Fourier transform (QFT) and quantum phase estimation (QPE) may be the single most powerful and important algorithmic tool available to quantum algorithm designers and quantum application developers. It is critical, for example, to quantum computational chemistry. Without it, dramatic quantum advantage may not even be possible. Sure, small, toy algorithms can get by without it, but any major, large, complex, and sophisticated quantum algorithm is likely to require it.

People will likely get very excited if Osprey does indeed support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Similarly, people will likely be rather disappointed if Osprey does not support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) does require all of the Big Four key technical factors:

**Higher qubit fidelity.****Greater qubit connectivity.****Finer granularity of phase and probability amplitude.****Greater circuit depth.**

# Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE)

Fine granularity of phase is tricky. Granularity of phase and probability amplitude are never adequately documented or even understood for current quantum computers. This needs to change since fine granularity of phase is essential to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE), which are needed for advanced quantum applications such as quantum computational chemistry.

For more detail on this issue, see my paper:

*Beware of Quantum Algorithms Dependent on Fine Granularity of Phase*- https://jackkrupansky.medium.com/beware-of-quantum-algorithms-dependent-on-fine-granularity-of-phase-525bde2642d8

# What qubit count has IBM committed for Osprey?

A qubit count of 433 qubits is solidly committed by IBM for Osprey.

# What qubit count is likely in Osprey?

A qubit count of 433 qubits is likely for Osprey.

# What qubit count does Osprey need to inspire confidence?

A qubit count of 433 is just fine and actually a lot more than most people will need in the near term.

# How many qubits is enough to keep people happy?

People don’t seem ecstatic with the 127 qubits of Eagle, but I don’t think their lack of ecstasy is due to a need for more qubits, but a need for higher qubit fidelity and greater connectivity.

This leaves open the question of exactly how many qubits people really do need to design, develop, and use quantum algorithms to address production-scale practical real-world problems. The possibilities:

**127 or 128 qubits.**Personally, I think this is enough qubits. I recall some people suggesting that 105 qubits or 125 qubits would be needed for various quantum computational chemistry applications, for example. But they need qubit fidelity, qubit connectivity, fine granularity of phase, and greater circuit depth before they can actually use that many qubits.**160 qubits.**Should be a sufficient margin for many applications.**192 qubits.**Ditto.**256 qubits.**I’m not even sure which applications need this, but most applications should be covered.**384 qubits.**Ditto.

So, 433 qubits should be fine for Osprey. But only if they have higher fidelity, greater connectivity, finer granularity of phase, and greater circuit depth.

# What qubit fidelity has IBM committed for Osprey?

IBM has made no commitment as to what qubit fidelity can be expected in Osprey.

Shocking, but true, believe it or not.

# Nines of qubit fidelity

Qubit fidelity (reliability) is an excellent surrogate overall indicator of the overall capability of a quantum computer. It doesn’t matter how many qubits you have if they are not reliable.

Reliability of qubits can best be expressed as *nines of qubit fidelity* — 99.9% reliability is *three nines of qubit fidelity*.

For more on *nines of qubit fidelity*, see my paper:

# What qubit fidelity is likely in Osprey?

A qubit fidelity roughly comparable to that of Eagle is most likely in Osprey.

Unfortunately, that might only be 1.8 nines at best.

Possibly even worse.

Unlikely to be much better.

But, that’s based purely on extrapolating from Eagle.

# What qubit fidelity does Osprey need to inspire confidence?

I don’t think quantum algorithm designers and quantum application developers will get excited by less than a minimum of 3 nines. Preferably 3.5 nines.

Three nines would be the bare minimum that would maintain confidence in Osprey.

3.5 nines would engender at least a modicum of *excitement* in Osprey.

Four nines of qubit fidelity would definitely inspire confidence in Osprey, but that seems beyond reach at this stage.

Some possibilities:

**1.8 nines.**Comparable to Eagle. Not impressive. In fact, very disappointing.**Two nines.**Better than Eagle, but not great.**2.5 nines.**Substantially better than Eagle, but still not great.**2.75 nines.**Maybe the minimal acceptable, but still not great.**Three nines.**Target for minimum acceptable.**3.25 nines.**Starting to look appealing.**3.5 nines.**Moderately appealing.**3.75 nines.**Moderately impressive.**Four nines.**Impressive.**Near-perfect qubits.**This is what we really should be seeing to believe that we really are on track for supporting production-scale practical real-world quantum applications.

For more on *nines of qubit fidelity*, see my paper:

# Does Osprey need near-perfect qubits to inspire significant excitement? Probably.

As just noted, I do think that *near-perfect qubits* would inspire significant excitement in Osprey, but the question is whether Osprey can inspire excitement without near-perfect qubits? I’m not sure what the right answer is.

It is very possible that other technical factors could inspire some excitement.

But it just feels less likely that Osprey could inspire any truy dramatic excitement if qubit fidelity is still low, under three to 3.5 nines.

So I lean towards answering that near-perfect qubits are probably needed for Osprey to inspire any significant excitement or confidence.

For more on *near-perfect qubits*, see my paper:

*What Is a Near-perfect Qubit?*- https://jackkrupansky.medium.com/what-is-a-near-perfect-qubit-4b1ce65c7908

# What about qubit measurement fidelity for Osprey?

For the purposes of this paper, I consider qubit measurement fidelity to be bundled into qubit fidelity even though it is generally characterized separately.

All of my comments above about qubit fidelity generally apply to qubit measurement fidelity as well.

Technically, qubit measurement fidelity in Osprey could diverge from overall qubit fidelity in Osprey, but we can’t know that at this time, so I presume that qubit measurement fidelity will be roughly comparable to what it is for Eagle.

Unfortunately, qubit measurement fidelity in Eagle was fairly mediocre, so it will be somewhat of a disappointment if there is no dramatic improvement in qubit measurement fidelity (and overall qubit fidelity) in Osprey.

Be prepared to be surprised, but don’t be surprised if there is no surprise.

# Mediocre qubit measurement fidelity would be a critical flaw for Osprey

Osprey is likely to have the same issues with qubit measurement fidelity as did Eagle. If so, this will be a critical weakness for Osprey.

There’s no point in accurately performing a quantum computation if you can’t accurately read the result. To repeat…

*There’s no point in accurately performing a quantum computation if you can’t accurately read the result.*

# What qubit connectivity has IBM committed for Osprey?

IBM has publicly committed *nothing* in terms of improvements in qubit connectivity beyond what is available in Eagle, Hummingbird, and Falcon.

# What qubit connectivity is likely in Osprey?

Osprey is likely to offer only the same limited nearest-neighbor qubit connectivity which is currently found in Eagle, Hummingbird, and Falcon.

# What qubit connectivity does Osprey need to inspire confidence?

In my opinion, Osprey needs *something* more than limited nearest-neighbor qubit connectivity to inspire the confidence of quantum algorithm designers and quantum application developers.

Osprey doesn’t need *full any-to-any* qubit connectivity, but *at least something* more than limited nearest-neighbor qubit connectivity.

And to be clear, the use of so-called *SWAP networks* to shuffle the state of qubits around to simulate qubit connectivity is *not* an acceptable alternative to full qubit connectivity. Qubit fidelity is far too low to rely on SWAP networks to move quantum state around.

Actually, my own opinion is that Osprey really does need full any-to-any qubit connectivity, even though I concede that it likely won’t get it.

# Will Osprey need some sort of quantum state bus to satisfy demand for greater qubit connectivity?

Yes, I personally do think that Osprey — or some other future IBM quantum computer — really does need some new hardware architecture to enable significantly greater qubit connectivity.

But, as much as I think that it is needed, I don’t expect it to happen for Osprey or in the relatively near future.

In my own thinking I refer to two terms:

**Quantum state bus.****Dynamically-routable resonator.**

In a traditional superconducting transmon qubit quantum computer there is a *resonator* connecting each pair of qubits which can be operated on by a two-qubit quantum logic gate.

It would be impractical to provide such a resonator for every pair of qubits when the number of qubits is large:

**27 qubits would require 27 * 26 / 2 = 351 resonators.****65 qubits would require 65 * 64 / 2 = 2,080 resonators.****127 qubits would require 127 * 126 / 2 = 8,001 resonators.****433 qubits would require 433 * 432 / 2 = 93,528 resonators.****1,121 qubits would require 1,121 * 1,120 / 2 = 627,760 resonators.**

With a quantum state bus or dynamically-routable resonator each qubit would have an *entry ramp* and *exit ramp* to a single, shared resonator and a hardware device to dynamically enable the entry ramp and exit ramp for the pair of qubits to be used in a two-qubit quantum logic gate.

This hardware mechanism would provide *full any-to-any* qubit connectivity.

This is a purely-speculative hardware mechanism on my part.

I don’t have any expectation that Osprey or any other near-term transmon qubit quantum computer would have such a hardware mechanism, but it is definitely *needed*.

# Is some other hardware mechanism needed to increase qubit connectivity for Osprey?

Absent my suggested *quantum state bus*, which would provide full any-to-any qubit connectivity, there may be simpler hardware support that could be added to Osprey or some other near-term transmon qubit quantum computer to enable *at least some* additional qubit connectivity.

*Something* is definitely needed. *Anything* more than the very limited nearest-neighbor connectivity of Eagle, Hummingbird, and Falcon.

And to be clear, the use of so-called *SWAP networks* to shuffle the state of qubits around to simulate qubit connectivity is *not* an acceptable alternative to full qubit connectivity. Qubit fidelity is far too low to rely on SWAP networks to move quantum state around.

That said, IBM has made no such commitment, and I have no expectation that they will provide such a capability in Osprey, but… we’ll have to see, later this year.

# No, SWAP networks are not a viable alternative to enhanced qubit connectivity

To be clear, the use of so-called *SWAP networks* to shuffle the state of qubits around to simulate qubit connectivity is *not* an acceptable alternative to full qubit connectivity. Qubit fidelity is far too low to rely on SWAP networks to move quantum state around.

SWAP networks are a significant part of the reason that Quantum Volume (QV) is so low for IBM’s quantum computers, even for their most advanced quantum computer, the 127-qubit Eagle, which has a Quantum Volume of only 64, meaning that only at most six qubits (log2(64) = 6) can be used in a reasonably high-fidelity quantum computation.

# What granularity of phase and probability amplitude has IBM committed for Osprey?

IBM has publicly committed *nothing* in terms of improvements in granularity of phase and probability amplitude beyond what is available in Eagle, Hummingbird, and Falcon.

# What granularity of phase and probability amplitude is likely in Osprey?

Osprey is likely to offer only the same granularity of phase and probability amplitude which is currently found in Eagle, Hummingbird, and Falcon.

Granularity could be a little finer than in Eagle.

But granularity could be coarser if engineering tradeoffs were needed in order to accommodate the dramatic boost in qubit count.

# What granularity of phase and probability amplitude does Osprey need to inspire confidence?

The main interest in finer granularity of phase and probability amplitude is to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE), as well as quantum amplitude estimation (QAE), so what would inspire confidence in Osprey would be the ability to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE), as well as quantum amplitude estimation (QAE) for some number of bits of precision more than a small handful, such as:

**6 bits.**Too trivial for any practical application.**8 bits.**Bare minimum.**10 bits.**Still rather bare-bones minimum.**12 bits.**Still rather bare-bones minimum.**14 bits.**Starting to get nontrivial.**16 bits.**Lower bound of nontrivial.**20 bits.**Nontrivial. Minimum to achieve minimal quantum advantage.**24 bits.**Nontrivial. Starting to get interesting.**28 bits.**Nontrivial. More interesting.**32 bits.**Nontrivial. Possibly even useful. Possibly even enough to achieve substantial quantum advantage.**48 bits.**Definitely nontrivial, and likely needed to achieve dramatic or at least substantial quantum advantage, but well beyond any near-term expectations.**50 bits.**Enough to achieve dramatic quantum advantage, but well beyond near-term expectations.

I’ll be impressed if Osprey can support even a 10 or 12-bit quantum Fourier transform.

I’ll be very impressed if Osprey can achieve a 16-bit quantum Fourier transform. But others may still be disappointed.

# What coherence time and circuit depth has IBM committed for Osprey?

IBM has publicly committed *nothing* in terms of improvements in coherence time and circuit depth beyond what is available in Eagle, Hummingbird, and Falcon.

# What coherence time and circuit depth is likely in Osprey?

Osprey is likely to offer only the same coherence time and circuit depth which is currently found in Eagle, Hummingbird, and Falcon.

Coherence time and circuit depth could be a little greater than in Eagle.

But coherence time and circuit depth could be less than in Eagle if engineering tradeoffs were needed in order to accommodate the dramatic boost in qubit count.

# What coherence time and circuit depth does Osprey need to inspire confidence?

Coherence time and circuit depth are essential for large and more complex and more sophisticated quantum algorithms, but are moot if qubit fidelity, qubit connectivity, and granularity of phase and probability amplitude are not dramatically improved in Osprey relative to Eagle.

So, even a dramatic improvement in coherence time and circuit depth won’t inspire confidence in Osprey unless dramatic improvements are not simultaneously made for granularity of phase and probability amplitude.

All of that said, I’d be impressed if IBM could *double* coherence time and circuit depth in Osprey relative to Eagle.

# What has IBM committed to for quantum error correction (QEC) in Osprey?

IBM has made an explicit commitment to pursuing quantum error correction (QEC), promising that each new machine will come one step closer to supporting quantum error correction:

*IBM’s roadmap for scaling quantum technology*- “
*every processor we design has fault tolerance considerations taken into account*.” - https://research.ibm.com/blog/ibm-quantum-roadmap

That statement with more context:

*… we’ve struck a delicate balance of connectivity and reduction of crosstalk error with our fixed-frequency approach to two-qubit gates and hexagonal qubit arrangement introduced by Falcon. This qubit layout will allow us to implement the “heavy-hexagonal” error-correcting code that our team debuted last year, so as we scale up the number of physical qubits, we will also be able to explore how they’ll work together as error-corrected logical qubits — every processor we design has fault tolerance considerations taken into account.*

Unfortunately, I haven’t been able to find any mention of any advances in quantum error correction in the 127-qubit Eagle quantum computer.

And IBM has made no explicit mention of specific advances in quantum error correction in the 433-qubit Osprey quantum computer.

In fact, even the 1,121-qubit Condor quantum computer lacks any explicit mention of quantum error correction

The only mention of quantum error correction in the roadmap diagram is for the “*and beyond — Path to 1 million qubits and beyond*” category *after* 2023 and the 1,121-qubit Condor quantum computer, which has the *key advancement* caption:

*Build new infrastructure,*

Quantum error correction

In short, IBM has not committed to *any* additional support for quantum error correction in Osprey.

That doesn’t mean that they won’t *deliver* any additional support for quantum error correction, just that there has been no commitment.

# What do I hope that IBM will deliver for quantum error correction (QEC) in Osprey?

This is only my own personal *speculative hope* for some sort of advance for quantum error correction (QEC) in Osprey.

IBM hasn’t made any explicit and specific promises or commitments about the 433-qubit Osprey which is due out later this year (2022), but given the large number of qubits — which no existing algorithms would be ready to utilize, I would hope that IBM would at least attempt to *demonstrate* at least a handful of perfect logical qubits.

In my personal view, IBM will need to demonstrate *five to eight logical qubits* with quantum error correction and *six nines of qubit fidelity for logical qubits* later this year to maintain any sense of technical credibility.

Based on reading some IBM papers, the possibilities would be:

**6 logical qubits.**If 65 physical qubits are needed for each logical qubit.**7 logical qubits.**If 57 physical qubits are needed for each logical qubit.

IBM published a paper in 2019/2020 which contains some formulas for calculating physical qubits per logical qubit for a couple of approaches to quantum error correction. This is where the 57 and 65 numbers came from.

The 2019/2020 paper:

*Topological and subsystem codes on low-degree graphs with flag qubits*- Chamberland, Zhu, Yoder, Hertzberg, Cross
- https://arxiv.org/abs/1907.09528
- https://journals.aps.org/prx/abstract/10.1103/PhysRevX.10.011022

The IBM researchers evaluated two approaches:

**Heavy hexagon code.**57 physical qubits per logical qubit.**Heavy square code.**65 physical qubits per logical qubit.

Failure to demonstrate even a small handful of perfect logical qubits on Osprey will send shock waves though the field. It may not be enough to trigger a full Quantum Winter by itself, but it would be enough to put everyone on edge so that even some minor additional setbacks, failed advances, or delayed advances could be the straw that breaks the camel’s back and kicks off a deep Quantum Winter.

For more on *quantum error correction*, *logical qubits*, and *fault-tolerant quantum computing*, see my paper:

*Preliminary Thoughts on Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits*- https://jackkrupansky.medium.com/preliminary-thoughts-on-fault-tolerant-quantum-computing-quantum-error-correction-and-logical-1f9e3f122e71

# Do I expect a demonstration of quantum error correction in Osprey? No, but…

Even though I don’t believe that there is a high likelihood of a demonstration of quantum error correction in Osprey, there may still be some level of gossip and rumor which might lead others to *expect* such a demonstration.

# Maybe a demonstration of quantum error correction in a subsequent revision to Osprey? Possibly, but…

Even though the initial release of Osprey may not have any support for or a demonstration of quantum error correction, it’s very possible that a subsequent revision to Osprey might offer such a demonstration.

I can’t discount that possibility, but I can’t confirm it or suggest that it is likely either.

And IBM has made no such commitment or even dropped the slightest hint.

# Is the credibility of quantum error correction at stake with Osprey?

IBM and others have been talking up a storm about the potential for *quantum error correction* (QEC) for quite a few years now with nothing to show for it. They’ve been able to get away with this simply because there were no quantum computers with enough physical qubits to implement even two logical qubits. But now with Osprey they will have enough physical qubits to implement a small handful of logical qubits, not enough to do anything useful, but enough to at least demonstrate several functional and interacting logical qubits, maybe six of seven logical qubits.

If even Osprey isn’t good enough to demonstrate a few logical qubits, then what will be enough?

If Osprey is unable to demonstrate even a few logical qubits, IBM will have a lot of explaining to do — or suffer the consequences of a loss of technical credibility.

That said, IBM didn’t commit, promise, suggest, or even hint that Osprey might support any logical qubits, and didn’t even do that for the 1,121-qubit Condor due out in 2023, so a reasonable case can made that IBM doesn’t have their reputation at risk per se.

Still, I suspect that a lot of people are getting tired of hearing about quantum error correction as always being beyond the horizon and seeing actual hardware with more than enough physical qubits to support quantum error correction but without even a hint of trying to support it might be the straw that breaks the camel’s back of confidence in IBM.

Or maybe it doesn’t completely break the camel’s back, but simply takes enough of the wind out of the sails of enthusiasm for quantum computing — and for IBM — that momentum is severely damaged even if raw technical credibility remains mostly intact.

Failure to demonstrate at least some support for quantum error correction in Osprey won’t likely be the primary trigger for commencing a slide down into a Quantum Winter, but it could sure grease the skids.

The only way out that I can see for IBM if they aren’t going to demonstrate quantum error correction in Osprey is to at least announce a fairly detailed roadmap for how they will progress towards full support for quantum error correction in some future quantum computers.

# IBM technical credibility will suffer if they neither show progress with quantum error correction nor progress towards near-perfect qubits

IBM’s technical credibility may also hinge on any progress with qubit fidelity — if they are getting close enough to near-perfect qubits (three to four nines of reliability) so that most people won’t even need quantum error correction, then IBM can get a free pass on lack of progress towards quantum error correction.

On the flip side, if IBM is not able to demonstrate a significant improvement in qubit fidelity in Osprey, then the pressure will be on to show real progress with quantum error correction.

Either alternative can work. Doing neither will cost IBM dearly in terms of technical credibility.

# What Quantum Volume has IBM committed for Osprey?

IBM has not committed to any particular Quantum Volume (QV) for Osprey.

IBM has not given any guidance whatsoever.

# What Quantum Volume is likely in Osprey?

A Quantum Volume (QV) comparable to Eagle at 64 is most likely.

A QV a little better, at 128, is very possible.

A QV of even 256 might be possible.

It all depends on whether qubit fidelity is much better than Eagle.

And with mediocre qubit connectivity it may be very difficult to get beyond a QV of 256.

Expecting a QV of even 512 may be unrealistic given relatively low expectations for qubit fidelity and qubit connectivity.

# What Quantum Volume does Osprey need to inspire confidence?

Anything less than a Quantum Volume (QV) of 256 would be seen as a severe disappointment.

Even a QV of 512 or 1024 would be seen as rather disappointing compared to Honeywell achieving 2048.

A QV of 2048 would at least make Osprey seem competitive with trapped-ion qubits.

A QV of 4096 or higher would more clearly inspire confidence, but I don’t see that in the cards due to relatively low expectations for qubit fidelity and qubit connectivity.

# Quantum Volume cannot use more than 50 qubits (and maybe only 40 or even 32 qubits)

Note that Quantum Volume tops at at roughly 2⁵⁰, or more likely 2⁴⁰ or even only 2³², since measurement of Quantum Volume requires simulation of the quantum circuit being tested and simulators can’t handle more than 50 qubits, and even 40 or 32 qubits can be problematic.

A quantum computer can have more than 50 or 40 or 32 qubits, but only 50 or 40 or 32 can be used at a time for measuring Quantum Volume.

For more details on this limit, see my paper:

*Why Is IBM’s Notion of Quantum Volume Only Valid up to About 50 Qubits?*- https://jackkrupansky.medium.com/why-is-ibms-notion-of-quantum-volume-only-valid-up-to-about-50-qubits-7a780453e32c

# Topology and connectivity of Osprey’s 433 qubits

Although IBM hasn’t committed or explicitly stated what the qubit topology and connectivity will be for Osprey’s 433 qubits, it’s a fairly safe bet that it will be an extrapolation of Eagle’s 127 qubits.

Essentially, the topology will look like a brick wall with layers or rows of staggered bricks, but with more rows, with more bricks in each row, similar to the layout of Eagle.

For details of the actually topology of qubits in Eagle, see my paper:

*Preliminary Thoughts on the IBM 127-qubit Eagle Quantum Computer*- https://jackkrupansky.medium.com/preliminary-thoughts-on-the-ibm-127-qubit-eagle-quantum-computer-e3b1ea7695a3

Again, this is all mere speculation about Osprey due to lack of detail from IBM.

# Will Osprey be modular?

Eagle wasn’t modular. All qubits are on a single chip, one module.

Rigetti’s new 80-qubit processor is modular, based on their 40-qubit quantum computer module.

Might Osprey be modular? Sure, it’s possible, but IBM has given no hint that it might be, so my presumption is that it won’t be modular.

# IBM Quantum System Two

An IBM quantum computer can be thought of as two *parts*:

**The quantum processor chip.**The heart and brain of the quantum computer.**The overall quantum system.**The mechanical, packaging, cryogenics, and classical electronics needed to support the operation of the quantum processor chip.

Technically, none of the details of the overall quantum system should have any effect on what a quantum algorithm designer or a quantum application developer can design and develop, except as overall system improvements might have some impact on technical feasibility of technical features of the quantum processor chip.

The *quantum processor chip* typically has a name, such as:

**Falcon.****Hummingbird.****Eagle.****Osprey.**

The *overall quantum system* is common across quantum processor chips, although some chips may require more advanced quantum systems.

At present, IBM has a single overall quantum system:

**IBM Quantum System One.**Supports Falcon, Hummingbird, and Eagle chips.

Beginning with Osprey, IBM will be introducing a new overall quantum system:

**IBM Quantum System Two.**

Actually, IBM Quantum System Two was announced at the same time as Eagle, back in November 2021:

**IBM Unveils Breakthrough 127-Qubit Quantum Processor***- Delivers 127 qubits on a single IBM quantum processor for the first time with breakthrough packaging technology**- New processor furthers IBM’s industry-leading roadmaps for advancing the performance of its quantum systems**- Previews design for**IBM Quantum System Two**, a next generation quantum system to house future quantum processors*- November 16, 2021
- https://newsroom.ibm.com/2021-11-16-IBM-Unveils-Breakthrough-127-Qubit-Quantum-Processor

The technical details of *IBM Quantum System Two* are currently only sketchy — as per the IBM press release:

**Modular architecture.**Control hardware has the flexibility and resources necessary to scale.**A new generation of scalable qubit control electronics.****Higher-density cryogenic components and cabling.****A new cryogenic platform.**Based on a new cryogenic dilution refrigerator. Designed in conjunction with Bluefors, featuring a novel, innovative structural design to maximize space for the support hardware required by larger processors while ensuring that engineers can easily access and service the hardware.**The possibility to provide a larger shared cryogenic work-space.**Ultimately leading to the potential linking of multiple quantum processors.

But none of the details of the IBM Quantum System Two should have any effect on what a quantum algorithm designer or a quantum application developer can design and develop, except as overall system improvements might have some impact on technical feasibility of technical features of the quantum processor chip.

# Will the new dilution refrigerator offer any functional advantage to quantum applications?

As noted above, the new IBM Quantum System Two which will debut with Osprey will include a new *cryogenic dilution refrigerator*. The question is whether this new system component is simply cheaper and more efficient but otherwise functionally identical to the old dilution refrigerator. Will the overall system simply operate more efficiently, or will quantum application developers see any functional benefit to applications?

Some questions I have:

**Will the new refrigerator have better shielding or otherwise reduce environmental interference so that there is a net improvement in qubit fidelity?****Will the temperature be more stable and result in more consistent results?****Will the refrigerator be substantially cheaper?****Will the refrigerator be substantially cheaper to operate?**Less electrical power? Less loss of refrigerant?**Might the new refrigerator have engineering tradeoffs which have a negative impact on qubit fidelity, such as less shielding to reduce cost, or cheaper components to increase qubit count but with lower qubit fidelity?****Will quantum algorithm designers or quantum application developers have to adjust their algorithms to work effectively with the new refrigerator?****Will quantum algorithm designers or quantum application developers be able to take advantage of or otherwise exploit the new refrigerator?****In short, will the new refrigerator be a net gain for quantum algorithms and quantum applications, or a net loss, or a wash, or an uneven mix of gains and losses?**

My hope and expectation would be that the new refrigerator would be completely transparent to the work of quantum algorithm designers and quantum application developers. Or, there would be some net improvements. But certainly no net losses.

# What speed or throughput will Osprey have?

IBM has neither committed nor hinted what speed or throughput (CLOPS) Osprey will have.

All we can do is extrapolate from Eagle, which had a *circuit layer operations per second* (CLOPS) of 850, which seemed low compared to a CLOPS of 1.5K for the 65-qubit Hummingbird and 2K (1.8K to 2.4K) for the 27-qubit Falcon.

So, let’s assume that Osprey will come in at 850 CLOPS. But… don’t hold me to it.

# Will Osprey support Qiskit Runtime?

Oddly, Eagle didn’t initially support *Qiskit Runtime* and still doesn’t, even though both Hummingbird and Falcon support it.

IBM has given no explanation for this lack of support for Qiskit Runtime by Eagle.

All we can do is presume that support for Qiskit Runtime will be the same in Osprey as it is in Eagle, which is that it is *not* supported. But… don’t hold me to it.

# Is Osprey simply a bigger Eagle (more qubits)?

To be honest, we just don’t know yet whether Osprey is simply a bigger version of Eagle with the only main difference being more qubits. We do know that components for Osprey are intended to be smaller (miniaturized), and that there will be more qubits, but what we don’t know is whether those qubits will be functionally any different than the qubits of Eagle in terms of the key technical factors that could impact quantum algorithms and quantums applications, as mentioned earlier:

**Qubit count.**Guaranteed to be different.**Qubit fidelity.****Qubit connectivity.****Granularity of phase and probability amplitude.****Coherence time and circuit depth.****Quantum error correction (QEC).****Quantum Volume (QV).**

In short, one of two propositions will be true once Osprey is unveiled later this year:

**It’s just a larger version of Eagle — more qubits.****It’s functionally somewhat different from Eagle — besides count of qubits.**

Personally, I hope it is different, with significant improvements in all of those key technical factors.

# Concerns from Eagle which may or may not apply to Osprey as well

After carefully reviewing the limited detail available from the announcement of Eagle, I developed this list of concerns, posted in my paper on Eagle in December 2021. Some, many, most, or maybe even all of them may apply to Osprey as well. For the sake of argument, presume that one out of every three or four of the concerns will likely still apply to Osprey.

To be clear, this is the exact, literal list from my Eagle paper. In many cases references to Falcon can be read as references to Eagle and references to Eagle can be read as references to Osprey.

Any references to actual performance of Eagle should be read as speculating and extrapolating on comparable results for Osprey.

Once Osprey is finally announced and becomes available for testing, a fresh list can be developed that is specific to Osprey.

**No significant benefits to most typical near-term quantum algorithm designers or quantum application developers.**All of the engineering is under the hood where most typical users won’t see it. Low qubit fidelity — no significant improvement from previous processors — precludes using more than 20 or so qubits in a single circuit — which can already be done with a 27-qubit Falcon, so the dramatic increase in qubit count isn’t generally functionally useful for most typical users, at present.**No hint of any significant change to the basic core qubit technology.**Despite the dramatic overall engineering redesign, there is no hint that the core qubit technology has changed. Presumably IBM would have touted that if it had been improved.**No significant increase in qubit fidelity.**Some 27-qubit Falcon processors are better.**No hint of improvement in fine granularity of phase and probability amplitude.**Needed for quantum Fourier transform (QFT) and quantum phase estimation (QPE), as well as for more complex algorithms utilizing quantum amplitude estimation (QAE). Needed for quantum computational chemistry, so no significant advance on this front.**No hint of any significant improvement in measurement fidelity.**Sorely needed.**No improvement in qubit connectivity.**Same topology. Low qubit fidelity limits use of SWAP networks to simulate connectivity.**No significant increase in qubit coherence time.**Many 27-qubit Falcon processors are better, some by a lot.**No significant improvement in gate execution time.**The minimum does seem to show significant improvement, but the average is not quite as good as**ibm_hanoi**(27-qubit Falcon), although somewhat better than**ibmq_brooklyn**(65-qubit Hummingbird.)**No significant increase in circuit depth.**Follows qubit coherence time and gate execution time.**No improvement in Quantum Volume (QV).**Measured at only 32 as of December 8, 2021. Very disappointing. Worse than Falcon (64 and 128). Matches 65-qubit Hummingbird. I had hoped for 256.**No significant progress in two of the three metrics for progress given by IBM.**Scale increased, but no significant increase in quality (QV) or speed (CLOPS).**No support for Qiskit Runtime.**At least not initially, but I presume that will come, eventually.**Unlikely to attain any substantial degree of quantum advantage.**Due to limited qubit fidelity and limited connectivity.**No documented attempt to implement quantum error correction (QEC) or logical qubits.****Clearly Eagle and IBM are still deep in the pre-commercialization stage of quantum computing, not yet ready to even begin commercialization.**Many questions and issues and much research remains. Not even close to commercialization.**No roadmap for enhancements to Eagle.**Other than Osprey and Condor being successors. But I want to know about r2, r3, r4, and r5.

To read my original concerns about Eagle in context, consult my paper on Eagle:

*Preliminary Thoughts on the IBM 127-qubit Eagle Quantum Computer*- https://jackkrupansky.medium.com/preliminary-thoughts-on-the-ibm-127-qubit-eagle-quantum-computer-e3b1ea7695a3

# Competition from other qubit technologies

Trapped-ion qubits and neutral-atom qubits seem poised to give superconducting transmon qubits such as those from IBM a run for their money.

It will be interesting to see how far these alternative qubit technologies will have advanced over the coming months in contrast to where Osprey will be when it is announced and made available later this year.

There are more exotic qubit technologies under development, but not likely to have any impact this year:

**Topological qubits.**Microsoft.**Silicon spin qubits.**Intel.

Ultimately, comparison of qubit technologies will come down to comparing the big four key technical factors:

**Higher qubit fidelity.****Greater qubit connectivity.****Finer granularity of phase and probability amplitude.****Greater circuit depth.**

(Assuming the technology supports a sufficient number of qubits as needed by common applications.)

As well as the degree to which nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) are supported.

# IBM quantum is still in research mode, not mainline commercial product engineering, yet

Something to keep in mind is that IBM quantum is still a research project at IBM, not a mainstream commercial business unit. The focus remains research, not development, distribution, deployment, and support of revenue-producing commercial products. The focus and product of their current efforts is research, not revenue.

As such, we need to refrain from judging IBM’s quantum efforts as if they were commercial products.

If Eagle and Osprey are necessary stepping stones for *research* in quantum computing, that’s fine. We shouldn’t judge them *as if* they were true commercial products. Except… to the degree that IBM and others may talk as if they were commercial products. But we should continually remind everyone that they are not commercial products.

# When might IBM transition their quantum efforts from research to commercial product development?

I don’t think anybody does or can know how many more years of research might be required before IBM (or anybody else) finally stumbles on the right formula for a *practical quantum computer* capable of supporting *production-scale practical real-world quantum applications*.

Some possibilities:

**Two years.**Very unlikely.**Three years.**Still very unlikely.**Four years.**Unlikely. But possible.**Five years.**Possibly. Fair bet.**Seven years.**More likely.**Ten years.**Probably.

# Unfortunately, parts of IBM are acting as if quantum was a commercial product and engaging in premature commercialization

IBM’s big push for *Quantum Ready* is clearly an effort at *premature commercialization*. IBM and everyone else should let the research get much more settled before attempting to treat the technology as if it were a commercial product.

The main focus right now needs to be on what I call *pre-commercialization*, focused on research, prototyping, and experimentation, quite a long way from commercialization.

For more on pre-commercialization, see my paper:

*Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization*- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a

And for more on the risk of premature commercialization and the need to focus on research, see my paper:

*Prescription for Advancing Quantum Computing Much More Rapidly: Hold Off on Commercialization but Double Down on Pre-commercialization*- https://jackkrupansky.medium.com/prescription-for-advancing-quantum-computing-much-more-rapidly-hold-off-on-commercialization-but-28d1128166a

# Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization

Overall, Osprey just feels like a research experiment, which is what it is. And that’s actually okay since that’s what we should expect in the *pre-commercialization stage of quantum computing*, long before the technology is ready for true commercialization.

IBM does some research, and then the community prototypes and experiments with applications using the results of that research, providing IBM with feedback to improve the next cycle of research. Rinse and repeat. This is a perfectly sane and responsible and productive process — provided that nobody gets the misguided idea that these research results are viable commercial products.

For more on *pre-commercialization*, see my paper:

*Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization*- https://jackkrupansky.medium.com/model-for-pre-commercialization-required-before-quantum-computing-is-ready-for-commercialization-689651c7398a

# Still a mere laboratory curiosity

As with the rest of quantum computing, Osprey will likely be still at the stage of being *a mere laboratory curiosity*, not even close to being ready for development and deployment of production-scale practical real-world quantum applications.

Much research is still required. Many technical issues remain to be resolved.

Granted, as a laboratory curiosity it is indeed quite appropriate to prototype systems and to experiment with quantum algorithms and quantum applications.

But prototyping and experimentation should not be confused with product engineering and development and deployment of production-scale practical real-world quantum applications.

Being a mere laboratory curiosity is fine for where we are today, focused on prototyping and experimentation, but we run the risk of slipping into a Quantum Winter if we’re still at this stage of being a mere laboratory curiosity two to three years from now.

For more discussion of quantum computing being a mere laboratory curiosity, see my paper:

*When Will Quantum Computing Advance Beyond Mere Laboratory Curiosity?*- https://jackkrupansky.medium.com/when-will-quantum-computing-advance-beyond-mere-laboratory-curiosity-2e1b88329136

# Still more appropriate for the lunatic fringe rather than mainstream application developers

The *lunatic fringe* are those super-elite technical staff who are capable and interested in working with a new technology regardless of whether the technology is ready for commercial deployment. As with the rest of quantum computing, Osprey will still be at the stage where its primary appeal is to *the lunatic fringe* rather than to mainstream application developers.

This is okay for where we are today, but two to three years from now it will be necessary to cater to mainstream application developers rather than the lunatic fringe.

Quantum computing runs the rising risk of falling into a Quantum Winter if it still only appeals to the lunatic fringe two to three years from now.

For more on the lunatic fringe, see my paper:

*When Will Quantum Computing Be Ready to Move Beyond the Lunatic Fringe?*- https://jackkrupansky.medium.com/when-will-quantum-computing-be-ready-to-move-beyond-the-lunatic-fringe-27cc8ddd776e

# Risk for Quantum Winter

Disappointment over Osprey alone is unlikely to trigger the onset of a Quantum Winter by itself, but it could end up being a contributing factor and help to set the stage.

Disappointment over Osprey could help to set the stage for a Quantum Winter, and then maybe disappointment over Condor a year later could then become the straw that breaks the camel’s back which helps to trigger the Quantum Winter.

Of course IBM is not the only game in town — other vendors might leapfrog ahead of IBM, so that even if IBM were to have its own Quantum Winter, the overall quantum computing field could still continue in a thriving and vibrant Quantum Summer. Could. How things turn out remains to be seen.

The main technical factors driving whether people become excited or disappointed in Osprey are:

**Qubit fidelity.**Including qubit measurement fidelity. How close to near-perfect qubits.**Qubit connectivity.**Something better than nearest neighbor.**Granularity of phase and probability amplitude.**Support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).**Coherence time and circuit depth.****Quantum error correction (QEC).**Some significant sense of progress.**Quantum Volume (QV).**Hopefully much better than Eagle, and Falcon as well.

For more on Quantum Winter, see my paper:

*Risk Is Rising for a Quantum Winter for Quantum Computing in Two to Three Years*- https://jackkrupansky.medium.com/risk-is-rising-for-a-quantum-winter-for-quantum-computing-in-two-to-three-years-70b3ba974eca

# What can we expect from future revisions of Osprey?

At this stage, given how speculative expectations for the initial revision of Osprey are, I hesitate to guess what an Osprey 1.1, 1.2, or 2.1 or 2.2 might look like, other than gradual incremental improvements.

And of course we would like to see incremental improvements in all of the Big Four key technical factors:

**Higher qubit fidelity.****Greater qubit connectivity.****Finer granularity of phase and probability amplitude.****Greater circuit depth.**

Such improvements would drive improvements in Quantum Volume (QV) and support for nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).

# I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey

A 27-qubit Falcon with enhanced connectivity and another 1.5 nines of qubit fidelity would be much more beneficial for quantum algorithm designers and quantum application developers than whatever Osprey might deliver.

The extra qubits of 65-qubit Hummingbird and 127-qubit Eagle were distractions rather than substantial advances for quantum algorithm designers and quantum application developers. The extra qubits were essentially useless since algorithm complexity was severely limited by low qubit fidelity and weak qubit connectivity.

Full any-to-any qubit connectivity would be a huge win for algorithm complexity. That may be too much to ask, but any significant improvement in qubit connectivity would still be a big win.

Even if 1.5 nines of additional qubit fidelity is beyond reach, almost any increase in qubit fidelity would be a huge win.

But simply offering more of the same old noisy and weakly-connected qubits is a great loss, not a win by any measure.

I’d like to see IBM perfect 27-qubit Falcon — in a scalable manner — before pursuing a lot more mediocre qubits.

And then scaling an upgraded Falcon to 32, 36, 40, 44, and 48 qubits with that enhanced connectivity and higher qubit fidelity would similarly be a huge win relative to the distraction of simply adding more noisy and weakly-connected qubits.

Once IBM gets to a high-fidelity 48-qubit quantum computer with high qubit fidelity and full or near-full qubit connectivity, as well as finer granularity of phase and probability amplitude, then and only then will much higher qubit counts be warranted or even useful.

# My original proposal for this topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

**Speculative preview of the IBM 433-qubit Osprey.**This is less a preview than a wish list for what I hope that IBM will deliver later this year (2022). I would dearly love to know what capabilities the upcoming IBM 433-qubit Osprey quantum processor will deliver later this year, but other than 433 qubits, we simply don’t know. I would hope that it has improved qubit fidelity, but we just don’t know at this stage. We’ll have to wait and see. The recent 127-qubit Eagle didn’t deliver improved qubit fidelity, so I’m primed for further disappointment, but still hopeful. I would hope that it has some improvement to qubit connectivity, but IBM hasn’t hinted at any, so I’m not holding my breath. But I do hope they at least deliver a roadmap and set expectations for qubit connectivity improvements in future years and future processors. We’ll simply have to wait until November to see.

# Summary and conclusions

**Osprey is poised for disappointment.**Numerous technical obstacles.**Plenty of hope for Osprey.**But hope is not a plan or certainty.**IBM needs to up their game to avoid disaster.**Technical obstacles can be overcome, but only through much more serious effort.**Overall, Osprey is just an upsized Eagle.**More qubits, miniaturized, but individually not much better functionally.**My overall disappointment with the IBM 127-qubit Eagle.**No improvement in qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.**My overall disappointment with the IBM hardware roadmap.**No milestones or detail for quantum error correction (QEC), qubit fidelity, qubit connectivity, granularity of phase and probability amplitude, or circuit depth.**Low qubit fidelity and weak qubit connectivity.**No dramatic improvement over Eagle.**Mediocre qubit measurement fidelity.**Same issues as with Eagle. A critical weakness.**Still an impressive engineering achievement under the hood.**But most of that internal engineering effort doesn’t affect fidelity or performance from the perspective of a quantum algorithm designer or quantum application developer.**Maybe enough improvement to not be an absolute flop, but disappointing enough to be a relative flop.****IBM needs much better messaging.**Needs to set expectations more accurately.**Osprey probably needs near-perfect qubits to inspire any significant excitement.****Only four details we know for sure about Osprey…****Osprey will have 433 qubits.****Key advancement of Osprey will be miniaturization of components.****Osprey will be based on the new quantum hardware infrastructure of the IBM Quantum System Two.****The IBM Quantum System Two incorporates a new cryogenic refrigerator from Bluefors.****IBM hasn’t committed to any improvement in qubit fidelity.****IBM hasn’t committed to any improvement in granularity of phase and probability amplitude.****Fine granularity of phase is needed to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE).**Essential for advanced applications such as quantum computational chemistry.**IBM hasn’t committed to any improvement in coherence time and circuit depth.****IBM hasn’t committed to any advances in quantum error correction (QEC) in Osprey.****But I would hope that IBM would demonstrate at least a small handful of perfect logical qubits in Osprey.**Maybe six or seven.**IBM hasn’t committed to any improvement in Quantum Volume (QV).****Advances or lack of advances in Osprey will confirm, set, or undermine IBM’s technical credibility.****Research orientation of Osprey reemphasizes that quantum computing is still in the pre-commercialization stage, still well short of being ready for commercialization.****Still a mere laboratory curiosity.**Not ready for production-scale practical real-world quantum applications.**Still more appropriate for the lunatic fringe rather than mainstream application developers.****IBM needs to announce preliminary benchmarking results when Osprey is formally announced and made available.**Including Quantum Volume (QV) and qubit fidelity.**IBM needs to provide full technical documentation and technical specifications when Osprey is formally announced and made available.**Including a*Principles of Operation*document which details the programming model.**I’d much rather see an upgraded 27-qubit Falcon than a 433-qubit Osprey.**A 27-qubit Falcon with enhanced connectivity and another 1.5 nines of qubit fidelity would be much more beneficial for quantum algorithm designers and quantum application developers than whatever Osprey might deliver. Hummingbird and Eagle were distractions rather than substantial advances for quantum algorithm designers and quantum application developers.**My hope is that this paper might help to cajole IBM into doing a better job of setting expectations for Osprey and to do a better job of detailing milestones for specific technical factors in their hardware roadmap going forward.**

For more of my writing: ** List of My Papers on Quantum Computing**.