What Single Advance in Quantum Computing Is Most Needed in the Near Future?

Jack Krupansky
35 min readFeb 23, 2022

There are many technical obstacles to enabling production-scale practical real-world quantum applications, but which singular advance in capability would deliver the greatest benefit in the near future — the next six to twelve months or so? There are so many advances which are needed, but which would enable the greatest leap in progress for quantum algorithm designers and quantum application developers? This informal paper will explore the possibilities.

And the answer is…

  • Higher qubit fidelity

But why that particular advance and not others? This informal paper will elaborate my thinking about selecting that advance.

Topics covered in this paper:

  1. Near future or near term — the next six months to a year or so
  2. In a nutshell
  3. Overview
  4. Higher qubit fidelity is the most needed advance in quantum computing in the near term
  5. Personal historical perspective
  6. Advance vs. capability or feature or characteristic
  7. Urgent need, importance, and priority
  8. Criteria for selecting the single most important near-term advance in quantum computing
  9. Different audiences may have different needs and priorities
  10. The audience focus here is on quantum algorithm designers and quantum application developers
  11. Researchers as an audience: what do they need in the near-term?
  12. What is a near-perfect qubit?
  13. What are nines of qubit fidelity?
  14. Qubit fidelity includes coherence, gate errors, and measurement errors
  15. Higher qubit fidelity not likely to achieve near-perfect qubits in the near term
  16. Coherence time will limit the degree to which SWAP networks can be used to simulate connectivity
  17. Two paths to greater circuit depth — longer coherence time or faster gate execution time
  18. The IBM 127-qubit Eagle didn’t address the qubit fidelity issue
  19. Preview of the IBM 433-qubit Osprey
  20. Further improvement to qubit fidelity in the IBM 27-qubit Falcon?
  21. Is 27 qubits the best we can do for the near term?
  22. Where are all of the 40-qubit algorithms?
  23. Need for automatically scalable quantum algorithms
  24. Limited connectivity is more of an absolute barrier — all or nothing, incremental advances are not really possible
  25. Advances not likely in the near term
  26. Quantum error correction (QEC) is a critical priority, but not in the near term, except for research
  27. Advances and capabilities not considered as critical gating or limiting factors in the near term
  28. An alternative to the Quantum Volume metric is not essential for the near term
  29. Advances in fine granularity of phase and probability amplitude not so likely in the near term
  30. A more advanced quantum programming model is not likely in the near term
  31. Advances in qubit fidelity have the added benefit of enabling other advances
  32. Many advances will eventually bump into limitations in other capabilities
  33. Ordering of advances — not so easy to predict or plan
  34. Possible that some qubit technologies might do better than others in the near term
  35. Limiting and critical gating factors may be algorithm-specific or application-specific
  36. Special needs for variational methods
  37. More capable simulators really are needed, but…
  38. Use simulation to find limits for benefits
  39. Any dispute as to the most urgent advance?
  40. What would be the second most needed advance?
  41. My original proposal for this topic
  42. Summary and conclusions

Near future or near term — the next six months to a year or so

When this paper refers to the near future or near term, it is generally a reference to the next year or so. Maybe a little less, but not much more. It’s definitely not a reference to the 18-month or two-year timeframe. And it’s definitely looking out past what is currently available in existing real quantum computers.

Technically the next few months is the near term as well, but in the context of this paper I don’t expect major advances over just a few months.

Maybe six to nine months as the lower bound for near term.

And maybe just another few months — one or two or three — as the upper bound for near term.

But thinking of the near term as roughly a year or so is a close-enough approximation of the intended meaning of near term in the context of this paper.

In a nutshell

The goal of this paper is to identify the single most important advance needed for quantum computing in the near future. What advance is most urgent and deserves the highest priority?

Some of the more obvious candidate advances to be considered are:

  1. More qubits.
  2. Higher qubit fidelity.
  3. Greater qubit connectivity.
  4. Longer coherence time.
  5. Faster gate execution time.
  6. Greater circuit depth.
  7. Finer granularity for phase and probability amplitude.
  8. Support for nontrivial quantum Fourier transform and quantum phase estimation.
  9. Richer collection of algorithmic building blocks.
  10. More capable simulators.

To make a long story short, the single advance that appears to be most important, most urgent, and most deserving of the highest priority is:

  • Higher qubit fidelity.

Other advances such as connectivity and coherence time are vitally important, but qubit fidelity really stands out as the single most beneficial advance, in the near term.

And coherence time and circuit depth won’t matter if qubit fidelity is too low to deliver meaningful results.

Some of the criteria which make it the most beneficial advance are:

  1. Urgent need. People are struggling without it.
  2. Technical benefit is very high.
  3. Delivers the most bang for the buck.
  4. Applies to all algorithms and applications.
  5. Doesn’t rely on other advances to get started or to make progress.
  6. Incremental progress is possible.
  7. Enables other advances.
  8. It would give the field a boost in momentum.

It’s worth noting that different audiences may have different needs and priorities relative to each possible advance. This paper focuses on quantum algorithm designers in general as well as quantum application developers in general. Specific niche categories may have different needs and priorities than discussed here.

Caveat: Higher qubit fidelity does not necessarily mean that near-perfect qubits will be achieved in the near term, just incrementally better qubit fidelity. This will be discussed more in a subsequent section.

Where are all of the 40-qubit algorithms? There are a number of limiting factors, with low qubit fidelity being at the top of the list. Higher qubit fidelity alone may not be enough to open the floodgates for 40-qubit algorithms, but it’s the top priority step to take.

Incidentally, the single advance which is most likely over the next year is also the least beneficial:

  • More qubits.

We’ve progressed to the stage where more qubits without higher qubit fidelity and improved connectivity are of negligible value. Sure, we definitely need more qubits for the longer term, including for quantum error correction (QEC), but they’re not as high a priority for the near term as higher qubit fidelity.

Overview

In a recent informal paper I explored the full range of technical advances which are needed over the next two years for quantum computing to remain on track to enabling production-scale practical real-world quantum applications. It’s a rather long list, even in summary form. Read it here:

Just to give a flavor, here are some of the obvious advances needed, not in any priority order per se:

  1. More qubits.
  2. Higher qubit fidelity.
  3. Higher gate fidelity.
  4. Higher qubit measurement fidelity.
  5. Near-perfect qubits.
  6. Greater qubit connectivity.
  7. Longer coherence time.
  8. Faster gate execution time.
  9. Greater circuit depth.
  10. Finer granularity for phase and probability amplitude.
  11. Support for nontrivial quantum Fourier transform and quantum phase estimation.
  12. Richer collection of algorithmic building blocks.
  13. Scalable quantum algorithms.
  14. Algorithms using 32 to 40 qubits.
  15. More capable simulators.
  16. New qubit technologies.
  17. More advanced qubit technologies.
  18. Modular quantum processor architectures.
  19. Research for advances two to five years from now.

That’s a rather tall order, even for two years. And that’s just a sampler. Where to start?!

Unfortunately, I suspect that the correct answer to the headline question is none of the above or maybe all of the above. Really. Seriously.

Or, I can cheat as I did on my Christmas and New Year wish list and select support for nontrivial quantum Fourier transform and quantum phase estimation — which is a cheat since that indirectly requires all nine of the preceding advances on the list.

If you twisted my arm and forced me to pick just one — and it couldn’t be the umbrella advance of support for quantum Fourier transform and quantum phase estimation — I suppose it would have to be higher qubit fidelity since virtually everything relies on it — all algorithms and all applications.

Another possible and important criterion is whether an advance enables other advances.

One criterion which leads to my selection of higher qubit fidelity is simply that it’s a place to start — it doesn’t rely on any other advances.

Another criterion leading to the selection of higher qubit fidelity is a wide range of quantum algorithms and quantum applications could immediately use it without any change to the algorithms or application source code. As opposed to greater qubit connectivity, which may be a greater priority overall, but would require significant rework and fresh algorithm design and fresh application development in order to even moderately exploit it. Ditto for my preference for quantum Fourier transform support — existing algorithms and applications are not set up to just drop it in without major design rework.

Overall, higher qubit fidelity would deliver the most bang for the buck.

Those are just a few of the possible criteria. This paper will explore other criteria as well.

Caveat: Higher qubit fidelity does not necessarily mean that near-perfect qubits will be achieved in the near term, just incrementally better qubit fidelity. This will be discussed more in a subsequent section.

Another approach to the headline topic question is to simply settle on a short list of top candidates with the assurance that focusing on any of them in any order in the near future will deliver significant value in the long slow march to enabling production-scale practical real-world quantum applications.

Although coherence time and circuit depth will certainly be critical advances needed soon enough, they won’t really matter until qubit fidelity is high enough for longer quantum circuits to deliver meaningful, low-error results.

Where are all of the 40-qubit algorithms? There are a number of limiting factors, with low qubit fidelity being at the top of the list. Higher qubit fidelity alone may not be enough to open the floodgates for 40-qubit algorithms, but it’s the top priority step to take.

Two significant caveats:

  1. Raw qubit count is not a current limiting factor for most use cases.
  2. Quantum error correction is far over the horizon, so not a near-term priority. Research for two to five years, yes, but not for practical use in the coming months to a year — or even two years.

It’s worth noting that different audiences may have different needs and priorities relative to each possible advance. This paper focuses on quantum algorithm designers in general as well as quantum application developers in general. Specific niche categories may have different needs and priorities than discussed here.

Higher qubit fidelity is the most needed advance in quantum computing in the near term

Just to emphasize that higher qubit fidelity is the most important, most urgent, most deserving, and highest priority advance needed in quantum computing — in the near term.

Personal historical perspective

When I first started digging into quantum computing seriously in 2018, coherence time was the thing everybody talked about. There was little mention of qubit fidelity or gate errors. And no mention of limited connectivity as a problem either.

Actually, in truth, IBM had announced coherence time of 100 microseconds back in 2012

And, qubit fidelity was a real issue — actually, gate errors — but not clearly identified as the primary limiting factor.

In truth, there was confusion as to whether low qubit fidelity or coherence time was the key limiting factor, although in hindsight I can confidently say that low qubit fidelity was the key limiting factor.

People were chattering as if limited qubits — 20 qubits — was a key limiting factor, but there were few algorithms using more than five to eight qubits anyway.

Now, we have plenty of qubits, but qubit fidelity (gate errors) and limited qubit connectivity are the major obstacles to near-term progress for quantum algorithms.

Actually, I would say that inability to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) are they key limiting factors to achieving dramatic quantum advantage, such as for quantum computational chemistry.

Coherence time is still out there as an ultimate limiting factor, but qubit fidelity and qubit connectivity will come into play long before we get to coherence time — at least in the very near term.

In all of those years from 2018 right up to the present moment, nobody except me has been talking about how quantum computing is severely limited by lack of fine granularity for phase and probability amplitude and how that limits quantum Fourier transform (QFT) and quantum phase estimation (QPE). Of course, low qubit fidelity and limited connectivity are also key limiting factors as well.

In truth, most quantum algorithms are so small — so they can run on existing real quantum computers — that coherence time and circuit depth and phase granularity are not at the front and center of attention. It will be interesting to see how long it takes before that changes.

I can only speculate what people will be chattering about in terms of priorities a year from now. It does depend in large part on how much progress is made over the next year.

But for now, qubit fidelity does have most of the attention. With limited qubit connectivity looming out there as well.

Advance vs. capability or feature or characteristic

This paper uses advance as a synonym for capability or feature or characteristic. Or, more explicitly, an advance is taken to be either:

  1. The introduction or addition of a new capability or feature or characteristic.
  2. The improvement or enhancement of an existing capability or feature or characteristic.

And improvement or enhancement or characteristic could refer to:

  1. Function. What it does.
  2. Performance. How fast it does it. Units of processing per unit of time. Or units of time per unit of processing.
  3. Capacity. How large or how many units of information can be processed.

Generally, this paper will refer to any of these concepts using advance as a catchall umbrella term.

Urgent need, importance, and priority

The focus in this paper is on advances which are important and should be assigned a high priority, but ultimately the main focus is on advances for which there is the greatest urgent need.

Indeed, some features may be more important for the longer term, but be a lower priority in the near term. Quantum error correction (QEC) is an example. Or greater connectivity, which is critical for the longer term, but may not be practical in the near term anyway — and may not even be needed for some technologies such as trapped-ion and neutral-atom qubits.

Criteria for selecting the single most important near-term advance in quantum computing

These are some of the criteria which can be used to attempt to determine which capabilities might be prime candidates for being the single most needed advance for quantum commuting in the near term.

There is no implication that all or most of the criteria must be met. And the criteria are not in any absolute order of importance or priority.

  1. Urgent need. People are struggling without it.
  2. Technical benefit is very high.
  3. Delivers the most bang for the buck.
  4. It’s essential. Real progress is not possible without it.
  5. It would give the field a boost in momentum. Accelerate progress beyond the raw technical benefit of the advance itself.
  6. It’s a place to start. It requires nothing else. Helps to start building momentum.
  7. It’s an easy place to start. It requires little effort to get started.
  8. Doesn’t rely on other advances to get started or to make progress.
  9. It’s easy to implement. Minimal effort to complete.
  10. Incremental progress is possible. It can be implemented and used incrementally. It’s not an all or nothing proposition. Subsets of the full capability are reasonably useful.
  11. Already in progress. Needs to be finished, but risk is low.
  12. Immediate use without any change to algorithms or applications. A wide range of quantum algorithms and quantum applications could immediately use it without any change to the algorithms or application source code.
  13. It applies to a moderate range of algorithms and applications.
  14. It applies to a very wide range of algorithms and applications. Wider range makes it more valuable.
  15. It applies to all algorithms and applications. Universal benefit makes it extremely valuable.
  16. Research is ripe to pursue. No major open issues which might take years to resolve. Completed research is sitting on the shelf and published.
  17. Science and tech is ready. No research is needed. Everything that is needed is sitting on the shelf and published.
  18. Technical feasibility. Can be implemented within a few months to a year.
  19. Feasible in the near term. Everything that it requires is already in place — or will be completed as part of the task.
  20. People are clamoring for it. Satisfies a market demand.
  21. Needed for the two-year horizon. Foundation for the next stage.
  22. It enables other advances.
  23. It enables higher-priority capabilities.
  24. Helps to enable quantum error correction. A longer-term goal, but all of the pieces need to be put in place over an extended period of time.
  25. Helps to enable quantum parallelism. This is the whole point of quantum computing.
  26. Helps to achieve quantum advantage. Also the whole point of quantum computing.
  27. Simplifies quantum algorithm design.
  28. Simplifies quantum application development.
  29. Makes quantum algorithms more efficient.
  30. Makes quantum applications more efficient.

Different audiences may have different needs and priorities

Each audience is entitled to its own perceptions of its needs and its priorities.

Some of the potential audiences for quantum computing:

  1. Researchers. They’re developing the capabilities described in this paper rather than using them. What they need is more money, talent, management support, and time. And access to underlying research.
  2. Algorithm designers.
  3. Application developers.
  4. Business customers.
  5. Science customers.
  6. Engineering customers.
  7. End users.
  8. IT staff.
  9. Management.
  10. Executives.
  11. Each application category.

The audience focus here is on quantum algorithm designers and quantum application developers

This paper focuses on the needs for the audience of quantum algorithm designers as well as quantum application developers — the technical staff who are actually focused on using the technical features of the quantum computer to perform quantum computations.

It’s not that other audiences are not important, but just that quantum computation is the whole reason for the existence of quantum computers, so it makes sense that the audience focus is on technical staff engaged in quantum computation.

Researchers as an audience: what do they need in the near-term?

Although we just indicated that quantum algorithm designers and quantum application developers are the intended audience focus for this paper, the technical staff actually engaged in quantum computation, the staff actually using the technical features of the quantum computer to perform quantum computations, it also makes sense to highlight the needs and priorities of the researchers (and engineers) who are actually doing the research and engineering to develop those technical features.

What do these researchers and engineers need?

  1. Priority on research.
  2. Funding for research. And engineering.
  3. Talent pool for research. And engineering.
  4. Management support.
  5. Time.
  6. Access to underlying research. Building on the work of other researchers. Hopefully not hidden or protected by intellectual property (IP) protections.
  7. Feedback from users, algorithm designers, and application developers to develop better technical capabilities.

What is a near-perfect qubit?

As the term suggests, a near-perfect qubit is not quite perfect and error-free, but is close enough for the needs of the vast majority of quantum algorithms and quantum applications, without the need for the complexity and cost of full-blown (and expensive) quantum error correction (QEC). For more detail, see my paper:

What are nines of qubit fidelity?

Nines of qubit fidelity basically refers to the reliability of a qubit. It’s the number of leading nines in the percentage of reliability. For example:

  1. 90% = one nine.
  2. 95% = 1.5 nines.
  3. 98% = 1.8 nines.
  4. 98.5% = 1.85 nines.
  5. 99% = two nines.
  6. 99.5% = 2.5 nines.
  7. 99.9% = three nines.
  8. 99.95% = 3.5 nines.
  9. 99.99% = four nines.

For more detail on nines of qubit fidelity, see my paper:

Qubit fidelity includes coherence, gate errors, and measurement errors

Qubit fidelity is actually a vague, umbrella term. I use it to mean anything which can impact the reliability or quality of the quantum state of a qubit. This includes:

  1. Coherence. Quantum state can decay or decohere over time, the coherence time.
  2. Single-qubit gate execution errors. Potential for errors executing a quantum logic gate on even a single qubit.
  3. Two-qubit gate execution errors. Potential for errors executing a quantum logic gate on two qubits. This is generally the limiting factor which determines overall qubit fidelity.
  4. Measurement errors. Even measurement of a qubit is not 100% reliable. And it tends to be less reliable than even two-qubit gate execution.
  5. Variations between qubits. Not all qubits in a given quantum processor have the same fidelity. Even different pairs of qubits can have different two-qubit gate execution errors.

Technically, qubit fidelity should be some sort of composite of all of these factors, although listing them all separately can be quite enlightening as well. Both the aggregate and the detail are useful.

For the purposes of this paper, qubit fidelity is taken to be the composite aggregate of all of these factors, as vague and ambiguous as that may be. For simplicity, the average two-qubit gate execution reliability is a reasonable surrogate for overall qubit fidelity.

For more on these distinctions and possible aggregations of them, see my paper on nines of qubit fidelity:

Higher qubit fidelity not likely to achieve near-perfect qubits in the near term

Although I have much higher expectations for achieving near-perfect qubits in the two-year timeframe, I’m not so optimistic for the near term, other than a modest to moderate improvement in qubit fidelity.

I do expect reasonable progress towards the longer-term goal of full near-perfect qubits in the near term.

I do expect that we’ll get reasonably close to near-perfect qubits, just not all the way there in the near term.

Some possibilities for the near term for qubit fidelity:

  1. Full near-perfect qubit fidelity. A full four nines or better. Rather unlikely in the near term. The primary goal for the two-year timeframe. Good enough for the vast majority of quantum algorithms and quantum applications.
  2. 3.75 nines. May be close enough to near-perfect for many quantum algorithms and quantum applications. But not so likely in the near term. Good enough for a sizable majority of quantum algorithms and quantum applications.
  3. 3.50 nines. Perfectly reasonable goal for the near term. May be the best to hope for in the near term. Reasonably acceptable minimum achievement for the two-year timeframe. Good enough for many or even most quantum algorithms and quantum applications.
  4. 3.25 nines. Reasonable goal for the near term. Marginally acceptable achievement for the two-year timeframe. Good enough for a significant fraction of quantum algorithms and quantum applications.
  5. Three nines. Marginally reasonable goal for the near term. Bare minimum achievement for the two-year timeframe. Good enough for some quantum algorithms and quantum applications.
  6. 2.75 nines. Minimal reasonable goal for the near term. Disappointing achievement for the two-year timeframe. Good enough for some niche quantum algorithms and quantum applications.
  7. 2.50 nines. Disappointing achievement for the near term. Dismal failure for the two-year timeframe. Generally not good enough for any quantum algorithms and quantum applications.
  8. 2.25 nines. Only acceptable as a stepping stone, a milestone on the path to higher qubit fidelity. But not really usable in any meaningful manner.
  9. Under two nines. Fairly dismal failure, even in the near term.

So, I’d hope for 3.5 nines in the near term and settle for three to 3.25 nines. I’d only settle for 2.75 nines if that’s the best that the hardware guys can deliver — and if clever algorithm designers and application developers can work around or mitigate the remaining errors.

Anything less and I’d consider the field to have failed in the near term and likely to get stuck in at least a mini Quantum Winter until three to 3.50 nines of qubit fidelity can be achieved.

Coherence time will limit the degree to which SWAP networks can be used to simulate connectivity

Although coherence time and circuit depth will ultimately be limiting factors eventually, my thought in the context of this paper was that they would not be the critical limiting factors until qubit fidelity improved dramatically. So, people shouldn’t worry about coherence time and circuit depth until after qubit fidelity is enhanced.

An exception to that logic is that for quantum computers with weak connectivity, such as transmon qubits with only nearest-neighbor connectivity, an excessive degree of SWAP networks required to simulate connectivity could quickly balloon the size of quantum circuits so that coherence time and circuit depth do indeed become critical gating factors to progress much sooner. But… unless qubit fidelity improves significantly, such excessive SWAP networks won’t be feasible anway without incurring excessive gate errors.

Of course, my preferred solution would be to architect improved connectivity, but as previously noted that may not be likely in the near term.

So, improved coherence time can enable:

  1. Deeper circuits.
  2. Larger circuits.
  3. Improved effective connectivity — if relying on SWAP networks.

Two paths to greater circuit depth — longer coherence time or faster gate execution time

Coherence time does limit circuit depth, but there is another path to greater circuit depth besides increasing coherence time — faster gate execution time.

Which path will be easier or more feasible will vary from time to time, but greater coherence time or faster gate execution time are both viable paths to greater circuit depth.

Faster gate execution time is also another path to using SWAP networks to simulate connectivity for transmon qubits, but only to the degree that circuit depth was the limiting factor. If qubit fidelity is the limiting factor, faster gate execution time won’t improve the degree to which SWAP networks can be used to simulate qubit connectivity.

The IBM 127-qubit Eagle didn’t address the qubit fidelity issue

I was really disappointed in November 2021 when the brand new IBM 127-qubit Eagle quantum processor was introduced by IBM but failed to deliver any improvement in qubit fidelity.

Worse, no explanation or excuse was given by IBM.

I’m hopeful that IBM may deliver several upgrades to Eagle over the next year as they did over the past year with their 27-qubit Falcom quantum processor, but they haven’t made any such commitments, so we’ll have to see.

So, as it stands now, in general, there’s no real benefit to moving up from a 27-qubt Falcon to a 127-qubit Eagle. There are likely some niche exceptions, but this is likely the general case.

Preview of the IBM 433-qubit Osprey

I would dearly love to know what capabilities the upcoming IBM 433-qubit Osprey quantum processor will deliver later this year, but other than 433 qubits, we simply don’t know.

I would hope that it has improved qubit fidelity, but we just don’t know at this stage. We’ll have to wait and see. The recent 127-qubit Eagle didn’t deliver improved qubit fidelity, so I’m primed for further disappointment, but still hopeful.

I would hope that it has some improvement to qubit connectivity, but IBM hasn’t hinted at any, so I’m not holding my breath. But I do hope they at least deliver a roadmap and set expectations for qubit connectivity improvements in future years and future processors.

We’ll simply have to wait until November to see.

Further improvement to qubit fidelity in the IBM 27-qubit Falcon?

IBM did deliver improvements to qubit fidelity over the past year for their 27-qubit Falcon quantum processor, including hinting at achieving three nines of qubit fidelity for a 2-qubit CNOT gate at least in one test configuration. It would be nice if they delivered further improvements over the next year, but they’ve made no commitments and didn’t even make any promises. Still, I remain hopeful.

Is 27 qubits the best we can do for the near term?

Yes, we have quantum computers with more than 27 qubits, but with limited qubit fidelity and limited qubit connectivity, as well as limited coherence time, we’re very limited as to the size of quantum circuit which can be executed and deliver high-quality results. In fact I don’t recall any published papers for algorithms using more than about 23 qubits to address the kinds of applications often touted as ideal for quantum computers.

Sure, there are esoteric computer science experiments that can use more qubits, but they’re not addressing practical, real-world problems.

So if somebody has an algorithm using 16 to 24 qubits on a 27-qubit or 53-qubit quantum computer and you give them a 40 or 65 or 80 or 100 or 127-qubit quantum computer they can’t generally easily take advantage of all of those extra qubits.

Where are all of the 40-qubit algorithms?

I keep saying that we need to see 32 to 40-qubit algorithms, even if run only on simulators, but I’m not seeing such algorithms.

To me, this is a measure of how limited current quantum computers are today. It’s unclear exactly which limiting factors are causing this dearth of 40-qubit algorithms, but a short list of candidates is:

  1. Low qubit fidelity.
  2. Limited qubit connectivity.
  3. Limited coherence time.
  4. Limited circuit depth.
  5. Lack of rich algorithmic building blocks.
  6. Lack of experience designing complex algorithms.

And for the purposes of this paper, low qubit fidelity is the #1 limiting technical factor precluding 40-qubit algorithms. With limited qubit connectivity a strong runner-up.

For more on this dearth of 40-qubit algorithms, see my paper:

A big part of the problem is scalability of quantum algorithms…

Need for automatically scalable quantum algorithms

One issue that is under the control of quantum algorithm designers and not an inherent limitation of the available hardware per se is that generally quantum algorithms have not been designed to be automatically scalable. Generally, they are designed to fit the limitations of a particular quantum computer and can’t easily be moved to a new, larger quantum computer and easily use the additional qubits to handle larger input data or larger calculations. Sometimes, yes, but generally, no.

The real point here is that this is an advance which is fully in the hands of quantum algorithm designers. They can make progress without any hardware advances.

Granted, enhanced qubit fidelity and qubit connectivity would be a big help for scaling of many quantum algorithms, but the basic scaling design is within the hands of the quantum algorithm designer. They can at least design for scaling even if the hardware is not quite ready yet. And they can also use simulation while waiting for the hardware to catch up.

For more on automatic scalability of quantum algorithms, see my paper:

Limited connectivity is more of an absolute barrier — all or nothing, incremental advances are not really possible

Many advances are compatible with an incremental approach — they are not all or nothing propositions. But limited connectivity is an exception. Trapped-ion and neutral-atom quantum computers are an exception since they have full, unlimited any to any connectivity, but the connectivity of transmon qubits is hardwired and very limited, such as nearest neighbor. Adding further connectivity would be very expensive and problematic, at best, so there is no real potential for incremental improvement short of an architectural change which might offer full any to any connectivity or something close to it.

Granted, higher qubit fidelity does enable longer SWAP networks which can effectively yield incremental progress on connectivity, but that’s a poor-man’s connectivity and not a very satisfying long-term solution to limited qubit connectivity. And it can accelerate bumping into coherence time as a limiting factor as well.

In short, limited connectivity is a serious problem which can only be addressed with a dramatic architectural change rather than incremental improvement over time.

Advances not likely in the near term

Some of the more interesting advances which are unlikely to get included in the near term include:

  1. Quantum error correction (QEC). Some number of years required.
  2. Larger quantum Fourier transform (QFT). Lucky if we can get even 12 or 16 qubits in the near term.
  3. Greater connectivity. Significant architectural changes required.
  4. Fine granularity of phase and probability amplitude. Unless it’s a quick fix in the firmware. But more sophisticated hardware and architectural changes are likely required.
  5. Quantum networking. Much research is needed.
  6. Quantum volume alternative. Unless somebody comes up with one shortly.
  7. Quantum-native programming languages. Depend on more advanced programming models.
  8. More advanced quantum programming models. Still too hard and too unknown, even if it is desperately needed. Much research is required. Not really needed until we have a moderate range of qubits with high fidelity anyway. Figure three to five years.

Quantum error correction (QEC) is a critical priority, but not in the near term, except for research

It may be three to seven years before quantum error correction (QEC) becomes practical, generally available, and commonly used, but definitely not in the near term, not in the next two years and not in the next year for sure.

Research is of course needed over the next few years, but that won’t result in any useful capability in the near term.

For more on quantum error correction, see my paper:

Advances and capabilities not considered as critical gating or limiting factors in the near term

These are advances or capabilities which might indeed have significant value and might even be feasible in the near term but simply aren’t critical gating or limiting factors for significant progress of quantum computing in the near term, over the next year:

  1. More qubits. We already have plenty for many use cases — they just don’t have sufficient fidelity, connectivity, or fine enough granularity of phase or probability amplitude.
  2. Support software and tools. They are important, but generally they can be designed and implemented relatively easily and with low technical risk so that they are not true and substantial advances per se. Generally they will make life easier, but quantum algorithm designers and quantum application developers can generally get along (or occasionally limp along) without them, or with only primitive support software and tools.
  3. An alternative to the Quantum Volume metric. Such as for more than 50 qubits, or even 40, 32, or 28, or 24 qubits.
  4. A more advanced programming model. This will be essential at some stage in the future, but there are much more pressing needs in the near term.
  5. A quantum-native programming language. Ditto.

An alternative to the Quantum Volume metric is not essential for the near term

The basic issue or limitation with the Quantum Volume metric is that it is limited to roughly 50 qubits — or maybe even to only 40, 32, 28, 24, or fewer qubits — since it requires simulation of the quantum circuit which is being run on the quantum computer. So it’s really a limitation of the classical simulation software.

This will be a critical limiting factor at some stage, but right now and probably for the rest of the coming year there will be any number of critical limiting factors which prevent quantum circuits using 24 or more qubits from running correctly on real quantum computers.

Qubit fidelity and qubit connectivity are the top two candidates for critical limiting factors which prevent Quantum Volume from running into the 50-qubit simulation limit.

For example, the new IBM 127-qubit Eagle quantum processor has a Quantum Volume of only 64, so essentially only six qubits can be used before low qubit fidelity and limited connectivity cause circuits to deliver incorrect results, so there is no need to simulate circuits with more than six qubits.

But if a trapped-ion or neutral-atom quantum computer can manage to support circuits with 50 or more qubits over the coming year, they may hit the 50-qubit simulation limit. They have unlimited qubit connectivity, but qubit fidelity is still only so-so and may prevent them from delivering correct results for 50-qubit circuits anyway.

Advances in fine granularity of phase and probability amplitude not so likely in the near term

Although I really would like to see use of quantum Fourier transform (QFT), quantum phase estimation (QPE), quantum amplitude estimation (QAE), and amplitude amplification over the coming year, it does seem unlikely since all of these capabilities depend on fine granularity of phase and probability amplitude, at least for nontrivial use cases.

Sure, it is still possible that we could see some incremental advance, especially if it focuses on firmware or the classical digital and analog hardware of qubit control, but anything more than a modest incremental advance would likely require a significant hardware redesign, and may run into theoretical and practical limitations as well.

The granularity of phase and probability amplitude may be driven by a combination of:

  1. Reduction and limitation of noise and other interference.
  2. Precision of the digital to analog converters (DACs) used to convert digital data to analog form to be applied to the qubit hardware.

High-precision DACs are expensive, use lots of power, have lots of wiring, and are susceptible to noise. They’re difficult to work with. And there are limits to their precision.

Lower-precision DACs are cheaper, more efficient, require less wiring, and are more resilient in the face of noise and other interference. Unfortunately, they are incapable of supporting very-fine granularity of phase and probability amplitude.

It’s a difficult tradeoff. You can understand why engineers might choose coarser control of phase and probability amplitude.

I expect improvement over time, but not necessarily in the very near future.

The more difficult factors are two-fold:

  1. Practical considerations. Availability and cost of fine granularity DACs. And practical limitations on precision of DACs. 8-bit, 16-bit, 18-bit, 20-bit, and 32-bit precision is available. What’s actually practical in the context of a quantum computer and qubit control is another matter, and unclear, and never documented.
  2. Theoretical considerations and limits of physics. What if an application really does need a 48-bit or 80-bit quantum Fourier transform (or Shor’s factoring algorithm needs 4096 or 8192-bit QFT)? What does the underlying physics support even if you had ideal digital and analog logic?

Vendors of quantum computers need to clearly document their capabilities in terms of the fine granularity of phase and probability amplitude.

We need to know if it even makes sense to contemplate future quantum algorithms using quantum Fourier transforms using:

  1. 8 qubits. Hopefully a slam dunk. I would hope this could happen over the next year.
  2. 12 qubits. Hopefully a slam dunk. But when? Again, one can hope for the next year, but that would require a number of other advances, and possibly upgrades to the qubit control hardware.
  3. 16 qubits. Unlikely over the next year.
  4. 20 qubits. Should be feasible. But not over the next year.
  5. 24 qubits. May or may not be feasible.
  6. 28 qubits. May or may not be feasible.
  7. 32 qubits. May or may not be feasible.
  8. 40 qubits. Questionable feasibility.
  9. 48 qubits. Dubious feasibility.
  10. 56 qubits. Beyond speculation at this stage.
  11. 64 qubits. Ditto.
  12. 72 qubits. Ditto.
  13. 80 qubits. Ditto.
  14. 96 qubits. Ditto.
  15. 128 qubits. Ditto.
  16. And beyond 128 qubits. Ditto.

We can speculate about some sort of idealistic fantasy Vunder-DAC which supports far more than 32-bits of precision. But, such a device does not exist today, and there is no prospect of it existing in a year, two years, or maybe even ever. The hardware engineers and researchers need to come clean as to what we can expect and when we can expect it.

For more on phase, see my paper:

A more advanced quantum programming model is not likely in the near term

I do indeed strongly believe that we need a much more powerful and advanced quantum programming model to enable non-elite technical staff to exploit the power of quantum computers, but…

  1. It’s not the critical technical gating factor. Lack of high fidelity qubits would preclude its operation. Since only relatively short quantum circuits would be supported by the hardware, the advanced quantum programming model would focus on larger, more sophisticated quantum algorithms.
  2. The conceptualization of a more advanced quantum programming model is still an open research question. Could take at least a few years of conceptual development and experimentation before the conceptual model matures and serious, production-scale implementation can begin.
  3. A more advanced programming model would be more focused on larger algorithms with many more qubits. Limited utility for 32 to 40 or maybe even 80 qubits.

In short, as desirable as a more advanced quantum programming model would be in the near term, it wouldn’t be the top, #1 priority.

Advances in qubit fidelity have the added benefit of enabling other advances

Enhanced qubit fidelity doesn’t enable all other advances, but many of them. Some examples:

  1. Greater simulated connectivity using SWAP networks. Critically limited by low qubit fidelity.
  2. Greater circuit size.
  3. Greater circuit depth.
  4. Nontrivial quantum Fourier transform (QFT).
  5. Nontrivial quantum phase estimation (QPE).
  6. Nontrivial quantum amplitude estimation (QAE).
  7. Nontrivial amplitude amplification.
  8. More advanced and more sophisticated quantum circuits.
  9. Much richer collection of algorithmic building blocks.

Many advances will eventually bump into limitations in other capabilities

One of the advantages or reasons for pursuing qubit fidelity as the top priority advance is that so many other advances won’t be able to progress very far without sufficient qubit fidelity. For example, quantum Fourier transform and quantum phase estimation or circuit depth.

Generally, every advance will eventually run into or bump up against the limitations of some other capability. Or multiple limitations.

Even qubit fidelity will eventually bump into limitations of other capabilities, such as:

  1. Limited connectivity. SWAP networks can dilute or consume too much of qubit fidelity.
  2. Coherence time. Even with very high fidelity, eventually qubits (or gates, actually) will bump up against limited coherence time.

Even further advances in qubit fidelity will have minimal or no beneficial effect once such limits have been hit.

In any case, one must always be cognizant of how far each advance can go before it hits some limit or limits of other capabilities.

Ordering of advances — not so easy to predict or plan

As just discussed, any given advance may be limited in its progress by other advances which haven’t yet occurred. This suggests that there is some ideal ordering of advances so that any given advance isn’t attempted until there are no further advances that would limit the given advance.

That is likely true, but also somewhat idealistic, primarily because a lot of advances may be limited by a lack of knowledge or technical capabilities needed to implement that advance, such that advances cannot always be simply scheduled as desired or planned in detail in advance.

Some advances may not be feasible or practical over the coming year or even the next two years, or even longer. And some advances may not be feasible or practical for six or nine months from now.

In short, it’s not possible for me to lay out the precise ordering of advances at this time in this paper. Some of that will become more apparent as the year unfolds, and some of it may not even be known before the moment that a given advance commences — or is finished.

Possible that some qubit technologies might do better than others in the near term

Trapped-ion and neutral-atom quantum computers have some automatic advantages over transmon qubits, including:

  1. Longer coherence time.
  2. Generally better qubit fidelity. Although transmon qubits may be catching up.
  3. Greater connectivity. Full any to any connectivity.

It’s not completely clear, but since gate execution time is supposedly slower for trapped-ion and neutral-atom qubits, transmon qubits may have an advantage for circuit depth even though trapped-ion and neutral-atom qubits technically have longer coherence time. Better documentation and specifications from vendors would help to answer such questions.

There may be other advantages that I am not aware of.

Limiting and critical gating factors may be algorithm-specific or application-specific

Although low qubit fidelity may be the most critical gating factor for progress for many or most quantum algorithms and quantum applications, there may be some quantum algorithms or quantum applications for which other advances or capabilities are the critical gating or limiting factors, such as:

  1. Qubit connectivity.
  2. Coherence time or circuit depth.
  3. Fine granularity of phase or probability amplitude.
  4. General lack of support for nontrivial quantum Fourier transform (QFT) or quantum phase estimation (QPE).

Special needs for variational methods

Variational methods likely have their own needs and priorities which are likely rather distinct from the needs and priorities for quantum Fourier transform (QFT) and quantum phase estimation (QPE), but I won’t attempt to ferret them out here in this paper since I consider variational methods to be a dead-end poor man’s substitute for the raw power of quantum Fourier transform (QFT) and quantum phase estimation (QPE), particularly for quantum computational chemistry.

I can surmise that higher qubit fidelity would be equally beneficial to variational methods, but I haven’t confirmed that.

Ditto for greater connectivity.

Whether finer granularity for phase and probability amplitude would benefit variational methods is an open question from my perspective.

More capable simulators really are needed, but…

Part of me really wants to give more capable simulators a significant priority over the near term and longer, but there are already so many high priorities, so it doesn’t make it to the top for highest near-term priority, but deserves an honorable mention..

Some improvements needed are:

  1. Greater capacity.
  2. Higher performance.
  3. More qubits.
  4. Greater circuit depth.
  5. More analysis tools.
  6. More debugging tools.
  7. Better and more accurate noise models. Exactly match existing and proposed quantum computers, so that a simulation run is an accurate reflection of running on a real quantum computer.
  8. Exploit distributed computing. Much greater capacity and performance.
  9. In summary, deliver great simulation of 32 to 40-qubit quantum circuits.

In short, I really hope somebody gives it a top priority for the near term even if I can’t right now.

Use simulation to find limits for benefits

Frequently it will be unclear how extensive or pervasive the benefits of an advance will be, or how far the advance can go. For example, how high can qubit fidelity be pushed, or when further improvement has diminishing returns. Simulation can help to evaluate such questions and issues.

But, this depends on the ability of the simulator to be configured to accurately reflect the details and consequences of any particular advance as well as the details of the overall underlying quantum computer even before the advance enters the picture.

All tools have their limits, but simulation should have some very significant value for evaluating the benefits of proposed advances to quantum computing technology even before the advance is incorporated into a real quantum computer.

Any dispute as to the most urgent advance?

I do think my selection of higher qubit fidelity as the most urgent advance for quantum computing for the near term is quite reasonable, but I do recognize that some may dispute it and assign a higher priority to some other capability, feature, or characteristic.

Some may feel that connectivity is a more urgent priority, or raw qubit count. Or maybe even quantum error correction (QEC).

And some may agree with my real preference, support for nontrivial quantum Fourier transform and quantum phase estimation (QPE), even though that effectively puts a high priority on higher qubit fidelity anyway. As well as support for finer granularity of phase and probability amplitude.

What would be the second most needed advance?

Once the need for higher qubit fidelity is taken care of (or in progress), what would be next on the list, the second most needed advance in the near term? That’s difficult to say and it may be subjective and depend on the audience.

Some top candidates:

  1. Greater qubit connectivity.
  2. Longer coherence time.
  3. Faster gate execution time.
  4. Greater circuit depth.
  5. Finer granularity for phase and probability amplitude. To enable quantum Fourier transform (QFT), quantum phase estimation (QPE), quantum amplitude estimation (QAE), and amplitude amplification.
  6. Support for nontrivial quantum Fourier transform and quantum phase estimation.
  7. Richer collection of algorithmic building blocks.
  8. More capable simulators.

Part of what makes it subjective is that transmon qubits desperately need greater connectivity, but trapped-ion and neutral-atom qubits don’t need it since they already have full any to any connectivity.

Longer coherence time and increased circuit depth will be needed once qubit fidelity is much higher, but may need to wait for greater connectivity. And once again, transmon qubits need it more, while trapped-ion and neutral-atom qubits don’t need it since they already have longer coherence time.

I’m in favor of finer granularity for phase to enable nontrivial quantum Fourier transform, but not all algorithms need that capability, at least in the near term. And it may be that full near-perfect qubits are needed before quantum Fourier transform is feasible, which may not happen in the near term anyway.

And maybe richer algorithmic building blocks would be a safe bet in the near term. But, they may depend on some of these other hardware advances anyway.

I really do favor more capable simulators, but… maybe it isn’t such a top priority, so far. Ask me again in six months.

Maybe there needs to be a split:

  1. Focus on connectivity for transmon qubits.
  2. Focus on finer phase granularity for trapped-ion and neutral-atom qubits. And possibly for transmon qubits as well if greater connectivity is not feasible in the near term.

Maybe that would offer the biggest bang for the buck.

I think I’ll leave it at that, for now — inconclusive as to the second most needed advance.

My original proposal for this topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

  • What single advance in quantum computing is most needed in the near future? There are so many! What criteria to use. How near-term — three months, six months, nine months, one year? Maybe qubit fidelity, or maybe qubit connectivity, or…?

Summary and conclusions

  1. Higher qubit fidelity is the most needed advance in the near term — the next year or so.
  2. Higher qubit fidelity will deliver the most bang for the buck.
  3. Higher qubit fidelity enables numerous other advances, including quantum Fourier transform (QFT) and quantum phase estimation (QPE), use of SWAP networks to simulate qubit connectivity for transmon qubits, and deeper quantum circuits.
  4. Greater qubit connectivity is very important, but will likely require architectural changes for transmon qubits.
  5. Quantum error correction (QEC) is very important for the longer term and certainly a research priority in the near term, but not a practical priority for the near term.
  6. Higher qubit fidelity is not likely or guaranteed to achieve true near-perfect qubits in the near term — over the next year. It might take two years to get there.
  7. Greater coherence time and circuit depth are important for the medium term, but are not critical until qubit fidelity and qubit connectivity are addressed — only relatively shallow circuits can be executed reliably with low qubit fidelity and limited qubit connectivity.
  8. Different audiences may have different needs and priorities relative to each possible advance. This paper focuses on quantum algorithm designers in general as well as quantum application developers in general. Specific niche categories may have different needs and priorities than discussed here.
  9. Different algorithm and application categories may have different requirements for which advances should have top priority.
  10. There are a wide range of criteria that can be used to judge which advances should have higher priority, including: urgent need — people are struggling without it, technical benefit is very high, applies to all algorithms and applications, doesn’t rely on other advances to get started or to make progress, incremental progress is possible, enables other advances, and it would give the field a boost in momentum. And many other criteria.
  11. What should be the next priority after higher qubit fidelity? That’s too difficult to say — there are so many urgent priorities. But once qubit fidelity is no longer the big holdup for most algorithms, the next big holdup will quickly become obvious. I suspect it will be limited connectivity.
  12. The one advance that isn’t a candidate for top priority in the near term is actually the advance that gets so much of the attention in recent months and years: more qubits. We already have enough qubits for many or even most algorithms, but low qubit fidelity and limited qubit connectivity make it very difficult for many algorithms to utilize any significant fraction of those qubits.
  13. I do think we definitely need a much richer collection of algorithmic building blocks as soon as possible, but once again low qubit fidelity and limited qubit connectivity render this goal unachievable in the near term.
  14. Where are all of the 40-qubit algorithms? There are a number of limiting factors, with low qubit fidelity being at the top of the list. Higher qubit fidelity alone may not be enough to open the floodgates for 40-qubit algorithms, but it’s the top priority step to take.
  15. More capable simulators would be a big win in the near term, but I’d rather keep the priority focus on qubit fidelity for now. That said, I’d push hard for further research in simulators, near term, medium term, and longer term.

--

--