What Single Advance in Quantum Computing Is Most Needed in the Near Future?
There are many technical obstacles to enabling production-scale practical real-world quantum applications, but which singular advance in capability would deliver the greatest benefit in the near future — the next six to twelve months or so? There are so many advances which are needed, but which would enable the greatest leap in progress for quantum algorithm designers and quantum application developers? This informal paper will explore the possibilities.
And the answer is…
- Higher qubit fidelity
But why that particular advance and not others? This informal paper will elaborate my thinking about selecting that advance.
Topics covered in this paper:
- Near future or near term — the next six months to a year or so
- In a nutshell
- Overview
- Higher qubit fidelity is the most needed advance in quantum computing in the near term
- Personal historical perspective
- Advance vs. capability or feature or characteristic
- Urgent need, importance, and priority
- Criteria for selecting the single most important near-term advance in quantum computing
- Different audiences may have different needs and priorities
- The audience focus here is on quantum algorithm designers and quantum application developers
- Researchers as an audience: what do they need in the near-term?
- What is a near-perfect qubit?
- What are nines of qubit fidelity?
- Qubit fidelity includes coherence, gate errors, and measurement errors
- Higher qubit fidelity not likely to achieve near-perfect qubits in the near term
- Coherence time will limit the degree to which SWAP networks can be used to simulate connectivity
- Two paths to greater circuit depth — longer coherence time or faster gate execution time
- The IBM 127-qubit Eagle didn’t address the qubit fidelity issue
- Preview of the IBM 433-qubit Osprey
- Further improvement to qubit fidelity in the IBM 27-qubit Falcon?
- Is 27 qubits the best we can do for the near term?
- Where are all of the 40-qubit algorithms?
- Need for automatically scalable quantum algorithms
- Limited connectivity is more of an absolute barrier — all or nothing, incremental advances are not really possible
- Advances not likely in the near term
- Quantum error correction (QEC) is a critical priority, but not in the near term, except for research
- Advances and capabilities not considered as critical gating or limiting factors in the near term
- An alternative to the Quantum Volume metric is not essential for the near term
- Advances in fine granularity of phase and probability amplitude not so likely in the near term
- A more advanced quantum programming model is not likely in the near term
- Advances in qubit fidelity have the added benefit of enabling other advances
- Many advances will eventually bump into limitations in other capabilities
- Ordering of advances — not so easy to predict or plan
- Possible that some qubit technologies might do better than others in the near term
- Limiting and critical gating factors may be algorithm-specific or application-specific
- Special needs for variational methods
- More capable simulators really are needed, but…
- Use simulation to find limits for benefits
- Any dispute as to the most urgent advance?
- What would be the second most needed advance?
- My original proposal for this topic
- Summary and conclusions
Near future or near term — the next six months to a year or so
When this paper refers to the near future or near term, it is generally a reference to the next year or so. Maybe a little less, but not much more. It’s definitely not a reference to the 18-month or two-year timeframe. And it’s definitely looking out past what is currently available in existing real quantum computers.
Technically the next few months is the near term as well, but in the context of this paper I don’t expect major advances over just a few months.
Maybe six to nine months as the lower bound for near term.
And maybe just another few months — one or two or three — as the upper bound for near term.
But thinking of the near term as roughly a year or so is a close-enough approximation of the intended meaning of near term in the context of this paper.
In a nutshell
The goal of this paper is to identify the single most important advance needed for quantum computing in the near future. What advance is most urgent and deserves the highest priority?
Some of the more obvious candidate advances to be considered are:
- More qubits.
- Higher qubit fidelity.
- Greater qubit connectivity.
- Longer coherence time.
- Faster gate execution time.
- Greater circuit depth.
- Finer granularity for phase and probability amplitude.
- Support for nontrivial quantum Fourier transform and quantum phase estimation.
- Richer collection of algorithmic building blocks.
- More capable simulators.
To make a long story short, the single advance that appears to be most important, most urgent, and most deserving of the highest priority is:
- Higher qubit fidelity.
Other advances such as connectivity and coherence time are vitally important, but qubit fidelity really stands out as the single most beneficial advance, in the near term.
And coherence time and circuit depth won’t matter if qubit fidelity is too low to deliver meaningful results.
Some of the criteria which make it the most beneficial advance are:
- Urgent need. People are struggling without it.
- Technical benefit is very high.
- Delivers the most bang for the buck.
- Applies to all algorithms and applications.
- Doesn’t rely on other advances to get started or to make progress.
- Incremental progress is possible.
- Enables other advances.
- It would give the field a boost in momentum.
It’s worth noting that different audiences may have different needs and priorities relative to each possible advance. This paper focuses on quantum algorithm designers in general as well as quantum application developers in general. Specific niche categories may have different needs and priorities than discussed here.
Caveat: Higher qubit fidelity does not necessarily mean that near-perfect qubits will be achieved in the near term, just incrementally better qubit fidelity. This will be discussed more in a subsequent section.
Where are all of the 40-qubit algorithms? There are a number of limiting factors, with low qubit fidelity being at the top of the list. Higher qubit fidelity alone may not be enough to open the floodgates for 40-qubit algorithms, but it’s the top priority step to take.
Incidentally, the single advance which is most likely over the next year is also the least beneficial:
- More qubits.
We’ve progressed to the stage where more qubits without higher qubit fidelity and improved connectivity are of negligible value. Sure, we definitely need more qubits for the longer term, including for quantum error correction (QEC), but they’re not as high a priority for the near term as higher qubit fidelity.
Overview
In a recent informal paper I explored the full range of technical advances which are needed over the next two years for quantum computing to remain on track to enabling production-scale practical real-world quantum applications. It’s a rather long list, even in summary form. Read it here:
- Quantum Computing Advances We Need to See Over the Coming 12 to 18 to 24 Months to Stay on Track
- https://jackkrupansky.medium.com/quantum-computing-advances-we-need-to-see-over-the-coming-12-to-18-to-24-months-to-stay-on-track-b98b89e8bb8f
Just to give a flavor, here are some of the obvious advances needed, not in any priority order per se:
- More qubits.
- Higher qubit fidelity.
- Higher gate fidelity.
- Higher qubit measurement fidelity.
- Near-perfect qubits.
- Greater qubit connectivity.
- Longer coherence time.
- Faster gate execution time.
- Greater circuit depth.
- Finer granularity for phase and probability amplitude.
- Support for nontrivial quantum Fourier transform and quantum phase estimation.
- Richer collection of algorithmic building blocks.
- Scalable quantum algorithms.
- Algorithms using 32 to 40 qubits.
- More capable simulators.
- New qubit technologies.
- More advanced qubit technologies.
- Modular quantum processor architectures.
- Research for advances two to five years from now.
That’s a rather tall order, even for two years. And that’s just a sampler. Where to start?!
Unfortunately, I suspect that the correct answer to the headline question is none of the above or maybe all of the above. Really. Seriously.
Or, I can cheat as I did on my Christmas and New Year wish list and select support for nontrivial quantum Fourier transform and quantum phase estimation — which is a cheat since that indirectly requires all nine of the preceding advances on the list.
If you twisted my arm and forced me to pick just one — and it couldn’t be the umbrella advance of support for quantum Fourier transform and quantum phase estimation — I suppose it would have to be higher qubit fidelity since virtually everything relies on it — all algorithms and all applications.
Another possible and important criterion is whether an advance enables other advances.
One criterion which leads to my selection of higher qubit fidelity is simply that it’s a place to start — it doesn’t rely on any other advances.
Another criterion leading to the selection of higher qubit fidelity is a wide range of quantum algorithms and quantum applications could immediately use it without any change to the algorithms or application source code. As opposed to greater qubit connectivity, which may be a greater priority overall, but would require significant rework and fresh algorithm design and fresh application development in order to even moderately exploit it. Ditto for my preference for quantum Fourier transform support — existing algorithms and applications are not set up to just drop it in without major design rework.
Overall, higher qubit fidelity would deliver the most bang for the buck.
Those are just a few of the possible criteria. This paper will explore other criteria as well.
Caveat: Higher qubit fidelity does not necessarily mean that near-perfect qubits will be achieved in the near term, just incrementally better qubit fidelity. This will be discussed more in a subsequent section.
Another approach to the headline topic question is to simply settle on a short list of top candidates with the assurance that focusing on any of them in any order in the near future will deliver significant value in the long slow march to enabling production-scale practical real-world quantum applications.
Although coherence time and circuit depth will certainly be critical advances needed soon enough, they won’t really matter until qubit fidelity is high enough for longer quantum circuits to deliver meaningful, low-error results.
Where are all of the 40-qubit algorithms? There are a number of limiting factors, with low qubit fidelity being at the top of the list. Higher qubit fidelity alone may not be enough to open the floodgates for 40-qubit algorithms, but it’s the top priority step to take.
Two significant caveats:
- Raw qubit count is not a current limiting factor for most use cases.
- Quantum error correction is far over the horizon, so not a near-term priority. Research for two to five years, yes, but not for practical use in the coming months to a year — or even two years.
It’s worth noting that different audiences may have different needs and priorities relative to each possible advance. This paper focuses on quantum algorithm designers in general as well as quantum application developers in general. Specific niche categories may have different needs and priorities than discussed here.
Higher qubit fidelity is the most needed advance in quantum computing in the near term
Just to emphasize that higher qubit fidelity is the most important, most urgent, most deserving, and highest priority advance needed in quantum computing — in the near term.
Personal historical perspective
When I first started digging into quantum computing seriously in 2018, coherence time was the thing everybody talked about. There was little mention of qubit fidelity or gate errors. And no mention of limited connectivity as a problem either.
Actually, in truth, IBM had announced coherence time of 100 microseconds back in 2012
And, qubit fidelity was a real issue — actually, gate errors — but not clearly identified as the primary limiting factor.
In truth, there was confusion as to whether low qubit fidelity or coherence time was the key limiting factor, although in hindsight I can confidently say that low qubit fidelity was the key limiting factor.
People were chattering as if limited qubits — 20 qubits — was a key limiting factor, but there were few algorithms using more than five to eight qubits anyway.
Now, we have plenty of qubits, but qubit fidelity (gate errors) and limited qubit connectivity are the major obstacles to near-term progress for quantum algorithms.
Actually, I would say that inability to support nontrivial quantum Fourier transform (QFT) and quantum phase estimation (QPE) are they key limiting factors to achieving dramatic quantum advantage, such as for quantum computational chemistry.
Coherence time is still out there as an ultimate limiting factor, but qubit fidelity and qubit connectivity will come into play long before we get to coherence time — at least in the very near term.
In all of those years from 2018 right up to the present moment, nobody except me has been talking about how quantum computing is severely limited by lack of fine granularity for phase and probability amplitude and how that limits quantum Fourier transform (QFT) and quantum phase estimation (QPE). Of course, low qubit fidelity and limited connectivity are also key limiting factors as well.
In truth, most quantum algorithms are so small — so they can run on existing real quantum computers — that coherence time and circuit depth and phase granularity are not at the front and center of attention. It will be interesting to see how long it takes before that changes.
I can only speculate what people will be chattering about in terms of priorities a year from now. It does depend in large part on how much progress is made over the next year.
But for now, qubit fidelity does have most of the attention. With limited qubit connectivity looming out there as well.
Advance vs. capability or feature or characteristic
This paper uses advance as a synonym for capability or feature or characteristic. Or, more explicitly, an advance is taken to be either:
- The introduction or addition of a new capability or feature or characteristic.
- The improvement or enhancement of an existing capability or feature or characteristic.
And improvement or enhancement or characteristic could refer to:
- Function. What it does.
- Performance. How fast it does it. Units of processing per unit of time. Or units of time per unit of processing.
- Capacity. How large or how many units of information can be processed.
Generally, this paper will refer to any of these concepts using advance as a catchall umbrella term.
Urgent need, importance, and priority
The focus in this paper is on advances which are important and should be assigned a high priority, but ultimately the main focus is on advances for which there is the greatest urgent need.
Indeed, some features may be more important for the longer term, but be a lower priority in the near term. Quantum error correction (QEC) is an example. Or greater connectivity, which is critical for the longer term, but may not be practical in the near term anyway — and may not even be needed for some technologies such as trapped-ion and neutral-atom qubits.
Criteria for selecting the single most important near-term advance in quantum computing
These are some of the criteria which can be used to attempt to determine which capabilities might be prime candidates for being the single most needed advance for quantum commuting in the near term.
There is no implication that all or most of the criteria must be met. And the criteria are not in any absolute order of importance or priority.
- Urgent need. People are struggling without it.
- Technical benefit is very high.
- Delivers the most bang for the buck.
- It’s essential. Real progress is not possible without it.
- It would give the field a boost in momentum. Accelerate progress beyond the raw technical benefit of the advance itself.
- It’s a place to start. It requires nothing else. Helps to start building momentum.
- It’s an easy place to start. It requires little effort to get started.
- Doesn’t rely on other advances to get started or to make progress.
- It’s easy to implement. Minimal effort to complete.
- Incremental progress is possible. It can be implemented and used incrementally. It’s not an all or nothing proposition. Subsets of the full capability are reasonably useful.
- Already in progress. Needs to be finished, but risk is low.
- Immediate use without any change to algorithms or applications. A wide range of quantum algorithms and quantum applications could immediately use it without any change to the algorithms or application source code.
- It applies to a moderate range of algorithms and applications.
- It applies to a very wide range of algorithms and applications. Wider range makes it more valuable.
- It applies to all algorithms and applications. Universal benefit makes it extremely valuable.
- Research is ripe to pursue. No major open issues which might take years to resolve. Completed research is sitting on the shelf and published.
- Science and tech is ready. No research is needed. Everything that is needed is sitting on the shelf and published.
- Technical feasibility. Can be implemented within a few months to a year.
- Feasible in the near term. Everything that it requires is already in place — or will be completed as part of the task.
- People are clamoring for it. Satisfies a market demand.
- Needed for the two-year horizon. Foundation for the next stage.
- It enables other advances.
- It enables higher-priority capabilities.
- Helps to enable quantum error correction. A longer-term goal, but all of the pieces need to be put in place over an extended period of time.
- Helps to enable quantum parallelism. This is the whole point of quantum computing.
- Helps to achieve quantum advantage. Also the whole point of quantum computing.
- Simplifies quantum algorithm design.
- Simplifies quantum application development.
- Makes quantum algorithms more efficient.
- Makes quantum applications more efficient.
Different audiences may have different needs and priorities
Each audience is entitled to its own perceptions of its needs and its priorities.
Some of the potential audiences for quantum computing:
- Researchers. They’re developing the capabilities described in this paper rather than using them. What they need is more money, talent, management support, and time. And access to underlying research.
- Algorithm designers.
- Application developers.
- Business customers.
- Science customers.
- Engineering customers.
- End users.
- IT staff.
- Management.
- Executives.
- Each application category.
The audience focus here is on quantum algorithm designers and quantum application developers
This paper focuses on the needs for the audience of quantum algorithm designers as well as quantum application developers — the technical staff who are actually focused on using the technical features of the quantum computer to perform quantum computations.
It’s not that other audiences are not important, but just that quantum computation is the whole reason for the existence of quantum computers, so it makes sense that the audience focus is on technical staff engaged in quantum computation.
Researchers as an audience: what do they need in the near-term?
Although we just indicated that quantum algorithm designers and quantum application developers are the intended audience focus for this paper, the technical staff actually engaged in quantum computation, the staff actually using the technical features of the quantum computer to perform quantum computations, it also makes sense to highlight the needs and priorities of the researchers (and engineers) who are actually doing the research and engineering to develop those technical features.
What do these researchers and engineers need?
- Priority on research.
- Funding for research. And engineering.
- Talent pool for research. And engineering.
- Management support.
- Time.
- Access to underlying research. Building on the work of other researchers. Hopefully not hidden or protected by intellectual property (IP) protections.
- Feedback from users, algorithm designers, and application developers to develop better technical capabilities.
What is a near-perfect qubit?
As the term suggests, a near-perfect qubit is not quite perfect and error-free, but is close enough for the needs of the vast majority of quantum algorithms and quantum applications, without the need for the complexity and cost of full-blown (and expensive) quantum error correction (QEC). For more detail, see my paper:
- What Is a Near-perfect Qubit?
- https://jackkrupansky.medium.com/what-is-a-near-perfect-qubit-4b1ce65c7908
What are nines of qubit fidelity?
Nines of qubit fidelity basically refers to the reliability of a qubit. It’s the number of leading nines in the percentage of reliability. For example:
- 90% = one nine.
- 95% = 1.5 nines.
- 98% = 1.8 nines.
- 98.5% = 1.85 nines.
- 99% = two nines.
- 99.5% = 2.5 nines.
- 99.9% = three nines.
- 99.95% = 3.5 nines.
- 99.99% = four nines.
For more detail on nines of qubit fidelity, see my paper:
Qubit fidelity includes coherence, gate errors, and measurement errors
Qubit fidelity is actually a vague, umbrella term. I use it to mean anything which can impact the reliability or quality of the quantum state of a qubit. This includes:
- Coherence. Quantum state can decay or decohere over time, the coherence time.
- Single-qubit gate execution errors. Potential for errors executing a quantum logic gate on even a single qubit.
- Two-qubit gate execution errors. Potential for errors executing a quantum logic gate on two qubits. This is generally the limiting factor which determines overall qubit fidelity.
- Measurement errors. Even measurement of a qubit is not 100% reliable. And it tends to be less reliable than even two-qubit gate execution.
- Variations between qubits. Not all qubits in a given quantum processor have the same fidelity. Even different pairs of qubits can have different two-qubit gate execution errors.
Technically, qubit fidelity should be some sort of composite of all of these factors, although listing them all separately can be quite enlightening as well. Both the aggregate and the detail are useful.
For the purposes of this paper, qubit fidelity is taken to be the composite aggregate of all of these factors, as vague and ambiguous as that may be. For simplicity, the average two-qubit gate execution reliability is a reasonable surrogate for overall qubit fidelity.
For more on these distinctions and possible aggregations of them, see my paper on nines of qubit fidelity:
Higher qubit fidelity not likely to achieve near-perfect qubits in the near term
Although I have much higher expectations for achieving near-perfect qubits in the two-year timeframe, I’m not so optimistic for the near term, other than a modest to moderate improvement in qubit fidelity.
I do expect reasonable progress towards the longer-term goal of full near-perfect qubits in the near term.
I do expect that we’ll get reasonably close to near-perfect qubits, just not all the way there in the near term.
Some possibilities for the near term for qubit fidelity:
- Full near-perfect qubit fidelity. A full four nines or better. Rather unlikely in the near term. The primary goal for the two-year timeframe. Good enough for the vast majority of quantum algorithms and quantum applications.
- 3.75 nines. May be close enough to near-perfect for many quantum algorithms and quantum applications. But not so likely in the near term. Good enough for a sizable majority of quantum algorithms and quantum applications.
- 3.50 nines. Perfectly reasonable goal for the near term. May be the best to hope for in the near term. Reasonably acceptable minimum achievement for the two-year timeframe. Good enough for many or even most quantum algorithms and quantum applications.
- 3.25 nines. Reasonable goal for the near term. Marginally acceptable achievement for the two-year timeframe. Good enough for a significant fraction of quantum algorithms and quantum applications.
- Three nines. Marginally reasonable goal for the near term. Bare minimum achievement for the two-year timeframe. Good enough for some quantum algorithms and quantum applications.
- 2.75 nines. Minimal reasonable goal for the near term. Disappointing achievement for the two-year timeframe. Good enough for some niche quantum algorithms and quantum applications.
- 2.50 nines. Disappointing achievement for the near term. Dismal failure for the two-year timeframe. Generally not good enough for any quantum algorithms and quantum applications.
- 2.25 nines. Only acceptable as a stepping stone, a milestone on the path to higher qubit fidelity. But not really usable in any meaningful manner.
- Under two nines. Fairly dismal failure, even in the near term.
So, I’d hope for 3.5 nines in the near term and settle for three to 3.25 nines. I’d only settle for 2.75 nines if that’s the best that the hardware guys can deliver — and if clever algorithm designers and application developers can work around or mitigate the remaining errors.
Anything less and I’d consider the field to have failed in the near term and likely to get stuck in at least a mini Quantum Winter until three to 3.50 nines of qubit fidelity can be achieved.
Coherence time will limit the degree to which SWAP networks can be used to simulate connectivity
Although coherence time and circuit depth will ultimately be limiting factors eventually, my thought in the context of this paper was that they would not be the critical limiting factors until qubit fidelity improved dramatically. So, people shouldn’t worry about coherence time and circuit depth until after qubit fidelity is enhanced.
An exception to that logic is that for quantum computers with weak connectivity, such as transmon qubits with only nearest-neighbor connectivity, an excessive degree of SWAP networks required to simulate connectivity could quickly balloon the size of quantum circuits so that coherence time and circuit depth do indeed become critical gating factors to progress much sooner. But… unless qubit fidelity improves significantly, such excessive SWAP networks won’t be feasible anway without incurring excessive gate errors.
Of course, my preferred solution would be to architect improved connectivity, but as previously noted that may not be likely in the near term.
So, improved coherence time can enable:
- Deeper circuits.
- Larger circuits.
- Improved effective connectivity — if relying on SWAP networks.
Two paths to greater circuit depth — longer coherence time or faster gate execution time
Coherence time does limit circuit depth, but there is another path to greater circuit depth besides increasing coherence time — faster gate execution time.
Which path will be easier or more feasible will vary from time to time, but greater coherence time or faster gate execution time are both viable paths to greater circuit depth.
Faster gate execution time is also another path to using SWAP networks to simulate connectivity for transmon qubits, but only to the degree that circuit depth was the limiting factor. If qubit fidelity is the limiting factor, faster gate execution time won’t improve the degree to which SWAP networks can be used to simulate qubit connectivity.
The IBM 127-qubit Eagle didn’t address the qubit fidelity issue
I was really disappointed in November 2021 when the brand new IBM 127-qubit Eagle quantum processor was introduced by IBM but failed to deliver any improvement in qubit fidelity.
Worse, no explanation or excuse was given by IBM.
I’m hopeful that IBM may deliver several upgrades to Eagle over the next year as they did over the past year with their 27-qubit Falcom quantum processor, but they haven’t made any such commitments, so we’ll have to see.
So, as it stands now, in general, there’s no real benefit to moving up from a 27-qubt Falcon to a 127-qubit Eagle. There are likely some niche exceptions, but this is likely the general case.
Preview of the IBM 433-qubit Osprey
I would dearly love to know what capabilities the upcoming IBM 433-qubit Osprey quantum processor will deliver later this year, but other than 433 qubits, we simply don’t know.
I would hope that it has improved qubit fidelity, but we just don’t know at this stage. We’ll have to wait and see. The recent 127-qubit Eagle didn’t deliver improved qubit fidelity, so I’m primed for further disappointment, but still hopeful.
I would hope that it has some improvement to qubit connectivity, but IBM hasn’t hinted at any, so I’m not holding my breath. But I do hope they at least deliver a roadmap and set expectations for qubit connectivity improvements in future years and future processors.
We’ll simply have to wait until November to see.
Further improvement to qubit fidelity in the IBM 27-qubit Falcon?
IBM did deliver improvements to qubit fidelity over the past year for their 27-qubit Falcon quantum processor, including hinting at achieving three nines of qubit fidelity for a 2-qubit CNOT gate at least in one test configuration. It would be nice if they delivered further improvements over the next year, but they’ve made no commitments and didn’t even make any promises. Still, I remain hopeful.
Is 27 qubits the best we can do for the near term?
Yes, we have quantum computers with more than 27 qubits, but with limited qubit fidelity and limited qubit connectivity, as well as limited coherence time, we’re very limited as to the size of quantum circuit which can be executed and deliver high-quality results. In fact I don’t recall any published papers for algorithms using more than about 23 qubits to address the kinds of applications often touted as ideal for quantum computers.
Sure, there are esoteric computer science experiments that can use more qubits, but they’re not addressing practical, real-world problems.
So if somebody has an algorithm using 16 to 24 qubits on a 27-qubit or 53-qubit quantum computer and you give them a 40 or 65 or 80 or 100 or 127-qubit quantum computer they can’t generally easily take advantage of all of those extra qubits.
Where are all of the 40-qubit algorithms?
I keep saying that we need to see 32 to 40-qubit algorithms, even if run only on simulators, but I’m not seeing such algorithms.
To me, this is a measure of how limited current quantum computers are today. It’s unclear exactly which limiting factors are causing this dearth of 40-qubit algorithms, but a short list of candidates is:
- Low qubit fidelity.
- Limited qubit connectivity.
- Limited coherence time.
- Limited circuit depth.
- Lack of rich algorithmic building blocks.
- Lack of experience designing complex algorithms.
And for the purposes of this paper, low qubit fidelity is the #1 limiting technical factor precluding 40-qubit algorithms. With limited qubit connectivity a strong runner-up.
For more on this dearth of 40-qubit algorithms, see my paper:
- Where Are All of the 40-qubit Quantum Algorithms?
- https://jackkrupansky.medium.com/where-are-all-of-the-40-qubit-quantum-algorithms-14b711017086
A big part of the problem is scalability of quantum algorithms…
Need for automatically scalable quantum algorithms
One issue that is under the control of quantum algorithm designers and not an inherent limitation of the available hardware per se is that generally quantum algorithms have not been designed to be automatically scalable. Generally, they are designed to fit the limitations of a particular quantum computer and can’t easily be moved to a new, larger quantum computer and easily use the additional qubits to handle larger input data or larger calculations. Sometimes, yes, but generally, no.
The real point here is that this is an advance which is fully in the hands of quantum algorithm designers. They can make progress without any hardware advances.
Granted, enhanced qubit fidelity and qubit connectivity would be a big help for scaling of many quantum algorithms, but the basic scaling design is within the hands of the quantum algorithm designer. They can at least design for scaling even if the hardware is not quite ready yet. And they can also use simulation while waiting for the hardware to catch up.
For more on automatic scalability of quantum algorithms, see my paper:
- Staged Model for Scaling of Quantum Algorithms
- https://jackkrupansky.medium.com/staged-model-for-scaling-of-quantum-algorithms-d1070056907f
Limited connectivity is more of an absolute barrier — all or nothing, incremental advances are not really possible
Many advances are compatible with an incremental approach — they are not all or nothing propositions. But limited connectivity is an exception. Trapped-ion and neutral-atom quantum computers are an exception since they have full, unlimited any to any connectivity, but the connectivity of transmon qubits is hardwired and very limited, such as nearest neighbor. Adding further connectivity would be very expensive and problematic, at best, so there is no real potential for incremental improvement short of an architectural change which might offer full any to any connectivity or something close to it.
Granted, higher qubit fidelity does enable longer SWAP networks which can effectively yield incremental progress on connectivity, but that’s a poor-man’s connectivity and not a very satisfying long-term solution to limited qubit connectivity. And it can accelerate bumping into coherence time as a limiting factor as well.
In short, limited connectivity is a serious problem which can only be addressed with a dramatic architectural change rather than incremental improvement over time.
Advances not likely in the near term
Some of the more interesting advances which are unlikely to get included in the near term include:
- Quantum error correction (QEC). Some number of years required.
- Larger quantum Fourier transform (QFT). Lucky if we can get even 12 or 16 qubits in the near term.
- Greater connectivity. Significant architectural changes required.
- Fine granularity of phase and probability amplitude. Unless it’s a quick fix in the firmware. But more sophisticated hardware and architectural changes are likely required.
- Quantum networking. Much research is needed.
- Quantum volume alternative. Unless somebody comes up with one shortly.
- Quantum-native programming languages. Depend on more advanced programming models.
- More advanced quantum programming models. Still too hard and too unknown, even if it is desperately needed. Much research is required. Not really needed until we have a moderate range of qubits with high fidelity anyway. Figure three to five years.
Quantum error correction (QEC) is a critical priority, but not in the near term, except for research
It may be three to seven years before quantum error correction (QEC) becomes practical, generally available, and commonly used, but definitely not in the near term, not in the next two years and not in the next year for sure.
Research is of course needed over the next few years, but that won’t result in any useful capability in the near term.
For more on quantum error correction, see my paper:
- Preliminary Thoughts on Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits
- https://jackkrupansky.medium.com/preliminary-thoughts-on-fault-tolerant-quantum-computing-quantum-error-correction-and-logical-1f9e3f122e71
Advances and capabilities not considered as critical gating or limiting factors in the near term
These are advances or capabilities which might indeed have significant value and might even be feasible in the near term but simply aren’t critical gating or limiting factors for significant progress of quantum computing in the near term, over the next year:
- More qubits. We already have plenty for many use cases — they just don’t have sufficient fidelity, connectivity, or fine enough granularity of phase or probability amplitude.
- Support software and tools. They are important, but generally they can be designed and implemented relatively easily and with low technical risk so that they are not true and substantial advances per se. Generally they will make life easier, but quantum algorithm designers and quantum application developers can generally get along (or occasionally limp along) without them, or with only primitive support software and tools.
- An alternative to the Quantum Volume metric. Such as for more than 50 qubits, or even 40, 32, or 28, or 24 qubits.
- A more advanced programming model. This will be essential at some stage in the future, but there are much more pressing needs in the near term.
- A quantum-native programming language. Ditto.
An alternative to the Quantum Volume metric is not essential for the near term
The basic issue or limitation with the Quantum Volume metric is that it is limited to roughly 50 qubits — or maybe even to only 40, 32, 28, 24, or fewer qubits — since it requires simulation of the quantum circuit which is being run on the quantum computer. So it’s really a limitation of the classical simulation software.
This will be a critical limiting factor at some stage, but right now and probably for the rest of the coming year there will be any number of critical limiting factors which prevent quantum circuits using 24 or more qubits from running correctly on real quantum computers.
Qubit fidelity and qubit connectivity are the top two candidates for critical limiting factors which prevent Quantum Volume from running into the 50-qubit simulation limit.
For example, the new IBM 127-qubit Eagle quantum processor has a Quantum Volume of only 64, so essentially only six qubits can be used before low qubit fidelity and limited connectivity cause circuits to deliver incorrect results, so there is no need to simulate circuits with more than six qubits.
But if a trapped-ion or neutral-atom quantum computer can manage to support circuits with 50 or more qubits over the coming year, they may hit the 50-qubit simulation limit. They have unlimited qubit connectivity, but qubit fidelity is still only so-so and may prevent them from delivering correct results for 50-qubit circuits anyway.
Advances in fine granularity of phase and probability amplitude not so likely in the near term
Although I really would like to see use of quantum Fourier transform (QFT), quantum phase estimation (QPE), quantum amplitude estimation (QAE), and amplitude amplification over the coming year, it does seem unlikely since all of these capabilities depend on fine granularity of phase and probability amplitude, at least for nontrivial use cases.
Sure, it is still possible that we could see some incremental advance, especially if it focuses on firmware or the classical digital and analog hardware of qubit control, but anything more than a modest incremental advance would likely require a significant hardware redesign, and may run into theoretical and practical limitations as well.
The granularity of phase and probability amplitude may be driven by a combination of:
- Reduction and limitation of noise and other interference.
- Precision of the digital to analog converters (DACs) used to convert digital data to analog form to be applied to the qubit hardware.
High-precision DACs are expensive, use lots of power, have lots of wiring, and are susceptible to noise. They’re difficult to work with. And there are limits to their precision.
Lower-precision DACs are cheaper, more efficient, require less wiring, and are more resilient in the face of noise and other interference. Unfortunately, they are incapable of supporting very-fine granularity of phase and probability amplitude.
It’s a difficult tradeoff. You can understand why engineers might choose coarser control of phase and probability amplitude.
I expect improvement over time, but not necessarily in the very near future.
The more difficult factors are two-fold:
- Practical considerations. Availability and cost of fine granularity DACs. And practical limitations on precision of DACs. 8-bit, 16-bit, 18-bit, 20-bit, and 32-bit precision is available. What’s actually practical in the context of a quantum computer and qubit control is another matter, and unclear, and never documented.
- Theoretical considerations and limits of physics. What if an application really does need a 48-bit or 80-bit quantum Fourier transform (or Shor’s factoring algorithm needs 4096 or 8192-bit QFT)? What does the underlying physics support even if you had ideal digital and analog logic?
Vendors of quantum computers need to clearly document their capabilities in terms of the fine granularity of phase and probability amplitude.
We need to know if it even makes sense to contemplate future quantum algorithms using quantum Fourier transforms using:
- 8 qubits. Hopefully a slam dunk. I would hope this could happen over the next year.
- 12 qubits. Hopefully a slam dunk. But when? Again, one can hope for the next year, but that would require a number of other advances, and possibly upgrades to the qubit control hardware.
- 16 qubits. Unlikely over the next year.
- 20 qubits. Should be feasible. But not over the next year.
- 24 qubits. May or may not be feasible.
- 28 qubits. May or may not be feasible.
- 32 qubits. May or may not be feasible.
- 40 qubits. Questionable feasibility.
- 48 qubits. Dubious feasibility.
- 56 qubits. Beyond speculation at this stage.
- 64 qubits. Ditto.
- 72 qubits. Ditto.
- 80 qubits. Ditto.
- 96 qubits. Ditto.
- 128 qubits. Ditto.
- And beyond 128 qubits. Ditto.
We can speculate about some sort of idealistic fantasy Vunder-DAC which supports far more than 32-bits of precision. But, such a device does not exist today, and there is no prospect of it existing in a year, two years, or maybe even ever. The hardware engineers and researchers need to come clean as to what we can expect and when we can expect it.
For more on phase, see my paper:
- Beware of Quantum Algorithms Dependent on Fine Granularity of Phase
- https://jackkrupansky.medium.com/beware-of-quantum-algorithms-dependent-on-fine-granularity-of-phase-525bde2642d8
A more advanced quantum programming model is not likely in the near term
I do indeed strongly believe that we need a much more powerful and advanced quantum programming model to enable non-elite technical staff to exploit the power of quantum computers, but…
- It’s not the critical technical gating factor. Lack of high fidelity qubits would preclude its operation. Since only relatively short quantum circuits would be supported by the hardware, the advanced quantum programming model would focus on larger, more sophisticated quantum algorithms.
- The conceptualization of a more advanced quantum programming model is still an open research question. Could take at least a few years of conceptual development and experimentation before the conceptual model matures and serious, production-scale implementation can begin.
- A more advanced programming model would be more focused on larger algorithms with many more qubits. Limited utility for 32 to 40 or maybe even 80 qubits.
In short, as desirable as a more advanced quantum programming model would be in the near term, it wouldn’t be the top, #1 priority.
Advances in qubit fidelity have the added benefit of enabling other advances
Enhanced qubit fidelity doesn’t enable all other advances, but many of them. Some examples:
- Greater simulated connectivity using SWAP networks. Critically limited by low qubit fidelity.
- Greater circuit size.
- Greater circuit depth.
- Nontrivial quantum Fourier transform (QFT).
- Nontrivial quantum phase estimation (QPE).
- Nontrivial quantum amplitude estimation (QAE).
- Nontrivial amplitude amplification.
- More advanced and more sophisticated quantum circuits.
- Much richer collection of algorithmic building blocks.
Many advances will eventually bump into limitations in other capabilities
One of the advantages or reasons for pursuing qubit fidelity as the top priority advance is that so many other advances won’t be able to progress very far without sufficient qubit fidelity. For example, quantum Fourier transform and quantum phase estimation or circuit depth.
Generally, every advance will eventually run into or bump up against the limitations of some other capability. Or multiple limitations.
Even qubit fidelity will eventually bump into limitations of other capabilities, such as:
- Limited connectivity. SWAP networks can dilute or consume too much of qubit fidelity.
- Coherence time. Even with very high fidelity, eventually qubits (or gates, actually) will bump up against limited coherence time.
Even further advances in qubit fidelity will have minimal or no beneficial effect once such limits have been hit.
In any case, one must always be cognizant of how far each advance can go before it hits some limit or limits of other capabilities.
Ordering of advances — not so easy to predict or plan
As just discussed, any given advance may be limited in its progress by other advances which haven’t yet occurred. This suggests that there is some ideal ordering of advances so that any given advance isn’t attempted until there are no further advances that would limit the given advance.
That is likely true, but also somewhat idealistic, primarily because a lot of advances may be limited by a lack of knowledge or technical capabilities needed to implement that advance, such that advances cannot always be simply scheduled as desired or planned in detail in advance.
Some advances may not be feasible or practical over the coming year or even the next two years, or even longer. And some advances may not be feasible or practical for six or nine months from now.
In short, it’s not possible for me to lay out the precise ordering of advances at this time in this paper. Some of that will become more apparent as the year unfolds, and some of it may not even be known before the moment that a given advance commences — or is finished.
Possible that some qubit technologies might do better than others in the near term
Trapped-ion and neutral-atom quantum computers have some automatic advantages over transmon qubits, including:
- Longer coherence time.
- Generally better qubit fidelity. Although transmon qubits may be catching up.
- Greater connectivity. Full any to any connectivity.
It’s not completely clear, but since gate execution time is supposedly slower for trapped-ion and neutral-atom qubits, transmon qubits may have an advantage for circuit depth even though trapped-ion and neutral-atom qubits technically have longer coherence time. Better documentation and specifications from vendors would help to answer such questions.
There may be other advantages that I am not aware of.
Limiting and critical gating factors may be algorithm-specific or application-specific
Although low qubit fidelity may be the most critical gating factor for progress for many or most quantum algorithms and quantum applications, there may be some quantum algorithms or quantum applications for which other advances or capabilities are the critical gating or limiting factors, such as:
- Qubit connectivity.
- Coherence time or circuit depth.
- Fine granularity of phase or probability amplitude.
- General lack of support for nontrivial quantum Fourier transform (QFT) or quantum phase estimation (QPE).
Special needs for variational methods
Variational methods likely have their own needs and priorities which are likely rather distinct from the needs and priorities for quantum Fourier transform (QFT) and quantum phase estimation (QPE), but I won’t attempt to ferret them out here in this paper since I consider variational methods to be a dead-end poor man’s substitute for the raw power of quantum Fourier transform (QFT) and quantum phase estimation (QPE), particularly for quantum computational chemistry.
I can surmise that higher qubit fidelity would be equally beneficial to variational methods, but I haven’t confirmed that.
Ditto for greater connectivity.
Whether finer granularity for phase and probability amplitude would benefit variational methods is an open question from my perspective.
More capable simulators really are needed, but…
Part of me really wants to give more capable simulators a significant priority over the near term and longer, but there are already so many high priorities, so it doesn’t make it to the top for highest near-term priority, but deserves an honorable mention..
Some improvements needed are:
- Greater capacity.
- Higher performance.
- More qubits.
- Greater circuit depth.
- More analysis tools.
- More debugging tools.
- Better and more accurate noise models. Exactly match existing and proposed quantum computers, so that a simulation run is an accurate reflection of running on a real quantum computer.
- Exploit distributed computing. Much greater capacity and performance.
- In summary, deliver great simulation of 32 to 40-qubit quantum circuits.
In short, I really hope somebody gives it a top priority for the near term even if I can’t right now.
Use simulation to find limits for benefits
Frequently it will be unclear how extensive or pervasive the benefits of an advance will be, or how far the advance can go. For example, how high can qubit fidelity be pushed, or when further improvement has diminishing returns. Simulation can help to evaluate such questions and issues.
But, this depends on the ability of the simulator to be configured to accurately reflect the details and consequences of any particular advance as well as the details of the overall underlying quantum computer even before the advance enters the picture.
All tools have their limits, but simulation should have some very significant value for evaluating the benefits of proposed advances to quantum computing technology even before the advance is incorporated into a real quantum computer.
Any dispute as to the most urgent advance?
I do think my selection of higher qubit fidelity as the most urgent advance for quantum computing for the near term is quite reasonable, but I do recognize that some may dispute it and assign a higher priority to some other capability, feature, or characteristic.
Some may feel that connectivity is a more urgent priority, or raw qubit count. Or maybe even quantum error correction (QEC).
And some may agree with my real preference, support for nontrivial quantum Fourier transform and quantum phase estimation (QPE), even though that effectively puts a high priority on higher qubit fidelity anyway. As well as support for finer granularity of phase and probability amplitude.
What would be the second most needed advance?
Once the need for higher qubit fidelity is taken care of (or in progress), what would be next on the list, the second most needed advance in the near term? That’s difficult to say and it may be subjective and depend on the audience.
Some top candidates:
- Greater qubit connectivity.
- Longer coherence time.
- Faster gate execution time.
- Greater circuit depth.
- Finer granularity for phase and probability amplitude. To enable quantum Fourier transform (QFT), quantum phase estimation (QPE), quantum amplitude estimation (QAE), and amplitude amplification.
- Support for nontrivial quantum Fourier transform and quantum phase estimation.
- Richer collection of algorithmic building blocks.
- More capable simulators.
Part of what makes it subjective is that transmon qubits desperately need greater connectivity, but trapped-ion and neutral-atom qubits don’t need it since they already have full any to any connectivity.
Longer coherence time and increased circuit depth will be needed once qubit fidelity is much higher, but may need to wait for greater connectivity. And once again, transmon qubits need it more, while trapped-ion and neutral-atom qubits don’t need it since they already have longer coherence time.
I’m in favor of finer granularity for phase to enable nontrivial quantum Fourier transform, but not all algorithms need that capability, at least in the near term. And it may be that full near-perfect qubits are needed before quantum Fourier transform is feasible, which may not happen in the near term anyway.
And maybe richer algorithmic building blocks would be a safe bet in the near term. But, they may depend on some of these other hardware advances anyway.
I really do favor more capable simulators, but… maybe it isn’t such a top priority, so far. Ask me again in six months.
Maybe there needs to be a split:
- Focus on connectivity for transmon qubits.
- Focus on finer phase granularity for trapped-ion and neutral-atom qubits. And possibly for transmon qubits as well if greater connectivity is not feasible in the near term.
Maybe that would offer the biggest bang for the buck.
I think I’ll leave it at that, for now — inconclusive as to the second most needed advance.
My original proposal for this topic
For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.
- What single advance in quantum computing is most needed in the near future? There are so many! What criteria to use. How near-term — three months, six months, nine months, one year? Maybe qubit fidelity, or maybe qubit connectivity, or…?
Summary and conclusions
- Higher qubit fidelity is the most needed advance in the near term — the next year or so.
- Higher qubit fidelity will deliver the most bang for the buck.
- Higher qubit fidelity enables numerous other advances, including quantum Fourier transform (QFT) and quantum phase estimation (QPE), use of SWAP networks to simulate qubit connectivity for transmon qubits, and deeper quantum circuits.
- Greater qubit connectivity is very important, but will likely require architectural changes for transmon qubits.
- Quantum error correction (QEC) is very important for the longer term and certainly a research priority in the near term, but not a practical priority for the near term.
- Higher qubit fidelity is not likely or guaranteed to achieve true near-perfect qubits in the near term — over the next year. It might take two years to get there.
- Greater coherence time and circuit depth are important for the medium term, but are not critical until qubit fidelity and qubit connectivity are addressed — only relatively shallow circuits can be executed reliably with low qubit fidelity and limited qubit connectivity.
- Different audiences may have different needs and priorities relative to each possible advance. This paper focuses on quantum algorithm designers in general as well as quantum application developers in general. Specific niche categories may have different needs and priorities than discussed here.
- Different algorithm and application categories may have different requirements for which advances should have top priority.
- There are a wide range of criteria that can be used to judge which advances should have higher priority, including: urgent need — people are struggling without it, technical benefit is very high, applies to all algorithms and applications, doesn’t rely on other advances to get started or to make progress, incremental progress is possible, enables other advances, and it would give the field a boost in momentum. And many other criteria.
- What should be the next priority after higher qubit fidelity? That’s too difficult to say — there are so many urgent priorities. But once qubit fidelity is no longer the big holdup for most algorithms, the next big holdup will quickly become obvious. I suspect it will be limited connectivity.
- The one advance that isn’t a candidate for top priority in the near term is actually the advance that gets so much of the attention in recent months and years: more qubits. We already have enough qubits for many or even most algorithms, but low qubit fidelity and limited qubit connectivity make it very difficult for many algorithms to utilize any significant fraction of those qubits.
- I do think we definitely need a much richer collection of algorithmic building blocks as soon as possible, but once again low qubit fidelity and limited qubit connectivity render this goal unachievable in the near term.
- Where are all of the 40-qubit algorithms? There are a number of limiting factors, with low qubit fidelity being at the top of the list. Higher qubit fidelity alone may not be enough to open the floodgates for 40-qubit algorithms, but it’s the top priority step to take.
- More capable simulators would be a big win in the near term, but I’d rather keep the priority focus on qubit fidelity for now. That said, I’d push hard for further research in simulators, near term, medium term, and longer term.
For more of my writing: List of My Papers on Quantum Computing.