48 Fully-connected Near-perfect Qubits As the Sweet Spot Goal for Near-term Quantum Computing

  1. In a nutshell
  2. Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted
  3. What’s wrong with current quantum computers?
  4. Background
  5. Essence of the proposal
  6. Finally enable the use of quantum phase estimation to support quantum computational chemistry and finally enable at least a hint of actual quantum advantage
  7. Quantum Fourier transform (QFT) as the critical algorithmic building block
  8. Little data with a big solution space — the sweet spot for quantum computing
  9. Near-term: The next two to three years
  10. Quantum parallelism — simultaneously evaluate many alternative solutions in parallel
  11. Quantum computational leverage — the degree of quantum parallelism — how many alternative solutions are simultaneously evaluated in parallel
  12. Quantum advantage in the range of 1,000 to 1,000,000 X
  13. Sorry, but dramatic quantum advantage will remain out of reach
  14. Quantum advantage depends on the algorithm
  15. Raw quantum computational leverage must be discounted by shot count to get net quantum advantage
  16. Start with an upgraded 27-qubit quantum computer
  17. Maybe even a 36-qubit stepping stone
  18. Need near-perfect qubits with 3.25 to 4 nines of qubit fidelity
  19. Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits
  20. 20 bits of granularity for phase will support a 20-qubit quantum Fourier transform for quantum computational leverage of one million to one
  21. Quantum phase estimation (QPE) will finally be enabled
  22. Quantum amplitude estimation (QAE) will finally be enabled
  23. Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256?
  24. More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits
  25. Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform
  26. 48 qubits is likely about as high as we can go with full state vector simulation
  27. Fantasizing about a 72-qubit quantum computer
  28. Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits
  29. Quantum state bus or dynamically-routable resonators
  30. Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?
  31. Greater total circuit size and maximum circuit depth
  32. Where are all of the 40-qubit algorithms?
  33. Where are all of the scalable algorithms?
  34. Quantum Volume (QV) may be limited
  35. Other technical risks
  36. Will this even be feasible?
  37. When? Two to three years, or so, seems like a solid goal
  38. Earliest availability?
  39. Will this be enough? For who? Maybe… it will vary
  40. Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe… it will vary
  41. What about the impact on other quantum application categories? Impact will vary
  42. How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms, and scalable algorithms
  43. Ramp up efforts for more powerful simulators — shoot for 48 qubits
  44. Of course we want to get way beyond this ASAP, but we’re not even close to this yet
  45. A credible and palpable goal to aim at
  46. A clear path to avert a potential Quantum Winter
  47. What’s next after this proposal has been fully implemented? Unclear and uncharted territory
  48. No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers
  49. Quantum error correction (QEC)? Not yet required
  50. This proposal doesn’t rely on the distant fantasy of full quantum error correction
  51. Rigetti?
  52. Qubit counts for IBM
  53. What about Shor’s factoring algorithm?
  54. My original proposal for this topic
  55. Summary and conclusions

In a nutshell

  1. Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted.
  2. This configuration should be sufficient to finally enable the use of quantum phase estimation to support quantum computational chemistry, at least for relatively modest molecules, and finally enable at least a hint of actual quantum advantage.
  3. Quantum Fourier transform (QFT) as the critical algorithmic building block. It enables other algorithms, including quantum phase estimation and quantum computational chemistry.
  4. Little data with a big solution space — the sweet spot for quantum computing. k qubits of input define a solution space of 2^k quantum states. So 20 qubits may seem like a very small amount of data, but 20 qubits define a solution space of 2²⁰ or one million quantum states. This means that a million possible solutions can be evaluated in parallel, simultaneously.
  5. Near-term in this paper refers to the next two to three years.
  6. Unable to use enough qubits to perform quantum computations of sufficient complexity. Plenty of qubits, but unable to use them effectively. Unable to marshal them to their full capacity.
  7. Qubits need to be usable. There are still too many impediments to effectively using the qubits we already have.
  8. Specific insufficiencies…
  9. Insufficient qubit fidelity. Mostly insufficient gate fidelity.
  10. Insufficient measurement fidelity. Generally lumped in with qubit fidelity.
  11. Insufficient qubit connectivity. Need full any to any connectivity.
  12. Need fine granularity of phase for nontrivial quantum Fourier transform.
  13. Support larger total circuit size and maximum circuit depth. Coherence time comes into play here.
  14. Need scalable algorithms. Don’t blame the hardware alone.
  15. Key points…
  16. Priority is quality over quantity. 48 qubits should be enough, but they need to be high quality.
  17. Full connectivity. But that’s a big leap from where we are today.
  18. 3.25 to 4 nines of qubit fidelity. Near-perfect qubits.
  19. Likely will still require some degree of error mitigation.
  20. 20 bits of granularity for phase. One million gradations. A 20-bit DAC.
  21. Support a 20-bit quantum Fourier transform.
  22. Quantum phase estimation (QPE) will finally be enabled. A consequence of supporting quantum Fourier transform. Enables quantum computational chemistry for greater accuracy and performance. Also enables quantum amplitude estimation (QAE).
  23. Quantum amplitude estimation (QAE) will finally be enabled.
  24. Quantum computational leverage of one million to one — 2²⁰. One parallel quantum circuit execution can simultaneously evaluate as many alternative solutions as a million classical code executions. This is the quantum advantage.
  25. Quantum advantage in the range of 1,000 to 1,000,000 X. Assuming a 10 to 20-qubit quantum Fourier transform.
  26. Sorry, but dramatic quantum advantage will remain out of reach. Not one quadrillion X (2⁵⁰) or more advantage over best classical solutions.
  27. Quantum advantage depends on the algorithm. Not the hardware itself, which simply enables the quantum algorithm.
  28. Raw quantum computational leverage must be discounted by shot count to get net quantum advantage. Even if the raw quantum advantage (quantum computational leverage) with a 20-qubit quantum Fourier transform was 1,000,000 X, a shot count of 500 would mean a net quantum advantage of only 2,000.
  29. Greater total circuit size and maximum circuit depth. No specific goal here, but some significant improvement is needed. For the sake of completeness here, we presume 2,000 gates may be executed.
  30. Scalable algorithms are needed. Even with better hardware, basic algorithm research is still needed.
  31. Of course we want to get way beyond this ASAP. But we’re not even close to this yet.
  32. A solid goal for two to three years.
  33. A credible and palpable goal to aim at.
  34. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms.
  35. Maybe a few more bits of granularity could be achieved in subsequent years, or not. But focus on a goal that seems achievable and then build on it as opportunities arise in future years.
  36. Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits.
  37. Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform. This paper is an expansion of a section from my preceding paper: “48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing” in Is Lack of Fine Granularity of Phase and Probability Amplitude the Fatal Achilles Heel Which Dooms Quantum Computing to Severely Limited Utility?
  38. Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256? More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits.
  39. 48 qubits is likely about as high as we can go with full state vector simulation.
  40. Fantasizing about a 72-qubit quantum computer. Pure fantasy, but computational leverage of four, eight, or sixteen billion X is rather appealing.
  41. Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits.
  42. Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?
  43. Where are all of the 40-qubit algorithms? This proposal should enable them.
  44. Where are all of the scalable algorithms? Ditto. But basic algorithm research is also required.
  45. Quantum Volume (QV) may be limited. May be limited by maximum simulator capacity or maximum circuit size. Or maybe these limiting factors can be resolved so that a full QV 2⁴⁸ can be achieved. But QV 2⁴⁰, QV 2³⁶ or even QV 2³² may be the best that can be hoped for in the 48-qubit quantum computer proposed by this paper.
  46. How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms. And scalable algorithms.
  47. Ramp up efforts for more powerful simulators. Really should aim for 48 qubits. As well as better performance to handle deeper circuit depth and greater circuit size.
  48. A clear path to avert a potential Quantum Winter. If the proposed quantum computer is available, people should see some interesting and even impressive results.
  49. This proposal doesn’t rely on the distant fantasy of full quantum error correction. Near-perfect qubits will generally be good enough. And maybe some modest manual error mitigation.
  50. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner. Achieve 1,000 X minimal quantum advantage with a 10-qubit quantum Fourier transform.
  51. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner. Support 15 or 16-qubit quantum Fourier transform to achieve 32K to 64K X quantum advantage.
  52. Will this even be feasible? Maybe not, but I do think it is feasible based on a fair extrapolation from the current technology and recent trends, even if I can’t actually prove it at this time.
  53. When? Two to three years, or so, seems like a solid goal.
  54. Will this be enough? For who? Maybe. It will vary.
  55. Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe. It will vary.
  56. What about the impact on other quantum application categories? Impact will vary.
  57. What’s next after this proposal has been fully implemented? Unclear. Uncharted territory. Maybe a few incremental improvements are likely.
  58. No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers. Not even in theory, particularly given the inherent limitation on fine granularity of phase and probability amplitude.

Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted

Tremendous progress has been made with quantum computers in the past six years, but we’re still only working with toy-like solutions to toy-like problems. We need to advance to something resembling realistic solutions to realistic real-world problems, otherwise we run the risk of spiraling down into a Quantum Winter of unfulfilled expectations.

What’s wrong with current quantum computers?

Overall, the problem is that current quantum algorithms and applications are:

  • Unable to use enough qubits to perform quantum computations of sufficient complexity.
  • Qubits need to be usable. There are still too many impediments to effectively using the qubits we already have.
  1. Qubit fidelity.
  2. Measurement fidelity. Generally lumped in with qubit fidelity.
  3. Qubit connectivity.
  4. Fine granularity of phase for nontrivial quantum Fourier transform.
  5. Total circuit size and maximum circuit depth.
  6. Scalable algorithms.

Background

Too many people and quantum computer vendors are focusing on and obsessing over pursuing hundreds, thousands, millions, and even billions of qubits, but in my view that is a silly waste of effort and resources, and a distraction if qubit quality and connectivity is not sufficient to effectively utilize those qubits in practical quantum algorithms which address practical real-world problems.

Essence of the proposal

Key points of the proposal:

  1. Priority is quality over quantity. 48 qubits should be enough, but they need to be high quality.
  2. Full connectivity. But that’s a big leap from where we are today.
  3. 3.25 to 4 nines of qubit fidelity. Near-perfect qubits.
  4. Likely will still require some degree of error mitigation. But not full, transparent, and automatic quantum error correction (QEC), which is still years in the future, over the horizon.
  5. 20 bits of granularity for phase. One million gadations. A 20-bit DAC.
  6. Support a 20-bit quantum Fourier transform.
  7. Quantum phase estimation (QPE) will finally be enabled. A consequence of supporting quantum Fourier transform. Enables quantum computational chemistry for greater accuracy and performance. Also enables quantum amplitude estimation (QAE).
  8. Quantum amplitude estimation (QAE) will finally be enabled.
  9. Quantum computational leverage of one million to one — 2²⁰. One parallel quantum circuit execution can simultaneously evaluate as many alternative solutions as a million classical code executions. This is the quantum advantage.
  10. Quantum advantage in the range of 1,000 to 1,000,000 X. Assuming a 10 to 20-qubit quantum Fourier transform.
  11. Sorry, but dramatic quantum advantage will remain out of reach. Not one quadrillion X (2⁵⁰) or more advantage over best classical solutions.
  12. Greater total circuit size and maximum circuit depth. There is no firm specific goal here yet, but some significant improvement is clearly needed. For the sake of completeness here, we presume 2,000 gates may be executed.
  13. Scalable algorithms are needed. Even with better hardware, basic algorithm research is still needed.
  14. Of course we want to get way beyond this ASAP, but we’re not even close to this yet.
  15. A solid goal for two to three years.
  16. A credible and palpable goal to aim at.
  17. Doesn’t rely on the distant fantasy of full quantum error correction. Near-perfect qubits will generally be good enough. And maybe some modest manual error mitigation.
  18. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms.
  19. Maybe a few more bits of granularity could be achieved in subsequent years, or not. But focus on a goal that seems achievable and then build on it as opportunities arise in future years.
  20. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner. Achieve 1,000 X minimal quantum advantage with a 10-qubit quantum Fourier transform.
  21. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner. Support 15 or 16-qubit quantum Fourier transform to achieve 32K to 64K X quantum advantage.

Finally enable the use of quantum phase estimation to support quantum computational chemistry and finally enable at least a hint of actual quantum advantage

This configuration should be sufficient to finally enable the use of quantum phase estimation to support quantum computational chemistry, at least for relatively modest molecules, and finally enable at least a hint of actual quantum advantage.

Quantum Fourier transform (QFT) as the critical algorithmic building block

Quantum Fourier transform (QFT) is a quantum algorithm in its own right, but it’s really only an algorithmic building block which enables other algorithms, including quantum phase estimation and quantum algorithms for quantum computational chemistry, as well as for more sophisticated algorithms such as order finding and Shor’s factoring algorithm.

Little data with a big solution space — the sweet spot for quantum computing

48 qubits may not seem like a lot or seem to support quantum applications of any significant sophistication, but the best way to look at quantum computing is what I call little data with a big solution space. The general idea of quantum parallelism is that k qubits of input define a solution space of 2^k quantum states. So 20 qubits may seem like a very small amount of data, but 20 qubits define a solution space of 2²⁰ or one million quantum states. This means that a million possible solutions can be evaluated in parallel, simultaneously.

Near-term: The next two to three years

By near-term this paper is referring to the next two to three years. And this is referring to the goal or target, not what we actually have right now or in the very near future.

Quantum parallelism — simultaneously evaluate many alternative solutions in parallel

Quantum advantage basically comes down to the ability of a quantum computer to evaluate a large number of alternative solutions in parallel, which is known as quantum parallelism.

Quantum computational leverage — the degree of quantum parallelism — how many alternative solutions are simultaneously evaluated in parallel

The number of parallel evaluations performed using quantum parallelism is the quantum computational leverage of the quantum circuit or algorithm.

Quantum advantage in the range of 1,000 to 1,000,000 X

The quantum computational leverage of a quantum algorithm is essentially its quantum advantage.

Sorry, but dramatic quantum advantage will remain out of reach

The reasons why dramatic quantum advantage — computational advantage of one quadrillion X and more over a classical computer — will remain out of reach will be discussed shortly, or can be found in my paper:

Quantum advantage depends on the algorithm

Technically, it is not the quantum computer itself which has a particular quantum advantage, but the quantum algorithm running on that quantum computer since it depends on how many qubits are used by the quantum algorithm and how they are used. This paper is presuming that a 20-qubit quantum Fourier transform is used, but that may not be the case for a particular quantum algorithm, which may use a larger or smaller quantum Fourier transform.

Raw quantum computational leverage must be discounted by shot count to get net quantum advantage

Even if a quantum algorithm has a raw quantum computational leverage of 2^k by using a quantum Fourier transform on k qubits, that raw advantage needs to be discounted (divided) by the shot count (circuit repetitions) to get the net quantum advantage.

  1. To compensate for hardware errors.
  2. To compensate for the inherent probabilistic nature of quantum computing. Even if the hardware was ideal and had no errors.
  1. A shot count of 1,000 would yield a net quantum advantage of 1,000. A million divided by 1,000.
  2. A shot count of 100 would yield a net quantum advantage of 10,000. A million divided by 100.
  3. A shot count of 20,000 would yield a net quantum advantage of 50. A million divided by 20,000.
  1. Hardware errors will decline dramatically. Fewer circuit repetitions will be required to compensate for hardware errors. The exact change is unknown at this time.
  2. The probabilistic nature of quantum computing will be unchanged. The exact same number of circuit repetitions will be required for the purpose of accounting for the probabilistic nature of quantum computing. The total shot count will decline, but the probabilistic contribution will not change.

Start with an upgraded 27-qubit quantum computer

A quantum computer with 48 fully-connected near-perfect qubits is the clear goal to have, but even it may be a bit too ambitious. It may be better to start with an upgrade to current 27-qubit designs (e.g., IBM).

Maybe even a 36-qubit stepping stone

Even once a 27-qubit design has been upgraded and proved to support a ten to twelve-qubit quantum Fourier transform, 48 qubits may still seem too ambitious, and the underlying technology just not quite ready. Maybe 36 fully-connected near-perfect qubits would be a reasonable additional stepping stone on the way to 48 qubits.

Need near-perfect qubits with 3.25 to 4 nines of qubit fidelity

The starting point for the ideal goal for two to three years is the need for near-perfect qubits with 3.25 to 4 nines of qubit fidelity. Without qubit fidelity, you have nothing.

Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits

The precision of a quantum Fourier transform may be the primary limiting factor which limits the number of qubits which can effectively be used. A resolution of 24 bits may be at or beyond the practical limit for bits which can be processed by a digital to analog converter (DAC) and associated analog circuitry in a quantum computer. Maybe 28, 30, or even 32 bits, but 20 or maybe 22 bits may be the practical limit for the digital and analog circuitry of a quantum computer.

  • 48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing

20 bits of granularity for phase will support a 20-qubit quantum Fourier transform for quantum computational leverage of one million to one

Achieving 20 bits of granularity for phase will enable support for 20-qubit quantum Fourier transform which will provide a quantum computational leverage of 2²⁰ or one million to one. A quantum algorithm could evaluate a million possible solutions simultaneously.

Quantum phase estimation (QPE) will finally be enabled

A consequence of supporting quantum Fourier transform is that it enables Quantum phase estimation (QPE).

Quantum amplitude estimation (QAE) will finally be enabled

Another consequence of supporting quantum Fourier transform is that it enables Quantum amplitude estimation (QAE).

Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256?

Quantum Fourier transform (QFT) is the key limiting factor here. It’s the most powerful algorithmic building block we have, but appears to be limited by lack of fine granularity of phase. 20 to 24 qubits may be the limit of what can be processed using a quantum Fourier transform. Or more specifically, 24 may be at or beyond the practical limit for bits which can be processed by a digital to analog converter (DAC). 24 qubits is too much of a stretch and not so likely, but 20 to 22 qubits may be within reach.

More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits

There is no advantage to having more than 48 qubits if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits.

Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform

Qubit fidelity can continue to improve and eventually we are likely to achieve full connectivity, leaving lack of fine granularity of phase as possibly the ultimate limit which makes 48 qubits the largest configuration which can effectively support a nontrivial quantum Fourier transform.

  • 48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing

48 qubits is likely about as high as we can go with full state vector simulation

Another reason for stopping at 48 qubits is that it is likely about as high as we can go with full state vector simulation.

Fantasizing about a 72-qubit quantum computer

Yes, 48 qubits is probably the limit of what we can expect to be practical, but at least it’s worth contemplating the fantasy theoretical possibility of a 72-qubit configuration.

Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits

Trapped-ion and neutral-atom qubits implicitly support full qubit connectivity — any qubit directly to any other qubit, but superconducting transmon qubits are currently architected for no better than nearest-neighbor connectivity, forcing a reliance on inefficient and unreliable SWAP networks to achieve full connectivity.

Quantum state bus or dynamically-routable resonators

In my own thinking I refer to two terms:

  1. Quantum state bus.
  2. Dynamically-routable resonator.
  1. 27 qubits would require 27 * 26 / 2 = 351 resonators.
  2. 36 qubits would require 36 * 35 / 2 = 630 resonators.
  3. 48 qubits would require 48 * 47 / 2 = 1,128 resonators.
  4. 65 qubits would require 65 * 64 / 2 = 2,080 resonators.
  5. 127 qubits would require 127 * 126 / 2 = 8,001 resonators.
  6. 433 qubits would require 433 * 432 / 2 = 93,528 resonators.
  7. 1,121 qubits would require 1,121 * 1,120 / 2 = 627,760 resonators.

Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?

Buses are standard fare on classical computer systems. And there is plenty of proprietary intellectual property (IP) as well. In other words, patents. I just wonder if that classical IP is getting in the way of incorporating bus technology into quantum computers. Maybe. Or maybe not.

Greater total circuit size and maximum circuit depth

Quantum Fourier transform alone will require a significant improvement in total quantum circuit size and maximum quantum circuit depth. There is no firm specific goal here yet, but some significant improvement is clearly needed. For the sake of completeness here, we presume 2,000 gates may be executed. 5,000 gates would be a more desirable target, especially to enable achieving Quantum Volume (QV) of 2⁴⁸, but even 2,000 seems quite a stretch at this stage.

  1. Longer qubit coherence time. This enables more quantum logic gates to be executed.
  2. Shorter gate execution time. Shorter time to execute each quantum logic gate means that more gates can be executed in the available qubit coherence time.

Where are all of the 40-qubit algorithms?

It’s rare to find quantum algorithms using even 16 or 18 qubits, and very few using more than 20 qubits. Part of the problem is the limitations of current real quantum computers, but I would note that we have classical quantum simulators which can handle 32 qubits and even 40 qubits, so one would expect that we should see some reasonable body of 32 and 40-qubit algorithms, but… we don’t. So, where are all of the 32 or 40-qubit algorithms?

Where are all of the scalable algorithms?

We wouldn’t have to ask where all of the 40-qubit algorithms are if most quantum algorithms were scalable in the first place. More specifically, dynamically scalable — the size of the algorithm is automatically adjusted based on the size of the input data and input parameters.

Quantum Volume (QV) may be limited

Ideally, the Quantum Volume (QV) metric value for a 48-qubit quantum computer would be 2⁴⁸, but there are factors which may conspire to cause that ideal to fail to be achieved. It may be limited by maximum simulator capacity or maximum circuit size. And qubit fidelity could be a factor as well for deeper circuits. Qubit connectivity shouldn’t be an issue if full any to any qubit connectivity is supported, as required by the proposal of this paper.

Other technical risks

This paper has endeavored to cover all known technical risks. There may be some more, but at present no additional technical risks are known to the author.

Will this even be feasible?

That’s a great and fair question. This paper presumes that the technical advances which predicate the technology needed to technically achieve the 48-qubit quantum computer proposed by this paper are indeed feasible, but it is very possible that they are not. I do believe that they are feasible, but that’s more of a belief, extrapolation, and speculation on my part than demonstrable fact or provable knowledge.

When? Two to three years, or so, seems like a solid goal

There are so many factors which have to come together or could conspire to subvert the goal of this paper, but for now, two to three years seems like a credible goal.

  1. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner.
  2. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner.

Earliest availability?

Even though I would be loath to set expectations for availability any sooner than two to three years, there are many factors which if they occurred together or in rapid succession could result in earlier availability — or delayed availability if any of the factors fall short of requirements and expectations. Some factors:

  1. Roadmap of qubit fidelity. Each vendor will have their own roadmap towards greater qubit fidelity, each proceeding at their own pace, hopefully not too leisurely.
  2. Technical risk for connectivity. Improving connectivity is a huge question mark for most vendors (other than trapped-ion and neutral-atom vendors). The notion of a quantum state bus or dynamically-routable resonators may seem obvious to some of us, but devilishly difficult for vendors themselves who have to grapple with science, engineering, software, and business challenges.
  3. Trapped-ion and neutral-atom have an advantage for connectivity. At least at present, vendors of trapped-ion and neutral-atom qubit technologies have an implicit advantage since they have full any to any connectivity by the nature of their design.
  4. Lack of fine granularity of phase and probability amplitude. Too little transparency to get any reliable sense of where we are. No sense of what progress might occur and when.
  1. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity.
  2. Maybe even a 36-qubit stepping stone.

Will this be enough? For who? Maybe… it will vary

The 48-qubit quantum computer proposed by this paper would be a major advance over current offerings, but will it be enough? It will depend and vary. It may be enough for some but not enough for others. It’s hard to say.

Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe… it will vary

Of particular interest is whether this proposal will be sufficient to support quantum phase estimation (QPE) to enable more sophisticated quantum computational chemistry than can be supported with variational methods as is done at present. The answer is unknown. A 20-bit quantum Fourier transform may be enough for some quantum computational chemistry applications but not for others.

What about the impact on other quantum application categories? Impact will vary

I explicitly call out the quantum application category of quantum computational chemistry because it is computationally intensive and close enough to physics to be amenable to a quantum computing solution.

How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms, and scalable algorithms

Since the 48-qubit quantum computer hardware proposed by this paper is unlikely to be available for two years or so, the question is what to do before then to prepare for that eventuality. The answer is simulation — using classical quantum simulators to simulate quantum algorithms using 24 to 40 qubits and configured to emulate the proposal of this paper as closely as possible in terms of limits and error rates.

Ramp up efforts for more powerful simulators — shoot for 48 qubits

Theoretically, an enterprising engineering team should be able put together a simulator for 42 to 44 qubits, but of relatively shallow circuit depth.

Of course we want to get way beyond this ASAP, but we’re not even close to this yet

The proposal of this paper is not intended as the ultimate ideal for quantum computing, but simply where we need to be in two to three years or so to have a sense that we really are on track to progress towards the larger promise of quantum computing.

A credible and palpable goal to aim at

Ultimately, all that is really necessary here is to set a credible goal that gives people the feeling that we really are making decent progress.

A clear path to avert a potential Quantum Winter

Reiterating what was mentioned in the preceding section, following the proposal of this paper may well be the best we can do to establish and follow a clear path which averts the very real prospect of a Quantum Winter which could easily occur if we don’t have capable hardware such as proposed by this paper.

What’s next after this proposal has been fully implemented? Unclear and uncharted territory

A large part of the motivation for this paper was the fact that 20 to 24 qubits may be the practical limit for quantum Fourier transform, so this proposal is based on support for 20-qubit quantum Fourier transform. Maybe a few incremental improvements are likely.

No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers

There may be relatively clear roadmaps to get to hundreds, thousands, millions, and even billions of qubits, but the path to algorithms which can effectively exploit more than about 48 qubits (especially given the limitation on fine granularity of phase and probability amplitude) is certainly unknown at this time.

Quantum error correction (QEC)? Not yet required

The proposal of this paper does not rely on the distant promise of full-blown automatic and transparent quantum error correction (QEC). Near-perfect qubits with 3.25 to 4 nines of qubit fidelity should be good enough for many or maybe even most quantum algorithms and applications. Some manual (or compiler-generated) error mitigation may be appropriate in some situations, but near-perfect qubits should do much or most if not all of the heavy lifting, at least for modest to moderate-sized quantum algorithms.

This proposal doesn’t rely on the distant fantasy of full quantum error correction

Just to be clear, the proposal of this paper does not rely on the distant promise of full-blown automatic and transparent quantum error correction (QEC). But it does rely on near-perfect qubits, with 3.25 to 4 nines of qubit fidelity.

Rigetti?

Generally I’m not calling out specific vendors, but there are a few points about Rigetti worth noting relative to this paper:

  1. 40 qubits vs. 80 qubits. They have a new modular architecture based on 40-qubit modules. 40 qubits leaves them short of the 48 qubits proposed by this paper. 80 qubits is overkill, plus it may add overhead and reduce qubit fidelity and connectivity. It would be more ideal if they redid their 40-qubit module to be a 48-qubit module.
  2. Weak connectivity. All transmon qubit devices have this issue.
  3. Weak qubit fidelity. All vendors have this issue. They need to publish a roadmap for getting to near-perfect qubits.
  4. Granularity of phase. All transmon qubit devices have this issue. Transparency is needed. And a roadmap for getting to fine granularity.

Qubit counts for IBM

There is nothing special about my specific suggestions for qubit counts for 36 and 48-qubit quantum computers. A few qubits more wouldn’t impact the proposal of this paper. That said, I wouldn’t revise the qubit counts of my proposal, with the exception of IBM, which has a subtle method for how they decide how many qubits appear in a given quantum computer. Some IBM-specific qubit count issues related to the proposal of this paper:

  1. 27 qubits. The existing IBM configuration. Simply with upgraded capabilities for qubit fidelity, qubit connectivity, and fine granularity of phase and probability amplitude.
  2. 36 qubits. Maybe that would be 38 under the IBM scheme — based on using a subset of the 65-qubit Hummingbird qubit topology.
  3. 48 qubits. Maybe this would be 49 under the IBM scheme, again based on the Hummingbird qubit topology.

What about Shor’s factoring algorithm?

Shor’s factoring algorithm has been hyped as being the greatest capability of quantum computing and the greatest threat to encryption, but so far it’s not been a factor at all. In fact, it’s not clear when it might become a factor, or even if it will ever be feasible for factoring very large semiprime numbers such as 2048 and 4096-bit public encryption keys. In fact, the tiny number 21 is the largest semiprime integer known to have been factored by a real quantum computer using Shor’s algorithm.

My original proposal for this topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

  • 48 fully-connected near-perfect qubits as the sweet spot goal for near-term quantum computing. Priority is quality over quantity. Full connectivity. Near-perfect qubits with 3.25 to 4 nines of qubit fidelity. 20 bits of granularity for phase. Support a 20-bit quantum Fourier transform for computational leverage of one million to one. Of course we want to get way beyond this ASAP, but we’re not even close to this yet. A solid goal for two to three years. A credible and palpable goal to aim at. Doesn’t rely on the distant fantasy of full quantum error correction. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms. Start with an upgraded 27-qubit processor supporting 10 to 12-qubit quantum Fourier transform. Then a 36-qubit processor supporting 15 to 16-qubit quantum Fourier transform before moving up to the full 48-qubit processor.

Summary and conclusions

  1. Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted.
  2. This configuration should be sufficient to finally enable the use of quantum phase estimation to support quantum computational chemistry, at least for relatively modest molecules, and finally enable at least a hint of actual quantum advantage.
  3. Quantum Fourier transform (QFT) as the critical algorithmic building block. It enables other algorithms, including quantum phase estimation and quantum computational chemistry.
  4. Little data with a big solution space — the sweet spot for quantum computing. k qubits of input define a solution space of 2^k quantum states. So 20 qubits may seem like a very small amount of data, but 20 qubits define a solution space of 2²⁰ or one million quantum states. This means that a million possible solutions can be evaluated in parallel, simultaneously.
  5. Near-term in this paper refers to the next two to three years.
  6. Unable to use enough qubits to perform quantum computations of sufficient complexity. Plenty of qubits, but unable to use them effectively. Unable to marshal them to their full capacity.
  7. Qubits need to be usable. There are still too many impediments to effectively using the qubits we already have.
  8. Specific insufficiencies…
  9. Insufficient qubit fidelity. Mostly insufficient gate fidelity.
  10. Insufficient measurement fidelity. Generally lumped in with qubit fidelity.
  11. Insufficient qubit connectivity. Need full any to any connectivity.
  12. Need fine granularity of phase for nontrivial quantum Fourier transform.
  13. Support larger total circuit size and maximum circuit depth. Coherence time comes into play here.
  14. Need scalable algorithms. Don’t blame the hardware alone.
  15. Key points…
  16. Priority is quality over quantity. 48 qubits should be enough, but they need to be high quality.
  17. Full connectivity. But that’s a big leap from where we are today.
  18. 3.25 to 4 nines of qubit fidelity. Near-perfect qubits.
  19. Likely will still require some degree of error mitigation.
  20. 20 bits of granularity for phase. One million gradations. A 20-bit DAC.
  21. Support a 20-bit quantum Fourier transform.
  22. Quantum phase estimation (QPE) will finally be enabled. A consequence of supporting quantum Fourier transform. Enables quantum computational chemistry for greater accuracy and performance. Also enables quantum amplitude estimation (QAE).
  23. Quantum amplitude estimation (QAE) will finally be enabled.
  24. Quantum computational leverage of one million to one — 2²⁰. One parallel quantum circuit execution can simultaneously evaluate as many alternative solutions as a million classical code executions. This is the quantum advantage.
  25. Quantum advantage in the range of 1,000 to 1,000,000 X. Assuming a 10 to 20-qubit quantum Fourier transform.
  26. Sorry, but dramatic quantum advantage will remain out of reach. Not one quadrillion X (2⁵⁰) or more advantage over best classical solutions.
  27. Quantum advantage depends on the algorithm. Not the hardware itself, which simply enables the quantum algorithm.
  28. Raw quantum computational leverage must be discounted by shot count to get net quantum advantage. Even if the raw quantum advantage (quantum computational leverage) with a 20-qubit quantum Fourier transform was 1,000,000 X, a shot count of 500 would mean a net quantum advantage of only 2,000.
  29. Greater total circuit size and maximum circuit depth. No specific goal here, but some significant improvement is needed. For the sake of completeness here, we presume 2,000 gates may be executed.
  30. Scalable algorithms are needed. Even with better hardware, basic algorithm research is still needed.
  31. Of course we want to get way beyond this ASAP. But we’re not even close to this yet.
  32. A solid goal for two to three years.
  33. A credible and palpable goal to aim at.
  34. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms.
  35. Maybe a few more bits of granularity could be achieved in subsequent years, or not. But focus on a goal that seems achievable and then build on it as opportunities arise in future years.
  36. Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits.
  37. Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform. This paper is an expansion of a section from my preceding paper: “48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing” in Is Lack of Fine Granularity of Phase and Probability Amplitude the Fatal Achilles Heel Which Dooms Quantum Computing to Severely Limited Utility?
  38. Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256? More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits.
  39. 48 qubits is likely about as high as we can go with full state vector simulation.
  40. Fantasizing about a 72-qubit quantum computer. Pure fantasy, but computational leverage of four, eight, or sixteen billion X is rather appealing.
  41. Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits.
  42. Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?
  43. Where are all of the 40-qubit algorithms? This proposal should enable them.
  44. Where are all of the scalable algorithms? Ditto. But basic algorithm research is also required.
  45. Quantum Volume (QV) may be limited. May be limited by maximum simulator capacity or maximum circuit size. Or maybe these limiting factors can be resolved so that a full QV 2⁴⁸ can be achieved. But QV 2⁴⁰, QV 2³⁶ or even QV 2³² may be the best that can be hoped for in the 48-qubit quantum computer proposed by this paper.
  46. How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms. And scalable algorithms.
  47. Ramp up efforts for more powerful simulators. Really should aim for 48 qubits. As well as better performance to handle deeper circuit depth and greater circuit size.
  48. A clear path to avert a potential Quantum Winter. If the proposed quantum computer is available, people should see some interesting and even impressive results.
  49. This proposal doesn’t rely on the distant fantasy of full quantum error correction. Near-perfect qubits will generally be good enough. And maybe some modest manual error mitigation.
  50. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner. Achieve 1,000 X minimal quantum advantage with a 10-qubit quantum Fourier transform.
  51. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner. Support 15 or 16-qubit quantum Fourier transform to achieve 32K to 64K X quantum advantage.
  52. Will this even be feasible? Maybe not, but I do think it is feasible based on a fair extrapolation from the current technology and recent trends, even if I can’t actually prove it at this time.
  53. When? Two to three years, or so, seems like a solid goal.
  54. Will this be enough? For who? Maybe. It will vary.
  55. Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe. It will vary.
  56. What about the impact on other quantum application categories? Impact will vary.
  57. What’s next after this proposal has been fully implemented? Unclear. Uncharted territory. Maybe a few incremental improvements are likely.
  58. No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers. Not even in theory, particularly given the inherent limitation on fine granularity of phase and probability amplitude.

--

--

Freelance Consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store