48 Fully-connected Near-perfect Qubits As the Sweet Spot Goal for Near-term Quantum Computing

Jack Krupansky
46 min readJun 15, 2022

--

Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted. This informal paper proposes a quantum computer with 48 fully-connected near-perfect qubits as an ideal goal for the next two to three years to enable such results. 20 bits of granularity for phase are also required, supporting one million gradations, which should be sufficient for a 20-qubit quantum Fourier transform which can both deliver accurate results and achieve quantum computational leverage of a million to one over the best classical solutions.

This configuration should be sufficient to finally enable the use of quantum phase estimation (QPE) to support quantum computational chemistry, at least for relatively modest molecules, and finally enable at least a hint of actual quantum advantage. Quantum amplitude estimation (QAE) will also be enabled.

As intermediate milestones, maybe start with an upgraded 27-qubit quantum computer with enhanced qubit fidelity, full connectivity, and enhanced phase granularity. Then maybe a 36-qubit stepping stone for increased capacity and greater quantum advantage, somewhat sooner than the 48-qubit quantum computer would become available.

This paper is an expansion of the same topic mentioned briefly as a section in my preceding paper:

Topics discussed in this paper:

  1. In a nutshell
  2. Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted
  3. What’s wrong with current quantum computers?
  4. Background
  5. Essence of the proposal
  6. Finally enable the use of quantum phase estimation to support quantum computational chemistry and finally enable at least a hint of actual quantum advantage
  7. Quantum Fourier transform (QFT) as the critical algorithmic building block
  8. Little data with a big solution space — the sweet spot for quantum computing
  9. Near-term: The next two to three years
  10. Quantum parallelism — simultaneously evaluate many alternative solutions in parallel
  11. Quantum computational leverage — the degree of quantum parallelism — how many alternative solutions are simultaneously evaluated in parallel
  12. Quantum advantage in the range of 1,000 to 1,000,000 X
  13. Sorry, but dramatic quantum advantage will remain out of reach
  14. Quantum advantage depends on the algorithm
  15. Raw quantum computational leverage must be discounted by shot count to get net quantum advantage
  16. Start with an upgraded 27-qubit quantum computer
  17. Maybe even a 36-qubit stepping stone
  18. Need near-perfect qubits with 3.25 to 4 nines of qubit fidelity
  19. Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits
  20. 20 bits of granularity for phase will support a 20-qubit quantum Fourier transform for quantum computational leverage of one million to one
  21. Quantum phase estimation (QPE) will finally be enabled
  22. Quantum amplitude estimation (QAE) will finally be enabled
  23. Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256?
  24. More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits
  25. Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform
  26. 48 qubits is likely about as high as we can go with full state vector simulation
  27. Fantasizing about a 72-qubit quantum computer
  28. Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits
  29. Quantum state bus or dynamically-routable resonators
  30. Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?
  31. Greater total circuit size and maximum circuit depth
  32. Where are all of the 40-qubit algorithms?
  33. Where are all of the scalable algorithms?
  34. Quantum Volume (QV) may be limited
  35. Other technical risks
  36. Will this even be feasible?
  37. When? Two to three years, or so, seems like a solid goal
  38. Earliest availability?
  39. Will this be enough? For who? Maybe… it will vary
  40. Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe… it will vary
  41. What about the impact on other quantum application categories? Impact will vary
  42. How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms, and scalable algorithms
  43. Ramp up efforts for more powerful simulators — shoot for 48 qubits
  44. Of course we want to get way beyond this ASAP, but we’re not even close to this yet
  45. A credible and palpable goal to aim at
  46. A clear path to avert a potential Quantum Winter
  47. What’s next after this proposal has been fully implemented? Unclear and uncharted territory
  48. No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers
  49. Quantum error correction (QEC)? Not yet required
  50. This proposal doesn’t rely on the distant fantasy of full quantum error correction
  51. Rigetti?
  52. Qubit counts for IBM
  53. What about Shor’s factoring algorithm?
  54. My original proposal for this topic
  55. Summary and conclusions

In a nutshell

  1. Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted.
  2. This configuration should be sufficient to finally enable the use of quantum phase estimation to support quantum computational chemistry, at least for relatively modest molecules, and finally enable at least a hint of actual quantum advantage.
  3. Quantum Fourier transform (QFT) as the critical algorithmic building block. It enables other algorithms, including quantum phase estimation and quantum computational chemistry.
  4. Little data with a big solution space — the sweet spot for quantum computing. k qubits of input define a solution space of 2^k quantum states. So 20 qubits may seem like a very small amount of data, but 20 qubits define a solution space of 2²⁰ or one million quantum states. This means that a million possible solutions can be evaluated in parallel, simultaneously.
  5. Near-term in this paper refers to the next two to three years.
  6. Unable to use enough qubits to perform quantum computations of sufficient complexity. Plenty of qubits, but unable to use them effectively. Unable to marshal them to their full capacity.
  7. Qubits need to be usable. There are still too many impediments to effectively using the qubits we already have.
  8. Specific insufficiencies…
  9. Insufficient qubit fidelity. Mostly insufficient gate fidelity.
  10. Insufficient measurement fidelity. Generally lumped in with qubit fidelity.
  11. Insufficient qubit connectivity. Need full any to any connectivity.
  12. Need fine granularity of phase for nontrivial quantum Fourier transform.
  13. Support larger total circuit size and maximum circuit depth. Coherence time comes into play here.
  14. Need scalable algorithms. Don’t blame the hardware alone.
  15. Key points…
  16. Priority is quality over quantity. 48 qubits should be enough, but they need to be high quality.
  17. Full connectivity. But that’s a big leap from where we are today.
  18. 3.25 to 4 nines of qubit fidelity. Near-perfect qubits.
  19. Likely will still require some degree of error mitigation.
  20. 20 bits of granularity for phase. One million gradations. A 20-bit DAC.
  21. Support a 20-bit quantum Fourier transform.
  22. Quantum phase estimation (QPE) will finally be enabled. A consequence of supporting quantum Fourier transform. Enables quantum computational chemistry for greater accuracy and performance. Also enables quantum amplitude estimation (QAE).
  23. Quantum amplitude estimation (QAE) will finally be enabled.
  24. Quantum computational leverage of one million to one — 2²⁰. One parallel quantum circuit execution can simultaneously evaluate as many alternative solutions as a million classical code executions. This is the quantum advantage.
  25. Quantum advantage in the range of 1,000 to 1,000,000 X. Assuming a 10 to 20-qubit quantum Fourier transform.
  26. Sorry, but dramatic quantum advantage will remain out of reach. Not one quadrillion X (2⁵⁰) or more advantage over best classical solutions.
  27. Quantum advantage depends on the algorithm. Not the hardware itself, which simply enables the quantum algorithm.
  28. Raw quantum computational leverage must be discounted by shot count to get net quantum advantage. Even if the raw quantum advantage (quantum computational leverage) with a 20-qubit quantum Fourier transform was 1,000,000 X, a shot count of 500 would mean a net quantum advantage of only 2,000.
  29. Greater total circuit size and maximum circuit depth. No specific goal here, but some significant improvement is needed. For the sake of completeness here, we presume 2,000 gates may be executed.
  30. Scalable algorithms are needed. Even with better hardware, basic algorithm research is still needed.
  31. Of course we want to get way beyond this ASAP. But we’re not even close to this yet.
  32. A solid goal for two to three years.
  33. A credible and palpable goal to aim at.
  34. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms.
  35. Maybe a few more bits of granularity could be achieved in subsequent years, or not. But focus on a goal that seems achievable and then build on it as opportunities arise in future years.
  36. Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits.
  37. Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform. This paper is an expansion of a section from my preceding paper: “48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing” in Is Lack of Fine Granularity of Phase and Probability Amplitude the Fatal Achilles Heel Which Dooms Quantum Computing to Severely Limited Utility?
  38. Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256? More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits.
  39. 48 qubits is likely about as high as we can go with full state vector simulation.
  40. Fantasizing about a 72-qubit quantum computer. Pure fantasy, but computational leverage of four, eight, or sixteen billion X is rather appealing.
  41. Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits.
  42. Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?
  43. Where are all of the 40-qubit algorithms? This proposal should enable them.
  44. Where are all of the scalable algorithms? Ditto. But basic algorithm research is also required.
  45. Quantum Volume (QV) may be limited. May be limited by maximum simulator capacity or maximum circuit size. Or maybe these limiting factors can be resolved so that a full QV 2⁴⁸ can be achieved. But QV 2⁴⁰, QV 2³⁶ or even QV 2³² may be the best that can be hoped for in the 48-qubit quantum computer proposed by this paper.
  46. How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms. And scalable algorithms.
  47. Ramp up efforts for more powerful simulators. Really should aim for 48 qubits. As well as better performance to handle deeper circuit depth and greater circuit size.
  48. A clear path to avert a potential Quantum Winter. If the proposed quantum computer is available, people should see some interesting and even impressive results.
  49. This proposal doesn’t rely on the distant fantasy of full quantum error correction. Near-perfect qubits will generally be good enough. And maybe some modest manual error mitigation.
  50. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner. Achieve 1,000 X minimal quantum advantage with a 10-qubit quantum Fourier transform.
  51. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner. Support 15 or 16-qubit quantum Fourier transform to achieve 32K to 64K X quantum advantage.
  52. Will this even be feasible? Maybe not, but I do think it is feasible based on a fair extrapolation from the current technology and recent trends, even if I can’t actually prove it at this time.
  53. When? Two to three years, or so, seems like a solid goal.
  54. Will this be enough? For who? Maybe. It will vary.
  55. Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe. It will vary.
  56. What about the impact on other quantum application categories? Impact will vary.
  57. What’s next after this proposal has been fully implemented? Unclear. Uncharted territory. Maybe a few incremental improvements are likely.
  58. No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers. Not even in theory, particularly given the inherent limitation on fine granularity of phase and probability amplitude.

Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted

Tremendous progress has been made with quantum computers in the past six years, but we’re still only working with toy-like solutions to toy-like problems. We need to advance to something resembling realistic solutions to realistic real-world problems, otherwise we run the risk of spiraling down into a Quantum Winter of unfulfilled expectations.

This paper proposes a hardware configuration for a quantum computer which may be the best we can hope for within the next two to three years and has some hope of enabling reasonably impressive realistic results for realistic real-world problems. Certainly nothing even remotely close to fulfilling the sky-high unrealistic promises which have been made for quantum computing, but at least a reasonably credible stepping stone of progress.

What’s wrong with current quantum computers?

Overall, the problem is that current quantum algorithms and applications are:

  • Unable to use enough qubits to perform quantum computations of sufficient complexity.

Back in 2016, five qubits was a big deal and people were anxious to get more qubits, ASAP. Now, we have quantum computers with 27, 40, 53, 65, 80, and 127 qubits and more to come.

But we don’t have much in the way of quantum algorithms that use even 20 qubits. I saw a couple of papers from Google using 21 and 23 qubits, but that’s about it.

We also have this powerful algorithmic building block, quantum Fourier transform (QFT), at least on paper, and we have enough qubits to use it, but beyond the raw qubit count, the available quantum hardware is insufficient to support this powerful algorithmic capability.

So, we need the hardware to catch up with its own qubit counts.

In short:

  • Qubits need to be usable. There are still too many impediments to effectively using the qubits we already have.

Specific insufficiencies which need to be addressed:

  1. Qubit fidelity.
  2. Measurement fidelity. Generally lumped in with qubit fidelity.
  3. Qubit connectivity.
  4. Fine granularity of phase for nontrivial quantum Fourier transform.
  5. Total circuit size and maximum circuit depth.
  6. Scalable algorithms.

Background

Too many people and quantum computer vendors are focusing on and obsessing over pursuing hundreds, thousands, millions, and even billions of qubits, but in my view that is a silly waste of effort and resources, and a distraction if qubit quality and connectivity is not sufficient to effectively utilize those qubits in practical quantum algorithms which address practical real-world problems.

If you accept the basic conjecture of this paper, 20 qubits is a reasonable expectation of what might be achievable over the next two to three years for a quantum Fourier transform. Two 20-qubit registers (one for input and one for output) is 40 qubits, plus a handful of ancillary qubits, gets you to roughly 48 qubits.

Or maybe two 22-qubit registers could be achieved, boosting the quantum computational leverage (quantum advantage) by a factor of four, from 1,000,000 to 4,000,000. But that’s not a slam dunk.

So a 48-qubit quantum computer might indeed be a much more optimal goal to pursue over the next two to three years. Something that is achievable. Something that is practical. And something that offers some interesting level of computational advantage over a classical computer.

The downside is that this may be at or near the limit of what can be achieved with a quantum computer. Maybe a few more bits of granularity of phase can be achieved eventually, or maybe not. But at least it would be something that actually works and has some advantage and some benefit.

Essence of the proposal

Key points of the proposal:

  1. Priority is quality over quantity. 48 qubits should be enough, but they need to be high quality.
  2. Full connectivity. But that’s a big leap from where we are today.
  3. 3.25 to 4 nines of qubit fidelity. Near-perfect qubits.
  4. Likely will still require some degree of error mitigation. But not full, transparent, and automatic quantum error correction (QEC), which is still years in the future, over the horizon.
  5. 20 bits of granularity for phase. One million gadations. A 20-bit DAC.
  6. Support a 20-bit quantum Fourier transform.
  7. Quantum phase estimation (QPE) will finally be enabled. A consequence of supporting quantum Fourier transform. Enables quantum computational chemistry for greater accuracy and performance. Also enables quantum amplitude estimation (QAE).
  8. Quantum amplitude estimation (QAE) will finally be enabled.
  9. Quantum computational leverage of one million to one — 2²⁰. One parallel quantum circuit execution can simultaneously evaluate as many alternative solutions as a million classical code executions. This is the quantum advantage.
  10. Quantum advantage in the range of 1,000 to 1,000,000 X. Assuming a 10 to 20-qubit quantum Fourier transform.
  11. Sorry, but dramatic quantum advantage will remain out of reach. Not one quadrillion X (2⁵⁰) or more advantage over best classical solutions.
  12. Greater total circuit size and maximum circuit depth. There is no firm specific goal here yet, but some significant improvement is clearly needed. For the sake of completeness here, we presume 2,000 gates may be executed.
  13. Scalable algorithms are needed. Even with better hardware, basic algorithm research is still needed.
  14. Of course we want to get way beyond this ASAP, but we’re not even close to this yet.
  15. A solid goal for two to three years.
  16. A credible and palpable goal to aim at.
  17. Doesn’t rely on the distant fantasy of full quantum error correction. Near-perfect qubits will generally be good enough. And maybe some modest manual error mitigation.
  18. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms.
  19. Maybe a few more bits of granularity could be achieved in subsequent years, or not. But focus on a goal that seems achievable and then build on it as opportunities arise in future years.
  20. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner. Achieve 1,000 X minimal quantum advantage with a 10-qubit quantum Fourier transform.
  21. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner. Support 15 or 16-qubit quantum Fourier transform to achieve 32K to 64K X quantum advantage.

Finally enable the use of quantum phase estimation to support quantum computational chemistry and finally enable at least a hint of actual quantum advantage

This configuration should be sufficient to finally enable the use of quantum phase estimation to support quantum computational chemistry, at least for relatively modest molecules, and finally enable at least a hint of actual quantum advantage.

The configuration is not focused exclusively on quantum computational chemistry, but that’s a good place to start since it is so computationally intensive.

And if this configuration works well for quantum computational chemistry, then it will likely work wonders for other application categories as well.

Quantum amplitude estimation (QAE) will also be enabled since it also relies on quantum Fourier transform.

Quantum Fourier transform (QFT) as the critical algorithmic building block

Quantum Fourier transform (QFT) is a quantum algorithm in its own right, but it’s really only an algorithmic building block which enables other algorithms, including quantum phase estimation and quantum algorithms for quantum computational chemistry, as well as for more sophisticated algorithms such as order finding and Shor’s factoring algorithm.

The point is that the 48-qubit quantum computer proposed by this paper is intended to be sufficient to handle 20-qubit quantum Fourier transforms which should enable applications such as quantum computational chemistry.

Little data with a big solution space — the sweet spot for quantum computing

48 qubits may not seem like a lot or seem to support quantum applications of any significant sophistication, but the best way to look at quantum computing is what I call little data with a big solution space. The general idea of quantum parallelism is that k qubits of input define a solution space of 2^k quantum states. So 20 qubits may seem like a very small amount of data, but 20 qubits define a solution space of 2²⁰ or one million quantum states. This means that a million possible solutions can be evaluated in parallel, simultaneously.

20 qubits is roughly the limit for quantum Fourier Transform since it requires an output register of the same size.

In theory, a quantum algorithm using 48 qubits could evaluate a solution space of 2⁴⁸ or over 250 trillion distinct quantum states.

In short, 48 qubits can actually support applications of significant sophistication.

This certainly doesn’t fulfill the wildest promises of quantum computing for dramatic quantum advantage (a quadrillion X or more over the best classical solution) but is still a decent showing for the relatively near term (two to three years.)

For more on little data with a big solution space, see my paper:

Near-term: The next two to three years

By near-term this paper is referring to the next two to three years. And this is referring to the goal or target, not what we actually have right now or in the very near future.

Quantum parallelism — simultaneously evaluate many alternative solutions in parallel

Quantum advantage basically comes down to the ability of a quantum computer to evaluate a large number of alternative solutions in parallel, which is known as quantum parallelism.

Quantum computational leverage — the degree of quantum parallelism — how many alternative solutions are simultaneously evaluated in parallel

The number of parallel evaluations performed using quantum parallelism is the quantum computational leverage of the quantum circuit or algorithm.

The miracle called a Hadamard transform enables n qubits to simultaneously represent 2^n discrete values. A quantum computation on those n qubits hence evaluates that computation on all 2^n discrete values, in parallel.

2^n is the quantum computational leverage of evaluating 2^n possible solutions in parallel using n qubits.

For example, with 20 qubits the quantum computational leverage would be one million to one — 2²⁰. One parallel quantum circuit execution can simultaneously evaluate as many alternative solutions as a million classical code executions.

Quantum advantage in the range of 1,000 to 1,000,000 X

The quantum computational leverage of a quantum algorithm is essentially its quantum advantage.

The expectation with the proposal of this paper is that minimum to substantial or significant quantum advantage (referred to as fractional quantum advantage) can be achieved with quantum computational leverage in the range of 1,000 to 1,000,000 X over the best solution on a classical computer, assuming a 10 to 20-qubit quantum Fourier transform.

But true dramatic quantum advantage, which would come from using 50 qubits to achieve quantum computational leverage of 2⁵⁰ or one quadrillion X, appears to be out of the question, both now and… forever. For more on this limitation, see the paper cited in the next section.

For more on fractional quantum advantage (minimum quantum advantage or substantial or significant quantum advantage), see my paper:

For more on dramatic quantum advantage, see my paper:

Sorry, but dramatic quantum advantage will remain out of reach

The reasons why dramatic quantum advantage — computational advantage of one quadrillion X and more over a classical computer — will remain out of reach will be discussed shortly, or can be found in my paper:

Quantum advantage depends on the algorithm

Technically, it is not the quantum computer itself which has a particular quantum advantage, but the quantum algorithm running on that quantum computer since it depends on how many qubits are used by the quantum algorithm and how they are used. This paper is presuming that a 20-qubit quantum Fourier transform is used, but that may not be the case for a particular quantum algorithm, which may use a larger or smaller quantum Fourier transform.

For more on this topic, see my paper:

Raw quantum computational leverage must be discounted by shot count to get net quantum advantage

Even if a quantum algorithm has a raw quantum computational leverage of 2^k by using a quantum Fourier transform on k qubits, that raw advantage needs to be discounted (divided) by the shot count (circuit repetitions) to get the net quantum advantage.

Circuit repetitions are used for two purposes:

  1. To compensate for hardware errors.
  2. To compensate for the inherent probabilistic nature of quantum computing. Even if the hardware was ideal and had no errors.

For example, for a quantum algorithm using a 20-qubit quantum Fourier transform:

  1. A shot count of 1,000 would yield a net quantum advantage of 1,000. A million divided by 1,000.
  2. A shot count of 100 would yield a net quantum advantage of 10,000. A million divided by 100.
  3. A shot count of 20,000 would yield a net quantum advantage of 50. A million divided by 20,000.

The two purposes are very distinct, but get lumped in together (simple addition) for total shot count. Two considerations to keep in mind about these two purposes as we transition from current quantum computers to the proposed 48-qubit quantum computer:

  1. Hardware errors will decline dramatically. Fewer circuit repetitions will be required to compensate for hardware errors. The exact change is unknown at this time.
  2. The probabilistic nature of quantum computing will be unchanged. The exact same number of circuit repetitions will be required for the purpose of accounting for the probabilistic nature of quantum computing. The total shot count will decline, but the probabilistic contribution will not change.

So, the total circuit repetitions will decline, but not to zero or even close to zero.

For more on shot counts (circuit repetitions), see my paper:

Start with an upgraded 27-qubit quantum computer

A quantum computer with 48 fully-connected near-perfect qubits is the clear goal to have, but even it may be a bit too ambitious. It may be better to start with an upgrade to current 27-qubit designs (e.g., IBM).

Get the qubit fidelity and connectivity up to the level where fine granularity of phase is sufficient for a ten to twelve-qubit quantum Fourier transform.

This would provide only a 1K X to 4K X quantum advantage, but would prove the concepts and be a decent stepping stone towards the ultimate 48-qubit goal.

Maybe even a 36-qubit stepping stone

Even once a 27-qubit design has been upgraded and proved to support a ten to twelve-qubit quantum Fourier transform, 48 qubits may still seem too ambitious, and the underlying technology just not quite ready. Maybe 36 fully-connected near-perfect qubits would be a reasonable additional stepping stone on the way to 48 qubits.

36 qubits would enable fifteen or sixteen-qubit quantum Fourier transform, upping the quantum advantage to 32K X or 64K X, which is fairly respectable relative to current toy-like algorithms or even the 1K X to 4K X for an upgraded 27-qubit quantum computer.

Need near-perfect qubits with 3.25 to 4 nines of qubit fidelity

The starting point for the ideal goal for two to three years is the need for near-perfect qubits with 3.25 to 4 nines of qubit fidelity. Without qubit fidelity, you have nothing.

For more on the concept of near-perfect qubits, see my paper:

Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits

The precision of a quantum Fourier transform may be the primary limiting factor which limits the number of qubits which can effectively be used. A resolution of 24 bits may be at or beyond the practical limit for bits which can be processed by a digital to analog converter (DAC) and associated analog circuitry in a quantum computer. Maybe 28, 30, or even 32 bits, but 20 or maybe 22 bits may be the practical limit for the digital and analog circuitry of a quantum computer.

And a quantum Fourier transform requires two registers (one for input and one for output), so 20 or 22 becomes 40 or 44 qubits, plus a few ancillary qubits.

Although a DAC could operate on 32 bits, it seems unlikely that the rest of the hardware would support that. And beyond 32 is out of the question, although maybe a few more bits of resolution might be theoretically possible, eventually, somehow, maybe.

But for now, 24 bits seems to be the practical limit and 20 may be the maximum that we can rely upon with some degree of certainty.

For more detail on this issue of lack of fine granularity of phase and probability amplitude, see my paper:

In fact, this paper is an expansion of a section in that paper:

  • 48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing

20 bits of granularity for phase will support a 20-qubit quantum Fourier transform for quantum computational leverage of one million to one

Achieving 20 bits of granularity for phase will enable support for 20-qubit quantum Fourier transform which will provide a quantum computational leverage of 2²⁰ or one million to one. A quantum algorithm could evaluate a million possible solutions simultaneously.

24 or 32 bits of granularity would be much more desirable, but are unlikely to be within reach.

Theoretically may 36 or 40 bits of granularity could be theoretically achieved, but certainly not in the next few years.

And 44 and more bits of granularity for phase are flat out not achievable, not now and not ever.

Quantum phase estimation (QPE) will finally be enabled

A consequence of supporting quantum Fourier transform is that it enables Quantum phase estimation (QPE).

And quantum phase estimation (QPE) in turn enables quantum computational chemistry for greater accuracy and performance.

Quantum amplitude estimation (QAE) will finally be enabled

Another consequence of supporting quantum Fourier transform is that it enables Quantum amplitude estimation (QAE).

Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256?

Quantum Fourier transform (QFT) is the key limiting factor here. It’s the most powerful algorithmic building block we have, but appears to be limited by lack of fine granularity of phase. 20 to 24 qubits may be the limit of what can be processed using a quantum Fourier transform. Or more specifically, 24 may be at or beyond the practical limit for bits which can be processed by a digital to analog converter (DAC). 24 qubits is too much of a stretch and not so likely, but 20 to 22 qubits may be within reach.

A quantum Fourier transform has two registers, an input register and an output register, each with k qubits, so k = 20 requires 40 qubits, k = 21 requires 42 qubits, and k = 22 requires 44 qubits.

Generally, a few ancillary qubits are also needed. Figure at least three or four ancillary qubits. 40 plus 4 = 44, and 44 plus 4 = 48.

So 44 or 48 qubits would seem to be the limit for a quantum Fourier transform.

56 qubits or more? As I indicated, 24 qubits may be too much of a stretch for a DAC and quantum Fourier transform. 56 qubits would have the raw capacity for a 26-qubit quantum Fourier transform. 2 times 26 = 52 plus 4 ancillary qubits = 56, but a 26-bit DAC and a 26-qubit quantum Fourier transform are unlikely to be practical.

And beyond 56 qubits, the odds of success would be even further diminished.

So, sure, it would be nice, great, and very advantageous to leap to 64, 80, 96 or more qubits, but the limitations of the DAC on quantum Fourier transform would seem to suggest that 48 qubits is about as far as we can go. Or maybe just to 52 qubits or so (1 times 24 = 48 plus 4 = 52.)

Sure, it’s vaguely possible that we can go a bit further, but I wouldn’t hold it out as likely or a reasonable target. If we can push past it after we succeed at getting to 48 qubits, then some enterprising researchers and engineers can attempt to push further, but as a follow-on product rather than delay the proposed 48-qubit quantum computer.

Another reason for stopping at 48 qubits is that it is likely about as high as we can go with full state vector simulation.

More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits

There is no advantage to having more than 48 qubits if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits.

There may be some specialized algorithms which can utilize more qubits, but as a general proposition, 48 qubits will be good enough.

Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform

Qubit fidelity can continue to improve and eventually we are likely to achieve full connectivity, leaving lack of fine granularity of phase as possibly the ultimate limit which makes 48 qubits the largest configuration which can effectively support a nontrivial quantum Fourier transform.

For more detail on this issue of lack of fine granularity of phase and probability amplitude, see my paper:

In fact, this paper is an expansion of a section in that paper:

  • 48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing

48 qubits is likely about as high as we can go with full state vector simulation

Another reason for stopping at 48 qubits is that it is likely about as high as we can go with full state vector simulation.

Sure, it’s still possible that we could squeeze out a few more qubits, to 50 or maybe even 52 qubits for full state vector simulation, but I would not hold out those as practical goals for the next two to three years. Let’s focus on hitting 48 qubits as a goal and then future work can have its own goals depending on how things do or don’t work out over the next two to three years.

There are specialized simulators which simulate only a subset of the full capabilities of a quantum computer, but full state vector simulation is generally what’s needed and desired.

Classical quantum simulation is a powerful debugging technique which we really need at this stage of quantum computing with algorithm design being in such a state of flux and rapid evolution.

Until we get a lot more experience and comfort with quantum algorithms, the support of full state vector simulation will be a very powerful aid and confidence builder.

56 or 64 or 72 or 80 qubits would be more powerful, but would all be well beyond the limits of classical quantum simulation.

Fantasizing about a 72-qubit quantum computer

Yes, 48 qubits is probably the limit of what we can expect to be practical, but at least it’s worth contemplating the fantasy theoretical possibility of a 72-qubit configuration.

72 qubits would allow for a 32-bit DAC supporting a 32-qubit quantum Fourier transform which in turn would support 2³² quantum states for a computational leverage of four billion X a classical system. 32 times two is 64, leaving 8 qubits as ancillary qubits.

Maybe another bit or two could be squeezed out of the DAC and quantum Fourier transform. 33 times two is 66 leaving six ancillary qubits, for a computational leverage of 2³³ or eight billion X. 34 times two is 68 leaving four ancillary qubits, for a computational leverage of sixteen billion X. That’s as far as you could go with 72 qubits. And expecting a DAC beyond 34 bits is an absolute fantasy.

Again, such a configuration is virtually out of the question in terms of practicality, but this is what it would look like.

Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits

Trapped-ion and neutral-atom qubits implicitly support full qubit connectivity — any qubit directly to any other qubit, but superconducting transmon qubits are currently architected for no better than nearest-neighbor connectivity, forcing a reliance on inefficient and unreliable SWAP networks to achieve full connectivity.

SWAP networks can be avoided if full any to any connectivity is achieved using some sort of quantum state bus, dynamically-routable resonators, or one or more shared buses between qubits.

The ability to shuttle quantum state without shuttling the entire qubit would be beneficial.

Various topologies for the qubits should be investigated — square, rectangular, circular, elliptical, star, staggered circular, straight line, parallel straight lines, lattices (grids), three dimensional, etc.

Are there other methods for carrying or transporting quantum state (quantum state carriers) relatively short distances (within the chip as opposed to between chips)?

Also for transporting quantum state between modules or independent processors for multi-processor systems.

Fiber optic transport for longer distances — within a rack, local area network, or even wide area network.

But for the purposes of 48-qubit quantum computers as proposed by this paper, full connectivity within a single 48-qubit quantum processor is the only real priority, for now.

Quantum state bus or dynamically-routable resonators

In my own thinking I refer to two terms:

  1. Quantum state bus.
  2. Dynamically-routable resonator.

In a traditional superconducting transmon qubit quantum computer there is a resonator connecting each pair of qubits which can be operated on by a two-qubit quantum logic gate.

It would be impractical to provide such a resonator for every pair of qubits when the number of qubits is large:

  1. 27 qubits would require 27 * 26 / 2 = 351 resonators.
  2. 36 qubits would require 36 * 35 / 2 = 630 resonators.
  3. 48 qubits would require 48 * 47 / 2 = 1,128 resonators.
  4. 65 qubits would require 65 * 64 / 2 = 2,080 resonators.
  5. 127 qubits would require 127 * 126 / 2 = 8,001 resonators.
  6. 433 qubits would require 433 * 432 / 2 = 93,528 resonators.
  7. 1,121 qubits would require 1,121 * 1,120 / 2 = 627,760 resonators.

With a quantum state bus or dynamically-routable resonator each qubit would have an entry ramp and exit ramp to a single, shared resonator and a hardware device to dynamically enable the entry ramp and exit ramp for the pair of qubits to be used in a two-qubit quantum logic gate.

This hardware mechanism would provide full any-to-any qubit connectivity.

This is a purely-speculative hardware mechanism on my part. But it does seem plausible.

Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?

Buses are standard fare on classical computer systems. And there is plenty of proprietary intellectual property (IP) as well. In other words, patents. I just wonder if that classical IP is getting in the way of incorporating bus technology into quantum computers. Maybe. Or maybe not.

It’s worth noting that two of the major players in quantum computing are also major players in classical computing and no strangers to bus technology or protection of intellectual property — IBM and Intel.

It is worth noting that a lot of these big companies have extensive cross-licensing of patents as well, so protected IP is not an absolute obstacle per se.

So, it would seem that at least for these two vendors IP for bus technology should not be a problem. Unless some smaller startup firm or patent troll firm has obtained patent protection and is unwilling to license it as cheaply as IBM or Intel would prefer.

So, protected IP may or may not be an issue. It’s not an absolute slam dunk either way.

I really only have this section here because it is the most plausible explanation for why vendors are not pursuing quantum state buses and dynamically-routable resonators with more rigor. It’s an obvious path to pursue, but they’re not pursuing it, so there must be some good reason or rationale to explain their lack of pursuit of the obvious.

Greater total circuit size and maximum circuit depth

Quantum Fourier transform alone will require a significant improvement in total quantum circuit size and maximum quantum circuit depth. There is no firm specific goal here yet, but some significant improvement is clearly needed. For the sake of completeness here, we presume 2,000 gates may be executed. 5,000 gates would be a more desirable target, especially to enable achieving Quantum Volume (QV) of 2⁴⁸, but even 2,000 seems quite a stretch at this stage.

Improvement in total quantum circuit size and maximum quantum circuit depth comes from two distinct sources:

  1. Longer qubit coherence time. This enables more quantum logic gates to be executed.
  2. Shorter gate execution time. Shorter time to execute each quantum logic gate means that more gates can be executed in the available qubit coherence time.

A third factor is that support for full qubit connectivity means that the gate count doesn’t need to be wasted on SWAP networks needed to simulate full qubit connectivity.

We really do need to establish a goal or minimum requirement for total quantum circuit size and maximum quantum circuit depth, based nominally on the gates needed to perform a 20-qubit quantum Fourier transform, but right now that’s a more manageable engineering challenge compared to qubit fidelity and qubit connectivity.

For the sake of completeness here, we presume 2,000 gates may be executed. That could be accomplished in a coherence time of 250 microseconds with a gate execution time of 125 nanoseconds or eight gates per microsecond — 8 times 250 equals 2,000.

Where are all of the 40-qubit algorithms?

It’s rare to find quantum algorithms using even 16 or 18 qubits, and very few using more than 20 qubits. Part of the problem is the limitations of current real quantum computers, but I would note that we have classical quantum simulators which can handle 32 qubits and even 40 qubits, so one would expect that we should see some reasonable body of 32 and 40-qubit algorithms, but… we don’t. So, where are all of the 32 or 40-qubit algorithms?

Someday in the not too distant future, maybe two or three years, we should have quantum computer hardware capable of reliably executing 32 and 40-qubit algorithms, so it is reasonable to expect that researchers should be designing and testing those algorithms on today’s simulators so that they can be fully prepared to actually run such algorithms on real quantum computers with that capacity as soon as they become available. But, nobody seems to be preparing for such a state of affairs.

Read more on my notion of 40-qubit algorithms in my informal paper:

Where are all of the scalable algorithms?

We wouldn’t have to ask where all of the 40-qubit algorithms are if most quantum algorithms were scalable in the first place. More specifically, dynamically scalable — the size of the algorithm is automatically adjusted based on the size of the input data and input parameters.

For more detail on scalable algorithms, see my paper:

Quantum Volume (QV) may be limited

Ideally, the Quantum Volume (QV) metric value for a 48-qubit quantum computer would be 2⁴⁸, but there are factors which may conspire to cause that ideal to fail to be achieved. It may be limited by maximum simulator capacity or maximum circuit size. And qubit fidelity could be a factor as well for deeper circuits. Qubit connectivity shouldn’t be an issue if full any to any qubit connectivity is supported, as required by the proposal of this paper.

Measurement of the Quantum Volume metric requires running a simulation of a complex quantum circuit to determine the proper, ideal results for the quantum circuit. It is the largest square circuit of k qubits with k layers on each qubit. Actually, it’s two times k gates on each qubit, a one-qubit gate and a two-qubit gate for each layer. Simulation of a quantum circuit is a resource intensive process which grows exponentially for each added qubit.

Classical quantum simulators are currently limited to 40 qubits, or 32 qubits in some cases. Each added qubit doubles the resources required for the simulation (exponential growth.) So 48 qubits is well beyond the capacity of current simulators. Still, at least theoretically, research and clever engineering could produce a simulator capable of simulating a 48-qubit quantum computer.

Achieving a Quantum Volume of 2⁴⁸ would require a quantum circuit with 48 layers, each layer operating on 48 qubits. Each layer has two gates per qubit. 2 times 48 times 48 is 4,608 gates, which is well above the target of supporting quantum circuits up to 2,000 gates. So just to accommodate all of those gates would require a larger total circuit size. We could up the goal to 5,000 gates, but even 2,000 gates was a significant stretch.

If we really are limited to 2,000 gates, that would imply a maximum Quantum Volume of 2³¹. Bump that up to 2,048 gates and we can get to QV 2³².

Or maybe these limiting factors can be resolved so that a full QV 2⁴⁸ can be achieved. It’s not out of the question to support execution of quantum circuits with 5,000 gates, and not out of the question to support simulation of quantum circuits using 48 qubits. But it’s not a slam dunk either.

But QV 2⁴⁰, QV 2³⁶ or even QV 2³² may be the best that can be hoped for in the 48-qubit quantum computer proposed by this paper.

Other technical risks

This paper has endeavored to cover all known technical risks. There may be some more, but at present no additional technical risks are known to the author.

Still, we should plan for the unexpected, that unexpected contingencies may require our attention even after we have taken care of all of the known and expected technical factors.

But for now, there’s nothing else known or looming over the horizon.

Will this even be feasible?

That’s a great and fair question. This paper presumes that the technical advances which predicate the technology needed to technically achieve the 48-qubit quantum computer proposed by this paper are indeed feasible, but it is very possible that they are not. I do believe that they are feasible, but that’s more of a belief, extrapolation, and speculation on my part than demonstrable fact or provable knowledge.

In short, I do sincerely believe that the proposal of this paper is a fair extrapolation from the current state of the technology and recent trends.

When? Two to three years, or so, seems like a solid goal

There are so many factors which have to come together or could conspire to subvert the goal of this paper, but for now, two to three years seems like a credible goal.

Sure, it could take a bit longer.

And maybe it ends up happening a little sooner.

Short of the full capabilities of the proposed 48-qubit quantum computer, there are two additional possibilities:

  1. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner.
  2. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner.

Earliest availability?

Even though I would be loath to set expectations for availability any sooner than two to three years, there are many factors which if they occurred together or in rapid succession could result in earlier availability — or delayed availability if any of the factors fall short of requirements and expectations. Some factors:

  1. Roadmap of qubit fidelity. Each vendor will have their own roadmap towards greater qubit fidelity, each proceeding at their own pace, hopefully not too leisurely.
  2. Technical risk for connectivity. Improving connectivity is a huge question mark for most vendors (other than trapped-ion and neutral-atom vendors). The notion of a quantum state bus or dynamically-routable resonators may seem obvious to some of us, but devilishly difficult for vendors themselves who have to grapple with science, engineering, software, and business challenges.
  3. Trapped-ion and neutral-atom have an advantage for connectivity. At least at present, vendors of trapped-ion and neutral-atom qubit technologies have an implicit advantage since they have full any to any connectivity by the nature of their design.
  4. Lack of fine granularity of phase and probability amplitude. Too little transparency to get any reliable sense of where we are. No sense of what progress might occur and when.

And as mentioned in the preceding section, earlier goals can include:

  1. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity.
  2. Maybe even a 36-qubit stepping stone.

Will this be enough? For who? Maybe… it will vary

The 48-qubit quantum computer proposed by this paper would be a major advance over current offerings, but will it be enough? It will depend and vary. It may be enough for some but not enough for others. It’s hard to say.

All we can do right now is put this proposal out there and let quantum algorithm designers and quantum application developers compare it with their own particular requirements.

Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe… it will vary

Of particular interest is whether this proposal will be sufficient to support quantum phase estimation (QPE) to enable more sophisticated quantum computational chemistry than can be supported with variational methods as is done at present. The answer is unknown. A 20-bit quantum Fourier transform may be enough for some quantum computational chemistry applications but not for others.

All we can do right now is put this proposal out there and let the designers and developers of quantum algorithms and applications for quantum computational chemistry compare it with their own particular requirements.

I suspect that 20 bits of precision will be enough for some simpler molecules even if not for larger molecules.

I do recall somebody suggesting that they needed 110 and 125 qubits for some interesting quantum computational chemistry problems, but I don’t recall whether that was total qubits or the precision of the quantum phase estimation (quantum Fourier transform). Either way, that would be beyond what appears to be the limit of granularity of phase and probability amplitude, which would mean that such quantum computations would not be practical — either on the proposed 48-qubit quantum computer or on any quantum computer, ever.

What about the impact on other quantum application categories? Impact will vary

I explicitly call out the quantum application category of quantum computational chemistry because it is computationally intensive and close enough to physics to be amenable to a quantum computing solution.

If we can make progress on quantum computational chemistry, the other application categories should be more of a slam dunk, although the impact and requirements will vary. Some applications will be easy and some will be hard.

Some applications will achieve decent computational advantage and some will not achieve as much.

For more on quantum application categories, see my paper:

How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms, and scalable algorithms

Since the 48-qubit quantum computer hardware proposed by this paper is unlikely to be available for two years or so, the question is what to do before then to prepare for that eventuality. The answer is simulation — using classical quantum simulators to simulate quantum algorithms using 24 to 40 qubits and configured to emulate the proposal of this paper as closely as possible in terms of limits and error rates.

And focus on scalable algorithms as well. Algorithms which work well for 20 to 32 qubits and can reliably be scaled up to work for 36, 40, 44, and 48 qubits and even beyond 48 qubits, should such hardware become available.

24 to 28 qubits may be the practical limit for algorithms of any significant sophistication.

32 qubits should be in reach as well.

40 qubits is nominally the limit for what can practically be simulated today, but there are questions of how sophisticated the algorithms can be in terms of circuit depth and total circuit size.

We can hope that simulators for up to 48 qubits will be developed, but that’s not a slam dunk.

Ramp up efforts for more powerful simulators — shoot for 48 qubits

Theoretically, an enterprising engineering team should be able put together a simulator for 42 to 44 qubits, but of relatively shallow circuit depth.

Getting simulators up to 45 to 48 qubits would be a significant challenge, but is still a theoretical possibility.

The good news is that pushing simulator capacity and performance up beyond 40 qubits and even to 48 qubits does not depend on advances in quantum physics or even theoretical computer science. A talented engineering team could handle the task, given sufficient resources.

Some research may indeed be required as well. Sometimes the engineers can handle that, but sometimes computer scientists and mathematicians might be required as well.

And all of this work on simulators doesn’t depend on any of the work on the quantum hardware. Both can be done in parallel. The only thing that the simulator team will require is a full specification of the capabilities and limitations for the quantum computer to be simulated.

Of course we want to get way beyond this ASAP, but we’re not even close to this yet

The proposal of this paper is not intended as the ultimate ideal for quantum computing, but simply where we need to be in two to three years or so to have a sense that we really are on track to progress towards the larger promise of quantum computing.

A credible and palpable goal to aim at

Ultimately, all that is really necessary here is to set a credible goal that gives people the feeling that we really are making decent progress.

Without that palpable sense of progress in two to three years, we really do run the risk of spiraling down into a Quantum Winter.

A clear path to avert a potential Quantum Winter

Reiterating what was mentioned in the preceding section, following the proposal of this paper may well be the best we can do to establish and follow a clear path which averts the very real prospect of a Quantum Winter which could easily occur if we don’t have capable hardware such as proposed by this paper.

For more on the concept of a Quantum Winter, see my paper:

What’s next after this proposal has been fully implemented? Unclear and uncharted territory

A large part of the motivation for this paper was the fact that 20 to 24 qubits may be the practical limit for quantum Fourier transform, so this proposal is based on support for 20-qubit quantum Fourier transform. Maybe a few incremental improvements are likely.

Maybe eventually 20-qubit quantum Fourier transform can be expanded to a full 24 qubits, and may ultimately even to 32 qubits, but that’s just speculation at this stage.

And as I wrote in another recent paper, we simply may not be able to handle quantum Fourier transform beyond 40 qubits at all, ever. More research is needed to gain more insight into what the ultimate limiting factors really are.

No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers

There may be relatively clear roadmaps to get to hundreds, thousands, millions, and even billions of qubits, but the path to algorithms which can effectively exploit more than about 48 qubits (especially given the limitation on fine granularity of phase and probability amplitude) is certainly unknown at this time.

And more than simply unknown, there isn’t even a cognizable theoretical clue as to the potential existence of such a path.

Quantum error correction (QEC)? Not yet required

The proposal of this paper does not rely on the distant promise of full-blown automatic and transparent quantum error correction (QEC). Near-perfect qubits with 3.25 to 4 nines of qubit fidelity should be good enough for many or maybe even most quantum algorithms and applications. Some manual (or compiler-generated) error mitigation may be appropriate in some situations, but near-perfect qubits should do much or most if not all of the heavy lifting, at least for modest to moderate-sized quantum algorithms.

Larger and more sophisticated quantum algorithms will have to wait for future quantum computer technologies beyond the proposal of this paper — or exploit extensive tricks and clever approaches, as well as a heavy reliance on tedious manual error mitigation.

For more on quantum error correction, see my paper:

This proposal doesn’t rely on the distant fantasy of full quantum error correction

Just to be clear, the proposal of this paper does not rely on the distant promise of full-blown automatic and transparent quantum error correction (QEC). But it does rely on near-perfect qubits, with 3.25 to 4 nines of qubit fidelity.

Rigetti?

Generally I’m not calling out specific vendors, but there are a few points about Rigetti worth noting relative to this paper:

  1. 40 qubits vs. 80 qubits. They have a new modular architecture based on 40-qubit modules. 40 qubits leaves them short of the 48 qubits proposed by this paper. 80 qubits is overkill, plus it may add overhead and reduce qubit fidelity and connectivity. It would be more ideal if they redid their 40-qubit module to be a 48-qubit module.
  2. Weak connectivity. All transmon qubit devices have this issue.
  3. Weak qubit fidelity. All vendors have this issue. They need to publish a roadmap for getting to near-perfect qubits.
  4. Granularity of phase. All transmon qubit devices have this issue. Transparency is needed. And a roadmap for getting to fine granularity.

Qubit counts for IBM

There is nothing special about my specific suggestions for qubit counts for 36 and 48-qubit quantum computers. A few qubits more wouldn’t impact the proposal of this paper. That said, I wouldn’t revise the qubit counts of my proposal, with the exception of IBM, which has a subtle method for how they decide how many qubits appear in a given quantum computer. Some IBM-specific qubit count issues related to the proposal of this paper:

  1. 27 qubits. The existing IBM configuration. Simply with upgraded capabilities for qubit fidelity, qubit connectivity, and fine granularity of phase and probability amplitude.
  2. 36 qubits. Maybe that would be 38 under the IBM scheme — based on using a subset of the 65-qubit Hummingbird qubit topology.
  3. 48 qubits. Maybe this would be 49 under the IBM scheme, again based on the Hummingbird qubit topology.

What about Shor’s factoring algorithm?

Shor’s factoring algorithm has been hyped as being the greatest capability of quantum computing and the greatest threat to encryption, but so far it’s not been a factor at all. In fact, it’s not clear when it might become a factor, or even if it will ever be feasible for factoring very large semiprime numbers such as 2048 and 4096-bit public encryption keys. In fact, the tiny number 21 is the largest semiprime integer known to have been factored by a real quantum computer using Shor’s algorithm.

But the question arises as to whether particular advances in quantum computing might result in advances for Shor’s factoring algorithm.

Shor’s factoring algorithm as originally designed requires four times as many qubits as bits in the input number to be factored plus a few ancillary qubits. This is so that it can do a quantum Fourier transform on a number which is the square of the input number to be factored. So if the input number has k bits, the input register requires two times k qubits and the output register also requires as many qubits.

So if the proposed 48-qubit quantum computer can handle a quantum Fourier transform for input of 20 qubits, that means that the input number for Shor’s algorithm can be only 10 bits, up to 1023.

Or if we can achieve a quantum Fourier transform on 22 qubits, that means that the input number for Shor’s algorithm can be only 11 bits, up to 2047.

Whether the factoring process can be completed using the 2,000-gate limit for maximum circuit size proposed by this paper is not clear, but possible.

In any case, 10 or 11 bits would be the largest semiprime number that Shor’s algorithm could factor for the proposed 48-qubit quantum computer.

Whether it could actually work is unclear, but possible.

My original proposal for this topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

  • 48 fully-connected near-perfect qubits as the sweet spot goal for near-term quantum computing. Priority is quality over quantity. Full connectivity. Near-perfect qubits with 3.25 to 4 nines of qubit fidelity. 20 bits of granularity for phase. Support a 20-bit quantum Fourier transform for computational leverage of one million to one. Of course we want to get way beyond this ASAP, but we’re not even close to this yet. A solid goal for two to three years. A credible and palpable goal to aim at. Doesn’t rely on the distant fantasy of full quantum error correction. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms. Start with an upgraded 27-qubit processor supporting 10 to 12-qubit quantum Fourier transform. Then a 36-qubit processor supporting 15 to 16-qubit quantum Fourier transform before moving up to the full 48-qubit processor.

Summary and conclusions

  1. Quantum computing desperately needs to show some realistic results addressing more than toy-like problems within the next two to three years if a Quantum Winter is to be averted.
  2. This configuration should be sufficient to finally enable the use of quantum phase estimation to support quantum computational chemistry, at least for relatively modest molecules, and finally enable at least a hint of actual quantum advantage.
  3. Quantum Fourier transform (QFT) as the critical algorithmic building block. It enables other algorithms, including quantum phase estimation and quantum computational chemistry.
  4. Little data with a big solution space — the sweet spot for quantum computing. k qubits of input define a solution space of 2^k quantum states. So 20 qubits may seem like a very small amount of data, but 20 qubits define a solution space of 2²⁰ or one million quantum states. This means that a million possible solutions can be evaluated in parallel, simultaneously.
  5. Near-term in this paper refers to the next two to three years.
  6. Unable to use enough qubits to perform quantum computations of sufficient complexity. Plenty of qubits, but unable to use them effectively. Unable to marshal them to their full capacity.
  7. Qubits need to be usable. There are still too many impediments to effectively using the qubits we already have.
  8. Specific insufficiencies…
  9. Insufficient qubit fidelity. Mostly insufficient gate fidelity.
  10. Insufficient measurement fidelity. Generally lumped in with qubit fidelity.
  11. Insufficient qubit connectivity. Need full any to any connectivity.
  12. Need fine granularity of phase for nontrivial quantum Fourier transform.
  13. Support larger total circuit size and maximum circuit depth. Coherence time comes into play here.
  14. Need scalable algorithms. Don’t blame the hardware alone.
  15. Key points…
  16. Priority is quality over quantity. 48 qubits should be enough, but they need to be high quality.
  17. Full connectivity. But that’s a big leap from where we are today.
  18. 3.25 to 4 nines of qubit fidelity. Near-perfect qubits.
  19. Likely will still require some degree of error mitigation.
  20. 20 bits of granularity for phase. One million gradations. A 20-bit DAC.
  21. Support a 20-bit quantum Fourier transform.
  22. Quantum phase estimation (QPE) will finally be enabled. A consequence of supporting quantum Fourier transform. Enables quantum computational chemistry for greater accuracy and performance. Also enables quantum amplitude estimation (QAE).
  23. Quantum amplitude estimation (QAE) will finally be enabled.
  24. Quantum computational leverage of one million to one — 2²⁰. One parallel quantum circuit execution can simultaneously evaluate as many alternative solutions as a million classical code executions. This is the quantum advantage.
  25. Quantum advantage in the range of 1,000 to 1,000,000 X. Assuming a 10 to 20-qubit quantum Fourier transform.
  26. Sorry, but dramatic quantum advantage will remain out of reach. Not one quadrillion X (2⁵⁰) or more advantage over best classical solutions.
  27. Quantum advantage depends on the algorithm. Not the hardware itself, which simply enables the quantum algorithm.
  28. Raw quantum computational leverage must be discounted by shot count to get net quantum advantage. Even if the raw quantum advantage (quantum computational leverage) with a 20-qubit quantum Fourier transform was 1,000,000 X, a shot count of 500 would mean a net quantum advantage of only 2,000.
  29. Greater total circuit size and maximum circuit depth. No specific goal here, but some significant improvement is needed. For the sake of completeness here, we presume 2,000 gates may be executed.
  30. Scalable algorithms are needed. Even with better hardware, basic algorithm research is still needed.
  31. Of course we want to get way beyond this ASAP. But we’re not even close to this yet.
  32. A solid goal for two to three years.
  33. A credible and palpable goal to aim at.
  34. Should be able to stretch classical simulation to 48 bits as well, which will enable debugging of 48-qubit quantum algorithms.
  35. Maybe a few more bits of granularity could be achieved in subsequent years, or not. But focus on a goal that seems achievable and then build on it as opportunities arise in future years.
  36. Lack of fine granularity of phase and probability amplitude limit quantum Fourier transform to 20 to 32 qubits.
  37. Lack of fine granularity of phase may be the ultimate limit which makes 48 qubits the largest configuration which can effectively use a quantum Fourier transform. This paper is an expansion of a section from my preceding paper: “48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing” in Is Lack of Fine Granularity of Phase and Probability Amplitude the Fatal Achilles Heel Which Dooms Quantum Computing to Severely Limited Utility?
  38. Why 48 qubits and not 56, 64, 72, 80, 96, 128, 160, or 256? More than 48 qubits doesn’t help if qubit fidelity and connectivity aren’t there and if quantum Fourier transform precision is less than 24 qubits.
  39. 48 qubits is likely about as high as we can go with full state vector simulation.
  40. Fantasizing about a 72-qubit quantum computer. Pure fantasy, but computational leverage of four, eight, or sixteen billion X is rather appealing.
  41. Need for a quantum state bus or dynamically-routable resonators for enhanced connectivity for transmon qubits.
  42. Is extensive classical IP a severe impediment to pursuing a quantum state bus or dynamically-routable resonators?
  43. Where are all of the 40-qubit algorithms? This proposal should enable them.
  44. Where are all of the scalable algorithms? Ditto. But basic algorithm research is also required.
  45. Quantum Volume (QV) may be limited. May be limited by maximum simulator capacity or maximum circuit size. Or maybe these limiting factors can be resolved so that a full QV 2⁴⁸ can be achieved. But QV 2⁴⁰, QV 2³⁶ or even QV 2³² may be the best that can be hoped for in the 48-qubit quantum computer proposed by this paper.
  46. How best to prepare for this 48-qubit future? Focus on simulation of 24 to 40-qubit quantum algorithms. And scalable algorithms.
  47. Ramp up efforts for more powerful simulators. Really should aim for 48 qubits. As well as better performance to handle deeper circuit depth and greater circuit size.
  48. A clear path to avert a potential Quantum Winter. If the proposed quantum computer is available, people should see some interesting and even impressive results.
  49. This proposal doesn’t rely on the distant fantasy of full quantum error correction. Near-perfect qubits will generally be good enough. And maybe some modest manual error mitigation.
  50. Start with an upgraded 27-qubit quantum computer. Enhance qubit fidelity. Add full connectivity. Enhance phase granularity. Well short of the 48-qubit goal, but could be made available much sooner. Achieve 1,000 X minimal quantum advantage with a 10-qubit quantum Fourier transform.
  51. Maybe even a 36-qubit stepping stone. Not all that the 48-qubit goal would offer, but maybe available much sooner. Support 15 or 16-qubit quantum Fourier transform to achieve 32K to 64K X quantum advantage.
  52. Will this even be feasible? Maybe not, but I do think it is feasible based on a fair extrapolation from the current technology and recent trends, even if I can’t actually prove it at this time.
  53. When? Two to three years, or so, seems like a solid goal.
  54. Will this be enough? For who? Maybe. It will vary.
  55. Will this be enough for quantum computational chemistry with quantum phase estimation? Maybe. It will vary.
  56. What about the impact on other quantum application categories? Impact will vary.
  57. What’s next after this proposal has been fully implemented? Unclear. Uncharted territory. Maybe a few incremental improvements are likely.
  58. No clear roadmap to outline the path for quantum algorithms that can effectively exploit more than about 48 qubits for real quantum computers. Not even in theory, particularly given the inherent limitation on fine granularity of phase and probability amplitude.

--

--