Where Are All of the 40-qubit Quantum Algorithms?

  1. Motivation for this paper
  2. My goal: Encourage development of scalable 40-qubit quantum algorithms
  3. This is my challenge — I’m throwing down the gauntlet — I want to see a lot of 40-qubit quantum algorithms within two years
  4. Toy quantum algorithms
  5. Industrial-strength quantum algorithms
  6. Production-scale quantum algorithms
  7. Are any real quantum algorithms using more than 23 qubits?
  8. Late breaking news: Google has simulated up to 30 qubits for Quantum Machine Learning
  9. Google quantum supremacy experiment used 53 qubits, but not to solve a practical problem
  10. Why limit to 40 qubits?
  11. What’s the big benefit of 40 qubits?
  12. Scaling is essential — 40 qubits is merely a stepping stone
  13. What is dramatic quantum advantage?
  14. Don’t we already have real quantum computers with more than 40 qubits?
  15. Quantum Volume of at least one trillion is needed to support 40-qubit quantum algorithms
  16. NISQ quantum computers can’t handle 40-qubit quantum algorithms
  17. Technically, 43 qubits may be needed for 40-qubit quantum parallelism
  18. There just aren’t many 40-qubit quantum algorithms published
  19. It’s a chicken and egg problem
  20. My suggested solution: Simulate 40-qubit quantum algorithms now, be ready for real quantum computers when they become available
  21. Simulation is the best near-term path to supporting 40-qubit quantum algorithms
  22. Target qubit fidelity of 2–5 years from now
  23. So, why isn’t this happening? Where are all of the 40-qubit quantum algorithms?
  24. Why aren’t there any 40-qubit quantum algorithms?
  25. A sampler of excuses for why people aren’t focused on 40-qubit quantum algorithms
  26. Significant resources needed to simulate quantum circuits with a large number of product states
  27. Qubits, circuit depth, and product states
  28. Deep circuits and product states
  29. Is circuit depth a big issue for 40-qubit quantum algorithms?
  30. But in theory, simulating 40 qubits should be practical
  31. In short, there’s no technical excuse for lack of 40-qubit quantum algorithms which can run on simulators
  32. Mindset obstacles to 40-qubit quantum algorithms
  33. Is a focus on variational methods holding back larger algorithms?
  34. Need for fine granularity of phase and probability amplitude
  35. Shouldn’t the promise of support for quantum phase estimation (QPE) and quantum Fourier transform (QFT) be sufficient to draw a focus on 40-qubit quantum algorithms?
  36. Future demand for quantum phase estimation (QPE) and quantum Fourier transform (QFT) will eventually drive 40-qubit quantum algorithms
  37. No real need for 40-qubit quantum algorithms until quantum phase estimation (QPE) and quantum Fourier transform (QFT) create a need
  38. Why little publication of 40-qubit quantum algorithms even if simulation is limited?
  39. Any 40-qubit quantum algorithms in the Quantum Algorithm Zoo?
  40. My full solution: A model for scalable quantum algorithms
  41. Tackling practical, production-scale real-world problems
  42. Will 40-qubits get us to The ENIAC Moment for quantum computing?
  43. Will 40-qubits get us to The FORTRAN Moment for quantum computing?
  44. What would resource requirements be for various application categories?
  45. 40-qubit quantum algorithms which I may not be aware of
  46. Dramatic need for additional research
  47. Open source
  48. What’s a PhD candidate or thesis advisor to do?
  49. Timeframe
  50. Who’s on first?
  51. When will all quantum algorithms be 40-qubit?
  52. Take the 40-qubit challenge!
  53. Inspire the hardware vendors to support 40-qubit algorithms
  54. Summary and conclusions

Motivation for this paper

While contemplating issues with scaling quantum algorithms I started by considering how far we can go using classical quantum simulators before simulators run out of steam (capacity.) At present, the limit for simulation appears to be in the 38 to 42-qubit range — call it 40 qubits.

My goal: Encourage development of scalable 40-qubit quantum algorithms

To be clear, I am not focusing on 40 qubits as a specific target algorithm size, but focusing on scalable quantum algorithms tested in stages from 4 to 40 qubits — or whatever the size limit is for the available classical quantum simulators. And, with the expectation that those algorithms will also easily scale to 50, 64, 72, 80, and even 96 and more qubits as real quantum computers become available with sufficient qubits and qubit fidelity, connectivity, and coherence.

This is my challenge — I’m throwing down the gauntlet — I want to see a lot of 40-qubit quantum algorithms within two years

Seriously, I find it profoundly depressing that we don’t have a wealth of scalable algorithms capable of utilizing 40-qubits even on a classical quantum simulator.

Toy quantum algorithms

The term toy quantum algorithm is not intended to be derogatory, but simply highlights the very limited size and capacity of an algorithm that is not designed to handle production-scale computations, and not designed to be readily scaled to handle production-scale problems when a real quantum computer with sufficient capacity and fidelity does become available.

Industrial-strength quantum algorithms

An industrial-strength quantum algorithm has been designed for the capacity and performance required to address the needs of production-scale real-world practical problems, and to deliver dramatic quantum advantage. This is in contrast to a toy quantum algorithm with only very limited capacity, insufficient to handle any significant amount of data and very limited performance.

Production-scale quantum algorithms

The terms production-scale quantum algorithm and industrial-strength quantum algorithm are essentially synonyms.

Are any real quantum algorithms using more than 23 qubits?

I haven’t done an exhaustive literature search, but the largest quantum algorithm I could find was from Google AI using 23 qubits:

Late breaking news: Google has simulated up to 30 qubits for Quantum Machine Learning

Just as I was finalizing this paper, Google announced a paper on Quantum Machine Learning where they simulated quantum circuits up to 30 qubits with a depth of 40 gates:

  • TensorFlow Quantum: A Software Framework for Quantum Machine Learning
  • Michael Broughton, et al
  • August 26, 2021
  • https://arxiv.org/abs/2003.02989
  • In the numerical experiments conducted in [22], we consider the simulation of these quantum machine learning models up to 30 qubits. The large-scale simulation allows us to gauge the potential and limitations of different quantum machine learning models better. We utilize the qsim software package in TFQ to perform large scale quantum simulations using Google Cloud Platform. The simulation reaches a peak throughput of up to 1.1 quadrillion floating-point operations per second (petaflop/s). Trends of approximately 300 teraflop/s for quantum simulation and 800 teraflop/s for classical analysis were observed up to the maximum experiment size with the overall floating-point operations across all experiments totaling approximately two quintillions (exaflop).
  • From the caption for Figure 9: “Circuits were of depth 40.

Google quantum supremacy experiment used 53 qubits, but not to solve a practical problem

Granted, the Google quantum supremacy experiment did use 53 qubits on their 53-qubit Sycamore processor back in 2019, but that was a contrived computer science experiment (XEB — cross-entropy benchmarking), not a solution to a practical, real-world problem.

Why limit to 40 qubits?

The only reason for the 40-qubit limit in this informal paper is that it’s roughly the current limit for classical quantum simulators. Maybe you could push 41 or 42 qubits on some simulators.

What’s the big benefit of 40 qubits?

Besides the fact that bigger is always better and 40 qubits happens to be the current limit for simulation, the value and benefit of 40-qubit quantum algorithms is that the ability to perform a computation on 2⁴⁰ quantum states — that’s a trillion quantum states — is on the threshold of dramatic quantum advantage.

Scaling is essential — 40 qubits is merely a stepping stone

As previously stated, the goal here is not 40 qubits per se, but that’s simply the approximate limit for current classical quantum simulators.

What is dramatic quantum advantage?

I personally define dramatic quantum advantage as a factor of one quadrillion — 2⁵⁰ — greater than the performance of a classical computer.

Don’t we already have real quantum computers with more than 40 qubits?

Yes, we already do have real quantum computers with more than 40 qubits. Google and IBM both have 53-qubit quantum computers, and IBM has a 65-qubit quantum computer. But…

  1. Low qubit fidelity. Noise and error rates that discourage the design of complex algorithms.
  2. Low coherence time. Severely limits circuit depth.
  3. Limited connectivity. Only a few qubits to which any qubit can be directly connected. Forces a reliance on SWAP networks which increases the impact of #1 and #2 — more gates with lower fidelity.

Quantum Volume of at least one trillion is needed to support 40-qubit quantum algorithms

For 40-qubit quantum algorithms to be generally supported, a real quantum computer would have to have a Quantum Volume of at least 2⁴⁰ — that’s one trillion, while IBM has so far only achieved QV of 128 and Honeywell achieved a QV of 512. Google doesn’t report Quantum Volume.

NISQ quantum computers can’t handle 40-qubit quantum algorithms

Current NISQ quantum computers simply can’t handle 40-qubit quantum algorithms, as discussed in the previous section.

Technically, 43 qubits may be needed for 40-qubit quantum parallelism

The real goal is to support quantum parallelism on 40-qubits, to achieve a quantum speedup of one trillion, but in practice a few more qubits, ancilla qubits, may be needed to support those 40 qubits. So, 43 qubits is probably the better target.

There just aren’t many 40-qubit quantum algorithms published

But the overwhelming limitation to fully exploiting 40 qubits on even a real quantum computer with 40 or more qubits is simply:

  • There just aren’t many published algorithms ready to exploit 40 qubits.

It’s a chicken and egg problem

So, we have a classical chicken and egg problem:

  1. Why bother writing a 40-qubit quantum algorithm if there isn’t sufficient hardware to run it on?
  2. Why bother building 40-qubit quantum computers with sufficient capabilities to run complex 40-qubit quantum algorithms if no such algorithms exit?

My suggested solution: Simulate 40-qubit quantum algorithms now, be ready for real quantum computers when they become available

To me, the solution is simple:

  • Algorithm designers: Design, develop, and test complex 40-qubit quantum algorithms using advanced classical quantum simulators. Prove that they work. Publish them. Let the hardware engineers know that you are ready for real quantum computers with robust support for 40 qubits.
  • 40-qubit quantum algorithms will then be ready to run on advanced hardware as soon as it becomes available. Presuming that the hardware has enough qubits with sufficient qubit fidelity and connectivity.

Simulation is the best near-term path to supporting 40-qubit quantum algorithms

Just to emphasize this key point more strongly, simulation is the best path currently available for pursuing 40-qubit quantum algorithms.

Target qubit fidelity of 2–5 years from now

Since current real quantum computers cannot properl run 40-qubit algorithms at present, by definition we’re talking about running on real quantum computers as they might exist years in the future. I’m speculating that we will see real quantum computers of sufficient qubit fidelity to run 40-quibit algorithms in two to five years. Classical quantum simulators should be configured to match the likely qubit fidelities of two to five years from now.

So, why isn’t this happening? Where are all of the 40-qubit quantum algorithms?

But… clearly that isn’t happening. Why not? That’s the question to be explored by this informal paper.

Why aren’t there any 40-qubit quantum algorithms?

An alternative phrasing of the headline question is to ask why there aren’t any 40-qubit quantum algorithms.

A sampler of excuses for why people aren’t focused on 40-qubit quantum algorithms

Just off the top of my head, mostly my personal observations and speculation, here are just a sample of the excuses for not producing 40-qubit quantum algorithms today:

  1. It’s just technically too challenging. But I’m skeptical about that, although it’s at least partially true.
  2. Smaller algorithms are sufficient for the publication needs of most academic researchers.
  3. Smaller algorithms are easier to diagram — for publication.
  4. Smaller algorithms are easier to debug and test.
  5. Larger algorithms are very difficult to debug and test.
  6. Limited coherence time limits circuit depth, which limits how many qubits can be manipulated and entangled in a single large quantum computation.
  7. Large algorithms and NISQ quantum computers just don’t mix.
  8. The resource requirements for simulating 40 qubits could be enormous for more than fairly shallow circuits — 2⁴⁰ quantum product states (one trillion) times the circuit depth. Okay, so shoot for 30–32 qubits then.
  9. Large simulations require great patience.
  10. Simulators do exist, but simulation is still a bit immature and doesn’t inspire great confidence. It’s not easy to design, implement, and configure a noise model that does a realistic simulation of a realistic quantum computer.
  11. It’s far more impressive to run on a real quantum computer than a simulator, but current real quantum computers don’t offer robust support for 40 or even 32-qubit algorithms.
  12. People are currently impressed and focused on simply solving problems at all using a quantum computer, even if for only 23, 17, 11, or even 7 qubits. Even just saying the name of the problem being addressed is a reward in itself.
  13. Eventually 40-qubit quantum algorithms will happen, but for now they aren’t a priority.
  14. Easier to go after low-hanging fruit — diminishing returns for solving harder problems. Fewer qubits get the job done — results to be published. Breaking your back to do even just a few more qubits just isn’t worth it.
  15. Hand-coding of quantum algorithms is very tedious. Programmatic generation of quantum circuits is too big a challenge for many.
  16. Generalizing quantum algorithms to enable them to be fully parameterized and scalable is a difficult task, beyond the abilities of many. It’s easy to parameterize and scale based on an integer, but not so easy if the parameter is an atom or molecule or protein or drug.
  17. 24 qubits may be more than sufficient for variational methods. More qubits may be diminishing returns. 2⁴⁰ quantum states may be gross overkill for variational methods. And people are stuck with variational methods until the day when qubit fidelity eventually is sufficient to support full quantum phase estimation (QPE) and quantum Fourier transform (QFT.)
  18. Need for fine granularity of phase and probability amplitude. Not a problem for simulators, but very problematic for real quantum computers, with no certainty of a resolution in the next few years. If real quantum computers won’t support fine granularity of phase angles and probability amplitudes for another five years or more, why bother even simulating such algorithms. This is needed for quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Significant resources needed to simulate quantum circuits with a large number of product states

Quantum circuits using a modest to moderate number of qubits and a modest to moderate number of quantum logic gates can be simulated with modest to moderate classical computing resources. No problem there.

Qubits, circuit depth, and product states

Just to reemphasize and highlight that it is these three factors combined which impact resource requirements for simulation of quantum circuits:

  1. Qubit count.
  2. Circuit depth.
  3. Product states. Each qubit has a quantum state, but entangled qubits have a combined quantum state, called a product state. Actually, the entangled qubits can have multiple product states. A product state is simply a quantum state involving multiple qubits.

Deep circuits and product states

Just to reemphasize and highlight a point from the previous section, the data storage requirements for simulation are driven most strongly not just by qubits or deeper circuits, but the combination of product states and deep circuits.

Is circuit depth a big issue for 40-qubit quantum algorithms?

Shallow circuits should be very feasible at 40 qubits, but how deep can a 40-qubit circuit get before it just takes too long to simulate or consumes too much storage for quantum states?

But in theory, simulating 40 qubits should be practical

Despite the significant resource requirements needed for 2⁴⁰ product states and a moderately deep quantum algorithm, this is all still considered practical and within the realm of reason, so resource requirements alone for a 40-qubit simulation can’t be the obstacle precluding the existence of 40-qubit quantum algorithms.

In short, there’s no technical excuse for lack of 40-qubit quantum algorithms which can run on simulators

As far as I can tell, there’s no technical reason that precludes running 40-qubit quantum algorithms on today’s 40-qubit classical quantum simulators.

Mindset obstacles to 40-qubit quantum algorithms

There may indeed be technical mindset obstacles preventing algorithm designers from thinking about 40-qubit quantum algorithms.

Is a focus on variational methods holding back larger algorithms?

There are applications which could take advantage of quantum phase estimation (QPE) and quantum Fourier transform (QFT) — if those techniques were viable on today’s quantum computers. But since current quantum computers lack the qubit fidelity, connectivity, and coherence time to support full QPE and QFT, there is a strong incentive for algorithm designers to stay away from the use of QPE and QFT. Variational methods are currently the technique of choice as an alternative to QPE and QFT.

Need for fine granularity of phase and probability amplitude

Advanced quantum algorithm techniques such as quantum Fourier transform (QFT), quantum phase estimation (QPE), and amplitude amplification require very fine granularity of phase angles and probability amplitudes. This is not a problem for simulators with their arbitrary precision, but very problematic for real quantum computers, with no certainty of a resolution in the next few years.

Shouldn’t the promise of support for quantum phase estimation (QPE) and quantum Fourier transform (QFT) be sufficient to draw a focus on 40-qubit quantum algorithms?

One would think so! Variational methods are popular — for now — even though they are a technological dead end for quantum computing since they offer no prospect for dramatic quantum advantage.

Future demand for quantum phase estimation (QPE) and quantum Fourier transform (QFT) will eventually drive 40-qubit quantum algorithms

Eventually, I expect that people will wake up to the need for quantum phase estimation (QPE) and quantum Fourier transform (QFT) in order to achieve dramatic quantum advantage. Demand for QPE and QFT will finally drive the creation of 40-qubit quantum algorithms, in my view.

No real need for 40-qubit quantum algorithms until quantum phase estimation (QPE) and quantum Fourier transform (QFT) create a need

For now, until people move towards quantum phase estimation (QPE) and quantum Fourier transform (QFT), there will likely be no real need for 40-qubit quantum algorithms.

Why little publication of 40-qubit quantum algorithms even if simulation is limited?

Technically, there is probably a bias in favor of quantum algorithms which can actually be run — either on a real quantum computer or on a simulator — but I’d still expect to see more interest in at least publishing 32 to 40 and 50 to 80-qubit algorithms. But, I’m not seeing any such publication interest.

Any 40-qubit quantum algorithms in the Quantum Algorithm Zoo?

Actually, there were plenty of ambitious algorithms published in the early days of quantum computing, before we had even 5-qubit real quantum computers to run on. Simulators were less powerful and ran on significantly less-powerful classical computers — 10 to 25 years ago. Back then, paper algorithms were all that we had. The Quantum Algorithm Zoo website catalogs all of those early algorithms, including Shor’s factoring algorithm which uses thousands of qubits — in theory. You can find it here:

My full solution: A model for scalable quantum algorithms

My suggested solution earlier in this paper is to encourage people to focus on simulation of 40-qubit quantum algorithms to get results now and be ready for future real quantum computers when they finally have adequate support for 40-qubit quantum algorithms.

Tackling practical, production-scale real-world problems

40-qubit quantum algorithms would be a significant advance from the current state of the art, but not even close to supporting solutions for practical, production-scale real-world problems.

Will 40-qubits get us to The ENIAC Moment for quantum computing?

It’s hard to say for sure, but I suspect that a 40-qubit quantum algorithm, especially running only on a simulator, will not be enough to get us to The ENIAC Moment for quantum computing, where a practical real-world application is run at something approximating production-scale.

Will 40-qubits get us to The FORTRAN Moment for quantum computing?

I don’t expect that The FORTRAN Moment for quantum computing will be achieved in the next couple of years since significant research and experimentation will be needed to get there, and by then we will probably have 44 to 48-qubit simulators anyway.

What would resource requirements be for various application categories?

It would be interesting for those working in specific quantum application areas to catalog their estimated quantum resource requirements, either for algorithms they already know about or prospective algorithms they would like to have. Again, focused on the 32 to 40-qubit range, but whatever their actual or perceived requirements may be.

  1. Qubit count.
  2. Qubit fidelity.
  3. Circuit depth.
  4. Qubit connectivity.
  5. Fineness of granularity required for phase and probability amplitude.

40-qubit quantum algorithms which I may not be aware of

In truth, I don’t read or even scan every single research paper which comes along, so it is very possible that at least a few 40-qubit quantum algorithms exist that have escaped my attention.

Dramatic need for additional research

A large part of the reason that we don’t see any 40-qubit quantum algorithms is that there is not a sufficiently robust fundamental research foundation on which such algorithms can be based. The only solution is to proceed with such fundamental research.

  1. Metaphors.
  2. Design patterns.
  3. Algorithmic building blocks.
  4. Libraries.
  5. Frameworks.
  6. Performance characterization and measurement.
  7. Scaling in general, automated scaling, automating more situations currently requiring manual scaling.
  8. Automated algorithm and circuit analysis tools to detect design issues, such as for scaling.

Open source

It would be most advantageous for quantum algorithms, code, sample data, examples, and supporting documentation to be posted online as open source.

What’s a PhD candidate or thesis advisor to do?

I can sympathize with the plight of PhD candidates and their supervisors who have an incredibly hard choice to make:

  1. Focus on smaller algorithms (under 24 qubits) which can run on existing real quantum computers to deliver actual real results. Granted, they can’t handle production-scale practical real-world problems, but they are results sufficient for a PhD project.
  2. Focus on theoretical work with algorithms using 100 to 1,000 qubits — or even thousands of qubits — which not only can’t run on any existing real quantum computers, but are too large for even the best classical quantum simulators, even if they may be what are really needed for the long run to support production-scale practical real-world quantum applications. They can publish their theoretical findings, but nobody can use them in practice now or any time in the relatively near future.

Timeframe

So, when might we begin to see some significant, nontrivial 40-qubit quantum algorithms?

  1. Exhaustion of interest in variational methods. Desire to move significantly beyond what can be done with 12 to 23-qubit algorithms.
  2. Availability of real quantum computers with higher fidelity qubits and Quantum Volume of 2⁴⁰.
  3. Stronger interest in preparing for the real quantum computers which will be available in three to five years rather than on current and near-term quantum computers.
  4. Government research programs place fresh emphasis on 40-qubit quantum algorithms.

Who’s on first?

It’s anybody’s guess what quantum algorithm and quantum application will be the first to exploit 40 qubits for quantum parallelism and approach some fractional degree of quantum advantage.

When will all quantum algorithms be 40-qubit?

The whole point of quantum computing to to do massive calculations which aren’t possible or would take too long on a classical computer, so the true, grand promise of quantum computing won’t be achieved until essentially all quantum algorithms are capable of massive computations.

Take the 40-qubit challenge!

I don’t have the energy or resources to pursue it, but seriously, it would be great if some enterprising individual or organization would run a 40-qubit challenge to encourage — and reward — the design and development of practical and scalable 40-qubit algorithms and applications addressing realistic real-world problems.

  • Take the 40-qubit challenge!
  • We’re taking the 40-qubit challenge!
  • We took the 40-qubit challenge!

Inspire the hardware vendors to support 40-qubit algorithms

I seriously believe that if the quantum computer hardware vendors could see that algorithm designers are focused on simulation of 40-qubit quantum algorithms then they will have a strong incentive to get their act together to design and build real quantum computers which can run those algorithms sooner than later.

  • Take the 40-qubit challenge!

Summary and conclusions

  1. Scalable quantum algorithms are needed. 40-qubits may be the near-term limit for simulation, but scalable algorithms will extend beyond 40-qubits once quantum computers with enough qubits and sufficient qubit fidelity and connectivity become available.
  2. There’s no technical obstacle to 40-qubit quantum algorithms. Real quantum computers are not yet available with sufficient qubit fidelity and connectivity, but classical quantum simulators are available.
  3. A desire to run variational methods on a real quantum computer may be distracting people from the greater potential of simulating a larger number of qubits.
  4. There may be proprietary or secret algorithms which utilize 40 qubits.
  5. Research programs for larger algorithms are desperately needed.
  6. Research programs should explicitly call for simulation of 28, 32, 36, and 40-qubit quantum algorithms.
  7. Research is also needed for larger simulators, targeting 44, 48, and 50 qubits. Maybe even 54 to 56 qubits. And maybe even 60 qubits. Or at least call for research into whether there are good reasons not to try to simulate more than 50 qubits.
  8. I encourage people to Take the 40-qubit challenge! — design and develop practical and scalable 40-qubit algorithms and applications addressing realistic real-world problems.
  9. A significant collection of proven 40-qubit quantum algorithms, sitting on the shelf, and ready to go could be what’s needed to inspire the quantum computer hardware vendors to support 40-qubit algorithms. Only then can we realistically expect the hardware vendors to… Take the 40-qubit challenge!
  10. It may take another three or four years before we see the first 40-qubit algorithm and application running on a real quantum computer.
  11. And it could take five to seven years before 40-qubit algorithms become the norm.

--

--

Freelance Consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store