Proposed Moore’s Law for Quantum Computing

Jack Krupansky
8 min readNov 15, 2019

--

There’s not yet a recognized equivalent of Moore’s Law for quantum computing, but I propose that a doubling of qubit count every one to two years is a workable rule of thumb for now.

The Wikipedia article for Moore’s Law captures the conventional wisdom that “Moore’s law is the observation that the number of transistors in a dense integrated circuit doubles about every two years.

I have not worked out a careful graph but it feels as if a one to two year cycle makes sense at this stage for quantum computing. So, my proposal is:

  • Qubit count of general-purpose quantum computers will double roughly every one to two years, or roughly every 18 months on average.

Moore’s Law itself has a fairly stable fixed two-year cycle, but progress with quantum computing has been rather uneven even though it has been fairly rapid in recent years.

And to be clear, this is indeed a rule of thumb, a rough ballpark, and not designed to have any great precision.

Some issues and caveats:

  1. Qubit count alone does not tell us anything about equally important metrics such as coherence time and connectivity, but I expect that coherence time will generally progress at a comparable pace.
  2. Qubit count is not proportional to performance in terms of quantum parallelism — adding k qubits will increase parallelism by a factor of up to 2^k. So, the performance improvement by doubling qubits can be far greater than the increase in qubit count alone. That would constitute a distinct law (Neven’s Law) since Moore’s law is about raw hardware devices, not performance per se.
  3. It is unclear whether the date of announcement or the date of delivery should be used. In some cases, for research systems, it may be date of publication. In some cases, announcements have been made but the systems have not been delivered or even acknowledged as being functional.
  4. I hate to cater to hype and would prefer to date based on hard delivery, but the reality right now is that sometimes announcement is the best and most readily available data.
  5. And sometimes rumors are the best information we have. Generally, rumors should be discounted, but sometimes they are actually fairly reasonable and do make sense, especially when parties have a penchant for excessive secrecy. Good, solid judgment should be exercised when dealing with rumors or sketchy reports.
  6. Maybe announcements (or rumors) can be included, but with an asterisk until delivered.
  7. The proposal excludes quantum error correction (QEC) which creates logical qubits from some large number of physical qubits. Ultimately, all developers and users care about are qubits seen by the application. So, for systems which distinguish logical qubits from physical qubits, use the count of logical qubits.
  8. I haven’t included D-Wave Systems in my proposed rule since their machine is rather atypical of what is commonly referred to as a general-purpose quantum computer, a gate-based quantum computer, or a universal quantum computer.
  9. The pace of technological change may change the proposed rule at any time — progress may accelerate or decelerate or even oscillate between the two.
  10. Some particular technological changes might not change the rule per se, but may shift the baseline, and this might be either a reduction, such as a new technology which may be better for the long run but take a few years to catch up, or an increase in the baseline, such as a new technology that enables a dramatic leap in qubit count more than twice the previous record in its very first year, but then settle back down into the proposed rule in subsequent years.
  11. The transition from physical qubits to logical qubits with quantum error correction (QEC) will likely shift the baseline dramatically, or not, depending on both the ratio of logical to physical qubits as well as the delta of raw physical qubits, which may be modest or may leap if there is also a dramatic improvement in the scalability of the technology for physical qubits.
  12. And sometimes the baseline and the rule may change in a single cycle, either to increase or to decrease.
  13. T1, T2, and T2* are very specific technical measures of decoherence. The discussion here is not down at that level of precision. Generally, developers are more concerned with maximum circuit depth or probability of coherence after execution of a certain number of gates rather than precise timing. And as noted earlier, the focus here is on device count, not performance, coherence, or connectivity.
  14. Connectivity is more difficult to measure and extrapolate since some architectures may provide full any-to-any connectivity, while other architectures support only nearest-neighbor connectivity, or even more constrained connectivity, relying on use of extra SWAP gates to bring the quantum states of two qubits close enough so that nearest-neighbor coupling can be used, but then this depends on the fidelity of the SWAP gates.
  15. IBM has introduced a new single metric which they call quantum volume, which “quantifies the largest random circuit of equal width and depth that the computer successfully implements.” The is essentially combining qubit count and coherence time into a single metric. But so far this new metric is not commonly used. And nobody is suggesting a law based on it, yet.
  16. Finally, it may in fact not be wise to extrapolate the proposed rule all the way out to infinity — the rule may work for an extended period of time, until it doesn’t. And maybe there will be occasional gaps or compressions due to one-time flukes in the underlying technology.

So, where are we today?

  1. IBM claimed to have a 5-qubit system in May 2016.
  2. Rigetti claimed to have a 5-qubit system in September 2016.
  3. IBM claimed to have a 14-qubit system as of September 2019, but it’s not clear when it became available before that.
  4. IBM claimed to have a 16-qubit system in May 2017.
  5. IBM claimed to have a 17-qubit system in May 2017.
  6. IBM claimed to have a 20-qubit system in November 2017.
  7. IBM claimed to be working on a 50-qubit system in November 2017.
  8. Rigetti claimed to have a 19-qubit system in December 2017.
  9. IonQ had an 11-qubit device in 2018.
  10. Rigetti had an 8-qubit system available in 2018.
  11. Rigetti had a 16-qubit system available in 2018.
  12. Rigetti announced a 128-qubit chip in 2018 and claimed it would be available within a year, but no sign of it yet — as of November 15, 2019.
  13. Intel reported that a 49-qubit chip was being developed in 2018, but no sign of it yet.
  14. IBM reported working on a 50-qubit device in 2017, but no further news on that, although they have confirmed a 53-qubit chip in September 2019. Unclear if these were two distinct projects or an evolution of a single project.
  15. IBM claimed to have a 53-qubit system in September 2019.
  16. Google reported working on a 72-qubit chip in 2018, but the only recent reference to that was a media report that they were having problems controlling the qubits, but they did announce a working 53-qubit system in 2019, although it’s not clear whether it is a research-only system or intended to become available for general use.
  17. Microsoft has reported working on a quantum computer, but not even a qubit count has been reported.
  18. In December 2018 IonQ reported creating a system with 160 total qubits stored and operated on 79 of those qubits, although the more intensive benchmarking was done only on a 13-qubit configuration, but this system is not generally available and it is unclear how the first production machine will be configured, or how reliable 79 qubits are. It’s an intriguing report, but some follow-through is needed to confirm.

So, we have 16, 19, and 20 qubits in 2017 and 2018, and 53-qubits in 2019. That’s a rather skimpy and uneven dataset, but does confirm a doubling trend over one to two years.

The questions for the next year are:

  1. Will Rigetti fulfill their 128-qubit announcement?
  2. Will Google fulfill their 72-qubit announcement? Or have some other advance beyond 53 qubits?
  3. Will IonQ fulfill their 79-qubit announcement?
  4. What will IBM do next?
  5. Will Honeywell announce a system?
  6. Will some other, unknown player announce a system?

And then we have a whole year after that to double the current benchmark of 53 qubits.

  1. Will it be Rigetti and 128 qubits?
  2. Will Google do it?
  3. Will IonqQ do it?
  4. Will IBM do it?
  5. Or some other player?

Longer term, one also has to wonder what advances might be percolating in the labs of academia. But even as advances in academia get attention, it could take years for such research advances to find their way into practical commercial products. That said, much of the qubit doubling five years or more out in the future is very likely to depend on academic research in the next few years.

In any case, here’s my extrapolation from where we are now, in 2019, doubling every two years:

  1. 53 qubits in 2019.
  2. 106 qubits in 2021–2 years.
  3. 212 qubits in 2023–4 years.
  4. 424 qubits in 2025–6 years.
  5. 848 qubits in 2027–8 years.
  6. 1,696 qubits in 2029–10 years.
  7. 3,392 qubits in 2031–12 years.
  8. 6,784 qubits in 2031–14 years.
  9. 13,568 qubits in 2033–16 years. Not quite enough for Shor’s algorithm to crack strong public key encryption (4096 bit keys need four times 4096 or 16,384 qubits), but may be enough for some modified versions of the algorithm, although coherence time may be a limiting factor.
  10. 27,136 qubits in 2035–18 years. Enough for Shor’s algorithm (4 times 4K qubits required.)

A caveat to that extrapolation is that we still have six more weeks in 2019 for Rigetti, et al to establish a higher bar than 53 qubits.

Note that this is a worst case for extrapolation, every two years, but sometimes doubling will take just a single year.

If instead I presume a doubling in 18 months on average or a factor of four every 3 years, this gives us:

  1. 53 qubits in 2019.
  2. 212 qubits in 2022–3 years.
  3. 848 qubits in 2025–6 years.
  4. 3,392 qubits in 2028–9 years.
  5. 13,568 qubits in 2031–12 years. Not quite enough for Shor’s algorithm to crack strong public key encryption (4096 bit keys need four times 4096 or 16,384 qubits), but may be enough for some modified versions of the algorithm, although coherence time may be a limiting factor.
  6. 54,272 qubits in 2034–15 years.

And if Rigetti does in fact come out with a 128-qubit machine next year, and presuming doubling every 18 months:

  1. 128 qubits in 2020.
  2. 256 qubits in 1.5 years — 2021.
  3. 512 qubits in 3 years — 2023.
  4. 1K qubits in 4.5 years — 2024.
  5. 2K qubits in 6 years — 2026.
  6. 4K qubits in 7.5 years — 2027.
  7. 8K qubits in 9 years — 2029.
  8. 16K qubits in 10.5 years — 2030.
  9. 32K qubits in in 12 years — 2032.
  10. 64K qubits in 13.5 years — 2033.

And these are all physical qubits. When logical qubits do become available, the qubit count baseline will have to be reset to reflect logical qubits rather than physical qubits.

We’ll have to revisit all of this over the next few years to get a firmer handle on the actual rate of progress.

--

--