When Will Quantum Computing Have Its ENIAC Moment?

When can we expect quantum computing to have advanced to a stage comparable to the public unveiling of ENIAC in 1946, when the future has finally arrived and become now, when a quantum computer is finally capable of solving a substantial, nontrivial, real-world computing problem with nontrivial amounts of data rather than being merely yet another promise and mere hint of a future to come, some day, but not real soon? This informal paper will explore the key issues currently standing between quantum computing and its ENIAC moment and suggests a timeframe for resolution of those issues, leading to the ENIAC moment.

ENIAC (Electronic Numerical Integrator and Computer) was the first big, highly publicized marvel of electronic digital computing which was actually capable of solving real, substantial, nontrivial computing problems with nontrivial amounts of data rather than being merely yet another promise and mere hint of a future to come, some day, but not real soon.

Quantum computing has seen quite a few notable fits and starts over the past few years, numerous tantalizing tastes of the (promised) future to come, but so far nothing as impressive as ENIAC was in 1946.

First, a basic definition:

  • ENIAC moment. The stage at which a nascent technology is finally able to demonstrate that it is capable of solving a significant real-world problem — actually solving a problem and delivering substantial real-world value, in a manner which is a significant improvement over existing technologies. The moment when promises have been fulfilled.

See also a subsequent paper, When Will Quantum Computing Have Its FORTRAN Moment?, which focuses on the subsequent moment when the availability of an easy to use high-level programming language enables much wider adoption of quantum computing.

To be clear, the ENIAC moment may have required monumental and heroic effort of elite staff, while the FORTRAN moment is the transition to a more modest and average level of effort by average staff to solve problems.

TLDR;

Summary of main issues

  1. Need a lot more qubits — 500 to 1,000 rather than the current 50 to 100. And with support for a universal gate set supporting arbitrary quantum circuits on those qubits — special-purpose, single-function quantum computers won’t constitute an ENIAC moment.
  2. Need significantly longer coherence time — milliseconds rather than microseconds.
  3. Quantum error correction? May be needed eventually, but not for an ENIAC moment. A focus on reasonably stable qubits, longer coherence time, and reasonable environmental shielding is probably sufficient for an ENIAC moment.
  4. Need greater connectivity between more combinations of qubits — for entanglement.
  5. Need more and richer hardware features.
  6. Need more and richer firmware features. Including basic numeric math features and some uniquely quantum features such as quantum Fourier transform and phase estimation.
  7. Need better methods and tools for transforming classical-style algorithms to exploit the oddities of quantum computers and to exploit quantum parallelism.
  8. Need at least a healthy subset of the features of classical Turing machines, if not the full set of Turing machine features, merged and blended with the unique features of quantum computers. Maybe not a true, fully hybrid machine, but reasonably close.
  9. Need a few killer applications which really do show a distinctive quantum advantage over classical computing — and are not relatively trivial toy applications with very trivial amounts of data. And… they must be applications which the general public can relate to and appreciate as compelling. Unless the general public is stunningly captivated, there will be no ENIAC moment.
  10. A distinctive quantum advantage means that the application would not be practical on even a relatively large classical supercomputer or even a relatively large distributed network of classical computers. Or, at a minimum, the quantum program runs at least ten to a hundred if not a thousand times faster than a comparable algorithm on a reasonably fast classical computer.

Not all of those capabilities will be absolutely necessary to achieve the ENIAC moment for quantum computing, but certainly well more than a bare majority of them. More than a minimal deficit in any area would delay the ENIAC moment.

Note: There are likely quite a few additional issues to be resolved, but that’s beyond the scope of this paper.

ENIAC contemporaries

ENIAC very limited, but an essential critical mass

General purpose even if an initial target application

General purpose is key

Minimal capacity

Those twenty 10-digit numbers would require 20 x 35 = 700 qubits in a quantum computer just to represent the raw numbers, let alone intermediate calculations, plus the fact that the arithmetic operations were hardwired in ENIAC while current general-purpose quantum computers lack even such basic arithmetic operations, requiring arithmetic to be simulated by discrete operations on individual qubits.

The 700 qubit number is just for a basic comparison to ENIAC. An actual ENIAC moment for quantum computing may require substantially more qubits (1,000 or more), or maybe not even that many (200 to500?), depending on the specific application.

General-purpose programming

Even current general-purpose quantum computers do have rudimentary programming capabilities, called quantum circuits, consisting of sequences of quantum logic gates, which can be composed or generated by sophisticated software running on classical computers, but these so-called gates are extremely primitive, each operating on only one or two or maybe three qubits at a time, performing only basic quantum mechanics physics operations, called unitary operators.

Simulating multiplication of two 10-digit numbers (decimal or 35 binary bits) would require hundreds of quantum logic gates, too tedious to code by hand, so they must be generated by software on a classical computer.

Worse, because of quantum decoherence, the quantum state of the qubits is likely to have significantly decayed before execution of that gate sequence could be completed. Thus, the urgency of significantly improved coherence time.

Need Turing machine capabilities

No loops. No conditional branching. No functions or subroutines. No basic arithmetic. No rich data types. No high-level programing languages or their semantic richness.

ENIAC didn’t have all of those features either, but did have enough of them to be extremely useful, and the rest quickly followed within a few years.

Radical redesign of classical algorithms needed

Granted, you do gain the power of superposition, entanglement, interference, and quantum parallelism, but at the very steep cost of losing the intellectual power of classical Turing machines.

Probabilistic vs. deterministic

For sure, there are plenty of applications where probabilistic and statistical results are both acceptable and even desirable, but that leaves many (most) deterministic applications out in the cold.

Special purpose quantum computers don’t cut it

This is indeed a powerful and useful machine for sure, but personally I consider it to be more of a quantum coprocessor. Granted, it is a very sophisticated and powerful — and very useful — coprocessor, but a coprocessor nonetheless rather than a full-fledged, general purpose computer.

Also, in my view, it has more in common with analog computers than a digital computer.

Coprocessors

Need more qubits

Quantum computers will have to grow by a factor of 8 to 15 just to catch up to ENIAC. And as mentioned, that’s just the raw ability to represent the numbers, not the ability to perform even basic arithmetic or more complex programming logic.

Exploiting commercially-available components

Earlier relay-based digital computers similarly exploited existing commercially-available electromechanical relays, such as used in telephone switching systems.

Even transistors, invented in 1947 — a year after ENIAC was operational and public, went through years of commercial development before they were commonly used in digital computers a full decade later.

Rolling your own qubits

So, the quantum hardware folks are still stuck in the quantum equivalent of the early 1900’s of electronics — commercial radios were able to use off-the-shelf commercially-available vacuum tubes in 1920.

To be fair, the quantum hardware guys are still pioneers fighting through the frontier, a very long way from the promised land of quantum computing.

Perfecting basic qubits

We need to go from 50 qubits which decohere in 100 microseconds to 5,000 which can last 10 milliseconds. That’s two orders of magnitude.

Okay, maybe that’s too much to ask for at the ENIAC stage of quantum computing, but a single, full order of magnitude — 500 qubits with a full millisecond of coherence is probably a good and doable target. Still, even that significantly reduced target is not within imminent reach.

Difficulty of designing quantum algorithms

But that’s all still way down the road compared to having significant numbers of working qubits.

I explore this algorithmic deficit in my paper The Greatest Challenges for Quantum Computing Are Hardware and Algorithms.

Lack of basic math

At some point, numeric calculation will be seen as important enough that special hardware or firmware analogous to that provided in ENIAC will be provided so that quantum algorithms can perform basic math operations as single steps in an algorithm rather than dozens, hundreds, or thousands of discrete quantum logic gates.

Granted, the quantum firmware would need to translate each of those single steps into many dozens, hundreds, thousands, or even millions of discrete qubit quantum logic gates to be executed — much as is done today in a classical computer based on transistors and digital logic gates under the hood, but at least then there is the opportunity that the hardware or firmware can be optimized for such operations, much as ENIAC was, and then a great burden will be lifted off the shoulders of the overwhelmed quantum program developer.

Lack of special math functions

A comparable level of function on a quantum computer would be a quantum Fourier transform as a discrete operation which a programmer could invoke, even though under the hood the quantum firmware would need to translate that single, discrete operation into many dozens, hundreds, thousands, or even millions of hardware-level quantum logic gates to be executed on individual qubits.

Today, the quantum program developer must use classical code to generate the very complex sequence of quantum logic gates needed to implement a quantum Fourier transform. There may be code libraries on the classical computer to facilitate this expansion, but that’s still a rather inefficient approach compared to having optimized firmware which handles the transform as a single operation rather than having to transmit a very large quantum circuit between the classical and quantum computers.

The point is that current and near-term quantum computers are nowhere near as sophisticated as ENIAC was in 1946 for even basic math.

Universal quantum computers

A merger of at least a relatively significant subset of the combined capabilities would probably be warranted, in my view, but I won’t draw any bright lines at this stage.

Maybe a true universal quantum computer will have to wait for three to five to ten years after the ENIAC moment, but at least a palpable subset of such universal capabilities are likely to be required to make the ENIAC moment happen.

Critical mass of features needed

Applications ripe for quantum computing

People can relate at least a little to artillery trajectories and nuclear weapons, but mere optimization of delivery trucks, design of yet another drug, or simulating a relatively simple molecule like water just seem like tasks that a classical supercomputer could easily handle already, so such applications will continue to be unable to capture the imagination of the general public the way ENIAC did in 1946.

Or comparable to the successful prediction of the 1952 presidential election by the UNIVAC I computer, the first significant commercial electronic digital computer, created by the original designers of ENIAC itself.

Killer app for quantum computing

Maybe it will indeed be some particularly elaborate form of optimization or drug discovery, simulation of a nontrivial molecule.

Or maybe it will simply be a big surprise, some application which we can’t even dream about today with our limited conception of the true capabilities of the quantum computers which don’t even exist today.

Classical computers are a tough act to follow

Replacing human “computers” and calculating at much higher speeds and using much greater volumes of data was a truly amazing feat in the 1940’s.

Today, even amazing supercomputers and massive data centers are simply taken for granted.

I mean, what computing power is there in common handheld smart phones, a million to a billion time more power than ENIAC?!!

Seriously, the designers of quantum computers and quantum algorithms will have to deliver something incredibly awesome to capture the attention and enthusiasm of this crowd.

Quantum parallelism is difficult to exploit

Progress has been made. Significant progress. But much difficult work remains.

And there is absolutely no question that full-blown support for quantum parallelism across a reasonably wide range of applications is absolutely essential — it’s like the only reason to even want a quantum computer.

And then the ENIAC moment arrives

The ENIAC moment for quantum computing will finally have arrived.

Shor’s factoring algorithm as an ENIAC moment

But, I have my doubts about Shor’s algorithm as currently envisioned, and 8K-qubit machines capable of executing the millions of quantum logic gates required by Shor’s algorithm are not on any realistic horizon at this stage anyway.

No, the IBM Q System One was not a candidate for The ENIAC Moment

Sure, IBM did package those 20 qubits in a very sleek physical package, but… it was more smoke and mirrors than capable of delivering substantial business value for production-scale applications.

The IBM press release:

  • IBM Unveils World’s First Integrated Quantum Computing System for Commercial Use
  • IBM to Open Quantum Computation Center for Commercial Clients in Poughkeepsie, NY
  • YORKTOWN HEIGHTS, N.Y., Jan. 8, 2019 /PRNewswire/ — At the 2019 Consumer Electronics Show (CES), IBM (NYSE: IBM) today unveiled IBM Q System One™, the world’s first integrated universal approximate quantum computing system designed for scientific and commercial use. IBM also announced plans to open its first IBM Q Quantum Computation Center for commercial clients in Poughkeepsie, New York in 2019.
  • https://newsroom.ibm.com/2019-01-08-IBM-Unveils-Worlds-First-Integrated-Quantum-Computing-System-for-Commercial-Use

ENIAC moment in five to seven years

Five years? Maybe, possibly. I’d be willing to bet on the five to seven year timeframe.

Maybe even four years. And if it does happen in three years it won’t be a big shock to me at all.

But two years? Very unlikely.

Might it take ten years, or more? Possibly, but I hope not, and I don’t think there is a good reason to be that pessimistic — again, I’m only referring to the ENIAC moment here, vintage 1946, not the advanced computers of the late 1950’s and 1960’s.

Best qubit technology?

There is no great clarity as to which qubit technology will win and be used for the ENIAC moment of quantum computing.

Even then, there will be no great clarity as to whether that technology is really the best for the future, in much the same way as the basic technologies used by the original ENIAC were fairly quickly eclipsed within just a few years.

Just to be clear, the ideal qubit technology is not necessarily a requirement for the ENIAC moment.

Any of the above” might well be the right answer for which qubit technology should or could be bet on for the ENIAC moment.

Secret labs?

After all, an ENIAC moment is by definition a moment of public disclosure. ENIAC was indeed running in 1945, unbeknownst to the general public, until its public unveiling in 1946.

Quantum Advantage and Quantum Supremacy

For some people, quantum advantage and quantum supremacy are essentially synonyms, and for many practical applications that may well be the case, but I would suggest that quantum supremacy also implies a quantum advantage across a fair broad range of applications — the quantum computer would have to outperform classical computers or a lot more than the single application required for the ENIAC moment alone. See a more detailed treatment in the What Is Quantum Advantage and What Is Quantum Supremacy? paper.

The ENIAC moment will not mean that a quantum advantage or quantum supremacy has necessarily been achieved across a wide range of applications. It will simply be a milestone on the path to such a broader advantage.

And the ENIAC moment will certainly not mean that classical computers are now all obsolete. That will take a lot more development, particularly the advent of a true, full, universal quantum computer, combining the features of both a pure quantum computer and the Turing machine and other advanced capabilities of a classical computer, as well as a lot of evolution to shrink the form factor to match that of classical computers.

Quantum computing landscape continues to evolve

I’ll continue to update this informal paper, or post a successor, as progress is made on the relevant issues.

See also a subsequent paper, When Will Quantum Computing Have Its FORTRAN Moment?, which focuses on the subsequent moment when the availability of an easy to use high-level programming language enables much wider adoption of quantum computing.

Freelance Consultant