When Will Quantum Computing Have Its ENIAC Moment?
When can we expect quantum computing to have advanced to a stage comparable to the public unveiling of ENIAC in 1946, when the future has finally arrived and become now, when a quantum computer is finally capable of solving a substantial, nontrivial, real-world computing problem with nontrivial amounts of data rather than being merely yet another promise and mere hint of a future to come, some day, but not real soon? This informal paper will explore the key issues currently standing between quantum computing and its ENIAC moment and suggests a timeframe for resolution of those issues, leading to the ENIAC moment.
ENIAC (Electronic Numerical Integrator and Computer) was the first big, highly publicized marvel of electronic digital computing which was actually capable of solving real, substantial, nontrivial computing problems with nontrivial amounts of data rather than being merely yet another promise and mere hint of a future to come, some day, but not real soon.
Quantum computing has seen quite a few notable fits and starts over the past few years, numerous tantalizing tastes of the (promised) future to come, but so far nothing as impressive as ENIAC was in 1946.
First, a basic definition:
- ENIAC moment. The stage at which a nascent technology is finally able to demonstrate that it is capable of solving a significant real-world problem — actually solving a problem and delivering substantial real-world value, in a manner which is a significant improvement over existing technologies. The moment when promises have been fulfilled.
See also a subsequent paper, When Will Quantum Computing Have Its FORTRAN Moment?, which focuses on the subsequent moment when the availability of an easy to use high-level programming language enables much wider adoption of quantum computing.
To be clear, the ENIAC moment may have required monumental and heroic effort of elite staff, while the FORTRAN moment is the transition to a more modest and average level of effort by average staff to solve problems.
Quantum computing will not have its ENIAC moment any time soon, but it is likely, in my own personal view, within five to seven years, and maybe even in four, or possibly even three years if we’re really (really!) lucky.
Summary of main issues
To summarize the main issues that are currently precluding an ENIAC moment for quantum computing any time soon:
- Need a lot more qubits — 500 to 1,000 rather than the current 50 to 100. And with support for a universal gate set supporting arbitrary quantum circuits on those qubits — special-purpose, single-function quantum computers won’t constitute an ENIAC moment.
- Need significantly longer coherence time — milliseconds rather than microseconds.
- Quantum error correction? May be needed eventually, but not for an ENIAC moment. A focus on reasonably stable qubits, longer coherence time, and reasonable environmental shielding is probably sufficient for an ENIAC moment.
- Need greater connectivity between more combinations of qubits — for entanglement.
- Need more and richer hardware features.
- Need more and richer firmware features. Including basic numeric math features and some uniquely quantum features such as quantum Fourier transform and phase estimation.
- Need better methods and tools for transforming classical-style algorithms to exploit the oddities of quantum computers and to exploit quantum parallelism.
- Need at least a healthy subset of the features of classical Turing machines, if not the full set of Turing machine features, merged and blended with the unique features of quantum computers. Maybe not a true, fully hybrid machine, but reasonably close.
- Need a few killer applications which really do show a distinctive quantum advantage over classical computing — and are not relatively trivial toy applications with very trivial amounts of data. And… they must be applications which the general public can relate to and appreciate as compelling. Unless the general public is stunningly captivated, there will be no ENIAC moment.
- A distinctive quantum advantage means that the application would not be practical on even a relatively large classical supercomputer or even a relatively large distributed network of classical computers. Or, at a minimum, the quantum program runs at least ten to a hundred if not a thousand times faster than a comparable algorithm on a reasonably fast classical computer.
Not all of those capabilities will be absolutely necessary to achieve the ENIAC moment for quantum computing, but certainly well more than a bare majority of them. More than a minimal deficit in any area would delay the ENIAC moment.
Note: There are likely quite a few additional issues to be resolved, but that’s beyond the scope of this paper.
There were some other, but more specialized computers over the five years preceding ENIAC, including electromechanical relay-based computers from Bell Labs, IBM, Harvard, and the German Z3 developed by Konrad Zuse in Germany in the 1940 to 1945 perod during World War II, and the Atanasoff-Berry computer (ABC, 1942) and British Colossus code-breaking computers (1943 to 1945) based on vacuum tubes, as ENIAC was, but ENIAC was a more general-purpose digital computer and made quite an impressive splash.
ENIAC very limited, but an essential critical mass
ENIAC actually had rather limited capabilities, but helped to lay the groundwork for a number of more powerful and even more general-purpose computers in the subsequent five years, including the MIT Whirlwind, EDVAC, and EDSAC in 1949. The rest is history, as they say — see more of the history in my paper Criteria for Judging Progress of the Development of Quantum Computing. But ENIAC was the key milestone which clearly signaled that electronic digital computing had finally arrived and was no longer merely a vague promise of some distant future.
General purpose even if an initial target application
ENIAC was designed initially for calculation of artillery firing tables, and later used to help design nuclear weapons. It did indeed have a specialized purpose at the very beginning, but its actual capabilities were far more general than that singular initial purpose.
General purpose is key
The goal of an ENIAC moment is not to replicate the specific computational capabilities of ENIAC per se, but to replicate its general-purpose nature as being applicable to a relatively wide and diverse range of applications — and to have sufficient computational resources to be able to handle a nontrivial amount of data.
ENIAC could perform hundreds of multiplications or thousands of additions and subtractions per second for up to twenty 10-digit numbers. That’s nothing by today’s standards but could easily replace an entire room full of human “computers” back in those days.
Those twenty 10-digit numbers would require 20 x 35 = 700 qubits in a quantum computer just to represent the raw numbers, let alone intermediate calculations, plus the fact that the arithmetic operations were hardwired in ENIAC while current general-purpose quantum computers lack even such basic arithmetic operations, requiring arithmetic to be simulated by discrete operations on individual qubits.
The 700 qubit number is just for a basic comparison to ENIAC. An actual ENIAC moment for quantum computing may require substantially more qubits (1,000 or more), or maybe not even that many (200 to500?), depending on the specific application.
ENIAC required manual programming with wires and plugboards, and function tables with selector switches, but the machines which immediately followed over the subsequent five years did indeed have true, stored program capabilities. Still, programming was possible in ENIAC.
Even current general-purpose quantum computers do have rudimentary programming capabilities, called quantum circuits, consisting of sequences of quantum logic gates, which can be composed or generated by sophisticated software running on classical computers, but these so-called gates are extremely primitive, each operating on only one or two or maybe three qubits at a time, performing only basic quantum mechanics physics operations, called unitary operators.
Simulating multiplication of two 10-digit numbers (decimal or 35 binary bits) would require hundreds of quantum logic gates, too tedious to code by hand, so they must be generated by software on a classical computer.
Worse, because of quantum decoherence, the quantum state of the qubits is likely to have significantly decayed before execution of that gate sequence could be completed. Thus, the urgency of significantly improved coherence time.
Need Turing machine capabilities
Even worse, quantum computers lack the sophisticated Turing machine computing capabilities of even the simplest classical computers.
No loops. No conditional branching. No functions or subroutines. No basic arithmetic. No rich data types. No high-level programing languages or their semantic richness.
ENIAC didn’t have all of those features either, but did have enough of them to be extremely useful, and the rest quickly followed within a few years.
Radical redesign of classical algorithms needed
The unfortunate and very ugly truth is that traditional or modern algorithms must be radically redesigned to cast them in terms of the basic physics operations of a quantum computer.
Granted, you do gain the power of superposition, entanglement, interference, and quantum parallelism, but at the very steep cost of losing the intellectual power of classical Turing machines.
Probabilistic vs. deterministic
Even worse, all of us have gotten spoiled by the reliable determinism of classical computers, while even the best quantum computers promise only to deliver probabilistic computational results.
For sure, there are plenty of applications where probabilistic and statistical results are both acceptable and even desirable, but that leaves many (most) deterministic applications out in the cold.
Special purpose quantum computers don’t cut it
Although there is a special-purpose quantum computer from D-Wave Systems which has 2048 qubits, it is functionally limited to quantum annealing using Ising and QUBO (Quadratic Unconstrained Binary Optimization) models for discrete optimization, and it doesn’t have even the very limited programming features of ENIAC, let alone the features of a universal gate set, common on virtually all other quantum computers.
This is indeed a powerful and useful machine for sure, but personally I consider it to be more of a quantum coprocessor. Granted, it is a very sophisticated and powerful — and very useful — coprocessor, but a coprocessor nonetheless rather than a full-fledged, general purpose computer.
Also, in my view, it has more in common with analog computers than a digital computer.
Okay, all existing (and promised near-term) quantum computers operate essentially as coprocessors, requiring any data storage, data preparation, and result post-processing to be handled exclusively by classical software running on a classical computer, over a network connection, but at least the features of a universal gate set provide some degree of logic which isn’t available on the special-purpose, single-function D-Wave systems.
Need more qubits
The largest announced general-purpose quantum computers here in early 2019 have 49 to 160 qubits. That’s a truly impressive advance for quantum computing over the past five years, but still falls far short of the 700 qubits needed to represent the twenty 10-digit numbers which ENIAC could handle with ease — back in 1946.
Quantum computers will have to grow by a factor of 8 to 15 just to catch up to ENIAC. And as mentioned, that’s just the raw ability to represent the numbers, not the ability to perform even basic arithmetic or more complex programming logic.
Exploiting commercially-available components
To be fair, the electronic digital computers of the 1940’s had a head start which quantum computers currently don’t have — they were able to take advantage of the commercially-available vacuum tubes which had been developed over the preceding two decades for commercial radio and military applications.
Earlier relay-based digital computers similarly exploited existing commercially-available electromechanical relays, such as used in telephone switching systems.
Even transistors, invented in 1947 — a year after ENIAC was operational and public, went through years of commercial development before they were commonly used in digital computers a full decade later.
Rolling your own qubits
Meanwhile, today, every quantum computer creator must develop their own basic qubits from scratch — no qubits are available commercially available off the shelf.
So, the quantum hardware folks are still stuck in the quantum equivalent of the early 1900’s of electronics — commercial radios were able to use off-the-shelf commercially-available vacuum tubes in 1920.
To be fair, the quantum hardware guys are still pioneers fighting through the frontier, a very long way from the promised land of quantum computing.
Perfecting basic qubits
The really big task in front of the quantum hardware guys right now is inventing and perfecting basic qubits which have far greater coherence and can be produced and controlled in much more substantial volumes.
We need to go from 50 qubits which decohere in 100 microseconds to 5,000 which can last 10 milliseconds. That’s two orders of magnitude.
Okay, maybe that’s too much to ask for at the ENIAC stage of quantum computing, but a single, full order of magnitude — 500 qubits with a full millisecond of coherence is probably a good and doable target. Still, even that significantly reduced target is not within imminent reach.
Difficulty of designing quantum algorithms
Even then, with 500 to 1,000 robust qubits, the biggest deficits will be on the algorithmic front, with the twin deficits of the difficulty of redesigning algorithms to exploit quantum parallelism as well as the lack of the basic features of a Turing machine.
But that’s all still way down the road compared to having significant numbers of working qubits.
I explore this algorithmic deficit in my paper The Greatest Challenges for Quantum Computing Are Hardware and Algorithms.
Lack of basic math
The extreme difficulty of performing even basic numerical calculations on a quantum computer poses a significant obstacle to their use for other than the most extreme niche applications where pure numeric calculations are either unneeded or worth the extreme effort.
At some point, numeric calculation will be seen as important enough that special hardware or firmware analogous to that provided in ENIAC will be provided so that quantum algorithms can perform basic math operations as single steps in an algorithm rather than dozens, hundreds, or thousands of discrete quantum logic gates.
Granted, the quantum firmware would need to translate each of those single steps into many dozens, hundreds, thousands, or even millions of discrete qubit quantum logic gates to be executed — much as is done today in a classical computer based on transistors and digital logic gates under the hood, but at least then there is the opportunity that the hardware or firmware can be optimized for such operations, much as ENIAC was, and then a great burden will be lifted off the shoulders of the overwhelmed quantum program developer.
Lack of special math functions
As another example, the ability to calculate square roots was built into the hardware of ENIAC.
A comparable level of function on a quantum computer would be a quantum Fourier transform as a discrete operation which a programmer could invoke, even though under the hood the quantum firmware would need to translate that single, discrete operation into many dozens, hundreds, thousands, or even millions of hardware-level quantum logic gates to be executed on individual qubits.
Today, the quantum program developer must use classical code to generate the very complex sequence of quantum logic gates needed to implement a quantum Fourier transform. There may be code libraries on the classical computer to facilitate this expansion, but that’s still a rather inefficient approach compared to having optimized firmware which handles the transform as a single operation rather than having to transmit a very large quantum circuit between the classical and quantum computers.
The point is that current and near-term quantum computers are nowhere near as sophisticated as ENIAC was in 1946 for even basic math.
Universal quantum computers
Whether we actually have to achieve a true, full, universal quantum computer, combining the features of both a pure quantum computer and the Turing machine and other advanced capabilities of a classical computer, to claim the status of a quantum ENIAC is debatable — after all, ENIAC lacked quite a few of the features of future machines which we would consider essential today, or even in 1956, a mere ten years after ENIAC was unveiled.
A merger of at least a relatively significant subset of the combined capabilities would probably be warranted, in my view, but I won’t draw any bright lines at this stage.
Maybe a true universal quantum computer will have to wait for three to five to ten years after the ENIAC moment, but at least a palpable subset of such universal capabilities are likely to be required to make the ENIAC moment happen.
Critical mass of features needed
I wouldn’t insist that an ENIAC-class quantum computer must have all of these features and be the end-all of quantum computing, but I would insist that it have at least a minimal critical mass of such features which is at least indicative of the level of features to come shortly, in much the same way that ENIAC in 1946 was the precursor of Whirlwind, EDVAC, and EDSAC of 1949.
Applications ripe for quantum computing
There is still the unresolved issue of exactly what applications would make sense for an ENIAC moment for quantum computing.
People can relate at least a little to artillery trajectories and nuclear weapons, but mere optimization of delivery trucks, design of yet another drug, or simulating a relatively simple molecule like water just seem like tasks that a classical supercomputer could easily handle already, so such applications will continue to be unable to capture the imagination of the general public the way ENIAC did in 1946.
Or comparable to the successful prediction of the 1952 presidential election by the UNIVAC I computer, the first significant commercial electronic digital computer, created by the original designers of ENIAC itself.
Killer app for quantum computing
In short, the question remains what exactly is the killer app for quantum computing?
Maybe it will indeed be some particularly elaborate form of optimization or drug discovery, simulation of a nontrivial molecule.
Or maybe it will simply be a big surprise, some application which we can’t even dream about today with our limited conception of the true capabilities of the quantum computers which don’t even exist today.
Classical computers are a tough act to follow
Maybe the ultimate issue here is that classical computers are a really tough act to follow.
Replacing human “computers” and calculating at much higher speeds and using much greater volumes of data was a truly amazing feat in the 1940’s.
Today, even amazing supercomputers and massive data centers are simply taken for granted.
I mean, what computing power is there in common handheld smart phones, a million to a billion time more power than ENIAC?!!
Seriously, the designers of quantum computers and quantum algorithms will have to deliver something incredibly awesome to capture the attention and enthusiasm of this crowd.
Quantum parallelism is difficult to exploit
Quantum parallelism has great promise, but at present that promise is simply too far beyond both our reach and our grasp.
Progress has been made. Significant progress. But much difficult work remains.
And there is absolutely no question that full-blown support for quantum parallelism across a reasonably wide range of applications is absolutely essential — it’s like the only reason to even want a quantum computer.
And then the ENIAC moment arrives
Only then, when all (or most) of the aforementioned issues have been addressed, will we have the quantum equivalent of ENIAC for quantum computing.
The ENIAC moment for quantum computing will finally have arrived.
Shor’s factoring algorithm as an ENIAC moment
If somebody did manage to develop an 8K-qubit quantum computer and get Shor’s factoring algorithm running on it, cracking 2048-bit public key encryption would indeed be a true ENIAC moment, capturing everybody’s attention.
But, I have my doubts about Shor’s algorithm as currently envisioned, and 8K-qubit machines capable of executing the millions of quantum logic gates required by Shor’s algorithm are not on any realistic horizon at this stage anyway.
No, the IBM Q System One was not a candidate for The ENIAC Moment
IBM unveiled the IBM Q System One in January 2019, billing it as “the world’s first integrated universal approximate quantum computing system designed for scientific and commercial use.” That certainly sounds impressive. But… with only 20 qubits, and noisy qubits at that, it doesn’t even come remotely close to the level of capabilities one would properly expect for The ENIAC Moment of quantum computing — when, as I stated at the outset, a quantum computer is finally capable of solving a substantial, nontrivial, real-world computing problem with nontrivial amounts of data rather than being merely yet another promise and mere hint of a future to come, some day, but not real soon.
Sure, IBM did package those 20 qubits in a very sleek physical package, but… it was more smoke and mirrors than capable of delivering substantial business value for production-scale applications.
The IBM press release:
- IBM Unveils World’s First Integrated Quantum Computing System for Commercial Use
- IBM to Open Quantum Computation Center for Commercial Clients in Poughkeepsie, NY
- YORKTOWN HEIGHTS, N.Y., Jan. 8, 2019 /PRNewswire/ — At the 2019 Consumer Electronics Show (CES), IBM (NYSE: IBM) today unveiled IBM Q System One™, the world’s first integrated universal approximate quantum computing system designed for scientific and commercial use. IBM also announced plans to open its first IBM Q Quantum Computation Center for commercial clients in Poughkeepsie, New York in 2019.
ENIAC moment in five to seven years
I’m not holding my breath, but I remain hopeful that we might see a quantum-equivalent of ENIAC within the coming decade. Just not in the next few years.
Five years? Maybe, possibly. I’d be willing to bet on the five to seven year timeframe.
Maybe even four years. And if it does happen in three years it won’t be a big shock to me at all.
But two years? Very unlikely.
Might it take ten years, or more? Possibly, but I hope not, and I don’t think there is a good reason to be that pessimistic — again, I’m only referring to the ENIAC moment here, vintage 1946, not the advanced computers of the late 1950’s and 1960’s.
Best qubit technology?
There are a number of technologies for implementation of qubits currently used, under development, or contemplated, including superconducting quantum interference devices (SQUIDs), trapped ions, photonics and squeezed light, topological qubits, etc.
There is no great clarity as to which qubit technology will win and be used for the ENIAC moment of quantum computing.
Even then, there will be no great clarity as to whether that technology is really the best for the future, in much the same way as the basic technologies used by the original ENIAC were fairly quickly eclipsed within just a few years.
Just to be clear, the ideal qubit technology is not necessarily a requirement for the ENIAC moment.
“Any of the above” might well be the right answer for which qubit technology should or could be bet on for the ENIAC moment.
Granted, somebody may be working on exactly such a machine in a secret lab right at this moment, possibly under contract to NSA or DOE — or the Chinese government, but the best I can do is to stick to publicly-disclosed details and my own experience, expertise, knowledge, intuition, and judgment.
After all, an ENIAC moment is by definition a moment of public disclosure. ENIAC was indeed running in 1945, unbeknownst to the general public, until its public unveiling in 1946.
Quantum Advantage and Quantum Supremacy
Arrival the the ENIAC moment will imply that quantum computing has also achieved so-called quantum advantage. The terms quantum advantage and quantum supremacy are still a bit vague and unsettled, but for our purposes here, the mere fact that the quantum computer outperforms the best we can do for a comparable classical algorithm is the telltale sign of quantum advantage.
For some people, quantum advantage and quantum supremacy are essentially synonyms, and for many practical applications that may well be the case, but I would suggest that quantum supremacy also implies a quantum advantage across a fair broad range of applications — the quantum computer would have to outperform classical computers or a lot more than the single application required for the ENIAC moment alone. See a more detailed treatment in the What Is Quantum Advantage and What Is Quantum Supremacy? paper.
The ENIAC moment will not mean that a quantum advantage or quantum supremacy has necessarily been achieved across a wide range of applications. It will simply be a milestone on the path to such a broader advantage.
And the ENIAC moment will certainly not mean that classical computers are now all obsolete. That will take a lot more development, particularly the advent of a true, full, universal quantum computer, combining the features of both a pure quantum computer and the Turing machine and other advanced capabilities of a classical computer, as well as a lot of evolution to shrink the form factor to match that of classical computers.
Quantum computing landscape continues to evolve
But stay tuned! … But don’t hold your breath.
I’ll continue to update this informal paper, or post a successor, as progress is made on the relevant issues.
See also a subsequent paper, When Will Quantum Computing Have Its FORTRAN Moment?, which focuses on the subsequent moment when the availability of an easy to use high-level programming language enables much wider adoption of quantum computing.