Criteria for Judging Progress of the Development of Quantum Computing
When will quantum computing be ready for prime time? Research in quantum computing has been brewing for several decades now, with reasonably interesting and tantalizing results, but it’s still too much of a fringe specialty rather than a mainstream technology ready for the masses. This informal paper proposes some reasonably objective criteria for judging how much progress has been made and how close quantum computers are to something which most people would consider a mainstream technology.
For the most part, I’ll outline key aspects of the history of traditional digital computing in a reasonably quantitative manner so that quantitative achievements in quantum computing can be compared to historical developments in traditional digital computing.
Traditional digital computers, then called mainframes, were quite common for many large corporations and virtually all scientific facilities by the mid-1960’s. Cheaper minicomputers were appearing in the mid to late 1960’s and 1970’s. Microcomputers and personal computers were appearing in the late 1970’s and early 1980’s.
The digital computers of the 1940’s, 1950’s and even the early 1960’s were rather monstrous beasts, tameable only by the largest and most sophisticated of organizations and the most elite of technical staff.
The years from 1963 to 1967 were a true watershed, as mainframes were plentiful, and minicomputers as well.
Needless to say, quantum computing is nowhere near those 1963 to 1967 watershed years.
In fact, I think it’s safe to say that even the most advanced quantum computers are still only barely where traditional digital computing was in the 1940’s or maybe the early 1950’s at best.
We have a long way, a very long way, to go.
Granted the performance of even relatively simple quantum computing circuits is literally light years more advanced than the performance of even the best supercomputers of 1965, the capacity, sophistication, and certainly ease of use is still way back in the 1940’s. Or so it seems to me, at this stage.
That said, it seems almost a foregone conclusion that today’s limited quantum computing capabilities are most likely on the cusp of a dramatic breakout, so that quantum computers could very easily and likely zoom from being comparable to the beasts of the 1940’s and 1950’s to the manageable mainframes and sporty minicomputers of the mid-1960’s in a just a few years, in maybe five years, or ten years max.
Vocabulary for quantum computing
We need to have a common and rich vocabulary of shared meaning for discussing quantum computing. I’ve compiled an initial draft for a glossary of terms related to quantum computing: Quantum Computing Glossary — Introduction.
Learning from history
I hew to the dictum that those who fail to learn from history are condemned to repeat its mistakes.
But I also hew to the notion of being an iconoclast, that breaking out to new heights can and frequently requires a complete and clean break from the past, regardless of its lessons.
Those two notions conflict. Great judgment is needed to discern at every moment whether lessons from the past are relevant or whether breaking fresh ground is required.
Nothing in this paper presumes that one must learn any particular lessons from the past, but it does presume that there are positive lessons from the past that can enable innovators to make quantum leaps forward that would have required much more intensive slogging if lessons from the past had not been readily at their disposal.
This paper will endeavor to present lessons from the past, but with no bias towards whether any particular lessons might have any relevance at all to the future of quantum computing.
Categories of criteria
Some of these criteria will remain rather subjective and qualitative, but the goal is to make them as objective and quantitative as possible.
- Historic machines. General capabilities, capacity, and performance.
- Generations of processing technologies.
- Memory technology. Main memory, mass storage.
- Memory size.
- Memory organization.
- Number of qubits.
- Program size.
- Processing performance.
- Unit of work.
- Parallel processing.
- Granularity of modeling.
- Precision of calculations.
- Richness of data types.
- Operating temperature.
- Abstraction of function from device.
- Supercomputer status.
- Fault tolerance.
- I/O performance.
- Operating systems and front end computers.
- I/O device, server, coprocessor, or functional unit?
- Processing modes.
- Programming language sophistication.
- Availability of programming languages and development tools.
- Rich collection of design patterns and programming metaphors.
- Shared code libraries.
- Open source vs. proprietary technology.
- Transition from ultrasimple “toy” applications.
- Transition from experimental novelties to development of production applications.
- Wide availability of proven applications.
- IoT applications.
- Applications easily implemented.
- General purpose quantum computing.
- Number of research projects.
- Number of commercial vendors.
- Number of government organizations with quantum computing in active use, beyond research.
- Transparency for technical claims.
- Development of standards.
- Key technical breakthroughs required.
- Key non-technical hurdles.
- Formalized information theory.
Just about every one of the machines on this list was a significant milestone of sorts in the annals of computing. The point here for quantum computers is to identify how capable a given quantum computer is compared to any of these historic machines.
- 1890 — Hollerith punched card tabulating machines.
- 1920’s — punched card tabulating machines with plugboards — unit record equipment.
- 1930’s — punched card accounting machines.
- 1940’s — peak of tabulating and accounting machines.
- 1940 — Bell Labs Model I Relay Calculator (Complex Number Calculator). Electromechanical relays.
- 1941 — Zuse Z3 (electromechanical relays).
- 1942 — Atanasoff-Berry Computer (ABC) (first vacuum tube computer).
- 1943 — Colossus Mark 1 (UK) (vacuum tubes).
- 1943 — Bell Labs Model II Relay Interpolator (electromechanical relays).
- 1944 — Harvard Mark I — IBM Automatic Sequence Controlled Calculator (ASCC) (electromechanical relays).
- 1944 — Bell Labs Model III Ballistic Computer (electromechanical relays).
- 1944 — Colossus Mark 2 (UK) (vacuum tubes).
- 1946 — ENIAC (vacuum tubes).
- 1947 — Harvard Mark II (electromechanical relays).
- 1947 — transistor invented at Bell Labs. Not used in a computer until 1953.
- 1949 — Whirlwind (MIT) (vacuum tubes).
- 1949 — Manchester Mark 1 (vacuum tubes).
- 1949 — EDVAC (vacuum tubes).
- 1949 — EDSAC (vacuum tubes).
- 1950 — MESM — first computer in Soviet Union — Kiev, Ukraine (vacuum tubes).
- 1950 — SEAC (vacuum tubes).
- 1950 — SWAC (vacuum tubes).
- 1951 — Ferranti Mark 1 first commercially available general purpose electronic computer (vacuum tubes). Based on the Manchester Mark 1.
- 1951 — UNIVAC I first commercially available general purpose electronic computer in North America, second in world after Ferranti Mark 1 (vacuum tubes).
- 1952 — IAS (Princeton Institute for Advanced Study, von Neumann) (vacuum tubes).
- 1952 — ILLIAC I (vacuum tubes).
- 1952 — MANIAC I (vacuum tubes).
- 1953 — RAYDAC (vacuum tubes).
- 1953 — Manchester TC (first transistor computer).
- 1953 — IBM 701 (vacuum tubes).
- 1953 — JOHNNIAC (vacuum tubes).
- 1953 — Whirlwind core memory (vacuum tubes).
- 1954 — IBM 650 (vacuum tubes).
- 1954 — Bell Labs TRADIC (first U.S. transistor computer).
- 1955 — IBM 704 with core memory (vacuum tubes).
- 1955 — SAGE/AN/FSQ-7 (vacuum tubes).
- 1956 — MINAC/LGP-30 (vacuum tubes).
- 1955 — Harwell CADET (first fully transistorized computer in Europe).
- 1956 — TX-0 (MIT) (transistor).
- 1956 — IBM RAMAC 305 (vacuum tubes).
- 1958 — last year for introduction of any new computer designs based on vacuum tubes. Eleven years after invention of the transistor.
- 1958 — UNIVAC II (vacuum tubes).
- 1958 — IBM 709 (vacuum tubes, last for IBM).
- 1958 — Philco S-2000 TRANSAC used much faster surface barrier transistors.
- 1958 — RCA 501 (transistors).
- 1958 — SAGE goes online (vacuum tubes).
- 1959 — virtually all new computer designs based on transistors. Twelve years after invention of the transistor.
- 1959 — IBM 1401 (transistors).
- 1959 — IBM 1620 (transistors).
- 1959 — IBM 7090 (transistors).
- 1960 — DEC PDP-1 first minicomputer (transistors, logic modules).
- 1960 — CDC 160 (transistors).
- 1961 — IBM 7030 Stretch (transistors — first for IBM).
- 1962 — IBM 7094 (transistors).
- 1962 — Atlas early supercomputer (Univ. of Manchester).
- 1962 — CDC 1604 early commercial success for transistors.
- 1962 — DEC PDP-4.
- 1962 — ILLIAC II supercomputer.
- 1963 — IBM 7040/7044 scaled down 7090.
- 1963 — CDC 3600 48-bit scientific computing.
- 1963 — PDP-5 first successful minicomputer.
- 1964 — DEC PDP-6 precursor to PDP-10, low-cost mainframe-class power.
- 1965 — CDC 6600 one of first real, successful supercomputers.
- 1965 — DEC PDP-8 cheap, widely popular minicomputer.
- 1965 — IBM System/360 relatively cheap corporate machine.
- 1965 — IBM System/360 Model 75 high performance with advanced parallel and overlapped hardware for real-time computing — supported the Apollo space program.
- 1966 — IBM System/360 Model 91 high-speed data processing for scientific applications.
- 1966 — ILLIAC III specialized SIMD pattern recognition computer.
- 1967 — DEC PDP-10 cheap but powerful mainframe popular with research labs and universities, timesharing and ARPANET.
- 1967 — GE 645 hardware protected memory system designed specifically to support the Multics operating system, precursor to UNIX.
- 1969 — CDC 7600 ten times the performance of the CDC 6600.
- 1969 — Data General Nova (medium-scale integrated circuits).
- 1970 — DEC PDP-11 major line of cheap but powerful minicomputers.
- 1971 — IBM System/370 (integrated circuits).
- 1971 — CDC STAR-100 vector supercomputer, very limited success.
- 1972 — Goodyear STARAN associative memory parallel processor.
- 1972 — ILLIAC IV first massively parallel supercomputer (256 processors).
- 1976 — Cray-1 very successful vector processor supercomputer.
And that’s just the start.
This list can be extended once quantum computers have become as capable and widely available as the computers of the mid 1960’s. That said, such an update would be somewhat pointless since by definition quantum computing will have been considered to have arrived at that stage so that no further comparison to traditional digital computing will be needed at that stage.
Maybe the only really significant advances beyond this list would be the advent of cheap microprocessors, personal computers, and cheap network servers.
For reference, it might be helpful to consider the stages that microprocessors and personal computers have advanced through:
- 4-bit microprocessors.
- 8-bit microprocessors.
- 8/16-bit microprocessors.
- 16-bit microprocessors.
- 16/32-bit microprocessors.
- 32-bit microprocessors.
- Hyperthreaded 32-bit microprocessors.
- Dual-core 32-bit microprocessors.
- Multi-core 32-bit microprocessors.
- Multi-chip motherboards.
- 32/64-bit microprocessors.
- 64-bit microprocessors.
- Servers based on microprocessors.
- Servers based on multiple microprocessors.
- Supercomputers based on large numbers of microprocessors.
Generations of processing technologies
- Tabulating cards and plugboards — unit record equipment
- Electromechanical relays
- Vacuum tubes
- Discrete transistors
- Small scale integrated circuits (SSI)
- Medium scale integrated circuits (MSI)
- Large scale integration (LSI)
- Full microprocessor on a single chip
- Very large scale integration (VLSI)
- Large microprocessor on a chip
Both main memory and mass storage.
- Punched cards, paper tape
- Vacuum tubes (registers)
- Regenerative capacitor memory
- Williams tube (CRT)
- Mercury delay lines
- Magnetic drum memory
- Magnetic core memory
- Tape and disk mass storage
- ROM read-only memory chips
- Semiconductor memory
- CD-ROM and DVD for write-once storage
- Flash memory for mass storage
- 1942 — ABC — 60 50-bit numbers regenerative capacitor memory.
- 1949 — EDVAC — 1,024 44-bit words mercury delay line.
- 1949 — EDSAC — 1,024 18-bit words mercury delay line.
- 1951 — UNIVAC I — 1,000 10-digit words.
- 1951 — Feranti Mark I — 512 20-bit words Williams tubes.
- 1953 — ENIAC — 100 word core memory.
- 1953 — IBM 701–2K/4K 36-bit words Williams tubes.
- 1955 — IBM 704–4K 36-bit words core memory.
- 1958 — UNIVAC II — 10,000 11-digit words/numbers.
- 1962 — Atlas (Univ. of Manchester) 16K 48-bit words core memory.
- 1959 — IBM 1401–16KB core memory.
- 1958 — IBM 709 32K 36-bit words core memory.
- 1960 — DEC PDP-1 64K 18-bit words core memory.
- 1955 — SAGE/AN/FSQ-7 68K 32-bit words core memory.
- 1965 — IBM System/360 Model 40–256MB core memory.
- 1961 — IBM 7030 Stretch — 256K 64-bit words core memory.
- 1965 — CDC 6600–256K 60-bit words core memory.
- 1965 — IBM System/360 Model 50 512KB core memory.
- 1969 — CDC 7600–576K 60-bit words.
- 1965 — IBM System/360 Model 65–1MB core memory.
- 1966 — IBM System/360 Model 67–2MB core memory.
- 1966 — IBM System/360 Model 91–4MB core memory.
- 1969 — IBM System/360 Model 195–4MB core memory.
How is memory organized in a quantum computer compared to a traditional digital computer?
- Is there a flat, linear address space?
- Are segments supported?
- Are pages supported?
- Is virtual memory supported?
- Is there a single address size or multiple sizes?
- Is relative addressing supported?
- Can memory be shared?
- Can memory be shared with a traditional digital computer so that data can be accessed or transferred directly between the two machines? Not all memory need be shared, but enough to cover performance critical needs.
- Does memory have parity error detection?
- Is ECC error correction supported?
- How fast is memory, compared to both traditional digital computers, and compared to the performance of quantum computing processing?
As with traditional digital computers, early machines are likely to have very simple memory models, with sophistication growing over time.
Number of qubits
How to compare memory requirements for a traditional digital computer with quantum computing qubits is not so clear at this stage.
In any case, more qubits means more processing power and more capacity.
- Single qubit.
- 2 qubits.
- 3 qubits.
- 4 qubits.
- 8 qubits.
- 10 qubits.
- 12 qubits.
- 15 qubits.
- 16 qubits.
- 20 qubits.
- 25 qubits.
- 30 qubits.
- 40 qubits.
- 48 qubits.
- 50 qubits.
- 60 qubits.
- 75 qubits.
- 100 qubits.
- 200 qubits.
- 250 qubits.
- 375 qubits.
- 500 qubits.
- 750 qubits.
- 1,000 qubits.
- 1,024 qubits.
- 1,500 qubits.
- 2,000 qubits.
- 2,048 qubits.
- 3,000 qubits.
- 4,000 qubits.
- 4,096 qubits.
How quantum computing will evolve after hitting the 4K milestone is anybody’s guess at this stage.
It would be interesting to assign dates to when those milestones might be expected to be achieved.
Some milestones purported to have been achieved so far:
- 49 qubits — Intel.
- 50 qubits — IBM.
- 72 qubits — Google.
- 2,048 qubits — D-Wave Systems.
It is not quite clear whether the D-Wave chip is a general purpose quantum computer or is specialized only for the annealing algorithm, not that annealing doesn’t have a variety of different applications.
To be sure, some of the early traditional digital computers were also specialized for particular applications or algorithms, but achieving status as general purpose was a key quality for newer machines.
This criteria should be split to distinguish the capacity of a single chip and a single computer with multiple chips, provided that those chips can work in concert for a single quantum algorithm, as opposed to distinct algorithms operating completely independently.
That would be a distinct criteria for advancement of quantum computing — coordinating computation between quantum processing chips.
How large can a program be on a quantum computer?
How complex can an algorithm be?
Early traditional digital computers had very limited program size.
Even 4,000 instructions was too much for early machines.
Some milestones for program size for quantum computers:
- 10 instructions or operations.
- 100 operations.
- 1,000 operations.
- 2K operations.
- 4K operations.
Of course, that begs the questions of what a single instruction or operation does and how a single higher-level language construct is compiled into lower-level quantum computing operations.
Some thresholds of processing performance for traditional digital computers:
- Operations per second. Unit record equipment and relay computers.
- Thousands of operations per second. Vacuum tubes.
- Millions of operations per second. Transistors.
- Billions of operations per second. Integrated circuits.
- Trillions of operations per second. Multiprocessors.
- Quadrillions of operations per second. PetaFLOPS. Large number of processors.
Types of operations:
- Basic instructions. MIPS — Millions of instructions per second.
- Integer arithmetic and boolean logic. Covered by MIPS.
- Floating point arithmetic. FLOPS — Floating-point operations per second.
Unit of work
It isn’t quite clear what the unit of work or processing is for a quantum computer, compared to a traditional digital computer.
As mentioned in the preceding section, a traditional digital computer works in terms of instructions, operations, integer arithmetic, boolean operations, and floating point operations.
The unit of discourse for both performance and capacity of a quantum computer is the qubit, such that n qubits can simultaneously represent 2 to the nth superimposed states. A 50-qubit machine would have one quadrillion parallel states.
But how to map states into units of work like integers and floating point operations is unclear at this stage.
Is a quantum state fairly directly equivalent to one bit of traditional information, or is it equivalent to a traditional digital computing value such as a 32-bit or 64-bit integer (or 10 decimal digits, for that matter) or a single or double precision floating point number? I suspect the former, but it remains unclear.
And how quantum states would relate to processing of text is very unclear. With billions of documents, each with many thousands of words, and with some characters requiring only one 8-bit codes while others require 16-bit codes, all of which have a fixed ordering, it is unclear how such text would be represented or processed on a quantum computer.
Progress will be when these and related questions get answered.
A single computer or processor can only perform so many calculations or processing steps per unit of time. Parallelism permits more calculations or other forms of processing to be performed per unit of time.
Quantum computing changes this picture dramatically, but there is still only so much that a single processor can perform.
Various strategies or approaches to parallel processing have been employed by traditional digital computing over the decades:
- Dual processors.
- Multiple functional processing units within a single computer.
- Tightly coupled multiprocessor configurations.
- Loosely-coupled multiprocessor configurations.
- Single-instruction multiple data stream (SIMD) processing.
- Multiple-instruction multiple data stream (MIMD) processing.
- Local computing grids.
- Distributed computing. Separate computers connected by a network.
- Large supercomputers based on many discrete processors.
Quantum computers, by their very nature promise much finer-grained parallel processing.
But parallelism is only as useful as the magnitude of the amount of data that can be processed. Even 100% parallelism for relatively small amounts of data won’t achieve a great advantage over traditional parallel traditional digital computers.
The question is how quickly quantum computers will be able to handle larger amounts of data, in parallel.
Such as for large simulations, weather forecasting, finite element analysis, fluid dynamics.
Some thresholds for parallel processing:
- Dozens of data elements in parallel.
- 1,000 data elements in parallel.
- Low thousands.
- 10,000 data elements in parallel.
- Tens of thousands.
- 100,000 data elements in parallel.
- Hundreds of thousands.
- 1 million data elements in parallel.
- Low millions.
- 10 million data elements in parallel.
- Tens of millions.
- 100 million data elements in parallel.
- Hundreds of millions.
- 1 billion data elements in parallel.
- Low billions.
- 10 billion data elements in parallel.
- Tens of billions.
- 100 billion data elements in parallel.
- Hundreds of billions.
- 1 trillion data elements in parallel.
- Low trillions.
- 10 trillion data elements in parallel.
- 100 trillion data elements in parallel.
- 1 quadrillion data elements in parallel.
Ah, one caveat — by data element, I mean a 32 or 64-bit integer or a single or double precision floating point number.
Granularity of modeling
In traditional digital computing performance for modeling is improved by opting for a weaker approximation of a correct result by coarser-grained modeling.
Finer-grained modeling would give more accurate results, but at a cost of degraded performance.
Quantum computers offer the potential for much finer-grained modeling and more accurate results due to their much greater performance and ability to perform more calculations in parallel.
The main issue is not the size of granularity per se, but the dramatic increase in the number of data elements as granularity shrinks. Capacity becomes the issue.
This is really the same problem as parallel processing, but it is the desire to dramatically shrink granularity that dramatically drives up the number of data elements which could or must be processed in parallel.
Some thresholds for parallel processing due to dramatic rise of data elements due to finer granularity of models:
- Dozens of data elements in parallel.
- Low thousands.
- Tens of thousands.
- Hundreds of thousands.
- Low millions.
- Tens of millions.
- Hundreds of millions.
- Low billions.
- Tens of billions.
- Hundreds of billions.
- Low trillions.
- A quadrillion or more.
Precision of calculations
Digits of precision for calculation is an ongoing trade-off for computing.
More precision is highly desirable, but higher precision comes at the cost of longer processing time, greater storage requirements, and greater energy consumption and greater heating issues and cooling requirements.
Some traditional thresholds for precision.
- Small integers. 8-bit, 16-bit
- Moderate integers. 24-bit, 32-bit.
- Larger integers. 48-bit, 60-bit, 64-bit.
- Very large integers. 128-bit, 256-bit.
- Decimal integers. Varying length, but slow processing.
- Fixed decimal numbers. 10-digits, 11-digits, other.
- Single precision floating-point numbers. 32-bit, 36-bit.
- Double precision floating-point numbers. 60-bit 64-bit, 72-bit, 80-bit.
- Decimal numbers. Varying length, varying precision, but slow processing.
What will quantum computers of the future offer and what will the performance, storage, and energy and heating characteristics be? Good questions. Awaiting the answers.
Richness of data types
- Simple integers. 16-bit, 32-bit.
- Large integers. 64-bit, 128-bit.
- Simple floating point numbers.
- Large floating point numbers.
- Simple decimal numbers.
- Large decimal numbers.
- Complex numbers.
- Fuzzy math. Probabilities.
- Boolean values. True or false.
- Fuzzy Boolean values. Probabilities.
- Bit values. Any number of binary bits, each of which can be set, cleared, toggled, AND-ed, OR-ed, XOR-ed, or tested. Shifting, rotation, and masking as well.
- Characters. 8-bit, Unicode — 8-bit, 16-bit, 24-bit, 32-bit, UTF-8, UTF-16, UTF-32.
- Simple structures.
- Complex structures.
- Small arrays. Dozens, hundreds of elements.
- Large arrays. Thousands of elements.
- Very large arrays. Millions, even billions of elements.
Coherence is the length of time that a quantum computer or qubit can maintain its state before it begins to decay, losing information or state to the operating environment (or having its state degraded by noise or energy from the operating environment.)
Currently, coherence of quantum computers is rather short, with the recent 50-qubit IBM machine having an average coherence of 90 microseconds (a little less that 1/10,000th of a second.)
Some milestones to watch:
- 1 microsecond.
- 10 microseconds.
- 25 microseconds.
- 50 microseconds.
- 75 microseconds.
- 100 microseconds
- 150 microseconds.
- 200 microseconds.
- 250 microseconds.
- 500 microseconds.
- 750 microseconds.
- 1 millisecond (1,000 microseconds.)
- 2 milliseconds.
- 5 milliseconds.
- 10 milliseconds.
- 25 milliseconds.
- 50 milliseconds.
- 75 milliseconds.
- 100 milliseconds (1/10th second.)
- 250 milliseconds (1/4 second.)
- 500 milliseconds (1/2 second.)
- 750 milliseconds (3/4 second.)
- 1 second (1,000 milliseconds.)
- 2 seconds.
- 5 seconds.
- 10 seconds.
- 25 seconds.
- 1 minute (60 seconds.)
- 2 minutes.
- 5 minutes.
- 10 minutes.
- 25 minutes.
- 1 hour (60 minutes.)
- 2 hours.
- 4 hours.
- 8 hours.
- 12 hours.
- 1 day (24 hours.)
- 2 days.
Whether coherence can ultimately be maintained beyond a full day is a wide-open question.
Abstraction of function from device
A key concept enabling the rapid advance of traditional digital computing has been the abstraction of functions away from the underlying devices and technologies that implement those functions.
A user or developer can think of a traditional digital bit regardless of whether it is implemented as:
- A hole in a punched card.
- A hole in a paper tape.
- A relay.
- A charged spot on a Williams CRT tube.
- A sonic wave in a mercury delay line.
- A vacuum tube.
- A discrete transistor.
- A capacitor.
- An integrated circuit.
- An electron.
- A photon in an optical cable.
- An electromagnetic radio wave (still a photon) for wireless communication.
- A magnetized spot on a drum, disk, or tape.
- A hole burned by a laser or read by a laser on a DVD.
A logical bit is the same regardless of its physical manifestation, the physical bit.
Another abstraction in traditional digital computing is the number, integer or real (float), regardless of how it is represented in terms of bits. For many or most applications the distinction between a 32-bit integer and a 64-bit integer is irrelevant. Or 8-bit, 12-bit, 16-bit, 18-bit, 24-bit, 36-bit, 48-bit, or 60-bit integers, provided the range of values fits in that size. And most applications don’t care whether single or double precision is used for representing real numbers in floating point. Or whether a Boolean true or false value is represented as a single physical bit or as a small integer for more efficient memory access.
Similarly, operations like add, subtract, multiply, divide, square root, and transcendental functions are abstracted in traditional digital computing so that the application developer has no sense of which specific machine operations are being performed.
Granted, for some truly high-performance applications it is still necessary to care very much that the most efficient machine operations are being performed. That was frequently true in the early days of traditional digital computing as well. But over time, the basic device functions have gotten fast enough that clean abstractions — and efficient compilers — are considered more important than forcing developers to think and express themselves in terms of very low level device operations.
The question is how quantum computing will evolve for these abstraction criteria. The goal is clean abstraction unless there is some compelling rationale for a more specific, less abstract representation or operation.
Quantum computers of today require cooling to almost absolute zero (zero degrees kelvin, minus 273.15 centigrade.) There are some systems that claim to operate at room temperature, at least under some conditions, but most require a super-cold operating environment. At least today they do.
Achieving super-cold temperatures is a technical challenge, very inconvenient, and expensive. Higher operating temperatures mean less technical challenge, greater convenience, and cheaper cost.
- 10 mK (10/1,000th of one degree kelvin)
- 50 mK (50/1,000th of one degree kelvin)
- 100 mK
- 250 mK
- 500 mK
- 1 K
- 4 K (temperature that liquid helium boils)
- 10 K
- 50 K
- 75 K (liquid nitrogen boils at 77K, -196 C)
- 100 K (liquid oxygen boils at 90K, -183 C)
- 200 K (dry ice sublimes at 195K, -78 C)
- 250 K (255 K is 0 F)
- 275 K (273 K is freezing of water)
- 300 K (294 K is room temperature, 70 F)
A room temperature quantum computer is of course the holy grail for an operating environment.
Although it is worth noting that very low operating temperatures are readily available in space.
The concept of a supercomputer only takes on meaning when computers are generally available. Then, a supercomputer is a computer which has substantially greater performance and capacity than the computers which are generally available.
In the 1950’s one could have been excused for considering all of the existing computers as being supercomputers. They were certainly physically very large in size.
Only by the early 1960’s when quite a few computers were available did particular machines such as the IBM 7030 Stretch, Manchester Atlas, and finally the CDC 6600 take on the mantle of true supercomputers.
Similarly, one could assert that the very quantum, parallel nature of quantum computing renders every quantum computer a supercomputer almost by definition.
Ignoring, for the moment that even the most powerful quantum computers can only handle a tiny fraction of the data that common personal computers can handle. That will change quickly enough in the coming years.
In fact, handheld computers common today do indeed possess more computing power than supercomputers of the 1960’s.
Five years from now (okay, maybe ten), indeed, the average quantum computer will indeed eclipse the traditional digital computers that we call supercomputers today.
Sometime further out into the future, maybe ten to fifteen years, when quantum computers are much more common, only some of them will be considered true supercomputers — even though all of them will easily eclipse all of the supercomputers of today.
I’ll suggest these thresholds for supercomputing for quantum computers:
- Can handle dozens of data elements in parallel.
- Can handle hundreds of data elements in parallel.
- Can handle thousands of data elements in parallel.
- Can handle 10,000 data elements in parallel.
- Can handle 50,000 data elements in parallel.
- Can handle 100,000 data elements in parallel.
- Can handle 500,000 data elements in parallel.
- Can handle one million data elements in parallel.
- Can handle 10,000,000 data elements in parallel.
- Can handle 100,000,000 data elements in parallel.
- Can handle one billion data elements in parallel.
- Can handle 10,000,000,000 data elements in parallel.
- Can handle 100,000,000,000 data elements in parallel.
- Can handle one trillion data elements in parallel.
- Can handle one quadrillion data elements in parallel.
Where a data element would be a 32 or 64-bit integer or a single or double precision floating point number.
At a more easy to grasp level, consider these thresholds for a single quantum computer on the road to supercomputer status relative to outperforming traditional digital computers:
- Outperform a single 256GB server.
- Outperform two 256GB servers.
- Outperform four 256GB servers.
- Outperform eight 256GB servers.
- Outperform 16 256GB servers.
- Outperform 32 256GB servers.
- Outperform 64 256GB servers.
- Outperform 128 256GB servers.
- Outperform 256 256GB servers.
- Outperform 1,024 256GB servers.
- Outperform 2,048 256GB servers.
- Outperform 4K 256GB servers.
- Outperform 8K 256GB servers.
- Outperform 16K 256GB servers.
- Outperform 32K 256GB servers.
- Outperform 64K 256GB servers.
- Outperform 128K 256GB servers.
- Outperform 256K 256GB servers.
- Outperform 1M 256GB servers.
- Outperform 10M 256GB servers.
- Outperform 100M 256GB servers.
Component and even subsystem failure is a fact of life in technology. Systems must be able to detect and recover from as many failures as possible.
Fragility is not an option for modern computing systems.
Okay, technically, even an entire system could fail, but the point there would be that the larger computing infrastructure would be able to detect and recover from even failure of an entire system (say, a network node or a distributed system.)
Approaches to fault tolerance in traditional digital computing have included:
- Automatic detection of component failures, with a hard stop so that the integrity of data values in a computation is never compromised.
- Individual components designed for much higher reliability.
- Automatic hardware or low level software detection of individual component failures, with automatic failover to redundant components.
- Same redundancy and automatic failover, but within the component. Such as ECC error-correcting memory.
- Multiply redundant subsystems with software that can parcel out work to available subsystems, and then switch a subset of a task to a redundant subsystem when a subsystem failure is detected.
- Fully replicated parallel execution. Multiple components, subsystems, or even whole systems are computing exactly the same results such that any of them can fail and success is achieved as long as one of them succeeds.
Other forms of fault tolerance include:
- Shielding from cosmic radiation.
- Shielding from background radiation.
- Shielding from electromagnetic radiation in general.
- Filtering of power source to remove line noise.
- Battery backup for power failures and brownouts.
Processing performance is the main objective for computing, but how fast you can feed data into the computer and how fast you can get results out — input/output or I/O — are important as well.
At least with traditional digital computing there is also the issue of I/O performance for intermediate results when main memory does not have sufficient capacity to accommodate all of the data to be processed. Some sort of mass storage, whether a disk, drum, tape, or flash memory may be required, all of which operate much more slowly than main memory.
How this picture changes or will evolve for quantum computing is very unclear at this stage.
As with the earliest days of traditional digital computing when high-value research and national defense needs rendered cost irrelevant, current quantum computing efforts are proceeding despite their very high cost.
Last year, a D-Wave 2000Q with 2,048 qubits cost about $15 million.
The initial, leading edge, deployments of quantum computing in large institutions may also be free of cost as an issue, but as the promise and capabilities of quantum computing become more practical and desirable for more mainstream institutions and even businesses, cost will rear its ugly head.
The question is how quickly quantum computing will move down the cost curve compared to the comparable trend for traditional digital computing.
- Tens of millions of dollars.
- $20 million.
- $15 million.
- $10 million.
- Low millions of dollars.
- $1 million
- Hundreds of thousands of dollars.
- Tens of thousands of dollars.
- Thousands of dollars.
- Hundreds of dollars.
- Tens of dollars.
Operating systems and front end computers
The so-called operating system is the software which controls the basic operation of the computer. Its most important function is to enable the execution of applications, the programs, the actual code that implements the algorithms and logic to be performed.
The goal is to permit the application code to be as simple as possible and to offload any and all work that is required to coordinate the various functions of the hardware of the computer, including and especially input and output, and mass storage, including file systems.
Operating systems have evolved from being very simple and even nonexistent in early computers to extremely sophisticated in present day machines.
How much of that complexity is needed for quantum computers is unclear.
A lot of the work performed by a traditional operating system can be offloaded from the quantum machine itself and implemented in an attached support computer, so that the quantum computer can focus closer to 100% of its resources on actual quantum computing rather than the bookkeeping and logistics of a traditional operating system.
By the 1960’s and 1970’s it became common to have smaller computers act as front ends to larger computers. For example, the ILLIAC IV massive parallel supercomputer used a PDP-10 computer connected to the ARPANET as its front end.
It remains to be seen how operating systems and front end computers will evolve for quantum computers.
For example, it would be nice to see standardized front end computers that look and act the same regardless of what brand or model of quantum computer is under the hood.
One open issue is whether quantum computers will have anything resembling the file system and mass storage that is common in most traditional digital computers, or whether those two concepts will be offloaded entirely to the front end computer.
I/O device, server, coprocessor, or functional unit?
Beyond the question of whether quantum computers will ever evolve into general purpose computers, there are four questions related to how they will fit into the computing landscape:
- Will quantum computers simply be connected as I/O devices to traditional digital computers? This is where we are today.
- Will quantum computers be specialized servers on a network?
- Will they be coprocessors attached to a traditional digital computer processor?
- Will they be functional units within a traditional digital computer?
The first option is the more likely and current reality. Specialized interface hardware knows how to talk to both the quantum computer and the traditional digital computer, translating between the two.
One could in theory directly use the traditional digital computer as if it were a glorified desktop computer. It wouldn’t necessarily need to be connected to a network.
Of course it is very tempting and obvious that the traditional digital computer could in turn be connected to a network, either a local area network (LAN) or to the Internet, either via a network cable or a wireless connection.
Such a network connection by itself wouldn’t necessarily mean that the computer is a server on the network available to service requests from external users. The connection might simply be used for file transfer and administrative tasks.
But once a network connection is in place, that’s all of the hardware needed to allow the computer to operate as a true server, able to process service requests from clients on the network. A little bit of software is required — available off the shelf, but it’s more of an architectural and management decision rather than a technical hurdle.
At some stage, quantum computers may evolve to the point where high-end users might have their own local quantum computer, attached as an I/O device, for highest performance and ease of development and ease of use. But for the foreseeable near-term future, quantum computers are too bulky and expensive for individual users to have their own dedicated quantum computer. But someday this may change.
A coprocessor is a device or chip which is separate from the main computer, but tightly connected to it so that data and commands can be transferred very rapidly, somewhat more rapidly than a traditional I/O device.
As an example, floating point arithmetic has been implemented as a coprocessor in the past, although it has always been a priority to fully integrate floating point arithmetic into the main processor as quickly as possible, as a true functional unit, to further streamline the exchange of data and commands.
Graphical processing units or GPUs are a form of coprocessor. They are quite popular today, both for high-performance graphics as well as offloading particularly complex computations, such as bitcoin mining.
Coprocessors and GPUs are generally separate chips from the main processor of a traditional digital computer, interacting with with the main processor via a so-called bus, which is more of a highway permitting data and commands to be transported around within a computer fairly rapidly.
Traditional digital computers generally have two buses outside of the main processor, one for high speed memory access and one for I/O devices, which are generally much slower than main memory. A coprocessor generally runs at a speed comparable to main memory, at least in terms of interfacing to other system components, even though internally it will tend to run a lot faster.
A coprocessor could be a single chip, multiple chips, a separate module or daughterboard, or an external subsystem connected via cabling. Obviously a single chip is superior, but the coprocessor may be too complex or require a special operating environment, such as cryogenic cooling.
A functional unit is a circuit within the main processor itself. It tends to be fairly self-contained, which is why it is called a unit, buth has very high speed connections to other functional units within the processor, much faster than even the high speed bus that a coprocessor or GPU might use.
A functional unit will deliver the highest performance for transfer of data and commands, but is also the most complex and technically challenging.
And since the functional unit is fully integrated within the main processor, it cannot be replaced or enhanced without replacing the entire main processor.
A functional unit for quantum computing would effectively allow hybrid computers, part quantum and part traditional digital computer.
Such a hybrid would be the final stage of evolution short of a true general purpose quantum computer which would not need a separate functional unit for quantum computing, although a sophisticated quantum computer would probably have more than one functional unit for quantum computing.
Whether or how rapidly quantum computing evolves in the direction of cheaper I/O devices, coprocessors, and ultimately functional units remains to be seen and is merely speculation at this stage.
Processing modes for traditional digital computers have evolved greatly:
- Standalone, single user.
- Batch (commonly using punched cards), attended by an operator.
- Multitasking. A single computer capable of running multiple jobs simultaneously.
- Remote job entry (JSE). Users not required to be near the computer.
- Timesharing, with many users simultaneously connected via terminals.
- Wide area network (WAN) computers.
- Embedded computers. Microprocessors.
- Personal computing.
- Local area networking (LAN).
- Server computing.
- Internet. Services.
- On-demand utility or cloud computing.
- Internet of Things (IoT).
Quantum computing is presently at a hybrid of standalone single user mode and networking. True networking would support multiple simultaneous jobs or requests.
Programming language sophistication
Hardware is a key constraint, but ability to develop software to run on the hardware can be just as big a constraint.
Programming languages are a key constraint for developing applications.
- 1940’s — raw machine language (binary, toggle switches, holes in cards and paper tape)
- 1950’s — Assembly languages
- 1956 — Information Processing Language (IPL)
- 1957 — FORTRAN
- 1957 — COMTRAN
- 1958 — ALGOL 58
- 1958 — FLOW-MATIC
- 1958 — LISP
- 1959 — AIMACO
- 1960 — COBOL
- 1960 — ALGOL 60
- 1964 — PL/I
- 1964 — BASIC
- 1970 — Pascal
- 1972 — C
And more modern languages since then.
A key feature of programming languages is their ability to hide the details of the underlying machine language so that the developer can code in higher-level terms that make more sense to subject-matter experts, with the programming tools handling translation to the machine level features.
Availability of programming languages and development tools
Primitive development tools hold back serious development of serious applications for quantum computers.
A comprehensive suite of sophisticated development tools will be needed to assure that quantum computing will be ready for prime time.
These would include, among many others:
- Programming languages. Both existing languages and new languages or extensions that are optimal for quantum computing.
- Testing tools.
- Performance monitoring and planning tools.
- Capacity monitoring and planning tools.
- Visualization tools for all phases of development and deployment.
Rich collection of design patterns and programming metaphors
Reinventing the wheel for each and every quantum computing application is a tedious and unnecessarily expensive proposition. A rich collection of design patterns and programming metaphors can greatly accelerate the process of developing applications. And reduce errors and bugs as well.
- Plain English descriptions.
- Clear specification of intentions.
- Clear specification of use cases.
- Detailed notes on applicability.
- Detailed notes on limitations or restrictions.
- Clear specification of inputs.
- Clear specification of outputs.
- Clear specification of processing.
- Fully documented sample code.
- Fully functional libraries when possible.
- Full documentation overall.
- Comprehensive suites of test cases.
Shared code libraries
Design patterns and programming metaphors are essential, but fully self-contained code libraries are equally valuable, and even more valuable in many situations.
Code libraries must meet all of the criteria given for design patterns and programming metaphors.
Open source code libraries are best, permitting developers to more deeply understand what’s going on under the hood, as well as enabling customization and enhancement.
But since many developers will opt to use the libraries exactly as they are distributed, it is urgent that the quality of the underlying code be of as high quality as possible.
Application frameworks are another approach to shared code.
Code repositories such as GitHub will facilitate the process of sharing code.
In any case, a wide variety of shared code libraries and application frameworks will be a decent metric of how much progress has been made for quantum computing.
The ultimate measure of progress will be the extent to which any given application development task can be performed by existing shared code from a code repository rather than having to develop fresh custom code.
Open source vs. proprietary technology
Early computing technology tended to be very proprietary, with each new computer being built from scratch according to the proprietary designs of its builder.
That said, there was a fair amount of sharing of technology, knowledge, and expertise, including publication of academic papers. Not every new computer was designed 100% from scratch.
Over time, every new computer tended to have more in common with other computers than ways in which they were different.
It does seem as if each new quantum computer was designed from scratch, although there are plenty of elements of existing technology in most new quantum computers.
In some cases proprietary technology is used to gain a commercial advantage, to make money and exclude competition, but very commonly the goal is simply to gain some technical advantage over existing technology, to simply build a better mousetrap.
Open source is a newer approach to technology than was used in the early days of technology.
It is most common for software, but the open source concept can work with hardware as well, even if it is used at the component, module, and subsystem levels even if the overall system design is somewhat proprietary. Or maybe components or modules may be proprietary while larger portions of the system are open source or off the shelf.
Just about every vendor and organization now uses open source technology in traditional digital computing to some degree even if proprietary technology is still common to some extent.
The question and challenge for quantum computing is the extent to which open source technology will become the norm, and exactly what role proprietary technology will play.
The dream will be the day when a complete quantum computing system can be based 100% on open source technology, hardware, software, and applications.
The question is how quickly useful applications will appear on quantum computers compared to the pace they appeared on traditional digital computers.
Some historical precedents:
- 1940’s — relay computers — atomic bomb calculations.
- 1942 — ABC — solving linear equations.
- 1945–1952 — ENIAC, ILLIAC 1 — Navier-Stokes fluid dynamics.
- 1946 — ENIAC — artillery firing tables, hydrogen bomb calculations.
- 1950 — ENIAC — weather forecast.
- 1950 — Whirlwind — proof of concept for aircraft early-warning radar.
- 1951 — Checkers.
- 1952 — Tic-tac-toe.
- 1952 — Predict results of a presidential election.
- 1955 — Theorem proof.
- 1957 — Chess.
- 1958 — Air defense (SAGE).
- 1961 — General Purpose Systems Simulator (GPSS).
- 1964 — Solve algebra word problems.
- 1964 — ELIZA natural language Q&A.
- 1964 Sabre — airline reservation system.
It’s not that these particular applications have significance or would be relevant to quantum computing, but simply that quantum computers will need to prove their utility for applications of comparable complexity in terms of both code and data complexity.
Transition from ultrasimple “toy” applications
Presently, most quantum computers are mere research projects, so there is little in the way of real applications.
The question is how quickly or when we will start seeing real applications rather than mere “toy” or proof of concept applications.
In other words, when quantum computers are finally able to solve real problems and earning their keep, either earning money, reducing costs, or providing real-world services that are not readily and more efficiently and more cheaply provided by existing traditional digital computers.
Two specific criteria here for having completed the transition from ultra-simple “toy” applications:
- Use of quantum computers for applications requiring more performance and capacity than even the simplest of modern traditional digital computers.
- Use of quantum computers for applications previously requiring high-end traditional digital computers.
That’s not asking a lot at all, and not even close to proving the great promise of quantum computing, but at least it is the key step to advance out of the “nursery” phase of incubation of quantum computing.
Transition from experimental novelties to development of production applications
Once organizations have transitioned out of the ultrasimple “toy” application phase, they move on to the proof of concept and experimentation phase, where they endeavor to implement something comparable to a real-world application, but are still not prepared to commit to a full-scale application directly exposed to the real world. That’s still at the experimental stage since they don’t know for sure the actual outcome of their efforts.
At some point they have enough experimentation and experience in hand to actually design and develop credible real-world applications with the full intention to deploy them for real-world use.
This transition from experimental novelties to development of production applications is a key milestone for any new technology.
Wide availability of proven applications
Beyond simply advancing from ultrasimple “toy” applications and larger-scale experimental applications, the next big baby step for quantum computing is the accumulation of more than a very few real-world, proven applications.
Just a few examples to wait for:
- Weather modeling.
- Optimization and scheduling.
- Integrated circuit layout, routing, and validation.
- Protein folding.
- Computational chemistry.
- General purpose simulation.
- Artificial intelligence, machine intelligence, machine learning.
- Business intelligence.
Will quantum computing be relevant to the Internet of Things (IoT)?
One would think so.
There are three areas of potential:
- Handling a very large volume of devices. Lots of data. In parallel. High demands on I/O performance of the quantum computer.
- Handling a large stream of rapidly and continuously changing data, within a single IoT device. Assuming a quantum computer could be built that small.
- Spanning and connecting a significant group of IoT devices. A blend of lots of data streams and lots of devices. Maybe not that many devices, but with data that needs to be coordinated very closely between separate devices, all in real-time.
Applications easily implemented
More information is needed here, but maybe that is simply a sign of how immature the field of quantum computing is.
But that will be a good test of maturity of the field, when a substantial list of categories and specific applications can be listed that have been easily implemented.
General purpose quantum computing
Quite a few of the early big successes for traditional digital computing in the 1940’s and 1950’s were fairly specialized computers, even tailored to specific applications or specific types of algorithms or operations. To be sure, these were significant advances and specialization may have been the only way to make such big leaps so quickly in those days.
But it was only when a critical mass of technical breakthroughs had occurred that permitted computers to be truly general purpose that they could be broadly and widely adopted.
Quantum computing may face this some hurdle. Or I should say that it is highly likely to face this hurdle. Or maybe we should simply call it an opportunity to accelerate progress rather than a problem per se.
In fact, attempting to directly make the big leap to truly general purpose quantum computers first may delay their adoption for more critical applications that don’t require full generality.
It may indeed be much better to more quickly leap to very high value specialized applications, and then, later, figure out how to build on and exploit those specialized technologies to produce more general purpose machines.
In any case, general purpose quantum computing is a key criteria for judging the progress of quantum computing.
- Very limited proof of concept machines.
- More general, but still limited proof of concept machines.
- Prematurely generalized machines. They work, sort of, but not so well.
- Very specialized machines for very narrow niches of very high value applications that do extremely well.
- Better but still prematurely generalized machines. They work better, but not as well as specialized machines.
- Broader but still very specialized machines for specific classes of applications. Real progress.
- Somewhat more general machines for broader classes of applications. Further progress.
- Early fully general purpose machines, but may not be as well suited as specialized machines for specific niche use cases.
- Mid-maturity general purpose machines that handle a fair fraction of applications that previously required specialized machines.
- Full maturity general purpose quantum computers which handle most applications quite well. Although some niche applications may still be better served by specialized machines.
- General purpose quantum machines that are finally good enough that the extra effort and expense of specialized machines is not worth it.
Is general purpose quantum computing unlikely?
Personally, I think it’s likely that eventually quantum computers will be general purpose, capable of any application that a traditional digital computer is capable of, and a lot more, of course.
But, curiously, esteemed research firm Gartner doesn’t think so. They have these two dismissive statements in their report entitled The CIO’s Guide to Quantum Computing (November 29, 2017):
- However, it’s important to note that quantum computers will never replace classic computers for general-purpose computing.
- Current applications and, according to Gartner, future applications for quantum computing will be narrow and focused. General-purpose quantum computing will never be realized.
Those are really strong words. The opposite of hype.
Personally, I’m more of a never say never kind of guy. And even if you’re thinking never, say unlikely instead.
This reminds me of an article in Business Week magazine back when the Intel 486 processor chip came out. It was initially focused on high-end servers. The writer asserted a little too boldly that the 486 would never be used in desktop computers. We know how that movie ended.
In any case, I believe that it is safe to say that general purpose quantum computing will not be happening in the near future, the next few years.
Maybe five to ten years.
Although even then I would have to hedge and distinguish two distinct outcomes:
- Quantum computers technically capable of supporting all applications of traditional digital computers.
- Quantum computers cheap and easy enough to use to be a viable alternative to most traditional digital computing tasks.
The former being more likely sooner, while the latter would more likely be even further out in the timeline. Maybe twenty years?
Number of research projects
There are indeed plenty of research projects underway for quantum computing.
Whether these are sufficient is hard to say. I’d say more would be better.
Number of commercial vendors
This includes vendors at all stages:
- Vendors with research projects.
- Vendors with experimental products.
- Vendors with evaluation and alpha and beta test-level products.
- Vendors with off-the-shelf products, ready to plug and play.
A semi-comprehensive list of companies at least involved in quantum computing can be synthesize from these lists:
Categories of vendors include (or will eventually include):
- Full-system hardware suppliers.
- Component suppliers. Chips, modules, subsystems, peripheral products.
- Programming and development tool suppliers.
- Application framework and library and software subsystem suppliers.
- Application suppliers.
- Large consulting firms.
- Systems integrators.
- Defense contractors.
Number of government organizations with quantum computing in active use, beyond research
A list of some of the government labs and agencies using quantum computers from D-Wave Systems:
But I suspect that many of those uses are for research rather than active use in government operations. You know, “Let’s see what we can do with this new technology” rather than scheduling of predictable results. Any application without “experimental”, “testbed”, or “prototype” in its title.
Transparency for technical claims
Commercial vendors in all fields are prone to making exaggerated claims. It is important to have full transparency so that all technical claims for products and services can be independently verified.
This includes aspects such as:
- Technical specifications.
- Application availability.
Students need access to high quality courses in quantum computing:
- Basics of quantum mechanics.
- Advanced quantum mechanics.
- Quantum computing basics.
- Advanced quantum computing.
- Quantum algorithm basics.
- Advanced quantum algorithms.
- Performance of quantum algorithms.
- Software engineering for quantum computing.
- Quantum circuit design.
- Quantum computer engineering.
- Applied quantum computing.
- Business use cases for quantum computing.
Technical professionals and staff need access to high quality technical training for quantum computing:
- Quantum computing basics.
- Advanced quantum computing.
- Application development.
- Application use.
Development of standards
One measure of the maturity of a field or industry is the development of standards.
Some areas where standards could be developed for quantum computing:
- Standardized chips, components, modules, and subsystems. And connectors.
- Interface standards.
- Mechanical engineering standards.
- Cooling standards.
- Performance measurement, testing, and modeling standards.
- Standards for measuring key factors. Performance, capacity, features.
- Benchmarks. For comparing performance between machines.
- Validation suites. For confirming support for standardized features.
- Operations and instructions
- Data formats.
- Data structures.
- Programming languages.
- Design patterns and programming metaphors.
- Shared code libraries.
- Data structures.
It may be premature to see development of standards in any or many of these areas, but that could simply be a measure of how nascent and undeveloped the field remains.
Key technical breakthroughs required
Some of the key technical hurdles that are severely restraining the widespread adoption of quantum computing:
- Super-cold operating temperature. Room temperature may be too much to ask for yet, but requiring near-zero degrees kelvin is a major impediment.
- Programming languages and tools.
- Design patterns and programming metaphors.
- Clear definitions of what problems, applications, and algorithms are best for quantum computing.
- Fast and accurate simulators of quantum computing on traditional digital computers. Allow developers to observe state changes that the actual quantum computer cannot allow to be observed due to the very nature of quantum computation.
I’m sure that there are plenty of additional hurdles, but those are enough put a distinct damper on any significant forward progress.
Key non-technical hurdles
Beyond technical hurdles there are plenty of other, non-technical hurdles, including:
- Cost. The raw, total cost of the machines.
- Cumbersome operating environments. As with the original big computers and even more modern mainframe computers, very specialized and expensive operating environments are required.
- Organizational knowledge and expertise. Managers and executives understand traditional digital computers (well… sort of), but understanding how quantum computing fits into the mission of an organization is another matter. Understanding the limitations and appropriate applications of quantum computing are key issues.
- Cost and difficulty of finding, attracting, hiring, managing, and retaining appropriate technical staff for quantum computing.
- Kick-starting, budgeting, and maintaining momentum for standards efforts. A world in which each new machine is special and unique is a very unmanageable situation.
- Auditing and audit trails. The magic of quantum computing is that magic happens, but the downside is that quantum mechanics says that you can’t observe and track progress as the magic is happening. This could present significant management and legal problems. Generally, you need to be able to demonstrate how a result was achieved, not simply ask for the result to be accepted on nothing more than faith that the algorithm must have been right. Management must be able to validate that a given quantum computation is what people say it is.
I’m sure that there are plenty of additional non-technical hurdles, but those are enough put a distinct damper on any significant forward progress.
Traditional digital computers have gone through quite a few generations.
Quantum computers are still in their first generation, or since they are still mostly research projects, they are not technically in their first generation yet.
There are no clear criteria for what should define a generation of quantum computers, but the simple model could be a significant step function in any of the criteria proposed by this paper.
Although I’m not sure raw number of qubits should define a generation per se.
But many of the other criteria could be used to define generations.
At this stage, some of the more significant criteria that might represent generations might include:
- Size of application. Lines of code plus overall complexity.
- Amount of data that can be processed.
- Real applications rather than toy applications.
- Production applications rather than experimental and proof of concept projects.
- Size of application compared to size of traditional digital computer needed to support the application.
- Availability of sophisticated development tools.
- Availability of programming languages and compilers that can fully exploit the unique capabilities of quantum computers.
- Performance scaleup factor relative to same application on a high-end traditional digital computer. Times ten, 25, 50, one hundred, 250, 500, one thousand, etc.
- Reduction in cost. By factors of two to ten.
- Reduction in physical size. By factors of two to ten.
- Increase in operating temperature. Big leaps.
Those would not be specific generations, but any big step up in any of those criteria would quality for at least a 0.1 bump up in the generation. A bunch of such bumps could qualify for a full 1.0 bump up in the generation.
Right now, I would say that we are dealing with 0.01 bumps or occasional a 0.1 bump with most announcements.
I’ll go out on a limb and suggest that we are currently around generation 0.25. And likely to step up by 0.25 each year until we make some strong quantum leaps.
Formalized information theory
Digital communications is based on information theory, as formalized initially by Claude Shannon. Communications and computation are somewhat distinct domains but clearly overlap. Quantum communication is an area of intensive current research interest.
The issue is how ad hoc and informal or how formalized information theory is as applied to the quantum computing domain. Again, presumably overlapping with quantum communication.
As more of a placeholder for further thought in this area, I’ll suggest these preliminary levels of progress in formalization of information theory for the domain of quantum computing and quantum communication:
- Rather ad hoc and informal.
- Rudiments of formalization.
- Minimal formalization.
- Moderate formalization.
- Extensive formalization.
- Very formalized.
- Fully formalized.
- Full formalization of information theory for both quantum computing and quantum communication, integrated with formalized information theory of traditional digital computing and digital communications.
Work in progress
This informal paper is a work in progress. It will be updated as new information or new thinking develops.