The Greatest Challenges for Quantum Computing Are Hardware and Algorithms

Jack Krupansky
99 min readAug 20, 2018

--

First the good news: quantum simulators, hybrid mode of operation with the quantum computer as a coprocessor, and quantum-inspired algorithms will enable the quantum approach to computation to advance much more rapidly than pure quantum hardware and pure quantum algorithms are currently advancing. The bad news is that quantum computers are advancing at too slow a pace and strictly quantum algorithms are extremely daunting, especially in the face of very limited hardware.

This informal paper will focus mostly on algorithms rather than hardware, but without the needed hardware, strictly quantum algorithms are at risk of being left high and dry — if not for simulators and quantum-inspired algorithms running on classical computers which continue to advance at a brisk pace.

To put the hardware challenges in perspective, we need:

  1. Moderately more qubits. 64, 128, 192, 256, 512, 1024. As a start.
  2. Much larger numbers of qubits — tens of thousands, hundreds of thousands, even millions. A 1,000 by 1,000 lattice (grid) is one million qubits, but is still a rather modest amount of data by today’s standards.
  3. Much greater connectivity (entanglement) with far fewer, if any, restrictions.
  4. Much lower error rate.
  5. Much longer coherence.
  6. Much greater circuit depth.
  7. True fault tolerance — error correction, which requires significant redundancy for each qubit.
  8. Much lower cost for the full system.
  9. Non-cryogenic operating temperatures.

Granted, selective niche problems can achieve adequate solutions without a god fraction of those needed advances, but true, general-purpose, widely-usable, practical quantum computers will require most of those advances.

We are not even close to being on the cusp of replacing all or most classical computers with quantum computers, so we are looking at a medium-term future with hybrid solutions that mix and match quantum computing and classical computing as needed and as practical. In essence, quantum computers will be coprocessors for the foreseeable future.

If you went back to the 1940’s or 1950’s in a time machine and asked experts about the future of their computers, few would practically project the degree of advances that have been made in processor speed, memory size, data storage size, physical size, and ease of use. If you had suggested trying to fit a room-size mainframe in a shoebox, let alone a machine a hundred times more capable, they would have laughed at you, but that’s the scope of the challenge ahead of us for quantum computers.

I expect that progress on the hardware front will continue to plod along, at not so different a pace than we experienced with classical computers from the 1940’s through the 1990’s. One would hope that we would progress with at least a moderately faster pace, possibly advancing in 10 to 20 years what took 20 to 50 years back then, but we also face more daunting challenges that sometimes simply take elapsed time for creative thinking rather than being strictly amenable to the number of dollars you throw at the problem.

To be clear, yes, I personally do expect that we will see room temperature quantum computers the size of a shoebox, or a server blade, and even in a handheld device, but not in the near or not so near future.

All of that said, and said mostly to establish some context, the primary thesis of this paper is that the key limiting factor for the advance of quantum computing is not the hardware, as challenging as that is, but software, or more specifically, the algorithms which make software work.

It is tempting to presume that all we have to do is moderately rework all our classical algorithms and then, presto, we have working quantum algorithms. But, that’s not a viable approach. Quantum computation is radically different from classical computation. For the most part, we have to start over, from scratch.

The difference between quantum computation and classical computation is less like the difference between a propellor-driven plane and a jet aircraft, but more like the distinction between air travel and space travel — there is very little knowledge that transfers in any clean manner. Rather than encountering significant familiarity and similarity, one is faced on all fronts with disorientation and new concepts that are unrelated to the old concepts.

Very few of us today have had the experience of programming on a bare machine with no supporting software and no wealth of literature to consult for algorithms and approaches to solving problems. Granted, we do have embedded computers and microcontrollers, such as the Arduino, where you are running on bare hardware with minimal software, but even then we are still able to exploit the deep knowledge of traditional digital computing, including data types, data structures, control structures, high-level programming languages, and a wealth of existing algorithms, methods, and techniques, none of which exist at present for quantum computers.

For classical computing, the concept of assembly language or machine language at the level of discrete machine language instructions, bits and bytes, and constantly worrying about an extremely strict limit on memory and how data is arranged is a thing of the past, but for quantum computing these and other related factors remain front and center and are severely slowing if not outright blocking any rapid progress.

The remainder of this paper will go into greater detail, but as just one example, software developers for classical computers can rely on integer and floating point numbers, algebraic expressions with arithmetic operators and trigonometric functions, conditional branching, looping, and nested functions — but none of these constructs even exists in a quantum computer.

To put it most simply, a classical computer provides a rich computation model, closely matching human conceptions of mathematics and logic, while with a quantum computer you’re relying only on the raw physics of quantum mechanics, with none of the rich abstractions and semantics that classical computing provide.

Quantum computing revolves around the qubit and its quantum states, which can be superimposed and entangled with another qubit, while much of classical computing revolves around numbers, integer and decimal, and text. There is more than a little mismatch between these two worlds.

Quantum computing is not simply like a foreign language where you can line up corresponding words, concept, and meanings between different languages, but is outright alien with no familiar landmarks.

Intuitive? Even classical computing is not exactly intuitive, but quantum computing is simply off the charts when it comes to lining up with human intuition. Although, to be honest, classical computing has always been a bit too deterministic compared to the uncertainty, ambiguity, and confusion that surrounds most human endeavors. But the point remains that real people and software developers can more easily relate to the determinism of classical computers than to the uncertain probabilistic nature of quantum computing.

Classical computing revolves around the abstract concept of a Turing machine, whose semantics are not rooted in either physics or quantum mechanics. A Turing machine represents computable functions, in a traditional mathematical sense. At present, there is no such abstraction on any current or even proposed quantum computer. Again, the software developer is forced to think and operate on the level of the raw physics, with no simplifying abstractions to relate to normal mathematics and logic, like evaluating algebraic expressions, calling functions, or conditionally executing operations.

A critical issue for algorithms and hardware as well is scalability. It is easy to demonstrate a solution to a problem on a relatively small scale, but that leaves open the question of how that solution scales to a much larger amount of data and processing power. Quantum computing does have the prospect of scaling better than classical computing, but realizing that potential is another matter. Limited connectivity of qubits in current quantum computers has an impact on scalability. And that says nothing about whether or how any particular algorithm scales. We have a lot of tricks and techniques for scaling algorithms on classical computers, but very little of that is relevant to current quantum computers.

The point here is not that quantum computing is impossible for mere mortals to relate, but simply that we have a great distance to travel before we arrive at a state of affairs where moderately skilled software developers can swiftly proceed through complex quantum computing tasks.

In short, we need a far richer algorithmic infrastructure — and hardware which supports it — before we can tackle more than just a few, high-value niche applications.

In the meantime, as noted at the outset, quantum simulators, hybrid mode of operation, and quantum-inspired algorithms will allow us to make much more rapid progress than if we relied solely upon the sluggish pace of quantum hardware development and raw, physics-based algorithm design.

I think it is still way too early for most organizations to be Quantum Ready as IBM asserts. Instead, I would simply urge organizations to be Quantum Aware and Quantum Curious. Every two to three years you can re-review the status of the field to assess whether to stay in that Aware/Curious (and wait) posture, or whether it actually is close enough to leap into Quantum Ready mode.

Topics

Topics to be covered in the remainder of this paper include:

  1. Glossary
  2. Quantum computers are probabilistic rather than strictly deterministic
  3. Probability engine
  4. A qubit is more like a flip flop than a bit
  5. Quantum information
  6. Quantum states
  7. Quantum logic gates are instructions (software), unlike classical logic gates which are hardware
  8. Quantum logic circuits are instruction sequences, not electronic circuits
  9. Bits, gates, and instructions
  10. Not all qubits can be entangled together
  11. Need for fully-connected qubits
  12. Ride the wave function, until it collapses
  13. Major debugging challenges for a quantum program
  14. Quantum simulators to the rescue
  15. High-performance quantum simulators
  16. Trick for debugging on a real quantum computer
  17. Standard mode of operation
  18. Preprocessing and postprocessing
  19. Preparation and measurement
  20. Hybrid mode of operation
  21. Hybrid mode of operation over a network
  22. Quantum decoherence
  23. Quantum computer as a coprocessor
  24. Hybrid mode of operation with the quantum computer as a coprocessor
  25. Life without logging and tracing of events
  26. Lack of symmetry in quantum computers
  27. Need for linear memory
  28. Hybrid memory
  29. Hybrid memory for initialization of quantum state for big data
  30. Life no longer well above the bit, byte, word, instruction, address, register, and memory level
  31. Life without a stack
  32. Life without a memory heap
  33. Databases, files, and I/O of data
  34. Data modeling for the quantum world
  35. Raw physics vs. intellectual power of Turing machine, mathematics, logic, and rich data types
  36. Life without data types
  37. Life without object-oriented programming
  38. Life without algebraic expressions
  39. Life without hex numbers
  40. Life without rich mathematics
  41. What do you mean I can’t copy a qubit?!!
  42. The tedium of assembly language and machine language
  43. Richness of classic computing vs. extreme austerity of quantum computing
  44. Back to square one for algorithm design
  45. Need for new building blocks for quantum algorithms
  46. What are the essential Lego blocks for quantum computing?
  47. Need for co-design
  48. Need for new metaphors for quantum algorithms
  49. Need for new design patterns for quantum algorithms and code
  50. What’s the algorithm for designing an algorithm?
  51. Evolution vs. iteration
  52. Need for a quantum Turing machine
  53. Intellectual leverage is everything in computing
  54. Need for learning quantum intuition
  55. Need for libraries of quantum code
  56. Frameworks for interacting with quantum computers
  57. Open source software
  58. Need for global catalogs for algorithms, design patterns, and code
  59. Need for taxonomies for algorithms, design patterns, and problems
  60. Algorithms and code are not the same thing
  61. Mixing code and algorithms is not a good idea
  62. Algorithm as an outline for code
  63. Need for a reference implementation
  64. Commenting code — and algorithms
  65. Derivative algorithms
  66. Algorithms may need to be optimized for the specific hardware
  67. Plan for portability
  68. Hybrid algorithms
  69. Publication of algorithms
  70. Rich high-level programming languages
  71. Need for a true quantum programming language
  72. Need for rich higher-level programming languages
  73. Need for a natural language vocabulary for specifying real-world problems to be solved
  74. Need for quantum computer science
  75. Rerun, a number of times
  76. Need to partition larger problems into chunks
  77. Need for solid documentation
  78. Need for standardized convention for preset initialization of quantum state of qubits
  79. Need for formalized Principles of Operation documentation
  80. Need for formalized Programming Model documentation
  81. Issues of scalability
  82. What does networking mean in the quantum world?
  83. Need for quantum-inspired algorithms and quantum-inspired computing
  84. Media processing
  85. Artificial intelligence
  86. Internet of Things
  87. How much knowledge of quantum mechanics and linear algebra do you need?
  88. Need for multipartite entanglement?
  89. What exactly is entanglement good for?
  90. Consider qutrits and qudits?
  91. Security
  92. Cryptography
  93. Cryptocurrency and blockchain
  94. Applications vs. code and algorithms
  95. Need for a robust software architecture for quantum computing
  96. Need for a robust algorithmic infrastructure
  97. Need for trained quantum designers
  98. Sluggish pace of quantum hardware development and raw, physics-based algorithm design
  99. Multiprocessor quantum computers?
  100. Challenges of fixed-function and special-purpose quantum computers
  101. Role of the lunatic fringe
  102. Not even bleeding edge technology let alone leading edge
  103. Not yet Quantum Ready
  104. Quantum Research Ready
  105. Quantum Aware and Quantum Curious
  106. Too much hype and folklore
  107. Famous quantum algorithms — which haven’t panned out, so far
  108. When is quantum computing projected to be able to do anything interesting?
  109. Should students focus on quantum computing?
  110. Awaiting further advances and breakthroughs
  111. Pace of progress for quantum simulators
  112. Need for a universal quantum computer
  113. ENIAC moment for quantum computing
  114. FORTRAN moment for quantum computing
  115. What’s next — for me

Glossary

Quantum computing introduces a lot of new jargon. I’ve endeavored to compile a reasonably comprehensive glossary of any and all terminology related to quantum computing. It’s long — over 2,700 entries and growing — so it’s broken into six parts plus an introduction:

Quantum computers are probabilistic rather than strictly deterministic

One of the great qualities of a classical computer, based on the concept of a Turing machine, is that it is strictly deterministic. You can mimic nondeterminism, such as generating random numbers, but that’s the exception rather than the norm.

Quantum computers on the other hand are inherently probabilistic rather than deterministic, just as with the quantum mechanics upon which quantum computing is based.

This distinction requires a significant, radical change in mindset for the design of algorithms and code for a quantum computer.

Rather than calculating the answer to a problem as a classical computation would do, a quantum computation generates the probabilities for any number of possible solutions.

Analogously to a classical Turing machine, a quantum program can mimic determinism, but that is failing to exploit the power of the quantum computer.

Probability engine

Rather that trying so hard to directly compare quantum computers and classical computers, it may make more sense to simply think of a quantum computer as a probability engine since that’s what it really does, in much the same way that a GPU can be thought of as a graphics engine for a classical computer.

But even that analogy breaks down as we realize that a GPU can do much more than only graphics. So too can a quantum computer compute much more than simply probabilities. It’s simply that probabilities are an essential quality of the underlying foundation of quantum computing (quantum mechanics.)

Many of the initial applications for quantum computing are heavily focused on exploiting this probabilistic nature of quantum computing. This is a good thing, although it may distract attention from exploiting the full potential of quantum computing for more general applications.

A qubit is more like a flip flop than a bit

People are always comparing and contrasting qubits and classical bits, but besides the fact that a qubit supports superposition and entanglement, or I should say even for a qubit which is not in superposition and not entangled, a qubit is still not comparable or contrastable with a classical bit. A much more apt comparison would be between a qubit and a classical flip flop.

The main point is that a classical bit is simply abstract information, independent of any physical realization or storage.

In other words, a qubit is storage plus information while a bit is purely information.

The realization of a classical bit comes in the form of an electrical signal, a memory cell, a magnetic spot on a storage medium, or the state of a flip flop or even the state of a transistor or classical logic gate.

A transistor is the basic unit of classical digital logic, the ability to switch a circuit on or off.

Two or more transistors can be combined to form a classical logic gate, such as AND, OR, NOT, NOR, NAND, NOR, and XOR gates. These gates do not store data, but they can manipulate data, performing logical operations on the electronic signals flowing through them.

Classical logic gates can then be combined to form flip flops, which are the smallest units of classical digital logic which can both store and manipulate data. This is essentially comparable to the purpose of a qubit.

A register is a collection of flip flops on a classical computer which store and operate on a sequence of bits as a unit, commonly a 32-bit or 64-bit integer. There is no comparable unit of storage on a quantum computer, although some programming tools may provide features to define a sequence of consecutive qubits as if they were a logical register to make it easier to initialize or retrieve those consecutive bit values as if they a contiguous value rather than distinct qubits. But even then, a quantum computer has no notion that consecutive qubits represent a number.

A memory cell is a single storage location in a memory chip. It’s capable of holding a bit, but it’s not the bit itself. After all, a bit can be moved around between storage locations and even transmitted over a network, so it make sense to distinguish a bit from a container which can hold a bit. A memory cell is the smallest unit of storing data electronically on a classical computer. But it cannot manipulate the data as a flip flop can.

In short, the realization of a classical bit remains distinct from the logical, abstract, information value of the bit.

Whereas a qubit is the realization of either a bit, a superposition of two complementary bit values, or an entanglement of the states of two qubits.

Note that magical word — state — a qubit has state, and a flip flop or memory cell has state, but a bit does not have state in any physical sense. All a bit has is its logical value, separate from the transient realization of that bit as it is moved from medium to medium, while a qubit and its information value stays right where it always is.

In quantum mechanics and quantum computing we speak of a wave function which represents the complete state of a quantum system, either a single qubit or two qubits which are entangled.

A quantum wave function is a mathematical formulation of the states of a quantum system.

The individual states of a quantum system are known as basis states or basis vectors.

For qubits there are two basis states — |0> and |1>, which are indeed each comparable to the classical binary values of 0 and 1.

But basis states are not quite the same as binary values. They are the actual states of the underlying quantum system, the physical qubit. There are a variety of techniques for realizing qubits, quantum spin being one of the most popular these days.

Spin has physical values of up and down, which correspond to the voltage levels of transistors, classical logic gates, and classical flops, but the point is that those voltage levels or magnetic intensities of storage media correspond to bits rather than being the bits themselves. And spin differs since it can be superimposed and entangled.

In short, the most sensible comparison is between a qubit and a flip flop or a memory cell.

We can speak of a qubit as being able to hold and operate on a 0, a 1, a superposition of both a 0 and a 1, or an entanglement of such a value between two qubits.

Granted, a classical flip flop or memory cell doesn’t support superposition or entanglement, but other than that, they are the closest analogy between a concept from quantum computing and classical computing.

If we were starting anew, it would make more sense to refer to a quantum flip flop or a quantum memory cell or a quantum processing element, focusing on what the hardware device does rather than the information that passes through the hardware device.

But, we’re now stuck with this historical decision, so we have to live with it. Still, we at least have to acknowledge and accept what these twin concepts — qubit and classical bit — really mean, both in terms of definitions and operationally.

Quantum information

At some level, quantum information is similar to classical data and information, but at other levels it is very different.

You can still place a classical binary 0 or 1 in a qubit. At that level the quantum information is the same as the classical information.

But if you superimpose a 0 and a 1 in a qubit, now you have something very different.

Although, we do have the ability to simulate a quantum computer on a classical computer, so at that level the concept of information is the same between quantum computers and classical computers.

And, we can always define a composite data value or object on a classical computer which can indeed represent the state of two separate values, 0 and 1, as well as an indication of superposition and entanglement, as well as probability amplitude, so that we indeed represent the same information as a quantum computer, only less efficiently.

Beyond individual bits and qubits, classical computers have a very rich vocabulary for information, with integer and decimal numbers, and text, none of which have a direct analog in the world of quantum computing.

The bottom line is that algorithms for quantum computers must be couched in terms of the minimalist forms of information on a quantum computer, which means that classical algorithms tend not to transfer very well to the world of quantum computers.

Quantum states

Much is made of the concept that a quantum computer with n qubits can represent and operate on 2 to the n distinct and superimposed quantum states, all at the same time. A 4-qubit machine has 16 states, an 8-qubit machine has 256 states, a 20-qubit machine has over one million states, and a machine with more than 32 qubits has more states than you can even count in a 32-bit integer on a classical computer.

This is indeed true, to a large extent, but doing anything with all of these simultaneous states can be quite tedious and even problematic.

And each of those quantum states is a single bit, 0 or 1, not some traditional, classical value such as a 32-bit integer, 64-bit floating-point number, a name from a database or text from a book or email message, a high-resolution photograph, song, or video — just one, single bit.

In short, conceptualizing, designing, and realizing quantum algorithms which utilize superimposed quantum states is a very real challenge. Not impossible, but not exactly easy either.

Quantum logic gates are instructions (software), unlike classical logic gates which are hardware

Another annoying, historical accident of quantum computing is that instead of focusing on quantum instructions or quantum operations, they decided to call them logic gates, even though a quantum logic gate is not a discrete hardware device like an AND, OR, NOT, XOR, or flip flop in the classical digital logic world of electrical engineers.

Some of the visual tools and diagramming methods for composing quantum programs make it seem as if you are wiring logic devices, when you are simply listing out quantum instructions or operations to be executed sequentially in the order which they are specified. The “wires” are simply a shorthand for referring to the numbering of the qubits (0, 1, 2, …, n).

Quantum logic circuits are instruction sequences, not electronic circuits

When quantum people refer to circuits, once again, they are simply referring to a sequence of instructions of operations to be executed in the sequence written, rather than being “wired” in the same manner as a classical digital logic circuit.

Actually, the execution of a quantum logic circuit is not quite that simple since a significant fraction of the gates (instructions or operations) may be executed in parallel if and when they don’t depend on the results of other gates. Gate execution needs to be scheduled to reflect any dependencies which are implied by the references to qubits and the “wiring” in the individual gates.

The quantum programming language or firmware of the quantum computer will do all the required scheduling and sequencing, driven by the qubit references and “wiring” of the quantum logic circuit.

Bits, gates, and instructions

Just to recap and summarize some of the potentially confusing distinctions drawn in the preceding sections:

  1. bit. Pure information, a 0 or a 1, not associated with hardware per se, although a representation of a bit can be stored, transmitted, and manipulated by hardware.
  2. classical bit. A bit on a classical electronic device or in a transmission medium.
  3. classical logic gate. Hardware device on an electronic circuit board or integrated circuit (chip) which can process and transmit bits, classical bits.
  4. flip flop. A classical logic gate which can store, manipulate, and transmit a single bit, a classical bit.
  5. register. A parallel arrangement of flip flops on a classical computer which together constitute a single value, commonly a 32-bit or 64-bit integer. Some programming tools on quantum computers may simulate a register as a sequence of contiguous qubits, but only for initialization and measurement and not for full-fledged bit and arithmetic operations as on a classical computer. Otherwise, a qubit is simply a 1-bit register, granted, with the capabilities of superposition and entanglement.
  6. memory cell. An addressable location in a memory chip or storage medium which is capable of storing a single bit, a classical bit.
  7. quantum information. Information on a quantum computer. Unlike a classical bit which is either a 0 or a 1, quantum information can be a 0 or a 1, or a superposition of both a 0 and a 1, or an entanglement with the quantum information of another qubit.
  8. qubit. Nominally a quantum bit, which represents quantum information, but also and primarily a hardware device capable of storing that quantum information. It is the quantum equivalent of both a bit and a flip flop or memory cell which stores that information. But first and foremost, a qubit is a hardware device, independent of what quantum information may be placed in that device.
  9. information. Either a bit (classical bit) or quantum information, which can be stored in a flip flop or memory cell on a classical computer or in a qubit on a quantum computer.
  10. instruction. A single operation to be performed in a computer. Applies to both classical computers and quantum computers.
  11. quantum logic gate. An instruction on a quantum computer. In no way comparable to a classical logic gate.
  12. logic gate. Ambiguous term whose meaning depends on context. On a classical computer it refers to hardware — a classical logic gate, while on a quantum computer it refers to software — an instruction.

Not all qubits can be entangled together

Entanglement of qubits in a quantum computer requires a direct connection between the qubits, which is a significant piece of hardware. These connections are not just simple wires, but devices such as resonators or cavities which are activated using microwaves, at least in some cases, although different technologies for implementing qubits use different entanglement approaches.

Because of the expense and difficulty of these inter-qubit connections, it is most common that qubits can only connect with or be entangled with a limited number of other qubits, commonly only their nearest neighbors, and also there tend to be restrictions on qubits such that only some qubits can be the source for entanglement and only some can be the target for entanglement.

In some cases, there may be only a single qubit to which a given qubit can be entangled, and sometimes the entanglement can only be created in only a single direction.

In other cases there may be two or even three choices, but those choices may be in only a single direction.

Even in the best of cases, there maybe only four choices.

In any case, variability of what qubits can be entangled adds a whole new level of challenge to algorithm design for quantum computers. Qubit selection must be made with care.

Need for fully-connected qubits

As just noted, significant restrictions on connectivity of qubits is a major obstacle to rapid adoption and design of advanced quantum algorithms.

Put simply, we need fully-connected qubits as soon as possible. Or at least a lot more degrees of connectivity.

Granted, some simple algorithms do not require full connectivity, and that may indeed be the norm in the very near future, but much greater connectivity will be needed for quantum computing to break out of the novelty and niche stage in which it is currently deeply entrenched.

Coincidentally, while writing this paper I saw in the tech news that NSF (the National Science Foundation) had just awarded a 5-year $15 million grant to a consortium of universities to work on a Practical Fully-Connected Quantum Computer. As the NSF press release notes, the top two goals are:

  • Develop a quantum computer with a sufficiently large number of quantum bits (qubits) to solve a challenging calculation.
  • Ensure that every qubit interacts with all other qubits in the system, critical for solving fundamental problems in physics.

See:

Now, whether this project achieves these goals remains to be seen, and it might not come to fruition for another three to five years, or more, but at least they deeply understand the critical need.

The good news about this effort is that it will all be 100% open and transparent, nothing proprietary, hidden, or secret.

Ride the wave function, until it collapses

As noted in the introduction, the basis of quantum computing is the quantum wave function. It represents the complete state of the quantum computer, in terms of probabilities for each of the quantum states.

The wave function is what enables superposition and even entanglement.

Actually, there is a separate wave function for each qubit or pair of entangled qubits.

This is all great.

The only real downside is that you cannot examine the wave function of a quantum system (each qubit or pair of entangled qubits) without causing it to collapse.

Collapse of the wave function is not a complete loss, since it collapses into a specific value or 0 or 1 for each qubit which you attempt to observe or measure, but you do lose the probability values (called probability amplitudes) for each of the superimposed or entangled quantum states of the qubit.

Basically, you can observe or measure the state of each qubit once you are done computing with that qubit. Yes, the full quantum state of that qubit, its wave function, is then gone, but if you are really done with the computation, that’s not a problem.

Again, all of that is fine, just as long as you take care not to attempt to observe or measure the state of the computation before it has completed.

As we will see, this severe downside to observation of quantum state presents issues for debugging, logging, tracing, and capturing intermediate results, but hopefully this will be a small price to pay for the raw speed and flexibility of quantum computing.

Major debugging challenges for a quantum program

No serious programmer on a classical computer would even imagine giving up their debugger, but such a tool isn’t even possible on a quantum computer. Since examination of quantum state (the wave function) of a qubit is not even theoretically possible, quantum programs present these major challenges:

  1. You can’t examine the wave function of a quantum system or qubit — you can only measure or observe a single observable, which then causes the rest of the wave function to collapse.
  2. You can’t examine the probability amplitudes for the basis vectors of a qubit.
  3. You can’t single step or set breakpoints in a quantum program, examine quantum state, possibly even change the quantum state, and then continue execution.
  4. Quantum decoherence means that even if you could stop and examine state, you could not pause and think about it since the quantum state will tend to decay within a very tiny fraction of a second — less than one 10,000th of a second, 90 microseconds, on the 50-qubit IBM quantum computer.

You simply have to run the quantum program to completion and examine the results once the program is complete.

This essentially means that quantum programmers must be much, much, much more careful about writing correct, bug-free code since debugging simply isn’t an option.

But, as discussed in the next section, quantum simulators present at least a partial solution.

And a subsequent section mentions a trick to get around these limitations.

Quantum simulators to the rescue

Smaller quantum computers and algorithms of comparable complexity can be simulated fine on existing classical computers. Quantum simulators run a lot slower, for sure, but for smaller algorithms that’s not a problem at all.

The biggest benefit of running a quantum algorithm on a simulator is that it doesn’t require access to an expensive, real quantum computer.

Granted, there are now cloud-based quantum computing services, so you can queue up a program to be automatically executed when a quantum computer becomes available, but that can be tedious, cumbersome, and ultimately expensive if your algorithm is relatively small and you need to do a lot of experimentation with many variations.

Where’s the breakpoint, where a simulator is no longer a viable and preferred choice? Hard to say.

It will depend in large part on how long your simulation runs take.

Quantum programs which execute within seconds or minutes of a quantum simulator are a slam dunk for not needing a real quantum computer.

If your simulated algorithm takes an hour or more, it depends on what the turnaround time and cost is for the cloud-based quantum computing service.

And if your algorithm takes several hours to run on a simulator but completes within seconds on a cloud-based service for a real quantum computer, then it’s clear what the right choice is — unless you have a limited budget.

There’s a second breakpoint — when your algorithm requires significant time or significant iterations so that a cloud-based service might start to become prohibitively expensive, so that you need to consider a dedicated quantum computer, but most users are not going to have such demanding requirements.

There is one other reason why a simulator might not be appropriate or at least have more limited value, and that comes from the probabilistic nature of quantum computation. Although good simulators can be configured with sophisticated noise models to enable them to more closely approximate the non-deterministic nature of a real quantum computer, these models are not so perfect. If your final algorithm needs a real quantum computer for performance reasons, it won’t be terribly helpful if the simulator delivers significantly different results than the real quantum computer.

The hybrid mode of operation is another good reason why a simulator may make more sense. If the hybrid algorithm is rapidly bouncing between classical code and quantum code, that may eliminate much or all of the performance advantage of running directly on a real quantum computer.

Not all cloud-based quantum computing service providers will provide the same level of support for the hybrid mode of operation.

A huge advantage of quantum simulators is that they remove three of the major challenges of development of quantum programs:

  1. You can’t examine the wave function of a quantum system or qubit — you can only measure or observe a single observable, which then causes the rest of the wave function to collapse.
  2. You can’t examine the probability amplitudes for the basis vectors of a qubit.
  3. You can’t single step or set breakpoints in a quantum program, examine quantum state, and then continue execution.

You can indeed accomplish all three on a quantum simulator since the simulator is simulating the physics of a quantum computer and thus maintains its full state, or at least an approximation of its full state, including wave functions for qubits and probability amplitudes for individual basis states of qubits.

That’s in theory, but whether a given quantum simulator supports such debugging features will vary. Check the documentation.

The ultimate downside of a quantum simulator is that it is not an exact replica of the activity of a quantum computer, but it can be close enough for many applications.

At the time of this writing, I am not aware of quantum simulators which provide features to address all of the quantum limitations, but at least in theory such features could be developed.

High-performance quantum simulators

Not all quantum simulators are created equal. Any of the following would still be considered a quantum simulator:

  1. A quantum simulator running on your low-end laptop computer.
  2. A quantum simulator running on your high-end laptop computer.
  3. A quantum simulator running on your high-end desktop computer.
  4. A quantum simulator running on a low-end cloud server.
  5. A quantum simulator running on a quantum cloud service provider’s server of unspecified performance.
  6. A quantum simulator running on a high-end, but still general-purpose cloud server.
  7. A quantum simulator running on a cloud server which has been optimized for quantum simulation.
  8. An unoptimized quantum simulator.
  9. A modestly optimized quantum simulator.
  10. A moderately optimized quantum simulator.
  11. A heavily optimized quantum simulator.
  12. Specialized classical hardware, such as GPUs, FPGAs, and even custom logic, all optimized around quantum simulation. Still not even close to competing with a real quantum computer in terms of raw physics, but still boosting simulation significantly.
  13. Simulation partitioned over multiple processors, even a large number of processors, even to the point of a single processor per qubit.

Different users will have different needs, so even a very low-end quantum simulator may be just fine for some applications.

Quantum simulators are commonly thought of as simply a convenience or a cost savings compared to running on a real quantum computer, but the ongoing limitations of real quantum computers opens the prospect of quantum simulators which actually have a higher-level of capability even though performance may not be competitive with real quantum hardware.

For example, a high-performance simulator can offer:

  1. Much better performance than other quantum simulators.
  2. Greater qubit count than readily available quantum computers.
  3. Better qubit connectivity (entanglement) than real quantum computers.
  4. Hybrid memory for easier initialization and measurement of qubit state, especially for higher qubit counts.
  5. Specialized quantum logic gates for higher performance and simpler code.
  6. Greatly facilitate debugging of more complex quantum programs, including single step and full access to all details of quantum state (wave function), including probability amplitudes.

Any combination of these capabilities can provide a dramatic benefit over a garden-variety quantum simulator running on the cheapest commodity hardware.

Trick for debugging on a real quantum computer

As previously noted, you can’t just stop a quantum program in the middle of execution on a real quantum computer, examine the quantum state, and then resume execution as if it had never stopped.

Because of severe limitations caused by quantum decoherence, by definition, a quantum program must complete execution before the quantum state begins to encounter errors. For the 50-qubit IBM quantum computer this is 90 microseconds, one 10,000th of a second.

But there’s an upside to that short duration — you can afford to rerun the program many times in a very short amount of time.

The trick for simulating single stepping on a real quantum computer is to do a bunch of runs, each with only a subset of the full program — a prefix, starting with the first logic gate of preparation for the program — so that you can measure each qubit upon completion of that subset (program prefix.) Each subsequent subset will have only one more logic gate than the previous subset. And then run each subset multiple times to get a handle on the statistical distribution for each qubit at the end of that subset since a quantum computer is probabilistic rather than deterministic.

To be clear, you can’t continue execution after measuring the qubits — you have to start over with a fresh program execution for each subset (prefix of the full program), but that’s okay since execution is so fast.

For example, if a quantum program had 10 logic gates, you would have nine subsets in addition to the full program, the first subset with only the first logic gate, the second subset with only the first and second logic gates, the third subset with only the first three logic gates, and so on, up to the ninth subset which has all but the last logic gate of the original program.

You might run each of the ten programs (nine subsets plus the full program) 100 times to get a clear statistical account of quantum state after each gate of the execution.

In classical computing we call this brute force. It may not be pretty and elegant or the optimal or preferred approach, but it will work.

One downside or limitation is that you wouldn’t see all combinations of all intermediate states, just the snapshots at each interval of time. Again, this is due to the probabilistic rather than deterministic nature of quantum computing.

It would be interesting to compare the results of such a simulation of single-stepping with actual single-stepping on a quantum simulator. But even if everything is done absolutely correctly, there is no guarantee of identical results.

Standard mode of operation

Any realistic quantum program is going to need real data from somewhere, which will need to be used to initialize the state of the qubits of a quantum computer, a process called preparation.

A classical computer program will need to execute all required logic to fetch the real data from wherever it is normally stored in the real, non-quantum world.

It will then need to massage the data to organize it into initial qubit values.

It will then need to request the execution of the the steps for preparation of the initial qubit values, followed by the quantum logic circuit which represents the quantum program which does the actual quantum computation.

The last stage of the quantum program, called measurement, is the sequence of instructions to read any number of qubits to observe their final values, the results, which will be a string of 0’s and 1’s — bits.

Once the classical computer program has retrieved those measured results from the quantum computer, they need to be post-processed. The string of 0’s and 1’s — classical bits — will need to be interpreted in some way.

Sequences of bits might represent values, like numbers, or as is commonly the case, the positions in the overall sequence may define the meaning of each bit.

In fact, it may be that only one bit is supposed to be 1, to indicate by its position what the actual answer is. Or some number of 1 bits indicate multiple possible results.

It may also be the case, that the quantum algorithm merely selects a subset of the total data for subsequent classical processing.

In any case, post-processing is the stage or step of actually doing something with the results of the quantum computation

Preprocessing and post-processing

Quantum computers don’t process data in the same form as a classical computer. This means that before processing by the quantum computer, the application must fetch and transform the classical data into the form which the quantum computer can handle. This stage is called preprocessing.

The exact nature of the steps of preprocessing will vary from application to application.

Similarly, the results of a quantum computation are rarely in a form that an application can directly process. This means that before continuing processing on classical computer, the application must transform the quantum results into a form which the classical computer can handle. This stage is called post-processing.

The exact nature of the steps of post-processing will vary from application to application.

Note that preprocessing occurs before quantum preparation, and that post-processing occurs after quantum measurement, and that both occur on the classical computer, in contrast to preparation and measurement which both occur on the quantum computer.

Preparation and measurement

The qubits of a quantum computer must be initialized to their initial state before the actual quantum computation can commence. This stage is called preparation or more formally, quantum preparation.

Once the quantum computation has finished, the results must be captured and sent back to the classical computer. This stage is called measurement or more formally, quantum measurement. It is also called readout as well — same thing.

Note that preparation occurs after preprocessing, and that measurement occurs before post-processing, and that both occur on the quantum computer, in contrast to preprocessing and post-processing which both occur on the classical computer.

Hybrid mode of operation

Not all problems will be able to be completely solved by a discrete quantum program.

Instead, the original problem may need to be broken into a sequence of smaller problems — partitioned into chunks, each of which can be processed by its own quantum circuit or quantum program, and then a classical computer program can integrate the individual results from each of those smaller problems.

That’s the simplest form of the hybrid mode of operation — a simple, linear, multi-step sequence.

The next level of complexity is conditional execution, which quantum computers do not support.

Instead, the classical program sends shorter quantum circuits to calculate intermediate results, which the classical program retrieves and then makes a decision about what alternative quantum circuits should be conditionally executed.

Iteration can be added as well in a similar manner, with the classical program examining the intermediate quantum results and then deciding whether and how to re-execute that same quantum circuit.

In fact, conditional and iterative execution quantum circuits can be combined in any arrangement, such as nested iterations or conditional execution within an iteration.

The only downside is that the measurement process which is needed to obtain intermediate results has the downside of what is called collapse of the wave function, where any quantum state of superposition or entanglement of a qubit vanishes when that qubit is measured. It doesn’t completely vanish, but it collapses to one of the two basis states, |0> or |1>, based on the probability computed and maintained for each basis state as the qubit evolves as each successive quantum logic gate of the logic circuit is executed. The qubit is then 100% in the measured state, the one into which it collapsed.

It will vary from program to program whether the loss of quantum state for superposition and entanglement is a big deal, a major annoyance, a minor annoyance, or not a problem at all.

It could be that the quantum state for that particular circuit is no longer needed.

In an iteration, i could be that each iteration needs to do a different form of preparation anyway.

In conditional execution, it will vary whether the previous quantum state is needed for the conditional logic circuit. It may not be needed at all, or if it is needed, the previous circuit could simply be re-executed, but without state-collapsing measurement, followed by the conditional logic circuit — all as one, contiguous circuit.

Even if there are a number of conditional branches in a sequence, each branch can simply begin with as much of the preceding circuits as needed to reconstruct any needed quantum state, if it is needed at all.

The intermediate results may literally be intermediate and thrown away after being examined by the classical program, or they may be incremental final results, to be accumulated and saved or processed when the program finishes all of its quantum processing. Or, that saving and processing may be done incrementally as well.

The hybrid mode of operation works for quantum simulators as well as real quantum computers.

Hybrid mode of operation over a network

The hybrid mode of operation is facilitated if the quantum circuits are being executed locally in a quantum simulator, but conceptually this model still works if the quantum computer is remote and must be accessed over a network.

Whether network latency is a serious problem for a given program depends on a lot of factors. In some or even many cases it may be a non-problem — a few network accesses a second for a few minutes might be bearable, but if many thousands of conditional branches and iterations are needed, it could be problematic.

Not all cloud-based quantum computing services will support this type of hybrid mode of operation.

Quantum decoherence

As appealing as the hybrid mode of operation may be, one must take care to be aware of the prospect of quantum decoherence — the decay of quantum state, either due to environmental noise of simply the nature of quantum mechanics and the particular technology used to implement qubits.

What this means is that a program cannot expect the quantum state of the qubits of a quantum computer to remain correct for more than a relatively short period of time, on the order of less than one ten-thousandth of a second.

This won’t tend to be a problem when relatively short quantum logic circuits are being executed — which is the common case, at least today.

And if conditional and iterative execution in the classical program takes care to reset the quantum state anyway as part of each incremental circuit execution, it shouldn’t be a real problem.

Still, this is an important, very critical aspect of algorithm design for quantum computing.

Quantum computer as a coprocessor

The hybrid mode of operation is essentially treating the quantum computer as if it were a coprocessor, not unlike the way a GPU or graphic processing unit is used for graphics, or the 8087 floating-point coprocessor for older PCs, which provided functions such as square root, exponentials, logarithms, and trigonometric functions, as well as support for arithmetic on larger integers, in addition to basic and extended floating point calculations.

Whether the quantum computing is done remotely on a real quantum computer, remotely on a high-performance quantum simulator, or locally on the user or developer’s personal computer, the concept is the same.

The only real difference from a GPU or floating point coprocessor is back to algorithm differences between quantum and classical computers — all of the operations performed by a floating-point coprocessor were integrated cleanly into existing algebraic expressions, with no visible algorithmic differences, whereas the use of quantum computing requires explicit algorithmic differences.

A GPU is closer to the coprocessor model for quantum computing in that a software developer can develop custom code to be executed by the GPU, in much the way a quantum logic circuit can be developed for a quantum computer, although I wouldn’t stretch that analogy too far since a GPU is still modeled on classical computing, with support for integers, floating point, algebraic expressions and classical control flow and data structures.

Hybrid mode of operation with the quantum computer as a coprocessor

To recap, the hybrid mode of operation with the quantum computer as a coprocessor, executing relatively short snippets of quantum code interspersed with classical logic and data processing as well should be a very viable model for quantum computation in the relatively near future.

Life without logging and tracing of events

Log files and tracing are basic and essential tools for both debugging and monitoring of the operation of complex software. Alas, neither fits in with quantum computing, where any attempt to measure or otherwise capture the quantum state within a quantum program will cause a collapse of the wave function, disrupting execution and corrupting qubit values.

This is not an issue for small quantum programs, such as the toy-like demonstration programs which are common today, but it is a big deal for larger, more complex systems, such as the kind for which the performance and capacity of quantum computing is essential.

The only real workaround at this stage is the hybrid mode of operation, whereby the overall quantum program is broken down into smaller chunks or fragments, each executed separately, allowing intermediate quantum state to be measured and captured so that the classical code can then log intermediate results before initiating the execution of the next chunk or fragment of the quantum program, making sure to reinitialize the data in the qubits and re-executing any quantum code needed to fully restore quantum state before continuing to the next step or stage of execution.

Lack of symmetry in quantum computers

One of the key factors which makes development of algorithms and code on a classical computer manageable is symmetry of resources, such as:

  1. A moderate number of registers — all alike.
  2. Large amount of memory — and all alike.
  3. Large amount of disk storage — and it’s all alike.
  4. A file system — and folders and files are all alike.
  5. Rows of a database table — they all behave the same.
  6. Elements of an array or list — they all behave the same.
  7. Lots of variables — and they all behave alike, or at least those which are of the same type.

Quantum computers? Not so much.

First, quantum computers don’t even have registers, memory, mass storage, or a file system.

Until you bring in entanglement, all qubits are the same, but once you introduce entanglement, which is essential for all but the most trivial of quantum algorithms, the symmetry evaporates.

Each model of quantum computer has its own peculiar rules for exactly which qubits can be entangled with which other qubits. There is usually a pattern of sorts, but the patterns will vary, so that if an algorithm depends on entanglement, it will have to be customized for the connectivity map of that particular model of computer.

Some qubits can be either a source or a target for entanglement. Some can be only a source. Some can only be a target. And with specific other qubits at that.

Commonly, entanglement is only supported with nearest neighbors, with the qubits arranged in a roughly rectangular grid or lattice.

The connectivity map will differ between 5, 8, 20, 49, 50, 64, 72, 128, and 2048-qubit quantum computers, and not in any clean, consistent, and predictable manner. Algorithms will have to be redesigned to exploit the particular connectivity patterns of each type of machine.

And if the particular type quantum computer does support only nearest-neighbor entanglement, the algorithm will have to be designed to be limited to such limited entanglement. Entanglement will only scale linearly, as the number of qubits grows, rather than exponentially the way states grow — n vs. 2 to the n. That’s not a hard and fast rule, but a reasonable approximation at this stage in the evolution of quantum computing hardware.

Need for linear memory

Classical computing is greatly facilitated by the availability of copious amounts of linear memory, as exemplified by the infinite tape of Turing machines.

More memory means more data can be processed.

A clean, symmetric, linear model for how memory is organized makes life a whole lot easier for the designers of algorithms and code.

In practice, under the hood, memory gets allocated in block, objects, or segments, but to the programmer a variable or object can be accessed symbolically without regard to how memory is organized on the bare machine.

Modern classical computers support virtual memory so that each program sees a vast, linear, symmetric address space for memory. That’s the critical requirement for supporting the more familiar concepts of variables, nested function calls, arrays, lists, tables, etc. which are critical to the rapid development of classical computer programs.

But, the state of the art for quantum computers is that they have no memory, and not even any registers other than the relatively few qubits — 5, 8, 20, 49, 72, or 128. Even the 2048 qubits of the largest quantum computers pale in comparison to the gigabytes of even relatively modest handheld computing devices.

But the real issue is not the limited number of qubits, as limiting as that is, but the fact that they are not organized in a clean, consistent, symmetric, linear manner which facilitates algorithm design and coding. Instead, the quantum programmer must think very hard and very carefully how each and every qubit is used.

Sure, we can and indeed will manage to struggle with this scarcity and complexity for at least a few more years, but sooner or later, we need to see an organization of quantum resources comparable to linear memory. It doesn’t have to be gigabytes, but even a single measly megabit would be a huge leap forward and dramatically facilitate algorithm design, coding, and application development.

But for now, we have what we have, so algorithm design will remain extremely constrained for the foreseeable future.

Sure, the hybrid mode of operation will ameliorate some of the difficulty, but it will be more of a stopgap than a true solution.

Hybrid memory

It is highly advantageous to have some hardware method for sharing storage between a quantum computer and a classical computer.

Granted, it is not even theoretically possible for the quantum state (wave function) of a qubit to be directly read by a classical processor, but at least the data paths between measurement operations and the classical portion of a hybrid program should be optimized and as short as possible.

Especially as the qubit count for future quantum computers begins to “soar” — many dozens, hundreds, thousands, even millions — explicitly measuring qubits becomes problematic.

Rather, it would be beneficial to be able to measure all or a selective but large fraction of qubits in a bulk operation.

And to have that single operation essentially transfer all measurements to classical member as rapidly as possible.

So, this is shared memory, but shared between a real quantum computer and a classical computer.

This same technique should of course be available when running with a quantum simulator, in which case the implementation should be much more direct.

Hybrid memory for initialization of quantum state for big data

That same method for access to hybrid memory should also be made available for initializing the quantum state of all or a selective fraction of the qubits of a real (or simulated) quantum computer.

This becomes much more important once the qubit count begins to “explode” — into the hundreds, thousands, and even millions of qubits — and it becomes desirable to tackle big data problems.

Once again, the prospect of explicitly coding and sending preparation instructions one by one seems particularly undesirable.

Life no longer well above the bit, byte, word, instruction, address, register, and memory level

Back in the 1940’s and 1950’s, programmers were accustomed to working at the level of bits, bytes, words, instructions, addresses, registers, and bare memory, but through the 1960’s and 1970’s software moved upwards in levels of abstraction so that programmers no longer needed to be steeped in those lower levels of machine operation.

But now, with the advent of quantum computing, all of those higher-order abstractions suddenly vanish, so that once again programmers are required to work in terms of the bare machine, except that now the bare machine is even more primitive than in the 1940’s, lacking even integers, floating point, branching, loops, and function calls. Or even registers, memory, data structures, and mass storage.

Of course this is another argument in favor of the hybrid mode of operation, offloading non-quantum tasks which are too difficult for a quantum computer to a classical computer.

I’m relatively confident that some future generations of quantum computers will restore at least some of these higher-order abstractions, but for now we have to live with the bare-bones austerity of a bare quantum machine.

Life without a stack

A pushdown stack is essential to classical computing, for:

  1. Function calls, including built-in math functions.
  2. Recursive functions.
  3. Function arguments.
  4. Local variables and local memory management on nested functions.
  5. Support for modules, subsystems, and object-oriented programming.

None of that is supported on a quantum machine.

On the flip side, if your algorithms and code can avoid the need for functions, then you don’t have a need for a stack. That’s the current model for quantum computing, which is great, if you can live without functions.

And of course this is another argument in favor of the hybrid mode of operation, offloading non-quantum tasks such as function calls to a classical computer.

Life without a memory heap

Dynamic memory allocation or heap allocation is essential for an application of any significant complexity, but this capability is not supported on quantum computers.

This is fine if your quantum programs are simple enough, which is the common case today, but likely not to be the case as a few more years go by.

And of course this is another argument in favor of the hybrid mode of operation, offloading non-quantum tasks such as those which are memory-intensive to a classical computer.

Databases, files, and I/O of data

A quantum computer has no conception of memory, data storage, databases, files, or I/O of data.

All of that has to be handled on the classical computer side of the fence, using the hybrid mode of operation.

In essence, a quantum computer is the equivalent of only the core of the central processing unit of a classical computer, which itself knows nothing about where data comes from or is destined, although a classical CPU does have a notion of memory, so this analogy does break down there.

Data modeling for the quantum world

Data modeling is difficult enough in the classical computing world, doubly (exponentially?) so in the quantum world and in the hybrid mode of operation.

Superficially, data modeling is much simpler in the quantum world since all you have are a relatively few qubits, each of which has a potentially superimposed quantum state, and with some qubits entangled. But this stark simplicity belies the complexity of modeling and mapping much more complex data models from the classical world into that simpler quantum world.

And data modeling does need to take connectivity (entanglement) into account as well. That adds significant complexity. Doubly so when there are restrictions and rules as to which qubits may be entangled and in which direction.

The hybrid mode of operation helps, but also adds complexity since some data will exist in both quantum form and hybrid or classical form.

Data modeling itself is beyond the scope of this paper.

The point here is that the design of quantum algorithms and quantum code must take into account the challenges of data modeling in the quantum and hybrid worlds.

Raw physics vs. intellectual power of Turing machine, mathematics, logic, and rich data types

The grand promises of quantum computers are all premised on the simple fact that quantum computing is essential raw physics — quantum mechanics.

Quantum logic gates are simply triggering quantum state changes — again, at the level of raw physics.

Raw physics is great for performance, speed, very low power, and the qualities of superposition and entanglement.

But those raw, awesome capabilities come at the cost of completely giving up all of the grand luxuries that classical computers promised and delivered over the past 80 years, like:

  1. The intellectual power and intellectual leverage of Turing machines.
  2. Operations which closely mimic traditional mathematics, with basic arithmetic and mathematical functions like square root, exponentials, logarithms, and trigonometric functions.
  3. Logical operations and Boolean logic, such as AND, OR, NOT, exclusive OR, and complex parenthesized expressions.
  4. Rich data types — integers, real numbers, complex numbers, symbols, words, characters, arbitrary strings of characters, and even natural language text.
  5. Object-oriented programming with user-defined classes of objects.
  6. Algebraic expressions.
  7. Vast libraries of functions, classes, modules, and frameworks which can be used across many applications.
  8. Wealth of high-level programming languages, each adapted to particular needs and capabilities, with sophisticated control structures and data structures.
  9. Multiple processors and multiple processing cores for parallel processing.
  10. Parallel processes and interprocess communication.
  11. Shared memory for super-efficient sharing of information.
  12. Sophisticated file systems for storing and organizing information.
  13. Sophisticated databases for creating and querying complex data relationships.
  14. Distributed computing across diverse networks, including the Internet.
  15. Complex digital electronic circuits based on very few basic circuit elements wired up in a combinatorial manner to perform complex and sophisticated operations at the speed of switching a transistor.

We have made tremendous advances in software and applications over those 80 years with just these few areas of intellectual leverage.

But… all of that must be discarded or taken away from us in order to get down to the level of running just raw physics.

Of course, even traditional digital electronics — and the entire universe, for that matter — is also based on raw physics.

But… the difference is that the design of quantum computers provides only those raw operations of physics for a very limited number of qubits, while individual classical logic gates can exploit a large number of atoms, rather than merely one or two quantum states.

You can’t simply go out and buy a large number of qubits and design them into a complex logic circuit as we can easily do with classical digital logic.

Even the most powerful quantum computers available today cannot simulate or replicate even a tiny fraction of the features of even our simplest personal computers, tablets, and smartphones.

Sure, quantum computers will progress by leaps and bounds, and eventually may indeed fully supplant all imaginable uses of classical computers, but the architectures of quantum computers for the foreseeable future are still far too primitive to make even a tiny dent in the intellectual power we get from the Turing machines, mathematics, logic, and rich data types which classical computers readily provide today.

That said, none of this is meant to in any way denigrate the raw potential of raw physics, but simply to highlight that very little of our current intellectual tools can be ported over the the realm of designing algorithms to productively and fully exploit the raw physics of quantum computers.

Granted, the hybrid mode of operation does indeed let us have our cake and eat it too, but with a rather extreme tradeoff between the classical and quantum worlds.

Life without data types

Modern computing would seem unimaginable without the richness of data types we’ve grown accustomed to.

Granted, in the trade we get the awesome power of superposition, entanglement, and probabilistic results, but in the process we have to give up:

  1. Bit strings
  2. Bytes
  3. Integers
  4. Enumerations
  5. Reals
  6. Complex numbers
  7. Characters and text
  8. Classes and objects
  9. Arrays, lists, trees
  10. Databases

Life without object-oriented programming

One of the great innovations in classical computing in the 1990’s was object-oriented programming (or OOP, invented earlier, but evolving by twists and turns before becoming dominant in the 1990’s), with classes, inheritance, encapsulation, and polymorphism. It dramatically simplified, streamlined, and rationalized not only data structures and code within programs, but dramatically facilitated sharing of code and the creation and widespread adoption of sophisticated shared libraries.

But… with quantum computing, none of that is supported.

Eventually OOP may become supported on future quantum computers, but it’s not a huge problem today since the size of quantum programs today is so limited that the power of OOP would not make a big difference.

Once again, code and data which can exploit the power of OOP can simply be offloaded to the classical computer in the hybrid mode of operation.

Life without algebraic expressions

Surprising as it might seem, there is no capability for evaluating classical algebraic expressions on a quantum computer.

Classically, a compiler for a high-level programming language translates algebraic expressions into sequences of machine language instructions to evaluate the terms and operators in the expression.

But none of those instructions is supported on a quantum computer:

  1. Add
  2. Subtract
  3. Multiply
  4. Divide

And without those mst basic operations, there is no ability to evaluate classical math functions:

  1. Square root
  2. Exponentials
  3. Logarithms
  4. Trigonometric functions
  5. Statistical functions

Granted, all of this is still available on the classical side of the fence when using the hybrid mode of operation, but none of it can be used directly in the quantum world.

Life without hex numbers

One curious aspect of quantum computing is that there is no longer any need for hexadecimal notation.

There are no numbers, bit strings, character codes, or color codes in quantum computing, so there is no need to encode numbers as integers using hexadecimal notation.

Qubits are numbered with simple decimal numbers. And each qubit has two possible basis states. That’s it.

No great loss.

This is one place where austerity is a real and positive improvement over classical computing.

Okay, there may still be a need for hex numbers in the hybrid mode of operation, on the classical side of the fence, but no need for hex on the quantum side of the fence.

Life without rich mathematics

Most professionals come to computing with a fairly deep exposure to the richness of modern mathematics, much of which simply has no place in the very austere world of quantum computing.

There is certainly a lot of sophisticated mathematics used to master quantum mechanics, but the operations of quantum logic gates do not have the vocabulary to work with most of the concepts of modern mathematics — starting with simple evaluation of old-fashioned algebraic expressions, as well of more advanced mathematical functions.

Classical computing was based in large part on modern mathematics and logic, and Turing machines facilitate the creation of programs which embody much of modern mathematics.

At present, the austere landscape of quantum computing supports none of that, other than a handful of quantum logic gates.

Again, we must fall back to the hybrid mode of operation to utilize any of the intellectual power of mathematics in the context of quantum computing.

What do you mean I can’t copy a qubit?!!

A common code pattern in classical programming is copying a value from one variable to another — the second simplest assignment statement (after initializing a variable to a constant), but this action is not supported on qubits due to the no-cloning theorem.

Yes, you can copy a qubit which is neither superimposed nor entangled — it’s in a pure |0> or a pure |1> state, but superimposed or entangled quantum states cannot be copied.

In short, you cannot save the arbitrary state of a qubit, nor can you restore the arbitrary state of a qubit.

In practice, this restriction can be worked around and is not an insurmountable challenge, but it is one of the more annoying aspects of the quantum computing mindset that takes a bit of getting used to.

The tedium of assembly language and machine language

Quantum programs are still coded at essentially the same level as classical assembly language or machine language, one operation or instruction at a time, in the raw language of the computer hardware rather than in a high-level language which is closer to the level of algebraic expressions and logic.

Given the lack of symmetry of the hardware and lack of even registers and linear memory, this is necessary for the design of extremely concise programs and algorithms which can execute efficiently on the existing quantum hardware.

Some of this need for ultra-efficiency and hand-coding one logic gate at a time will be dissipated as quantum computers get a lot more qubits, but the lack of mathematical and logical operations will continue to make it problematic to accommodate evaluation of algebraic expressions and logic for the foreseeable future.

Quantum algorithm designers will have to remain resigned to having to think at the level of raw physics rather than in terms of algebraic expressions, Boolean logic, conditional branching, iteration, and functions.

That’s the bad news. The good news is that the hybrid mode of operation will allow hybrid algorithm designers to mix and match raw physics and classical computing techniques.

But even then, the bad news is that the full promise of quantum computing will be significantly constrained, shackled.

Future generations of quantum computers may relax or eliminate such constraints, but that prospect will remain well beyond the horizon for some years to come.

Richness of classic computing vs. extreme austerity of quantum computing

Just to recap, anybody moving from classical computing to quantum computing is in for extreme culture shock, having all of the richness of modern computing and software ripped away from them and being faced with the extreme austerity of bare physics and discrete hardware instructions.

Back to square one for algorithm design

All of the many, many thousands of algorithms and many, many millions of lines of code which exploited the intellectual power of classical computers will have to be set aside. Discarded, literally. Okay, not fully discarded since the hybrid mode of operation can still utilize traditional algorithms and code to at least some extent, but all the new quantum features will require a whole new approach.

Literally, it really is back to square one for design of algorithms for quantum computers.

Need for new building blocks for quantum algorithms

Since quantum computers don’t even have support for simple numbers and text, let alone algebraic expressions, new basic building blocks for algorithms are needed for the world of quantum computing.

The starting point is raw physics, the quantum mechanical properties which control the behavior of qubits.

Basic quantum logic gates are extremely primitive building blocks. New forms of abstraction are needed. And new layers of abstraction.

Directly working with the basic quantum state of qubits is too primitive. Some more structured forms for data, operations, and control flow are needed.

What are the essential Lego blocks for quantum computing?

As things stand, the small, limited set of quantum logic gates are the software equivalent of Lego blocks for construction of large-scale (or any scale) applications for quantum computing, but the question is whether that really is the proper level of abstraction.

In the classical computing world we have a variety of levels of abstraction to choose from:

  1. Transistors.
  2. Boolean logic gates.
  3. Machine language instructions.
  4. Macro assembly language.
  5. Algebraic expressions and statements in high-level languages.
  6. Objects and classes in object-oriented programming languages.
  7. Modules.

What we need in the quantum world is greater clarity about what the Lego blocks should be for construction of quantum applications.

For now, we’re stuck with basic quantum logic gates.

But that will severely limit the rate of adoption for quantum computing.

There are two bottom lines here:

  1. Extremely limited hardware limits the degree of software abstraction which can be supported.
  2. Lack of imagination on the software front starves the hardware guys from receiving any useful input about what is really needed on the hardware front to unshackle quantum software.

How to get around that? Co-design.

Need for co-design

There needs to be a much greater and tighter feedback loop between the hardware guys and the software guys. It’s a classical problem and primarily an organizational problem, a cultural problem.

A recent NSF grant (PFCQC: STAQ: Software-Tailored Architecture for Quantum co-design) for trapped-ion quantum computing promotes the concept of co-design, which explicitly focuses on bringing together the full spectrum of fields — physicists, computer scientists, and engineers — needed to design the full stack, especially the software stack, so that quantum hardware and quantum software will dovetail in a more synergistic manner than appear to be in endless conflict.

Software suffers from a paradox:

  1. Software is infinitely malleable.
  2. Limited hardware can severely limit the application of that otherwise unlimited malleability.

Much more extensive use of co-design is needed.

Until that happens, adoption of quantum computing will be needlessly stunted.

Need for new metaphors for quantum algorithms

So many of our classical algorithms are based on metaphors derived from our macroscopic world and the notions of classical mathematics, classical logic, and classical computer science, but they have two difficulties:

  1. They don’t have direct, efficient, powerful analogies in the quantum world.
  2. They don’t exploit the potential and capabilities of the quantum world.

So, we need to throw away or at least push aside most of our classical notions and approaches and take a de novo or tabula rasa approach to finding and creating metaphors for quantum algorithms.

Sure, we can adapt, twist, and distort many existing metaphors and design patterns to mash them into the quantum world, and that will work, in a fashion, but only to a limited degree, and it will leave much of the true promise of the quantum world untapped and unexploited.

Need for new design patterns for quantum algorithms and code

What’s the difference between a design pattern and a metaphor?

They do indeed have a lot in common, having the same purpose, to guide an approach to solving a problem.

The difference is somewhat superficial:

  • A metaphor draws a strong mental image or even a visual image which is an analogy to some real-world phenomenon which the designer will be intuitively familiar with. There are rules to be followed, but they parallel the real-world phenomenon.
  • A design pattern is a collection of rules which make sense once you learn them and gain experience using them, but initially have no visceral appeal or meaning per se. Sometimes there are indeed metaphors behind design patterns, but that may be more the exception than the rule. The point is that design patterns represent abstractions, unconstrained by any real-world phenomena or visual imagery.

As noted, as powerful as metaphors can be, there will be many situations in the quantum world for which no sensible metaphors seem to apply. Those will be the situations where design patterns will be needed.

What’s the algorithm for designing an algorithm?

Unfortunately, design of algorithms is more a craft than a science.

There is no clear roadmap for design of a new algorithm, but some of the components are:

  1. Clear understanding of the requirements. What are the expectations for what the algorithm is supposed to accomplish?
  2. Clear understanding of the raw materials at hand. The machine and its basic features.
  3. Review of the literature. What existing algorithms accomplish something similar or parts of what is needed?
  4. Intuition.
  5. Creative insight.
  6. Experience.
  7. Deep technical knowledge.
  8. Willingness to take risks.
  9. Willingness to think outside the box. Try something completely and fundamentally new.
  10. Diligence.
  11. Persistence.
  12. Perseverance.
  13. Trial and error.
  14. Willing to “wing it” on occasion, as often as necessary.
  15. Testing.
  16. Feedback.
  17. Willingness and presence of mind to throw it all away when an approach is unworkable and start over, possibly even from scratch.
  18. Willingness to set ego and pride aside and switch to a competing approach when merit suggests better opportunities.

Does the process of designing algorithms differ between a classical and a quantum computer?

Not in any critical manner. They have a lot in common.

The key difference at this stage is that designers of quantum algorithms don’t have a deep and rich library of existing literature and experience to use as a foundation or launch pad. They need to be willing to break new ground and blaze new trails.

Evolution vs. iteration

Classical algorithms and classical code are based in large part on the concept of iteration — sequencing though lists of items and lists of alternatives.

That may be great for classical computers, but just doesn’t fit in with the quantum world.

The heart of the quantum world is evolution. All items are evolving in parallel.

There is still sequencing in the quantum world, but it is time sequencing or time evolution, with all items qubits evolving in parallel, at the same time.

This is a major change in mindset for people coming from the classical world.

Although, it is not so different from the world of distributed computing, where each computer system is operating completely in parallel with all other computer systems. That’s true, but people struggle immensely trying to get distributed systems to function properly.

In summary, step one in contemplating a quantum algorithm is how exactly you expect all of the elements of the quantum computation to be evolving in parallel, interacting (entangling) as needed.

Quantum computing still has iteration, but it is iterating through time, all items in parallel, rather than iterating item by item.

Need for a quantum Turing machine

The conceptual and intellectual power of classical Turing machines has been an incredible intellectual lever to make progress against a vast array of problems. Yes, quantum computing offers some exciting new horsepower leverage, namely superposition and entanglement, but it is still missing the kind of intellectual leverage which Turing machines have provided over the past 80 years.

What might a quantum equivalent of a Turing machine look like? Nobody knows. They’re not even looking in that direction. At this stage, researchers are still struggling to master the basics of the quantum equivalents of transistors and Boolean logic gates, so higher levels of intellectual leverage haven’t been getting any significantly real attention.

And this continues to beg the question of whether we will be forced to limp along with a hybrid mode of operation, with classical Turing machine concepts for a large chunk of computing, or if a whole new form of Turing machine will evolve that is much more suitable for the quantum world.

In any case, without any replacement for the old-fashioned work-horse Turing machines, quantum computing will fall well-short of its true potential.

Intellectual leverage is everything in computing

Raw computational power, performance, and capacity are quite important, but it is intellectual leverage which dwarfs and dominates the computing world.

Hardware is the relatively easy part. It’s the software which is the real challenge.

Granted, we’re still in the very early stages of quantum computing, where the hardware is still a huge challenge, and may in fact be blocking any significant progress on the quantum software front, but any year now that will change.

Hardware vendors are talking about 49, 72, and now even 128-qubit quantum computers.

If that’s not enough, another one or two or three iterations of hardware, taking us to the 256, 512, and 1,024-qubit level, may finally be enough hardware so that software will finally emerge as the big stumbling block, much as it did back in the 1960’s when classical mainframes and minicomputers were finally powerful enough to enable relatively large software efforts.

The point here is not that you want a lot of software, but that you want a lot of intellectual leverage so that you can effectively exploit the raw power of the hardware.

And to re-make the central point of this paper, it is not raw size of software that is the issue, but the need for rich and powerful algorithms. So this thrust is twofold:

  1. Intellectual leverage to produce richer, better, and more powerful algorithms as basic building blocks.
  2. Intellectual leverage to use those algorithms as basic building blocks to build higher-levels of structure for even more powerful applications.

Need for learning quantum intuition

Design patterns may not initially seem intuitive, but as students of quantum mechanics are taught, one has to learn intuition to work effectively in the quantum world.

Design patterns provide viable and powerful shortcuts to better algorithms and better code.

Once a quantum designer has been exposed to a critical mass of quantum metaphors and quantum design patterns, they will be well on their way to developing the level of learned intuition which quantum computing demands.

A rich set of quantum metaphors and quantum design patterns will enable quantum designers to develop or learn an intuition for how to approach the development of solutions to real-world problems which are crying out for quantum solutions as a result of the limitations of classical computing.

Might there be a methodical approach to developing quantum intuition?

Maybe there might be some sort of road map which could be developed, which walks an aspiring quantum designer through a number of use cases (problems) and matching metaphors and design patterns. Ultimately, nothing short of extensive and prolonged experience may do the trick, but a pre-programmed road map would at least be a good start.

No such roadmap exists today, but at least that’s a good goal to work towards.

Need for libraries of quantum code

Countless libraries of functions, classes, modules, frameworks, and sample programs abound for classical computing, so that starting a new project doesn’t mean having to start from scratch.

Granted, it will take some time to compete with the vast code libraries accumulated over 80 years of classical computing, but the need is clear.

Open source and project and source code repositories such as GitHub will facilitate the process.

Even over 40 years ago, Microsoft was able to get a head start on their microcomputer tools using software readily available on magnetic tape from the Digital Equipment Corporation User Society (DECUS).

There already is a modest amount of quantum example code on GitHub today, but we’re still in only the very early days. And a lot of it is simply examples and very fragmentary rather than being ready to incorporate into realistic programs.

The fact that the quantum world doesn’t have rich code structuring tools such as compiled functions, classes, modules, and formal libraries, greatly impedes code sharing at any more than the level of example stand alone programs.

Frameworks for interacting with quantum computers

There are two distinct kinds of libraries relevant to quantum computing:

  1. Libraries of code which runs on quantum computers.
  2. Libraries for frameworks or utility functions which make it easier to interface with a quantum computer from a classical computer.

The previous section was focused more on the former, actual quantum code.

Here we are concerned with coping with the complexity of a raw quantum computer and its raw machine language. Frameworks are one of the best ways for coping with that complexity. A true quantum programming language would be the best approach, but absent such a language, frameworks help to fill the gap.

Two examples of current frameworks are Forest/pyQuil from Rigetti Computing and Cirq from Google — both for using Python on a classical computer, and both supporting both real quantum computers and quantum simulators. Both are open source software.

Open source software

Much of the progress of classical computing over the past 25 years has been greatly facilitated by open source software — software that is free and whose source code is freely and readily available, such as on GitHub.

Open source is a critical accelerator for any technology which has a software component.

And GitHub is the easiest way for even the smallest projects to rapidly come up to speed on both using and contributing to the vast base of open source software, both classical and quantum.

Proprietary software? Not so much.

Need for global catalogs for algorithms, design patterns, and code

We have source code repositories and general-purpose search engines, but we really need searchable and browsable catalogs for algorithms, design patterns, and code.

There is indeed a lot of great material and great projects in GitHub, but finding things can be quite tedious. Google Search is a great tool, but we need a lot better.

A key requirement is to have great, and very readable descriptions of what each algorithm, design pattern, and coding project accomplishes, in as plain language as possible so that keyword and phrase search can easily find matches.

As a starting point, NIST has a very interesting web page which provides descriptions and citations for almost 400 quantum algorithms. They call it the Quantum Algorithm Zoo. unfortunately, a number of the algorithms are hidden behind academic journal paywalls.

Need for taxonomies for algorithms, design patterns, and problems

We need taxonomies for algorithms, design patterns, and real-world problems.

We have a wealth of information and are rapidly generating a lot more information, but it desperately needs for organization. Types, hierarchies, relationships, subsets, derivatives, extensions — all the metadata that taxonomies can provide.

Code doesn’t need its own taxonomy per se since it provides solutions to problems and implementations of algorithms and design patterns, so code can piggyback off the other taxonomies.

A user or quantum designer should be able to find all instances of code being applied to a particular problem area by an identifier for that problem area, as narrow or as broadly defined as desired.

Similarly, a user or quantum designer should be able to find all instances of code which implements a particular algorithm, design pattern, or class of algorithms or design patterns, by supplying the identifier which identifies the particular algorithm, design pattern, or class of algorithms or design patterns.

Algorithms may or may not be tied to particular problem areas, possibly allowing search for algorithms by specifying the identifier for a problem area, as narrowly or broadly defined as desired.

Generally, code will be fairly tightly associated with either an algorithm or problem, so it wouldn’t generally need its own taxonomy, but there will also be libraries, frameworks, and subsystems which are not tightly associated with a particular algorithm or problem, having its own category. So, a taxonomy of utility code could also be provided as well. Or maybe utility could be considered a problem area, so that utility code could be associated with the utility problem it addresses.

Auxiliary keywords can be used to facilitate search by applicability of a specified keyword.

Of course free text search must be supported, but the goal here is to narrow the search results with a minimum of effort on the part of the user or quantum designer.

Algorithms and code are not the same thing

Loosely speaking, a lot of people, including otherwise competent professionals, will treat algorithms and code as being the same thing, but it simply isn’t true.

An algorithm is more of a higher-level specification of how the desired function or purpose is achieved, at a higher level, rather than at the level of raw code. It may have relatively detailed steps, but not as detailed as is required to actually be directly translated into machine language for a computer, classical or quantum.

Code on the other hand is in fact detailed enough that it can be directly and 100% translated into executable machine language instructions.

Code is always expressed in a programming language and then compiled into executable code or an intermediate form which can be interpreted as if it had been compiled into executable code.

Algorithms may sometimes be expressed in a form that superficially looks like code or at least a code-like language, but generally they are expressed in more expressive forms of language, such as:

  1. Natural language.
  2. Pseudo-code. A cross between code and natural language. It superficially looks code-like, but can be as loosely structured as natural language.
  3. Flow charts.
  4. Specialized design languages. They can certainly look as if they could be translated into executable code, but not quite or not completely.
  5. Mathematical equations.
  6. Informal or semi-formal rules, commonly in semi-structured natural language.
  7. State machines.
  8. Diagrams.
  9. Tables.
  10. Grammars.
  11. Patterns.
  12. Any of the above, as comments embedded in code.

There is frequently a significant effort to make algorithm specification as close to code as possible, but the true goal is to make the algorithm as complete, consistent, concise, and understandable as possible, even if that conflicts with the desire to make it as easy as possible to get from algorithm to executable code.

There are two other critical differences between algorithms and code:

  1. A single algorithm may have more than one code implementation.
  2. A single program or even a single function may utilize more than one algorithm.

In short, there is typically not a one-to-one correspondence between algorithms and code.

Mixing code and algorithms is not a good idea

As distinct as algorithms are from code, it is not uncommon for lazy individuals or those not well-trained in modern software development practices to skip the stage of specifying an algorithm explicitly and jump right into writing code.

Sometimes, especially for smaller programs, this is not too big a deal, especially if the algorithm is indeed very simple, but for any substantial algorithm it is only asking for trouble.

The first two questions which come to mind when reading code written by somebody else are:

  1. What is this code trying to do? What is its purpose? Besides stating this in natural language, it is very helpful to simply reference the name of the algorithm or algorithms being implemented by the code.
  2. What’s the algorithm which this code implements? Commonly, the details of the code needed to implement an algorithm obscure the algorithm itself.

The algorithm is the forest, at a high level, the code is the trees, at the ground level. Yes, you may indeed to examine individual trees, but first you want to comprehend the overall, high-level view of the forest as a whole. Blending the algorithm into the code risks not being able to see the forest because of the trees.

Algorithm as an outline for code

One simplistic but useful notion is that one can think of an algorithm as an outline for the code which implements the algorithm. This is more important than the code itself.

That’s simplististic since there can be any number of variations of code which are each valid implementations of the algorithm.

The main point is that having a written algorithm before you start coding gives you both a head start and a clear map of how you might proceed.

Without that clear roadmap as a starting point it is far too easy to get lost when coding.

And this roadmap also helps readers of the code to understand what the code is really trying to accomplish.

Need for a reference implementation

The important aspect of an algorithm is not that it is executable code, but that it provides an outline for any number of implementations of that one algorithm, each specialized as needed for the particular application and particular machine. That said, it is equally urgent to provide a reference implementation which gives quantum programmers significant guidance and insight as to at least one possible implementation.

The point of a reference implementation is not that it is the definitive implementation, but that it is a solid reference which quantum programmers can compare against, at least as a starting point.

In short, a reference implementation for an algorithm give the quantum programmer a head start.

And given the raw challenge of quantum programming, a head start is a critical need.

Commenting code — and algorithms

As explicit and detailed as code and algorithms may be, it is almost always helpful to have some natural language narrative which explains what’s happening in as simple and plain language as possible.

Some code and algorithm steps may be fairly obvious, especially for relatively simple code and relatively simple algorithms, but very commonly this is not the case.

Over-commenting can be problematic, but that’s far less a concern than the more common situation of under-commenting.

Derivative algorithms

It is not uncommon to encounter a problem where an existing algorithm is simply not good enough to solve the present problem to our satisfaction. This presents us with three choices:

  1. Discard the algorithm and design a fresh algorithm from scratch.
  2. Tinker with the existing algorithm, keeping as much as possible without change, but making minimal changes to satisfy our needs.
  3. Making major, maybe even radical, or at least significant refinements to the existing algorithm.

In any case, it is helpful to explain, in plain language, why the existing algorithm was insufficient, and why the refinements were needed, beneficial, and satisfactory.

Even if some refinements may seem beneficial, it is also commonly true that it is better to stick with the existing algorithm, 100% intact, to avoid introducing bugs and to facilitate comprehension by those who may read the code in the future.

It may also be true that it may be necessary for the code to deviate from the algorithm for any number of reasons. In any case, judicious comments explaining the deviations can be very helpful.

Algorithms may need to be optimized for the specific hardware

It is not uncommon that a general-purpose, textbook algorithm is too general-purpose or that the hardware on which it is to be implemented, especially given the quirky nature of each type of quantum computer, has idiosyncrasies which the code must work around.

This is to be expected in computing, even classical computing, but especially for quantum computing.

There are two choices here:

  1. Let the code deviate from the algorithm, but keep the algorithm intact.
  2. First create a derivative algorithm which can be more readily implemented on the target machine.

There may also be situations where a new model of quantum computer has improved or specialized features which permit a refined algorithm or code to exploit this hardware improvement.

There may also be situations where a smaller, cheaper computer is available, but it has fewer or less efficient features such that the algorithm or code need to be adapted to account for these deficiencies.

In any case, there should be sufficient commentary to elaborate what adaptations were needed.

Plan for portability

Portability refers to how easy it is to move code from one type of machine to another. It can be very easy, very hard, or anywhere along that spectrum.

Moving code from one type of machine to another type of machine is known as porting.

Some degrees or levels of portability:

  1. No change required. The code runs without modification on at least some other types of machines.
  2. Some configuration changes required. Change some symbolic parameters but the code itself runs without change.
  3. Very minor changes.
  4. Modest changes.
  5. Moderate changes.
  6. Extensive changes.
  7. Radical changes.
  8. Complete rewrite — but basic algorithm(s) remain intact.
  9. Complete redesign, including adaptation of algorithm(s).
  10. Start from scratch, including the need for new algorithm(s).

Another distinction is the difference between source code portability and binary and machine language compatibility. The source code could be 100% portable, but require that it be recompiled since other machines may have different binary representations for compiled quantum logic gates.

Not all types of quantum computers will necessarily have the same set of quantum logic gates (instructions). Some machines may have more, specialized gates, and some machines may have fewer, more universal gates.

There may be a subset of gates which are in common for some selected set of types of machines, so that if you plan ahead and limit your code to that common subset of gates, then you can more easily port your code between the different types of machines within that selected group of types of machines.

Another nuance of portability is that code which works fine on a quantum computer with a smaller qubit count can more readily be ported to a quantum computer which has a higher qubit count, but porting the other way, from a high qubit count to a low qubit count may be significantly more difficult or even impossible without a significant or even complete redesign.

This also emphasizes the distinction between algorithms and code — that an algorithm is an abstraction that doesn’t depend on many of the particular details of the type of machine, while code can be much more dependent on the type of machine.

Ultimately, the goal is for algorithms and code to be more portable between types of machines, but this is more of a distant aspiration than a reality at this stage of the evolution of quantum computing.

Hybrid algorithms

The hybrid mode of operation does not necessarily mean a hybrid algorithm — it may simply be a hybrid implementation of a simple algorithm.

But there may indeed be algorithms which are themselves hybrid.

Either way, it will become important to identify, document, and categorize algorithms in a way which makes clear their hybrid nature, so that quantum application developers can select the type of algorithm which makes the most sense for their particular type of quantum hardware, data, and classical computing environment.

Publication of algorithms

Algorithms are only useful if programmers know about them. Knowledge of algorithms comes from publication.

Publication of algorithms can range from formal, as in a professional, peer-reviewed journal or textbook, to informal such as a blog post.

Some other forms of publication include:

  1. A text file in a code repository, such as GitHub.
  2. A formatted document in GitHub.
  3. An unreviewed paper submission to ArXiv.org.
  4. A white paper on a technology website.
  5. Comments in code.
  6. Less formal book, such as a cookbook.
  7. Presentation slides for a technology conference.

And algorithms need to be cataloged to make them easier to find.

Rich high-level programming languages

As noted previously, quantum programming is currently limited to a mix of quantum machine language (quantum logic gates) coupled with classical code for the hybrid mode of operation, but this very primitive level of operation for quantum code dramatically limits its pace of adoption and growth in application complexity.

I have no doubt that as qubit count rises and qubit control hardware and firmware gets more sophisticated, then high-level programming languages will begin to appear.

My real hope is that we will have exotic new languages which facilitate equally exotic and new quantum computing metaphors and design patterns, as opposed to simply trying to shoehorn old-fashioned classic computing languages and metaphors into the radical world of quantum computing.

In any case, something needs to happen, otherwise quantum computing will not be able to achieve escape velocity from the realm of novelty and niche applications and an extreme technical elite working in a rarefied atmosphere separated from the bulk of the real world.

Need for a true quantum programming language

At just mentioned, existing models for programming are simply grossly inadequate for the quantum world.

Significant research is needed to achieve the goal of a language which can connect real world problems with quantum world solutions, a true quantum programming language.

Need for rich higher-level programming languages

High-level programming languages are only the starting point — the real goal is much richer higher-level programming languages which are closer to the semantics of application problem domains themselves, to reduce the need to transform from the language of the application domains to the jargon of computing itself.

Need for a natural language vocabulary for specifying real-world problems to be solved

Even before we get to designing a programming language for quantum computing, we need natural language vocabulary for couching real world problems in terms which can then be relatively easily translated into the language of quantum computers, quantum logic gates and qubit quantum states.

In the case of classical computing we had the intermediate levels of the language of mathematics and logic, which we currently do not have for the quantum world.

Need for quantum computer science

Computer science is now a fairly rich field, but is focused intensely on the world of classical computing, based on mathematics, logic, Turing machines, Boolean logic, and classical electrical engineering, much of which simply isn’t very relevant to the quantum world.

We need a whole new field of quantum computer science. Whatever that might be.

There will of course be an intersection between classical computer science and quantum computer science, mostly focused on the hybrid mode of operation, but there will still be a fairly sizeable portion of this new field which has very little in common with classical computer science.

Rerun, a number of times

Being strictly probabilistic is not a problem per se. Flipping a coin is probabilistic, but repeat the flip a number of times and you quickly get a true sense of its nature, its distribution. This same technique can be applied to quantum algorithms and the execution of quantum programs — run the same code repeatedly and see what the probability distribution of the results is.

How many times? At present there is no sharp, crisp, bright-line guidance for how many runs are sufficient to determine the nature of the distribution. Five to ten runs may be enough. Or maybe twenty. It will likely also depend on the nature of the algorithm itself. In some cases it may take 100 or even 1,000 runs, but I wouldn’t think that would be likely, unless the precision of the distribution is especially significant, such as whether 49.999 is to be considered distinct from 50.001.

The hybrid mode of operation should take this into account, performing the extra runs and evaluating the distribution.

Note that the results may be more than one qubit, and each qubit will have its own distribution.

There may also be correlations between various qubit results to be examined.

Need to partition larger problems into chunks

As if quantum programming wasn’t already difficult enough, it will not be uncommon that the number of qubits on a quantum computer and their supported connectivity is insufficient to directly handle the problem to be solved.

This is not unlike having more data than the available memory on a classical computer.

The solution is approximately the same: partition the data and process it in chunks.

As the old saying goes, easier said than done.

Some problems can be partitioned into sequential chunks or a clear grid of chunks. If so, that is great and the solution can be rather straightforward — capture intermediate results and then combine or merge them once all chunks have been processed.

The problem is that many of the hardest problems, which is when quantum computers are needed the most, are not amenable to such a simple linear sequence or grid-like processing. In such cases, the algorithm designer will have to get creative, which is fine, but frequently not such an easy task.

Some extreme problems may not even be amenable to partitioning.

Or there may be any number of possible partitions, each of which has significant downsides making each not so appealing. Tradeoffs will have to be made.

Or maybe the problem has to be tackled at a higher-level of granularity, so that the larger granules can indeed all be accommodated in the available qubits or at least be processed with dramatically fewer chunks.

Or, maybe the problem simply has to be deferred until more powerful quantum computers with more qubits and more connectivity become available.

In short, partitioning can require a significant level of effort, and even then success is not guaranteed.

Need for solid documentation

Documentation is generally problematic for computing in general, so quantum computing is not unusual in that regard.

Documentation of quantum logic gates needs to be a lot more complete. Put simply, there should be no unanswered questions after reading the doc.

This begs the question of standardized logic gates which could be documented more completely but independently of any particular machine, and then the doc for each machine could reference the standardized doc and simply note any deviations, exceptions, or extensions.

Need for standardized convention for preset initialization of quantum state of qubits

Generally, I believe it is the case that most quantum computers will arrange to initialize all qubits to the |0> state before beginning execution of the user’s quantum program, so that it is not necessary for the user program to explicitly initialize any qubit to the |0> state.

I believe that, but… I don’t have any firm knowledge that this is absolutely the case and guaranteed to always be the case.

It may indeed happen to be the case, but that’s not the same as guaranteed to be the case.

And as previously noted, documentation tends to be weak, and this is in fact one of those weak areas.

Need for formalized Principles of Operation documentation

Back in the old days of classical computing, it was standard practice that each type of computer would have a very complete document, generally called something like Principles of Operation, which completely detailed the operation of a computer at the level of knowledge that a programmer would need to develop software which ran on the bare machine, such as an operating system or a compiler generating machine language code, or an adventurous programmer coding only in assembly language.

We need such a document for each type of quantum computer.

It is not uncommon for nuances and edge cases to go undocumented.

Again, the criteria is that there should be no questions which go unanswered in such a document.

Questions such as those listed in the introduction section of my glossary:

An example of this tradition from the 1960’s (1964) — over 50 years ago:

That same architecture, as upgraded over 40 years (2004) into IBM’s z architecture:

I intend to expand my thoughts on a framework for a quantum computer Principles of Operation document in a future paper.

Update: My initial cut at a Framework for Principles of Operation for a Quantum Computer.

Need for formalized Programming Model documentation

Closely related to the Principles of Operation documentation for a quantum computer would be complete, clear, and consistent documentation for the programming model (or models, plural) for the quantum computer.

The Principles of Operation really simply describes what the machine itself does and how the programmer can control the machine, but doesn’t rise to the level of giving the programmer guidance for how to transform a problem, an application, into algorithms and code. That’s the job of the programming model.

An example of at least a brief description of a programming model for a quantum computer is in this whitepaper from D-Wave Systems:

D-Wave also provides more extensive documentation that gives further depth on their programming model:

Neither of these two examples is as clear, complete, and focused as I would like to see, but they at least attack parts of the problem.

A crisper outline for a programming model is needed.

And there needs to be a standardized approach so that quantum programmers can easily compare different machines. That is not to say that all machines must have the same programming model, but that programmers need to be able to clearly see the differences in the programming models of the different machines.

In summary, quantum programmers need solid guidance for how to transform their problem into a solution involving algorithms and code on a given quantum computer.

Issues of scalability

In theory, a solution to a problem on a quantum computer should be much more scalable than a comparable solution on a classical computer. That’s the theory. But in practice, scalability is more of an art and craft than a true science.

Realizing the potential for scalability of quantum computers is a major challenge, and in many cases not even achievable, at least in the relatively near future.

Even if the hardware is indeed scalable, that does not guarantee the same for algorithms. There are likely to be lots of practical issues, beyond the basic theory.

Connectivity of qubits is likely to be a real issue. Or rather the significant restrictions on connectivity on current quantum computers is an issue. Connectivity may not grow at the same rate as qubit count grows. And even when connectivity does increase, it doesn’t keep up with the raw number of quantum states which those qubits support. A 10-qubit quantum computer supports 1,024 quantum states, but only five concurrent entangled pairs of qubits, and not all combinations of pairs. A 20-qubit quantum computer supports a million quantum states, but only ten concurrent entangled pairs of qubits, and not all combinations of pairs. So, algorithms must use entanglement very sparingly, which limits scalability.

Shor’s algorithm was billed as killing public-key encryption, but once again, an algorithm which works very well with a small amount of data simply doesn’t scale.

One unexplored area of scalability is that current quantum logic gates are individual operations on individual qubits, or pairs of qubits, or occasionally three qubits, but as the number of qubits used by an algorithm rises dramatically, that means that many quantum logic gates must be executed, some sequentially, but potentially many in parallel. The qubits may operate in parallel, but the firmware code and digital logic that sequences through the gates is classical, not quantum, so it does not operate fully in parallel, which raises the prospect that there may be a limit to how many qubits can be operated on in parallel before the coherence limit is reached. The documentation for each machine needs to specify how many gates can operate in parallel, and if there are any limits. This is not an issue with small numbers of qubits, but once we get dozens and then hundreds, and eventually thousands, this matter must be addressed more directly.

The hybrid mode of operation also presents scalability challenges if the classical code needs to execute for longer than the coherence time of the qubits. But that begs the question of the degree to which the quantum computer permits the classical code to interrupt the flow of quantum logic gates, measure some qubits and leave others live, go off and do some classical computation, and then resume gate execution while keeping the unmeasured gates alive. That may not be so practical today on some machines, but will probably be important in the longer run, especially as the size and complexity of qubit count and gate count rise dramatically. Otherwise, the classical code will be forced to completely restart the quantum program from the start, which isn’t a problem for smaller qubit and gate counts, but could be problematic for higher qubit and gate counts.

Scalability is nothing new in computing, but a wide variety of tricks and techniques have been developed over the decades to allow an interesting level of scaling on classical computers. Unfortunately, most of those efforts are simply not relevant to the much more austere programming model of a quantum computer. For example, numerous approaches to indexing of data for databases and search engines.

Much more research is needed on scalability of both hardware and algorithms.

Meanwhile, designers of quantum algorithms will need to invest an extraordinary level of effort to achieve even a modest degree of scaling.

What does networking mean in the quantum world?

Although there is intense interest in quantum communication, there is absolutely no conception as to how the concept of networking would mesh with quantum computing.

In fact, quantum computing has no conception of either storage or I/O (input/output).

Literally, all we have is a collection of qubits, which we can prepare and measure, but there is no conception of a qubit or a quantum logic gate accessing the outside world.

Networking is truly a foreign, even alien, concept to quantum computing.

Sure, networking is used to access a quantum computer remotely, but quantum programs submitted remotely for execution have no conception of networking.

The hybrid mode of operation does incorporate networking to some degree in the sense that a remote classical program can send a quantum program across the network for execution and then use the results in the classical program, but the code of the quantum program itself still has no conception of where it came from or where the results might get shipped.

The point is that true quantum networking would allow qubits to flow seamlessly across the network and to support entanglement between distant quantum machines. Clearly we are not there yet, and not even close. Lots of research is needed.

Need for quantum-inspired algorithms and quantum-inspired computing

We can simulate quantum programs in a quantum simulator running on a classical computer, but that is rather inefficient and limited to the primitive capabilities of quantum computers.

What we really need is quantum-inspired computing based on quantum-inspired algorithms.

Today, there is no immediate benefit from designing sophisticated quantum algorithms which cannot be run on a real quantum computer and run too slow on a quantum simulator.

What we need are tools and building blocks which are inspired and modeled on the basic capabilities of a quantum computer, but which integrated effectively and efficiently with classical computing, enabling us to do the kinds of operations supported by a real quantum computer, but at the speed of machine language on a classical computer, without the excessive overhead of a quantum simulator.

Exactly what quantum-inspired computing would look like is unclear. There are any number of directions it could go. Some of the issues:

  1. Use existing classical programming languages, or is there some particular intellectual advantage and leverage which can be achieved by designing a whole new language focused on quantum computing?
  2. What to do about data types and units of value other than simple bits, even though they can be superimposed and entangled.
  3. How to assure a bias towards quantum-style features rather than purely classical-style features so that as much of the algorithms and code as possible can be shifted over to a real quantum computer in the hybrid mode of operation.

The are two equally valid outcomes:

  1. A reasonable bulk of the quantum-inspired algorithms and code can be seamlessly shifted over to a real quantum computer in the hybrid mode of operation.
  2. Even without access to a real quantum computer, the quantum-inspired algorithms and code run as fast, almost as fast, or even faster than traditional classical algorithms and code which was not quantum-inspired.

What might a quantum-inspired algorithm or code look like? That’s very unclear. What is clear is that lots of research is needed. And that there’s lots of potential.

And I do strongly suspect that the effort will have the collateral effect of raising the bar for what features we expect to be supported on a real quantum computer.

Media processing

Although current quantum computers are too limited in capacity, it seems reasonable to hypothesize that future quantum computers should excel at parallel processing of media data, such as:

  1. Image processing
  2. Graphics processing.
  3. Audio processing.
  4. Video processing.

Grid or lattice-like arrangements of data would seem ideal for quantum computing — assuming you have enough qubits and enough connectivity.

And the hybrid mode of operation would seem appropriate.

In addition to batch-processing of existing media, there is also the opportunity to process media in real-time, direct from sensors.

Artificial intelligence

On the one hand, one might superficially expect that quantum computers would be a natural for artificial intelligence, but although there are likely to be niches where there is a good fit, such as data-oriented machine learning, a lot of the functional areas of AI don’t seem like such a natural fit, at least for the architectures of current quantum computers:

  1. Perception — recognizing objects and complex scenes.
  2. Knowledge.
  3. Machine learning (complex data and visual patterns).
  4. Learning (concepts, not just visual or numerical patterns).
  5. Tasks.
  6. Goals.
  7. Reasoning.
  8. Creativity and imagination.
  9. Interaction in the real world — robotics — manipulation of objects, motion.

I wouldn’t want to prejudge the ultimate trajectory of quantum computing, but if we want to make a bigger dent in AI than relatively simplistic machine learning, a lot more research will be needed.

Internet of Things

This is more of a placeholder — processing the vast flow of data from Internet-based sensors — the Internet of Things (IoT) — would also seems to be a great application for a sufficiently large quantum computer.

Better hardware is needed, but the design of algorithms is likely to be the greater challenge.

And the hybrid mode of operation would seem appropriate.

How much knowledge of quantum mechanics and linear algebra do you need?

Quantum mechanics, which is the underlying foundation for quantum computing, is heavily based on the use of linear algebra for describing quantum states in terms of wave functions as linear combinations of basis states with probability amplitudes, such that the sum of the probabilities for all states is exactly 1.0 by definition.

The question is whether or to what degree the designers of quantum algorithms need to know any or much of quantum mechanics and linear algebra to design great quantum algorithms.

The jury is still out as to how much knowledge of quantum mechanics and linear algebra is needed by quantum computing practitioners.

The answer is likely to to be some, so the question will be how to make a small subset of quantum mechanics and linear algebra approachable, comprehendible, and palatable to the broad audience of designers of quantum algorithms and quantum programs without requiring them to rise to the knowledge level of a theoretical physicist.

Need for multipartite entanglement?

Current and near-term quantum computers support only bipartite entanglement of qubits — pairs of qubits, but not multipartite entanglement — three or more qubits in a single entanglement.

Is this a big deal and critical limiting factor or not?

There is no clear answer at this time.

The operational answer is that quantum algorithm designers have to be happy with what they get and focus on what they have rather than fantasize about hypothetical future quantum computers which are not yet even on the drawing boards.

Besides, the critical limiting factor right now is that even with bipartite entanglement, there are severe limits on which pairs of qubits can be entangled and in what direction (which one has to be the control qubit for the CNOT quantum logic gate which creates the entanglement.)

Still it is worth a research effort to at least contemplate whether multipartite entanglement might provide significantly more intellectual leverage for quantum algorithm designers.

And to answer the research question of whether multipartite entanglement would merely be a convenience, a significant performance boost, or enable solutions which are not even possible with only bipartite entanglement.

And then there is the relatively minor research question as to whether tripartite entanglement (three qubits in a single entanglement) might give the lionshare of the benefit over bipartite entanglement, without the need for the extravagance of much higher-order multipartite entanglement.

What exactly is entanglement good for?

We need a much more detailed narrative as to when and how quantum entanglement can and should be used. Questions to be answered include:

  1. Should it be used sparingly?
  2. Should it be used widely and as often as possible?
  3. What intellectual leverage does it provide?
  4. What larger operations can be constructed using it?
  5. Given very limited connectivity of existing quantum computers, what tradeoffs should be used when it cannot be used as widely as desired?
  6. What are the best example algorithms and programs which use it?
  7. How much more connectivity is needed than on current quantum computers?
  8. How specifically can a quantum algorithm designer argue for the need for greater connectivity?
  9. How specifically can a quantum algorithm designer argue for the need for a quantum computer with a larger number of qubits, such as the need for greater connectivity even if a current machine does indeed have more than enough raw qubits for a given algorithm?
  10. How much connectivity is a typical quantum program likely to need?

In any case, the bottom line is that we need clear, complete, and consistent guidance for quantum algorithm designers on the use of quantum entanglement.

Consider qutrits and qudits?

Quantum algorithm designers and quantum programmers will be limited to two-state qubits for the foreseeable future, but there is the potential for more than two states in future quantum computers.

A qutrit would have three quantum basis states.

A qudit would have ten quantum basis states.

Some questions arise:

  1. What are some great examples of how realistic algorithms and could exploit three or ten quantum states?
  2. Would this provide greater intellectual leverage?
  3. How feasible is this?
  4. How much more difficult (or easier) would this make algorithm design and coding?
  5. Is a qutrit (three states) enough?
  6. Is a qudit (ten states) really that much better?
  7. How much compatibility would there be between algorithms and code for qubits, qutrits, and qudits?

This section is more of a placeholder for future research since there is little prospect for qutrits or qudits in the near future.

That said, even if the hardware is too big a challenge, simulators could be developed so that algorithms to exploit qutrits and qudits could be explored.

And quantum-inspired algorithms could have some advantage on classical machines even if there is no comparable quantum hardware available, yet. This could be an opportunity to let algorithms drive interest in hardware.

Security

This is more of a placeholder. Quantum computers are very isolated and they store no data, so they have a minimal security footprint.

They do have network access for submitting programs for execution remotely, but that’s a classical computer connected to the network as a front end, not the quantum computer itself.

The hybrid mode of operation does have a security component, but once again this is actually a classical computer front end which then interfaces directly to the quantum computer.

Cryptography

Although the emerging field of quantum cryptography is quite exciting, quantum computers as currently envisioned don’t have any real role in encryption. Quantum communication is another story, with flying qubits, but has nothing to do with operation on stationary qubits using quantum logic gates.

Although there is plenty of chatter about post-quantum cryptography and the prospect of a quantum computer being able to break even strong public cryptography keys, that’s not a reality today or the near future.

Prime factorization (technically integer factorization of a bi-prime or semiprime — factoring a large integer into exactly two factors which are each a large prime number), the required calculation to break encryption keys, is still a young and emerging area of research for quantum computing. The largest integer factored so far on a quantum computer is roughly between 18 and 19 bits, well short of the 4096 bits for strong public key encryption, or even the weaker 2048 and 1024 bit lengths.

Sure, better prime factorization is coming, but not so rapidly as to be worthy of any great alarm.

Cryptocurrency and blockchain

This is another placeholder for a larger topic. Cryptocurrency mining and blockchain are certainly potential applications for quantum computing, but the future is rather murky at this stage.

It might well be that quantum computing might be able to break both sometime in the future, but definitely not the near future. See the section on cryptography.

Applications vs. code and algorithms

Code is of course essential for software and applications, and algorithms are essential for code, but algorithms alone and even code alone do not constitute an application, and it is applications which justify the expense and effort of advanced computers.

The missing pieces needed for applications which are not found on quantum computers include:

  1. Data. It lives somewhere and comes from somewhere, but not on the quantum computer itself.
  2. User interaction and user experience. Again, this happens on classical computers, not the quantum computer.
  3. Algebraic computation. Maybe someday quantum computers will support this, but not soon.
  4. Complex control flow. Requires the hybrid mode of operation.
  5. Complex data structures. Ditto.
  6. Transforming real data to and from the form of data that exists on a quantum computer — qubits in the |0> and |1> quantum states. All of this must be performed on a classical computer.
  7. Networking and distributed computing. Outside the realm of quantum computing. Someday there may be forms of quantum networking, but not soon.

In short, a quantum application is a lot more than quantum computation on a quantum computer.

Need for a robust software architecture for quantum computing

As the rest of this paper has noted, there are many aspects to software which remain woefully inadequate on existing and near-term quantum computers.

Even as we do begin to address at least some of these many aspects, the real need is to integrate all of these aspects into a robust software architecture.

The primary criterion for a robust software architecture is that all of the many aspects work well together. Too often, in many classical computing efforts, the many aspects of software were not co-designed so it comes as no surprise that they do not work so well together.

Much of classical software has been an afterthought rather than each aspect being designed to work will with all other aspects.

One would hope that we would have learned some tough lessons from those many experiences, but that remains to be seen.

Need for a robust algorithmic infrastructure

A key portion of the robust software architecture which is needed for quantum computing will be a robust algorithmic infrastructure which supports the design and sharing of quantum algorithms, and hybrid algorithms.

As outlined in preceding sections, we need deep support for:

  1. Better building blocks for algorithms.
  2. More basic algorithms as building blocks.
  3. Better hardware functions.
  4. Better hardware architectures.
  5. Greater hardware capacity. Both qubits and connectivity.
  6. Better metaphors.
  7. Better design patterns.
  8. Reference implementations.
  9. Taxonomies.
  10. Catalogs.
  11. Documentation.
  12. Education.
  13. Training.

Need for trained quantum designers

It is unreasonable to expect that we will make a lot of progress on adoption of quantum computing until we have a sufficiently qualified and trained workforce.

That’s part of the motivation for Quantum Ready efforts, to get more people up to speed at least on the concepts of quantum computing even if we still do not have adequate practical quantum computer hardware.

It’s way too soon to plan on a massive training effort, especially since the software architecture and algorithmic infrastructure for quantum computing remain woefully inadequate.

This is simply a placeholder to note that significant effort will still be required once a sufficient software architecture and algorithmic infrastructure is indeed in place, which isn’t soon in any case.

Sluggish pace of quantum hardware development and raw, physics-based algorithm design

Although the headlines make it seem as though rapid progress is being made in quantum computing, the reality is that although advances are being made on the hardware front, the net effect still falls woefully far short of being sufficient to support a robust and vibrant software ecosystem.

Maybe and hopefully this sluggish pace will accelerate in the coming years, but as of August 2018, it is positive news when hardware projects report intentions to produce 64-qubit and 128-qubit quantum computers, while the reality is that even those advances still fall woefully short of being sufficient for accelerating the pace of design of quantum algorithms and development of rich quantum programs.

The bottom line is that while we are indeed getting closer, practical quantum computing for wide audiences is still well beyond our reach.

It’s not clear whether additional money is the key constraint, or whether we simply need a lot more elapsed time while we wait for a long stream of unpredictable breakthroughs.

This is a key part of why this paper argues for promoting use of quantum simulators and focus on algorithm development, so that software does not need to wait for the hardware, and so that advances in software can help to drive priorities and functions for future hardware development.

Multiprocessor quantum computers?

Quantum computer designers are struggling mightily just to deliver a single basic processor, but it’s only a matter of time before they begin to consider multiple, parallel quantum processors, with some way of interconnecting them at the quantum level.

Initially it may be a simple classical computer to connect them and shuttle the data between them, but at some stage some sort of hybrid memory and specialized quantum logic gates could enable very fast shuttling of data between parallel quantum processors.

The real goal is to have true quantum connectivity between processors.

This would be a great advance, but could add a whole new level of complexity to the quantum algorithm design process unless the hardware design is driven by algorithm development.

This concept is not even being talked about and is certainly not on the near horizon, but it could be one of those areas where some forward-thinking algorithm designers might be able to influence future hardware design.

The key here will be whether the parallel hardware is symmetric enough and the interconnections transparent enough that the algorithms are inherently simple or inherently complex.

Challenges of fixed-function and special-purpose quantum computers

Although general-purpose quantum computers are the holy-grail for quantum computing, there are plenty of opportunities for machines which focus on niche applications.

So-called fixed-function quantum computers or special-purpose quantum computers can indeed provide significant computational and intellectual leverage for niche applications.

The challenge is threefold:

  1. Assuring that the problem niche matches the machine niche.
  2. The complexity of transforming problem data into machine data and transforming results back into a form that people or classical machines can handle.
  3. The risk that solutions formulated in terms of a specialized machine will not be readily ported to newer machines which may not have the same machine architecture.

None of this is to argue against utilizing specialized machines, but simply to be aware of the challenges.

Role of the lunatic fringe

Back in the old minicomputer days there was this notion of the lunatic fringe — the guys who were crazy enough to try any new technology which came along, regardless of whether there was good reason to believe in its potential.

Sometimes things wouldn’t pan out and some new technology would fall by the wayside, but sometimes a new technology would turn out to be even better than expected. It was the job of the lunatic fringe to sort this distinction out.

Once the lunatic fringe had proven the technology, then and only then would more mainstream professionals be willing to use it in production-ready systems.

There’s no reason not to trust this same model for all things quantum.

The catch is to avoid exposing mainstream professionals to a new technology until after the lunatic fringe have had ample opportunity to prove that the new technology actually works, in a real-world setting.

The problem we have right now is that too many people are trying to jump the gun and push quantum computing into the mainstream (Quantum Ready efforts) before even the lunatic fringe have had a chance to prove that it works.

Quantum computing is far, far from being ready for prime-time production use.

We need to be a lot more patient and wait for the lunatic fringe to finish doing what they do best. In fact, for the most part they haven’t even started to do their thing with quantum computers since sufficiently powerful machines don’t yet exist. Nor simulators.

Not even bleeding edge technology let alone leading edge

Mainstream developers are accustomed to working with leading edge technologies, the newest technologies which are indeed production ready. They’re new, but have been developed and used enough to have proven themselves capable of meeting the needs of mainstream organizations.

No, quantum computing is not yet at the stage of development and maturity characteristic of a leading edge technology.

Out in front of leading edge technologies are bleeding edge technologies, the really newest technologies which unfortunately have not yet been proven to be production ready. They’re so new that they haven’t yet been able to prove themselves capable of meeting the needs of mainstream organizations.

Adventurous and hearty, and patient, professionals are able to experiment with bleeding edge technologies, gradually working through issues to incrementally prove the value and readiness of the technologies. And to work with vendors to give them feedback to improve their products and services, to help evolve them to the stage of being actually usable and at least approximating production ready, and eventually advancing to being true leading edge technologies.

The key point is that bleeding edge technologies are indeed close enough that only a moderate level of effort and modest risk are needed to push them over the finish line to being production ready. A substantial effort indeed, but with a relatively minimal level of risk, and a reasonably clear path to follow to get there. That may be a couple of years, but not five years or a decade or more.

It’s not that quantum computing is at the bleeding edge stage, but that quantum computing is not yet even at the bleeding edge stage. A lot more research is needed to even get to the bleeding edge.

Sure, truly adventurous and extremely hearty professionals can indeed experiment with quantum computing as it exists today, but they should be under no preconceptions that they will be successful at anything more than showing that the technologies have some future promise — and that that future is not immediately around the corner.

Maybe in another two to four years quantum computing will indeed be solidly at the bleeding edge stage. And then another two to three years to advance to the leading edge stage. And I did say maybe, and I am being optimistic. Okay, it might be five years, but that’s a measure of how distant the finish line is from where we are today.

Sure, there will be niche applications which may be able to exploit quantum computing in a shorter timeframe, but the comments in this section are focused on the mainstream.

Not yet Quantum Ready

IBM feels confident that their quantum computing technology is Quantum Ready, but I am not persuaded. As I just noted, I don’t feel that quantum computing is even yet at the stage of being a bleeding edge technology.

Technically, IBM is not saying that the technology is ready, but that organizations should begin training and experimenting so that they, the organizations, will be ready in a few years when the technology itself is (hopefully) ready.

Give it another couple of years, and maybe then we might be able to assert that quantum computing is a bleeding edge technology, worthy of the attention of the more adventurous and hearty, and patient, professionals.

But for now and for the next couple of years, the technology is not quantum ready and the majority of organizations don’t need to be Quantum Ready.

Quantum Research Ready

Rather than being Quantum Ready, I would suggest that quantum computing is Quantum Research Ready — proven enough that it is worthy of a lot more research money that is more likely to pay a dividend in the next five years than it was in the past twenty years.

And some organizations, the more technically adept and adventurous, can and should become Quantum Research Ready as well, pursuing research in development of algorithms which can exploit quantum computer hardware when it becomes available.

Organizations can experiment with algorithms on quantum simulators, but this is research and will not lead to a product or service which is production ready in the next couple of years.

Advanced organizations can also do research into quantum-inspired algorithms which can blend quantum and classical approaches in a way that does not depend on availability of real quantum computers in the next couple of years, and may actually become production ready even in advance of comparable quantum computer hardware.

Quantum Aware and Quantum Curious

As noted, I don’t concur with the IBM approach of Quantum Ready. Besides the technology of quantum computing not being even remotely close to ready for real-world applications, I don’t concur that real-world organizations should be ramping up on the nascent technology to any significant degree at all.

Rather, I would instead encourage larger and more advanced organizations to simply be Quantum Aware and Quantum Curious — to be aware that quantum computing is coming and look into it a little, but accept that it won’t have any significant real-world application in the organization any time in the next few years.

Every two to three years they should check in for an updated forecast for when we will be within two to three years of the technology actually being Quantum Ready and only then should organizations actively and significantly begin ramping up to be Quantum Ready.

Some activities to engage in to be Quantum Aware and Quantum Curious:

  1. Read a few white papers.
  2. View a few slide presentations.
  3. Become slightly familiar with both its promises and its pitfalls.
  4. Do some small experiments. Optionally, since this requires some real effort.
  5. Send a few individuals — but not entire teams or large groups — to technical conferences to absorb some technical detail on the potential and status of quantum computing.
  6. Occasionally invite a guest technical expert to give a talk about quantum computing.
  7. Chat with colleagues at other organizations to get a sense of how they are looking at quantum computing.
  8. Have some discussions about what applications might benefit from quantum computing.
  9. Write up a report about how quantum computing might or could impact a particular organization.
  10. Schedule a re-review of the state of quantum computing another two to three years hence.
  11. Designate a few individuals to have responsibility for occasionally checking up on progress in the field of quantum computing — not as full-time jobs or to qualify as experts, but maybe at the level of a very few hours a month. Not to be cheerleaders for every minor advance in the field, but simply to be watching in case a true breakthrough actually does occur.
  12. Annually ask the question: has any truly dramatic development occurred this year to warrant changing the watch and wait posture?

Again, this should be a very minimal, light-touch level of effort at this stage, not a significant organizational commitment of any resources. This is not the stage when anybody in the organization should have quantum computing as their full-time job.

Granted, there will be some exceptions. Some more advanced organizations may indeed have some very narrow but very high-value niche applications which could benefit from quantum computing in the near future. But if that’s the case, their staff is probably sophisticated enough to already be aware of quantum computing and both its promises and its pitfalls.

In short, definitely be Quantum Aware and Quantum Curious right now, but hold off for at least two to three years before contemplating an aggressive effort to be Quantum Ready.

Too much hype and folklore

Call it hype if you want, but there is way too much folklore building up about the promise and threat of quantum computing.

Part of the reason for the prevalence of hype and folklore is a dearth of solid information coupled with wild speculation about an uncertain future.

It is not helpful when the media makes wild, speculative claims. And it’s equally unhelpful when managers, executives, and promoters help to fuel those wild, speculative efforts.

The only real solution is to patiently await real progress on both actual hardware and production-quality algorithms and code.

And of course we need to do more to accelerate progress on both fronts.

Meanwhile, we should be doing more the pump the breaks on the unprecedented levels of hype and folklore.

Famous quantum algorithms — which haven’t panned out, so far

Some examples of past and current wild claims:

  1. Quantum computers break current encryption schemes. Not even close.
  2. Shor’s algorithm is able to factor large prime numbers — to crack encryption schemes. Not even close. Researchers are still struggling with integer factorization schemes.
  3. Grover’s algorithm can search databases. No, not really. Most databases are a lot more complex than simple, linear bit strings.

When is quantum computing projected to be able to do anything interesting?

There have certainly been quite a few demonstrations of quantum computing, but so far there has not been anything even remotely close to a production-quality demonstration. These demonstrations have been mostly toy-like.

So, how much longer will we have to wait for at least one major application that is indeed production-ready?

Sound of crickets. No idea. None.

A few years? Sounds like a fair bet, but every few years it’s another few years.

But as this paper highlights, we clearly need a significant improvement in the algorithmic infrastructure coupled with greatly improved hardware.

Even 256 qubits may not be enough. Connectivity is lagging as well.

Will 1024 qubits be enough — a 32 x 32 grid (lattice)? D-Wave has 2048 qubits, but it’s a very specialized architecture, not suitable for general-purpose computing.

Will a 64 x 64 grid (lattice) finally begin opening the algorithm floodgates? Possibly. That’s 4096 qubits, well beyond even a couple of years at the current pace. And, again, connectivity is a key problem rather than simply raw qubit count.

We may need to see bi-grid algorithms, where there are two symmetric but overlapped grids and the focus is on correlations between the grids. Say, two 32 x 32 grids, which is two times 1024 or 2048 qubits. Or two 64 x 64 grids, which is 8192 qubits.

It may very well be that we need to see dramatic innovations in algorithms on simulators before we can get enough attention on dramatic improvements in hardware.

To answer the headline question, at least five years seems like a good bet.

But I’ll hedge and say that maybe in two to three years we might see at least some more credible demonstrations rather than vague promises and very toy-like demonstrations.

Should students focus on quantum computing?

Has quantum computing technology advanced to the stage where students should begin focusing on it and expect to be getting jobs in the field? Uhhh… a real mixed bag — yes and no.

In some cases, yes, but in most cases students should remain focused on majoring in classical computer science and hardware engineering, with no more than a side interest in quantum computing.

Yes, there are research positions for elite professional researchers, such as those with relevant PhDs in areas like physics, computer science, and materials science. But that’s for the elite.

Masters degrees are treacherous territory. Focus on quantum if and only if your intention is to go on to an elite PhD and postdoctoral path with research as your focus. Otherwise, remain focused on classical computer science and hardware engineering.

Organizations working on bleeding edge quantum programs will have a brisk demand for sharp masters graduates, but there won’t be many positions over the next few years except for the sharpest of the sharp minds.

Bachelor’s degree? Uh, dream on. If you don’t have the intellect and drive to go for that PhD or masters degree, you just won’t have much to offer to elite research organizations. In any case, remain focused on classical computer science.

Community college? Sorry, it’s way too early for quantum computing to be applicable to the kinds of jobs available to individuals with an associates degree.

In short, there’s no harm in getting exposed to quantum computing in college, but unless you’re at the more elite levels of PhD and masters programs, job prospects will be minimal. Sure, a few hearty souls will indeed get jobs in quantum computing over the next few years, but they will be the exceptions, not the rule.

To reiterate, exposure and awareness are both good, but focus is not so good at this moment.

This will change as each year goes by, but for this year quantum computing remains a novelty rather than a credible career for all but the most elite of students and professionals.

If you want to go for that PhD, go for it, but otherwise stick to classical computing as your career focus.

Continuing education is another matter. If you are solidly established in a classical computing career, there’s no harm and every advantage to reading up and taking a few classes, a masters degree, or a PhD program in quantum computing, just to be ready when jobs do appear, but this remains a sidelight rather than a primary focus. And, again, the market is not yet large enough to support jobs for everybody who might want to enter the field, even if you do get the required education. It’s a real shame to spent so much time, energy, and money on courses and degrees but not have the opportunity to apply that effort in a real job.

One last time, exposure and awareness are both good, but a focus on quantum computing is not so good at this moment — unless you really are one of the elite of the elite. Look very carefully before you leap — make very sure there’s somebody waiting there to catch you and soften your landing

Awaiting further advances and breakthroughs

Some of the hardware advances we need:

  1. Moderately more qubits. 64, 128, 192, 256, 512, 1024. As a start.
  2. Much larger numbers of qubits — tens of thousands, hundreds of thousands, even millions. A 1,000 by 1,000 lattice (grid) is one million qubits, but is still a rather modest amount of data by today’s standards.
  3. Much greater connectivity (entanglement) with far fewer, if any, restrictions.
  4. Much lower error rate.
  5. Much longer coherence.
  6. Much greater circuit depth.
  7. True fault tolerance — error correction, which requires significant redundancy for each qubit.
  8. Much lower cost for the full system.
  9. Non-cryogenic operating temperatures.

And beyond specific advances in hardware, we could use some dramatic breakthroughs for both hardware and software:

  1. New discoveries for technologies for qubits.
  2. Breakthroughs for fabricating qubits.
  3. Photonic computing?
  4. Room temperature quantum computing?
  5. A quantum computer in a shoebox?
  6. New architectures for qubits and qubit operations.
  7. Radical advances in classical computing which can spillover into quantum computing.
  8. Metaphors, design patterns, and programming languages for quantum computing.
  9. New discoveries in physics and quantum mechanics.
  10. Opportunities for tighter integration between classical and quantum computing.
  11. Opportunities for quantum-inspired algorithms which work well on classical computers.

Meanwhile, we have to get by the best we can with what we have.

Need for a universal quantum computer

This paper was originally written on the presumption that we would not have universal quantum computers as envisioned by David Deutsch in his classic paper Quantum theory, the Church-Turing principle and the universal quantum computer. Current and near-term quantum computers are not currently on a path towards a universal quantum computer which fully integrates Deutsch’s vision of a quantum computer integrated with a classical Turing machine.

For more on what exactly a universal quantum computer might be, see my paper What Is a Universal Quantum Computer?.

The current focus of quantum computer development is on the purely quantum operations, not on any of the classical operations associated with a classical Turing machine.

If we did have universal quantum computers, much of this paper would be moot. Quantum algorithms would still be a significant challenge, but somewhat less so than without the conceptual power of a Turing machine.

Pace of progress for quantum simulators

One interesting question is how quantum simulators and the classical computers they run on will advance over the next few years, in contrast to the pace of advances in real quantum computers.

Parallelism, multiprocessors, multiple cores, GPUs, and FPGAs could enable simulators to give real quantum computers a run for their money.

Granted, classical computers can’t simulate the full quantum state space for more than 40 to 50 qubits, but most algorithms won’t need the full state space anyway even if they do use more than 50 qubits. Sometimes an algorithm needs more qubits simply because connectivity (entanglement) between pairs of qubits is severely restricted to selective pairings. And every time you use entanglement, that reduces the total state space by another factor of two. But it is also worth noting that a lot of current simulators aren’t exploiting massive parallelism, which significantly increases the memory available for storing state space.

Connectivity (entanglement) will be an interesting area of competition, where it can be much easier for a simulator to support full connectivity of all pairs of qubits than for a real quantum computer, especially as the qubit count rises.

ENIAC moment for quantum computing

The ENIAC computer was unveiled in 1946 as the first successful digital computer, focused on a specific application — computing artillery firing tables. It wasn’t simply a bare piece of technology, but demonstrated a real and compelling application.

When will quantum computing achieve its own ENIAC moment, when both the hardware and a compelling application are here together for the first time? I explored this topic in my paper When Will Quantum Computing Have Its ENIAC Moment? The short answer is no time soon, but maybe in four to seven years.

The point here is that eventually one significant application will manage to come together at the same time that the hardware comes together, and then, only then, will quantum computing have its ENIAC moment.

We need to see both algorithms and hardware advance together, not necessarily in absolute lockstep, but it does no good to have hardware without applications or applications without hardware.

FORTRAN moment for quantum computing

The FORTRAN programming language was the first widely successful high-level programming language and programming model for classical computing. There were some earlier attempts, but FORTRAN made the big splash and opened up computing to a much wider market. Before FORTRAN, programmers had no choice but to code in assembly or raw machine language — the world of bits, words, registers, memory, machine instructions, and raw hardware I/O. It was very, very, VERY tedious. But FORTRAN allowed programmers to write and think in the higher-level terms of variables, integers, real numbers, arrays, control structures (conditionals, loops, and function calls), and even formatted I/O. Programmers became MUCH more productive. Other high-level languages followed, such as COBOL, LISP, and BASIC, but it was FORTRAN which opened the doors (or floodgates!) wide open in the first place.

The point here is that the high-level programming model and features of FORTRAN ushered in a new age for application developers.

Quantum computing does not yet have a sophisticated high-level programming model and features that delivers the kind of productivity boost as FORTRAN did for classical computing.

Until quantum computing does gain such a sophisticated high-level programming model and programming language, comparable to FORTRAN — what I am calling the FORTRAN moment for quantum computing, application development will proceed at a very sluggish pace and be restricted to the most elite of software developers.

I explore this topic at much greater length in my paper When Will Quantum Computing Have Its FORTRAN Moment?

When will quantum computing achieve its own ENIAC moment, when both the hardware and a compelling application are here together for the first time? I explored this topic in my paper When Will Quantum Computing Have Its ENIAC Moment? The short answer is no time soon, but maybe in four to seven years.

The point here is that with all the applications listed in this paper, eventually one of them will manage to come together at the same time that the hardware comes together, and then, only then, will quantum computing have its ENIAC moment.

We need to see both algorithms and hardware advance together, not necessarily in absolute lockstep, but it does no good to have hardware without applications or applications without hardware.

What’s next — for me

That’s it for now, for me.

This is the best I can do for now, to collect all of these thoughts, express them as clearly as possible, and hope that individuals involved in the advancement of quantum computing will pick up on some of them and do something about at least some of the issues which I have raised.

I’ll continue to monitor the news of advances and breakthroughs, including reading published papers. Progress seems to have slowed of late, but real progress in any truly important field is always uneven.

It will likely be at least another two years before the contents of this paper needs any significant revisions other than revising a few numbers here and there.

I’ll be tracking progress on prime factorization since that’s the gating factor for whether or when quantum computers can break strong encryption and may be a good indicator of how sophisticated algorithms are getting.

I’ll also be posting more informal papers on various aspects of quantum computing as I collect more of my own thoughts on these matters.

--

--

Jack Krupansky
Jack Krupansky

Responses (1)