What Is a Universal Quantum Computer?
First the bad news: The term Universal Quantum Computer has been reduced to mere marketing hype. What was it supposed to mean before the marketing hypesters entered the scene? This informal paper will briefly explore some of the historic intentions, current realities, and future potential. And I will offer my own definition of the term.
The essence of a universal quantum computer is that it combines the full power of a classical computer with the power of a quantum computer, and enables simulation of physics, including and especially quantum mechanics.
Current and near-term quantum computers tackle only the quantum portion of the full vision, leaving out all of the conceptual power of a Turing machine.
How to get there from where we are today is the subject of the rest of this paper.
Topics
This paper covers a range of topics:
- Glossary
- Origins of quantum computing
- Later developments
- Quantum computing in the maw of modern marketing
- Where we are today on the technical front
- Key elements of a universal quantum computer
- Deutsch’s vision — integration of classical and quantum operations
- Current quantum computers are only half of Deutsch’s vision
- Can we achieve parity with pure quantum?
- Can a quantum-only computer do everything a classical computer can do?
- Universal quantum gate sets alone don’t make a universal quantum computer
- Levels or types of universal quantum computers
- Some details
- Beyond the qubit?
- How much physics can be simulated with only two states?
- How many degrees of freedom are needed to simulate physics?
- How large a physical system can be simulated?
- QISC — Quantum Instruction Set Computer?
- Photonic computing
- QRAM — Quantum Random Access Memory
- Virtual memory?
- Balancing and exploiting parallel and sequential operation
- Data types for quantum computation?
- Can we simply graft a QPU onto a classical CPU?
- Hybrid mode of operation as a stopgap measure?
- Is a quantum coprocessor good enough?
- Poor man’s universal quantum computer
- Direct integration of classical and quantum operations
- Quantum registers
- What size integer is most appropriate for a universal quantum computer?
- What size floating point operations make sense for a universal quantum computer?
- What precision is needed for simulation of physics?
- Planck constant
- Simulation of chemistry
- Simulation of cosmology
- Exact simulation versus approximation
- Support for character, string, and text operations
- Any impact from running a CPU at colder temperatures?
- What to do about coherence time
- Multipartite entanglement?
- Mixing simulated and real quantum operations
- Cryptography
- Artificial intelligence?
- Turing a, c, o, and u-machines
- Analog?
- Horrendous mountains of code
- Database operations
- Time-series data operations
- Taylor series expansion
- Challenges of algorithms
- Data modeling
- Achieving statistical stability from probabilistic computation
- PL/Q — Need for a programming language to match the full potential of a universal quantum computer
- Great barriers to architectural progress: qubit count, coherence time, and connectivity
- What can a classical computer do?
- What should we expect a quantum computer to do?
- Criteria for judging progress
- How soon before we see the first real universal quantum computers?
- No, a Noisy Intermediate-Scale Quantum (NISQ) computer is not a universal quantum computer
- Simulate a universal quantum computer
- My definition
- What’s next?
Glossary
If any of the terminology of quantum computing seems unfamiliar, please consult my own Quantum Computing Glossary, with over 2,700 terms.
Origins of quantum computing
Even historically, before the marketeers got ahold of it, the term universal quantum computer was still a bit vague, as could be expected at the early dawn of an entirely new era.
Feynman didn’t appear to use this specific term, including in his 1982 paper, Simulating Physics with Computers, although he did make reference to universal computer and universal quantum simulator. His focus then was primarily on the capacity to simulate physics, notably quantum mechanics, and a focus on the quantum aspects of the simulation. You can certainly read between the lines and reasonably infer that he was indeed referring to a single computer which could do it all, both classical and quantum computation, but he doesn’t reach the point of a crisp definition for the term.
Deutsch does explicitly refer to the term in this 1985 paper, Quantum theory, the Church-Turing principle and the universal quantum computer, but like Feynman he doesn’t offer a crisp definition for the term.
My Internet searches were unable to locate any attempt to crisply or even casually define the term in those earlier years, although that was the era before the advent of a general inclination to post papers and other documents online routinely for consumption by the general public. A lot of papers from that era are still available only behind paywalls.
The simple fact is that although Feynman and Deitsch were looking ahead to the ultimate future when a quantum computer can do it all — quantum operations and classical operations, researchers quickly got very bogged down in simply trying to do anything with even a single qubit or pair of qubits, so that this ultimate vision got filed away for the more distant future than remaining a central focus. It’s only within the past couple of years that general-purpose quantum computers got past the number of qubits you can count on your fingers.
Later developments
My Internet searches did run across an occasional reference which used the term to refer not to the capabilities of the computer to tackle all computing problems, including everything a classical computer can accomplish, but simply to being functionally-complete for doing all operations which are expected of a quantum computer for even a single qubit or pair of qubits, such as the 16-qubit IBM universal quantum computer can be fully entangled paper by Wang, Li, Yin, and Zeng. Today, this is referred to as a universal quantum logic gate set.
Leaping forward to 21st century marketing, no crisp definitions for the term arise, but the general sense I get is simply the use of the term as a rough synonym for general purpose quantum computer, itself an undefined term, but loosely implying simply a quantum computer which has enough capacity (qubits) so that it can actually address nontrivial real-world problems, and to distinguish from special-purpose and fixed-function quantum computers which are focused on narrow niches rather than a much wider range of problems.
Currently, quantum computers are able to demonstrate concepts but simply don’t have the qubit capacity to handle more than very trivial problems.
The current consensus expectation seems to be roughly that it may take roughly another five to ten years to achieve the level of capacity to consider quantum computers as capable of real solutions to real problems rather than merely technology demonstrations. But that’s for doing all operations which are expected of a quantum computer. The more proper technical term there is universal quantum logic gate set — to perform all operations possible on a single qubit or pair of qubits.
Now that several vendors are actually able to make their quantum computers available online or remote execution of quantum programs, people are getting much more excited. Especially the marketing people.
But excitement frequently does not align strictly with reason and technical correctness.
Quantum computing in the maw of modern marketing
Hope (and hype) and reality are not the same thing.
And it’s not strictly only the true marketing professionals engaging in the hype, but a variety of amateurish promoters touting wild and unsubstantiated claims. Some of these are overly-enthusiastic media types, some of whom are legitimate journalists and some whose credentials are little more sturdy than blogging.
And in fact some of the hypesters are otherwise competent technical professionals who should know better. Normally, they would know better, but the maw of modern marketing still manages to suck people in despite their best intentions.
For example, a post/article on the Scientific American website in 2017 proclaims in its title that IBM Will Unleash Commercial “Universal” Quantum Computers This Year. That was actually a republication of an earlier news article published in Nature under the less-provocative title of IBM’s quantum cloud computer goes commercial and a reasonably realistic subtitle of “Company plans a bigger, better system aimed at creating a market for the still-immature technology.”
The exaggeration in the title of the Scientific American redinition of the same article may be more the fault of an overly-enthusiastic editor than the handiwork of the reporter himself, but the reporter did write “IBM plans to roll out the world’s first commercial ‘universal’ quantum-computing service some time this year” in the lead paragraph, which is presumably the direct words of the reporter. Maybe we should give the reporter a little slack since they did put “Universal” in quotes in both the title and the first paragraph, suggesting that they were not completely comfortable with the term as literally written. Maybe. Who knows.
Two other references from that article:
- getting enough qubits to work together to run any such algorithm — in what is known as a universal quantum computer — has proved extremely challenging
- D-Wave’s machines are not ‘universal’ computers’, and can only run a limited range of quantum algorithms
The article also references a paper, Experimental Comparison of Two Quantum Computing Architectures, by Linke, Maslov, Roetteler, Debnath, Figgatt, Landsman, Wright, and Monroe, which opens with “Inspired by the vast computing power a universal quantum computer could offer, several candidate systems are being explored”, but offers neither a definition nor a citation for universal quantum computer. And that’s a paper by technical professionals rather than mere marketeers.
It turns out, the article was based in large part on a press release from IBM itself, entitled IBM Building First Universal Quantum Computers for Business and Science, which has several references to universal:
- IBM (NYSE: IBM) announced today an industry-first initiative to build commercially available universal quantum computing systems.
- On Monday, March 6, IBM announced that it will build commercially available universal quantum computing systems. IBM Q quantum systems and services will be delivered via the IBM Cloud platform and will be designed to tackle problems that are too complex and exponential in nature for classical computing systems to handle.
- For more information on IBM’s universal quantum computing efforts, visit www.ibm.com/ibmq.
Press releases are of course the explicit and exclusive handiwork of true marketing professionals.
That said, did the marketeers themselves come up with this phrasing or did well-meaning but misguided managers, executives, and other professionals lead them down this path?
In any case, the point is that the use of the term universal in a non-technical context can be rather misleading and essentially useless.
Again, my generous personal sense is that well-meaning individuals were simply trying to highlight that these computers were more general-purpose than existing special-purpose and fixed function quantum machines which had been produced previously.
Still, I personally would have expected a much, much higher level of technical acumen and accuracy from publications of the quality of the Scientific American and Nature. I mean, if those two can’t muster technical acumen and accuracy, who can?
Enough on the marketing hype, for now. The bottom line there is to simply refrain from taking media claims too literally without deeper investigation.
Where we are today on the technical front
Back to the science and actual technology, my feeling is that we’re currently in a bit of a lull on the universal quantum computer front, not because of technical issues per se, but simply because people are super-focused on solving some preliminary but critical technical issues:
- Building more than just a few qubits. How do we scale to practical numbers, like hundreds and thousands, and eventually millions?
- Dramatically longer coherence times. 100 microseconds is an amazing achievement, but too toy-like to base a practical quantum computer. Seconds, minutes, hours, and days are so far off in the distance.
- Error correction. Even with longer quantum coherence, interference from environmental noise is unavoidable, so correction schemes are needed, which means more qubits are needed.
- Alternative technologies and approaches to constructing individual qubits. There are a number of competing approaches today, but it wouldn’t hurt to have many more. Innovation in this area might well be the gating factor for improvements in the other areas listed above.
And that’s the super-critical stuff.
After that we have smaller issues (relatively) such as:
- Cost.
- Difficulty of operation. Sophisticated staff needed.
- Cryogenic cooling.
- Room temperature operation. Not with the current technology. Maybe we can’t achieve room temperature practically anytime soon, but something much closer to freezing is sorely needed. Or, maybe that’s the limit of the current technology, and something like a photonic computer (Xanadu) is a better horse to bet on for true practicality, with current cryogenic temperatures simply being a necessary but transient steppingstone.
- Size. A standard data center rack, small filing cabinet, desktop, laptop, tablet, smartphone, wristwatch, and embedded Internet of Things are all well beyond the horizon, but still conceivable. For now, a standard data center rack should be the practical goal and benchmark to pursue.
And none of that really gets to the nature of achieving a quantum computer which is truly universal.
Key elements of a universal quantum computer
Reading Feynman and Deutsch as closely as I possibly can, I sense three key elements for a universal quantum computer:
- An assembly of independent qubits, each of which functions according to the probabilistic nature of quantum mechanics, including superposition and entanglement of quantum states, as well as a universal quantum logic gate set which supports all of the operations possible at the quantum mechanical level for each qubit.
- Can simulate or implement any and all operations of a classical computer. A classic Turing machine.
- Capable of simulating physics, especially quantum mechanics, but the rest of physics as well.
Do all of that and then you have yourself a truly universal quantum computer.
Although Deutsch doesn’t refer to it by name per se, his notion of a universal quantum computer is more properly a called a quantum Turing machine. Whether we should be more explicit by referring to it as a universal quantum Turing machine is debatable. Either way works. I’ll try to stick with the longer form, but the shorter form does imply the longer form.
What is a universal quantum Turing machine? I’ll defer the fine technical details to others for now, although I may well at least discuss it a bit more at a higher level in a subsequent paper. Essentially it is simply the formalization of the three key elements listed above.
There is also the issue of two independent factors:
- Functionally capable of those three qualities.
- Capacity to handle real-size problems. Up to and including Big Data.
Conceptually, one could implement a quantum computer which simulated a classical Turing machine, but didn’t have enough qubits to handle more than the smallest of programs.
Technically, capacity is not an absolute requirement for qualifying as a universal quantum computer, but I will assert that it is indeed essential. Competent minds can quibble as to where the line should be drawn for minimum capacity, and that will change over time, but I will assert that if the goal is that a quantum computer can do what a classical computer cannot do, then the quantum computer must have the capacity to exceed the capacity of a comparable classical computer.
If we are fully integrating classical and quantum computing, it may be sufficient to continue to do most classical-style operations just as they are done today unless there are comparable quantum operations which are more powerful or more efficient. So, we won’t necessarily have to build machines with billions of qubits to get to parity. There might be gigabytes of classical memory and mere millions or thousands of qubits.
Deutsch’s vision — integration of classical and quantum operations
Deutsch’s paper speaks of “All programs for Q can be expressed in terms of the ordinary Turing operations and just eight further operations.” Not a purely quantum computer, but quantum operations in addition to classical (ordinary) Turing operations. Both. not one versus the other, but an integration of the two models.
Current quantum computers are only half of Deutsch’s vision
None of the current or near-term quantum computers deliver on Deutsch’s vision of an integration of classical and quantum computing.
In fact, it’s as if the designers of these machines failed to read or simply discarded half of the essence of his paper.
Literally, each of the current quantum computers is focused exclusively on the quantum operations alone.
Sure, we do have the hybrid mode of operation, where a classical computer is controlling a quantum computer, but not in the level of integration envisioned by Deutsch’s paper. They are two separate machines rather than an integrated universal quantum machine.
Granted, the quantum folks have had their hands busy just getting the basic quantum operations working at all or in small numbers, so it is excusable that they have not yet reached the stage of focusing on the integration required to fulfill Deutsch’s full vision.
But the bottom line is that current quantum computers only provide half of Deutsch’s vision of a universal quantum computer.
Can we achieve parity with pure quantum?
This is an interesting theoretical and practical question — for a universal quantum computer to perform all of the operations of a classical computer, would that imply all of the hardware of a classical computer or could quantum-only hardware do the trick?
I’ll just leave the question hanging, with the presumption that maybe ultimately all classical digital logic falls by the wayside, but until then we may be stuck with a hybrid hardware design mix of classical and quantum.
One intriguing prospect is optical computing. The logic for even basic operations will be radically different from classical digital logic anyway, so it may be much more reasonable to build hardware that simulates a classical Turing machine using the same components as the quantum computations require. Interesting possibilities.
Can a quantum-only computer do everything a classical computer can do?
This is essentially the same question as the previous section, just with a slightly different emphasis. Given only the logic operations of a quantum computer, with no classical digital hardware logic gates, can the function of a Turing machine and all of the functions of a classical computer be fully replicated?
Sure, maybe the functions of AND, OR, and NOT gates and flip flops can indeed be replicated faithfully using only quantum logic gates at the hardware level, but the pragmatics of many millions of such gates is a great barrier to doing so, at least currently.
Today, a qubit and the hardware to implement each of the current quantum logic gates (which are software instructions in contrast to classical logic gates which are hardware) is prohibitively expensive and problematic. That’s part of why quantum computers have so few qubits today. Not to mention practical issues such as superconductivity, cryogenic temperatures, and shielding from environmental electromagnetic radiation.
This all begs the question of whether there might be some or any benefit of fully replicating classical logic using quantum logic.
Now and for the foreseeable future, replicating the full features of a classical Turing machine are well beyond the capabilities of purely-quantum technology.
Rather than imagining that maybe someday pure-quantum might overtake pure-classical, it is probably more advantageous to contemplate how to blend the two, exploiting the functional and performance power of both.
It is very conceivable that we can evolve both worlds to the point where classical logic and hardware can take on quantum-like features and quantum logic and hardware can take on classical-like features.
Universal quantum gate sets alone don’t make a universal quantum computer
This is also that same point as the preceding two sections. The concept of a universal quantum gate set sure sounds as if it would define a universal quantum computer, but it simply defines a set of quantum logic gates (software instructions, not hardware gates as with digital logic) which are sufficient to implement all of Deutsch’s eight further operations to extend beyond the operations of a classical Turing machine.
Sure, some people will naively or misguidedly assert that a given quantum computer is a universal quantum computer solely on the basis that it supports a universal quantum gate set.
That’s where we are today.
But where we happen to be today cannot be allowed to define or control where we need to be to unleash the full power of universal quantum computing.
Levels or types of universal quantum computers
For the purpose of discussing the nature of the scale of the problem, I will define or propose four levels or four types of universal quantum computers:
- Level 1 or Type A. A functional universal quantum Turing machine, capable of simulation of all operations of a classical Turing machine, plus Deutsch’s eight quantum operations, but only just barely. Not capable of handling a program of any significant size.
- Level 2 or Type B. Capable of handling programs of some significant size, large enough to be useful running as a coprocessor, comparable to a GPU, but well short of running a full-featured, complex operating system.
- Level 3 or Type C. Fully capable of running anything which can be run on any classical computer. Not necessarily better, but roughly comparable. This doesn’t meet the goal of exceeding classical computers, but at least achieves parity, the minimum. There might need to be separate metrics for performance (speed) and capacity (size) since they are not directly linked.
- Level 4 or Type D. Significantly exceeds the capacity and performance of a classical computer. This is it, the real goal for a universal quantum computer.
And as we do get to or at least begin approaching Level 3/Type C parity, we have some sublevels or sub-types as milestones:
- Level 3.0 or Type C.0. 1940’s style full-room size machine requiring the attention of one or more elite staff to keep the machine running properly. All the cryogenic equipment.
- Level 3.1 or Type C.1. Still room-size, but productized so that a trained non-elite operator can supervise the machine, as in the 1950’s. Still probably cryogenic.
- Level 3.2 or Type C.2. More compact, less operational challenge. Some smaller models which don’t dominate a large room, as in the 1960’s, especially minicomputers. May still use a large room, but to house multiple machines. May or may not still be cryogenic.
- Level 3.3 or Type C.3. Size of a standard data center rack with comparable operating requirements. No longer cryogenic.
- Level 3.4 or Type C.4. Size of a small file cabinet and suitable for an office environment, or a unit which fits into a standard data center rack, with multiple units per rack. Room temperature.
- Level 3.5 or Type C.5. Desktop form factor, comparable to a desktop PC or a workstation. Definitely room temperature.
- Level 3.6 or Type C.6. Laptop form factor.
- Level 3.7 or Type C.7. Tablet form factor.
- Level 3.8 or Type C.8. Handheld form factor.
- Level 3.9 or Type C.9. Wearable form factor, such as a smart watch.
- Level 3.10 or Type C.10. Embeddable, such as Internet of Things.
Obviously that’s a dramatic degree of evolution from today, and won’t occur overnight, but this is the kind of roadmap that is likely with any computing technology.
And there would be comparable sublevels and sub-types for Level 4 or Type D as quantum performance begins to significantly exceed classical machines.
Some details
This paper is not intended to detail all aspects of what is needed to achieve a universal quantum computer, but simply to provide enough of a framework so that discussions of this topic can be firmly grounded.
In particular, this paper offers no details on how specifically to blend quantum and classical operations — probabilistic and deterministic values. That’s an essential topic but any details are well beyond the scope of this paper.
That said, here are some of the areas that will need attention to achieve Deutsch’s vision of “All programs for Q can be expressed in terms of the ordinary Turing operations and just eight further operations.”
- Formalize the notion of a universal quantum Turing machine, incorporating the classical Turing machine and Deutsch’s notion of eight further quantum operations.
- Elaborate on Feynman’s notion of simulating physics, and whether Deutsch’s eight further operations are indeed sufficient, or at least clarify how much of physics those eight operations cover.
- Hybrid memory. Both classical and quantum states.
- Rapid and high-volume movement of data between classical and quantum states.
- Pseudo-quantum data. Expression of an intended quantum superposed and possibly even entangled state in a structured form of classical data, which can be trivially and rapidly transformed into a sequence of quantum operations to achieve that quantum state.
- Transparent switch between true, real quantum and simulated quantum mode. For example, be able to run quantum code at full speed, or enable tracing and breakpoints, causing execution to drop into simulation mode.
- Significantly greater number of qubits.
- Significantly greater degree of connecting (entangling) qubits.
- Quantum coherence time for qubits on the order of seconds.
- Quantum error correction as needed to maintain quantum coherence.
- Multiple quantum processors in a single computer system.
- Quantum communication to transmit quantum data and entanglement between quantum processors, at least in nearby systems, but eventually long distances as with quantum communication.
- Programming languages with very expressive high-level constructs for both Deutsch’s eight further operations and methods for integrating and coordinating those eight operations and classic data and classical operations.
- Debugging tools for working with hybrid, universal quantum programs.
- What are the quantum equivalent of classical objects, classes, functions, and basic data structures? What level should average quantum developers be working at compared to raw qubits?
- Some initial libraries of building blocks of algorithms, code, and data structures for both working with Deutsch’s eight further operations and for integrating and coordinating those eight operations and classic data and classical operations.
- Much richer libraries comparable to and rivaling the libraries for classical computing.
- Even richer libraries which significantly exceed those available for classical computing.
Some other issues that may not be as essential per se, at least for the coprocessor model, but ultimately are needed for a truly, fully universal quantum computer:
- I/O. In general.
- Mass storage.
- Databases. Including access to distributed databases.
- Full-featured operating system.
- Virtual machines. One physical machine, but any number of processes, each of which appears to be a virtual machine of its own.
- User experience. Remote access is sufficient for many uses.
- Networking. Must be accessible over the network, but that can be accomplished with a front-end computer.
- Quantum networking. Direct quantum communication between quantum processors without transforming to and from non-quantum binary format, fully maintaining all quantum state.
- Quantum distributed processing. More than simply disparate systems interacting, tightly integrated distributed systems, each working on a designated subset of a larger problem. Key to tackling Big Data problems beyond the size of even a large quantum computer.
- Direct quantum sensor access. Such as video, audio, and Internet of Things, without traditional digital processing which distorts continuous values.
- Real-time processing. In a traditional sense — prompt attention to input, time-sensitive response. Interrupts.
- Robotic activity. Includes real-time processing, and attention to spatial position, orientation, and velocity. Includes perception and manipulation of objects in the real-world.
There’s probably a lot more, but that’s a good start, and probably sufficient to justify a claim for a universal quantum computer.
Beyond the qubit?
There has been some research into quantum values beyond simply the two-state qubit, including the three-state qutrit and the ten-state (or more) qudit.
Or qumodes on a photonic quantum computer.
It’s not completely clear when or why additional states would necessarily be needed, but the issue is out there.
How much physics can be simulated with only two states?
A qubit is based on using the ground state and a single excited state of a quantum observable, which is fine for simulating many observables, but what about observables in real physical systems which have more than one excited state?
Sure, you can use more complex structures, such as assigning a distinct qubit for each excitation state, but that seems contrary to the whole point of using a quantum computer to simulate a quantum system using direct physics alone.
This is a good argument in favor of a qudit or qumodes.
How many degrees of freedom are needed to simulate physics?
Once again, are two-state qubits sufficient to adequately simulate physics?
Granted, a quantum computer can accommodate an arbitrary number of qubits, subject to practical limits.
But simulating a single physical quantity which has more than two degrees of freedom with multiple qubits will get very tedious very quickly. And, it partially defeats the whole purpose of quantum computing to simulate physics as directly as possible using physics itself.
The physicists will have to be explicit about how many degrees of freedom they require for comprehensive simulation of physics on a universal quantum computer.
How large a physical system can be simulated?
Today, physicists and computational chemists are ecstatic to simulate even a single atom or molecule, but for how long will they be sated?
Before long, once physicists and chemists can simulate nontrivial molecules, their sights will jump much higher, to more complex molecules, including proteins and drugs, and more macroscopic objects with dozens or even hundreds (or more) atoms and molecules. And even to gases and liquids.
Current notions of simulating physics at a very small scale will have to be radically reconsidered and radically revised.
Where does this leave the design of universal quantum computers?
Multiple, connected processors come to mind. Massive parallel processing as well.
So, the issue will not be to make a super-massive universal quantum computer, but to make it possible to create a large, multidimensional lattice of reasonably powerful universal quantum computers.
And to permit very efficient quantum-level interactions between these many systems.
There would still be a need and desire to make individual universal quantum computers as powerful as possible, but ambitious physicists and chemists will want something orders of magnitude larger no matter how big we manage to make a single computer.
QISC — Quantum Instruction Set Computer?
Rather than starting with the core hardware of an existing classical computer, such as the Intel or PowerPC architecture, we should use this moment to derive a wholly new architecture which takes the quantum side of the house into account. Ala Reduced Instruction Set Computer (RISC), we should define a Quantum Instruction Set Computer (QISC).
What would it look like? I have no idea, this is simply a placeholder to stimulate thinking.
Photonic computing
Photonic computing is another potential direction for a universal quantum computer which might provide interesting opportunities for blending quantum and classical data and operations.
If the photonic equivalents of transistors and qubits are small and simple enough, it might well be possible to construct hybrid memories where classical bits are simply qubits without a superimposed or entangled state.
QRAM — Quantum Random Access Memory
It is an open question whether qubits have to be physically distinct from classical memory.
This paper asserts that the logical needs of classical memory, two basis states, 0 and 1, with no superposition or entanglement, are a clear subset of the logical needs of a qubit.
That means that if we design an array or lattice of qubits, there is no functional reason this array or lattice cannot be used for classical memory or random access memory.
I’ll conjecture a quantum random access memory or QRAM, combining the best of functional needs of qubits and classical RAM such as DRAM.
Interestingly, while people complain about quantum decoherence and the rapid decay of quantum state, the DRAM people have dealt with a similar problem from the get-go — the capacitors which make up the memory cells of a dynamic random access memory also decay fairly rapidly, not as ultra-rapidly as with a qubit, but rapidly nonetheless, on the order of one-tenth of a second or even less (64 milliseconds for one recent DRAM memory chip.)
The DRAM people address or solve this problem using refresh, which is special hardware (or sometimes software) which forces a read of all memory cells frequently enough to avoid running into that 0.1 second decay rate.
Unfortunately, there doesn’t appear to be any technical means around the quantum mechanics theoretical restriction that you can’t read the full quantum state of a qubit. Still, maybe somebody will come up with some sort of trick, eventually.
Even if QRAM memory cells still decay at the quantum level, their non-quantum classical state could be refreshed.
Although technically a single particle would serve to contain both quantum and classical state, it may be more practical to have two distinct particles, one for full quantum state and one for a collapsed classical state. The classical state could be refreshed, even if the quantum state cannot.
The downside of that excess circuitry for redundancy would be more than compensated by the significant reduction in size and increase in speed needed to accommodate the qubit side of the memory cell.
Superposition should not be a technical difficulty for a QRAM, beyond the decoherence issue.
Open question: What are the decoherence modes for a qubit? What does a superimposed 0 and 1 decay to? Always to 0, always to 1, randomly to 0 or 1, or what?
Entanglement would be more problematic for a QRAM design. Currently, a complex resonator arrangement is needed to accomplish entanglement. A fully-connected lattice of qubits is a very daunting challenge. And the resonators could be prohibitively expensive for a memory that serves both quantum and classical modes.
Many more details would need to be worked out, but the basic concept seems sound enough to pursue at the theoretical and research level even if it is not practical for another five years or more.
Virtual memory?
Virtually all classical computers larger than embedded computers use virtual memory to keep the data and code for each process separate. This would seem to make sense for a universal quantum computer.
One side effect of classical virtual memory is that it provides each process with a linear address space with no gaps between pages of memory.
A feature of this linear address space is that the physical pages of memory that back that virtual address space do not need to be contiguous in physical memory.
Another feature of classical virtual memory is the ability to swap pages to mass storage so that more physical pages of memory are available for other processes. This is common when a process or significant portions of a process are idle and not being actively accessed. Any of those swapped pages can be reloaded if the process does try to access them.
Both of these two features could be problematic for qubits.
The linear virtual address space is fine until we get to entanglement. Entanglement involves a physical connection and typically a close physical proximity. This means that a qubit at the end of a virtual page that wanted to entangle with the next qubit in the virtual address space which happens to be on the next page will not necessarily be physically adjacent in physical memory.
There could be some technical fix for that issue, but nonetheless it is an issue.
A second issue for qubits is swapping, which is effectively the copying on the contents of a memory page to mass storage, but qubit state cannot be copied due to the no cloning theorem.
I don’t think there could be any technical fix for swapping and reloading of qubit state, other than to disallow it and only support swapping of purely classical memory pages. But for many applications, disabling swapping would be a non-issue, especially if it is a performance-sensitive application which is constantly active anyway.
Balancing and exploiting parallel and sequential operation
There is an eternal tension between the respective benefits of parallel operation and sequential operation.
Sequential operation (sequential execution) is the hallmark of classical computing and classical Turing machines. One step after the other, coupled with branching, looping, and function calls.
The great hope of quantum computing is inherent parallel operation for maximum performance.
How do you combine or blend these divergent approaches to computation?
In truth, even classical computing is actually based on hardware which is inherently parallel. Each AND, OR, and NOT gate and flip flop is operating completely in parallel and it actually takes special logic to force digital hardware to behave in an apparently sequential manner. This is done with clocks, timing circuits, and simple AND operations so that the function of a subcircuit only becomes final when it is ANDed with a signal, pulse, or electrical strobe to cause synchronization. The logic gates within a flip flop are continuously recomputing their outputs based on their inputs, until the synchronization pulse causes a subtle change which causes the desired state change.
A simple way to put it is that hardware tends to be inherently parallel unless you go to great effort to make it sequential, while software tends to be inherently sequential unless you go to great effort to make it parallel.
Part of the reason for this asymmetry is that software more closely parallels human information processing, where there are some things we can do in parallel, but a lot of our more ambitious efforts require a large amount of sequential processing, one step after another.
It is not uncommon for classical computers to have multiple processors, multiple CPU cores, and multiple processing elements within a single CPU core. Lots of parallelism even for a classical computer.
Some classical computers even support instructions which operate on multiple data values in parallel. This was popular with traditional supercomputers, although it is now more popular to have multiple or even massive numbers of processing units instead.
Synchronizing parallel processing units is a significant challenge, largely because software is inherently sequential by its very nature.
So, even with classical computers we have this ongoing tension between parallel and sequential operation.
Quantum computing is an opportunity to refocus attention on maximizing parallelism, but there will still be needs for sequential processing as well. After all, the concept of time evolution of quantum systems is fundamental to quantum mechanics and is inherently sequential. That is only part of quantum mechanics, but is a key part.
In any case, the design of universal quantum computers and universal quantum Turing machines needs to take both the opportunities for parallelism and the inherent tension between parallelism and sequential operation into account, exploiting both for maximum function and maximum performance.
Data types for quantum computation?
The classical computing features of a universal quantum computer would have a full complement of data types (integers, reals, characters, strings, booleans, bytes, bits), but would the quantum operations support only individual qubits? It would seem so, but this matter bears further consideration.
At this stage I don’t imagine n-bit integers, n-bit floating point, or byte-oriented string operations on raw qubits. I don’t rule out the possibility, but it’s too novel to have obvious implications.
For now, all the non-binary data types will remain focused on classical computing operations.
The one exception is preparation and measurement of qubits, where a set of adjacent qubits could be initialized or read as a sequence of bits, either as a bit string or the bits of an integer — or any other data type. But even then, the interpretation of the sequence of bits is strictly on the classical side of the operation — quantum operations know nothing about the sequence, just the raw qubits.
Can we simply graft a QPU onto a classical CPU?
From a hardware perspective, what would a universal quantum computer look like?
Would it simply be little more than a quantum processing unit (QPU) merged with a classical central processing unit (CPU)?
Would it basically be two distinct processors which just happen to be housed on the same chip? Grafted, so to speak.
Technically, it could be implemented this way, at least at first, but we should hope for a lot better integration than that.
In the relatively near term, this route may have some significant potential and possibly be an incremental stepping-stone, even if the evolution is ultimately to a much more integrated solution.
Hybrid mode of operation as a stopgap measure?
It may take more than a few years to realize something even vaguely similar to the kind of architecture discussed in this paper. What do we do in the meantime?
Sticking with the hybrid mode of operation seems to be the most sensible route for at least the next few years. This means that the quantum processor is clearly separated from the classical processor, with each doing its own parts of any larger computation, with a tedious process for moving data between the two.
They are two distinct computer systems. Two distinct Turing machines.
It can and does work, but it is a lot more effort on the part of the programmer and algorithm designer.
This is better than nothing if an integrated universal quantum computer is not available or feasible, but it’s more of a temporary stopgap measure and a stepping stone on the longer path to a true universal quantum computer.
Is a quantum coprocessor good enough?
We also have the model of a floating-point coprocessor or a graphic processing unit (GPU) to at least loosely combine a QPU and CPU.
This model can and does work, but should be seen as more of an intermediate and transient stepping stone than an ultimate alternative to full integration of classical and quantum computing.
The downside is that we’re still keeping the two worlds apart rather than looking for ways for both to leverage the capabilities of the other.
In other words, a coprocessor arrangement is more of a sideshow than the big show.
Poor man’s universal quantum computer
That’s what I’d call the combination of a classical computer with a quantum computer as a coprocessor — a poor man’s universal quantum computer.
It’s more of an approximation of a true, integrated universal quantum computer.
A quantum coprocessor is certainly better than nothing, but only a stepping stone on a longer path to a true universal quantum computer.
The hybrid mode of operation is also a poor man’s universal quantum computer.
Direct integration of classical and quantum operations
Directly integrating classical and quantum operations is a new and unexplored area, but there’s some real promise.
It should be possible to directly measure binary value of one or more adjacent qubits as if they were bits in memory. This might indeed collapse their full quantum state, but at least this process would be as fast as possible — a very low-latency operation rather than a much slower communication between two disparate processors.
It should be possible to directly prepare the initial values of one or more adjacent qubits as if they were adjacent bits in memory.
Quantum registers
It might also be advantageous to have a relatively few number of qubits which are organized as if they were classical registers, so they can be quickly accessed as classical registers are, without the overhead of sending addresses and data over a bus and memory controller.
Quantum registers could be organized both as individual numbered qubits, such as 16 one-qubit registers, or as groups comparable to n-bit integers, such as 16 64-bit registers. Or, maybe 1024 qubits could be addressed as either 1024 individual one-qubit registers, 128 eight-qubit registers, 64 16-qubit registers, 32 32-qubit registers, or 16 64-qubit registers — or any combination of those multiples — using classical operations on single bits, 8-bit bytes, 16-bit words (integer), 32-bit words, or 64-bit words. And maybe eventually even 128-bit words.
What the upper bound for these non-memory registers should be is completely up in the air. It could range from 64 to 128, 256, 512, 1024, 2048, or even 4096. More is possible, but the primary constraint is to be able very rapidly address and access these registers, even faster than larger-scale memory which requires a bus and memory controller.
One thing to be careful about with quantum registers is that a classical computer requires process state or thread state to be saved whenever execution switches to another processor thread, and to restore state when the process or thread resumes execution. Process and thread state includes registers. But quantum state cannot be saved and restored due to the no-cloning theorem. The simplest solution is to support multiple banks of registers, so that a process or thread is assigned a bank and keeps it for the duration of its execution. This may limit the number of processes or threads which can perform quantum operations, but that’s life in the quantum world.
There are plenty of other possibilities for integration. These are intended simply to give a flavor and to suggest a minimum level of integration.
So far, I haven’t suggested qubit operations which operated on qubits as a group in a quantum sense. I’m not sure what that would even mean, other than performing the same quantum operations on each qubit in parallel, or the two special cases of preparing and measuring n-qubits in parallel.
What size integer is most appropriate for a universal quantum computer?
So far, the quantum operations envisioned for a universal quantum computer don’t have any multi-qubit data types comparable to integers, reals, bytes, characters, and strings, but the classical side of the house would presumably have a full complement of operations for these data types. This does still beg the question of how big an integer should be on a universal quantum computer.
The initial motivation for a quantum computer is to handle large, computationally-intensive problems, so 32-bit integers and floating point would seem rather lame as the starting point.
64-bit integers and floating point would seem to be a reasonable starting point.
But if there is one lesson which we’ve constantly had to relearn in the past 80-year history of digital computers, it’s that the technology and demand move a lot faster than we expect. 64 bits may seem like more than enough today, especially since so many applications run fine on what are still basically 32-bit architectures today.
It’s at least worth contemplating 128-bit integers five to ten years from now on a universal quantum computer. Even if a lot of classical math would still be fine with even 32-bit integers, integers are commonly used as blocks of bits, and it seems like preparing and measuring a significant number of qubits in parallel would be very advantageous.
64-bit, 32-bit, and even 16-bit integers can still be supported and handled more efficiently (or at least equally efficiently) anyway.
Integer size surfaces in multiple ways in a computer:
- Size of individual integer data values. A single number.
- Most efficient way to handle a bunch of bits. As a bundle of bits.
- Size of registers. Each holding a single integer value.
- Complexity of hardware for basic math operations. Add, subtract, multiply, and divide.
- Size of a fixed-width or typical machine language instruction.
- Size of a memory address.
- Width of the data path to move data between the CPU and memory. For both data values and addresses.
- Implies a size for basic floating point operations. Single-precision.
- Number of bits which can be manipulated in parallel in a single operation. AND, OR, NOT, and XOR.
- The number of memory chips needed. Since each chip supplies only a single bit, n-bit values implies n chips operating in parallel.
Some of those factors would by themselves argue for larger integers or possibly against larger integers. A wider data path requires more connections, more chip real estate and more power.
Even 64-bit integers might seem like overkill, and 128-bit integers a wild extravagance, in a world where 50 qubits is considered a large machine, but the ability to prepare and measure 128 qubits at a time seems quite appropriate as a target for machines to be widely used in ten to fifteen years.
128 bits may seem completely unnecessary for memory addresses, especially since even 64-bit memory addresses are well beyond the memory capacity of current classical computers, but there may be some more exotic potential, with sparse address spaces and mapping portions of address spaces to other processors, for example. I’ll leave this as an open issue, but the real point of this section is that it may be best if there is a symmetry between integers and memory addresses.
What size floating point operations make sense for a universal quantum computer?
Classical double-precision floating point (64-bit) seems like a good starting point for a next-generation computer. Or possibly even extended precision (80-bit).
Support for classical single-precision floating point would seem warranted only if it can be implemented to perform much faster, and for applications where the storage size is especially significant.
There is an IEEE quadruple-precision floating point (128 bits) standard, which would seem more appropriate if not ideal for a machine focused on 128-bit integers and 128-bit registers and data paths.
Classical real numbers are not relevant to quantum operations, at present. Maybe that will change or evolve as qubit count rises.
What precision is needed for simulation of physics?
Most practical applications of computers for human-level real-world applications don’t need a lot of precision. One hundred trillion requires 14 digits. A nanosecond or nanometer requires 9 digits. A picosecond requires 12 digits. Combining those two worst cases gives 26 digits. A quadruple-precision floating-point number offers at least 33 digits, sufficient for that worst case — for practical applications. And, it is atypical to require both such a large size and such a small size in the same calculation or data values.
That’s for practical, human-level applications. What about physics?
We have the large scale of the size of the universe, the number of stars, the number of atoms in the universe, and the total mass and energy of the universe.
And we have the small scale of subatomic particle size, mass, charge, and energy.
I’ll leave it to the physicists to inform us of what precision they need.
Oh, and some constants such as pi and e are irrational and have an infinite number of digits, so physicists will have to make a judgment call on how many digits of infinity they feel are necessary.
Planck constant
One important factor in what ultimate precision simulation of physics might require is the Planck constant, with a nominal value of approximately 6.62606 times 10 to the minus 34 joule-seconds.
There is also the reduced Planck constant which is the Planck constant divided by 2 times pi, with a nominal value of approximately 1.05457 times 10 to the minus 34.
It’s not clear how many digits are required to accurately represent the Planck constant.
And since pi is irrational with an infinite number of digits, one or both of these two constants has an infinite number of digits, so once again the physicists will have to render a judgment call as to how many digits of precision they require.
There is also Planck time which is approximately 5.39 times 10 to the minus 44 seconds, and the Planck length which is approximately 1.616 times 10 to the minus 35 meters. And there are some other Planck units as well, including mass and charge.
A fundamental question is whether Planck units should have some special, central treatment by a universal quantum computer, rather than being odd-looking binary-decimal floating-point numbers.
A second fundamental question is whether exact precision of macroscopic physical measurements in terms of Planck units is required.
Or, whether approximations are good enough for physicists.
In any case, physicists need to be explicit as to what precision they require for their simulations of physics.
Simulation of chemistry
Is simulation of chemistry an easier problem than simulation of physics or a much more difficult problem?
It probably depends on the type of simulation and the complexity of the molecules being simulated.
Protein folding? Biological impact of drugs? They’re in this space as well, although there is no clarity as to whether or when quantum computing will fully simulate such systems.
So the question is whether the basic Deutsch operations plus classical computing operations are really sufficient to effectively simulate chemistry.
Simulation of cosmology
Cosmology is technically a subfield of physics, but the scale is rather different than the rest of physics.
There is no clarity as to whether the Deutsch operators of a quantum computer necessarily offer much to efforts to simulate cosmology.
In any case, the computational needs of cosmology simulation need to be addressed by a universal quantum computer.
Exact simulation versus approximation
Classically, simulation has been an attempt to approximate real systems.
More powerful computers have meant more accurate simulations, but they still remain approximations.
Unfortunately, the amount of computing power to achieve each increment of additional accuracy keeps rising at a faster rate than the improvement in computer technology.
In theory, quantum computing should provide much greater performance.
And less raw computing power should be required for each increment of increased accuracy, compared to the incremental power needed on a classical computer, especially for problems who classical solutions have exponential cost growth while quantum solutions have polynomial cost growth.
The advent of quantum simulation is supposed to radically change this picture since the quantum operations are supposed to be the same as occur in physics itself, so that a simulation at the quantum level should be an exact simulation, granted, in a probabilistic sense, rather than a much rougher approximate simulation.
How much simulation can be done at the purely quantum level, especially for larger and more complex systems, remains to be seen, but we at least have a focus and goal to aim for as we design and build universal quantum computers integrating classical and quantum operations.
Support for character, string, and text operations
Individual characters are relatively trivial to process since they are essentially simply integers, and relatively small integers at that.
Ignoring, for the moment, the fact that quantum operations don’t apply to integers.
Strings of characters have been a longstanding source of problems for classical computers.
First, there is no standard format for strings.
A string is basically simply a linear array of character codes.
But how to handle length and termination has never been standardized.
Back when characters were limited to 8-bit bytes, life was so much easier. But now, with multi-byte Unicode, life gets complicated. There is no single standardized format for encoding of string values. There are standardized formats, but no single standard.
Individual applications, programming languages, or subsystems make their own choices for which encoding formats to use.
Once again, this is ignoring, for the moment, the simple fact that quantum operations don’t apply to either individual character codes or strings of characters.
Text is simply an interpretation of strings of characters, recognizing words, numbers, names, punctuation, and symbols. Once again, there has never been a single standard. And, once again, quantum operations don’t apply to text.
That said, it seems only a matter of time before people actually do what to apply quantum operations, or quantum-like operations to characters, strings, and text.
It is easy to say that we’ll cross that bridge when we come to it, but we should endeavor to at least strategize some options to have ready.
Any impact from running a CPU at colder temperatures?
One issue with more tightly integrating a quantum processor and a classical processor is the question of how a much lower operating temperature might impact a classical CPU. Might it interfere? If, so, significant effort might need to be expended to isolate the two temperature regimes.
Or, might much colder temperatures enable even higher performance classical processors?
Design tradeoffs need to be made to run at room temperature and with electronics which generates a lot of heat, so maybe a much colder operating temperature might open some interesting opportunities.
Current quantum hardware has no choice but to operate in a challenging and expensive physical environment. We don’t want to force classical computers to operate in that same environment, but if the opportunity arises, might it actually be a net positive?
Could classical logic circuits be made smaller and faster if intended to run at 0 K?
What to do about coherence time
There will undoubtedly be improvements in quantum coherence time for qubits as newer machines get built, but the central question is whether or when qubits will have advanced to the stage where it is not a major concern and blocker to many applications.
Maybe the only answer here is that much more research is needed.
And to be careful not to design a machine which can’t be built or fails to meet realistic expectations.
Quantum error correction (QEC) may or may not be required to compensate for quantum decoherence. More resilient qubits and better shielding may reduce or even eliminate the need for expensive QEC.
Multipartite entanglement?
Current and near-term quantum computers support only bipartite entanglement — only pairs of qubits can be entangled, but some thought will have to be given to tripartite entanglement or even higher degrees of entanglement.
There are two independent questions:
- How difficult is it to implement multipartite entanglement?
- What applications require or are significantly advantaged by multipartite entanglement?
Part of the problem is that quantum computers and their applications are now operating at such a small scale that potential applications are masked so that they are not seen or obvious.
I’m tempted to say bipartite is sufficient for the foreseeable future, but ten to fifteen years from now our successors might well criticize us for our lack of foresight.
Mixing simulated and real quantum operations
There might be some significant advantages to having all three modes of computing, simultaneously, in the same machine, transparently:
- Classical computing.
- Quantum computing.
- Simulated quantum computing.
At a minimum, it should be possible to perform all mixed classical and quantum operations using a quantum simulator. This is not unlike older classical computers in which floating-point operations are simulated, but the machine language is identical to a machine with real floating-point hardware. So everything works, transparently.
Debugging will generally be facilitated using a quantum simulator. This transparent mixed mode will make it more convenient.
Simulation can also be used to test out new and novel changes to the quantum operations.
Cryptography
This section is simply a placeholder. A lot has been written about cryptography, the application of quantum computers to cryptography, and the potential of quantum computers to break existing cryptography.
I leave it as an open question as to whether additional operations are needed beyond the traditional classical operations and Deutsch’s further quantum operations.
Artificial intelligence?
Is artificial intelligence (AI) just another application needing high performance, or is it truly special needing fundamental computing power distinct from anything classical computing has to offer? Whether quantum computing, namely Deutsch’s eight further operations, is enough to fully meet the special needs of AI remains to be seen.
A classical Turing machine supplemented with Deutsch’s eight further operations may be sufficient for the basic features of so-called machine learning (ML), but ML falls far short of human-level intelligence.
What exactly is the computation power needed to sustain human intelligence? Nobody really knows, yet.
Penrose speculates in his book The Emperor’s New Mind that a classical Turing machine is insufficient to model human intelligence (consciousness) and that quantum mechanics is needed. This is a controversial proposition, but still worth considering. And if Deutsch’s eight further operations are sufficient to simulate quantum mechanics, this might have some promise in that direction.
Turing himself hypothesized a B-type u-machine (unorganized machine) which was intended to more closely model the human mind. The concept is referenced in the Evolutionary Turing in the Context of Evolutionary Machines paper by Burgin and Eberbach. This might be a good starting point for considering how to blend classical and quantum operations in a universal quantum machine.
Turing a, c, o, and u-machines
Although people commonly speak of the universal Turing machine, Turing had several different conceptions of machines:
- a-machine. “a” for automatic. His classic universal Turing machine. Given its original input it runs to completion and halts, having produced the results of the computation.
- c-machine. “c” for choice. It runs automatically as a pure a-machine until it reaches points at which an external operator (or another machine) has to make a choice which determines the next state. Technically, it could be a human operator (user), but more likely it is another machine, such as another Turing machine. This enables interactive computing, networked computing, and operating systems controlling multiple processes (virtual machines) which may interact through their choices. Whether or when it halts is less significant than with an a-machine.
- o-machine. “o” for oracle. Similar to a c-machine, but the choice is made by an entity which is not a Turing-style machine. This could be a human operator or user, or a machine such as an analog computer or a sensor.
- u-machine. “u” for unorganized. Intending to permit modeling of human intelligence and evolutionary computation. See the Evolutionary Turing in the Context of Evolutionary Machines paper by Burgin and Eberbach.
The classic a-machine has the issue of the halting problem since it is fully automatic. If you request a calculation, you expect an answer in a finite amount of time.
The c-machine and o-machine conceptually do not need to halt unless the external input indicates that computation should halt. An operating system, word processor, or network server are expected to run forever, or at least until an external operator requests a shutdown or some special condition arises.
The first two types of machine and possibly the third are mandatory to be functionally competitive with existing classical computers.
Virtual machines are a conception which Turing does not seemed to have considered. The concept of processes, which each process appearing to have the state of an independent machine, with the potential to communicate between processes, is standard fare on modern classical computers. Each of those processes runs as a c-machine or an o-machine. Background processes are c-machines. Interactive applications are a hybrid of a c-machine and an o-machine.
A network server process is a c-machine, but individual network requests function as a strict a-machine — the code is expected to halt and return a response to the network request within some relatively finite amount of time.
The fourth type, the u-machine, is thought to be needed to be competitive with human intelligence.
Somehow, it has become popular to speak only of a single universal Turing machine without being specific as to type. A pure, strict a-machine is not sufficient to support much of modern classical computing.
But as we contemplate formalization of a universal quantum Turing machine, the unique capabilities of the various types need to be made explicit.
Maybe Turing’s taxonomy, supplemented by a q-machine implementing Deutsch’s quantum operations might be sufficient. Technically, that’s not correct — the whole point of Deutch’s vision is to integrate his quantum operations into a Turing machine. Type would give us an aq-machine, a cq-machine, an oq-machine, and a uq-machine. Or something like that. Details to be considered further.
Analog?
In the earlier days of classical computing there was still competition from analog computers. For some problems analog computers were sufficient, easier to set up, and even more accurate, but they were less general and not programmable, requiring very careful and tedious wiring or mechanical connection of components. Programmable digital computers quickly left them in the dust.
Quantum computing is a bit more like analog computing. In fact, the D-Wave quantum annealing system has much more the flavor of an analog computer than a digital computer, albeit with the ability to be dynamically configured (“programmed”) using software.
Digital is another name for discrete value, in contrast to continuous value or continuous variable (CV). There is some research in continuous-variable quantum computing, such as Xanadu using optical computing.
In any case, the point is that we should take this moment to consider what sorts of analog-like operations are suitable for a universal quantum computer.
With images, music, video, audio, and other forms of signal processing, and even signal generation, we are currently using digital computers to simulate the analog world, but we’re really only approximating the real world signals. This approach seems to be working reasonably well, but that is always true for chosen niches. The question is whether there are other areas or additional levels of the current niches where we can and maybe should consider more analog-like operations to more greatly facilitate our simulations of the analog world.
Horrendous mountains of code
It is truly amazing how much we are able to accomplish with today’s classical computers.
The downside is the horrific complexity of the resulting code. Thousands of lines of code, tens of thousands of lines of code, hundreds of thousands of lines of code, millions of lines of code, tens of millions of lines of code. Where will it end? How much do we really know about what issues may lurk in that code?
And this begs the question of how many machine language instructions are generated for each line of source code.
Not to mention how many dozens, hundreds, or thousands of lines of code or machine language instructions are executed for even a single function call.
Not to mention the intellectual challenges of developing and maintaining a mental model of what’s going on in all of that code.
Can’t a decent quantum computer reduce the size of this monstrous mountain? If not down to the size of an ant hill or a mole hill, at least down to the size of a garden-variety hill?
To be fair, instructions and lines of code are not unlike atoms, molecules, and particles, and we don’t complain too much about the complexity of a grain of sand or salt, or a simple teaspoon, despite the truly immense number of atoms and molecules in even a single cubic millimeter of matter.
The real, fundamental issue of concern is all of the intermediate levels of structuring and complexity. With matter, there tends to be a reasonably clear lattice-like crystalline structure which is relatively easy to comprehend. And gases and liquids have a reasonably clear and consistent statistical structure.
Meanwhile so much of our software systems are like giant, chaotic flea markets, with very little apparent structure or order, even though there tends to be at least some degree of method to the madness within individual software components. Sometimes. Sort of.
To be fair, most biological systems (creatures) and DNA are quite complex and at least moderately chaotic, but the point is that our software systems seem to be far more disorganized and chaotic than your typical biological system.
It’s not clear if anything can done about this, at least at the level of individual machine instructions, but at least there is some opportunity for machine architectures that at least encourage more rational, structured, and organized systems.
At a minimum, the problem warrants some intensive thought.
Database operations
Big Data is a big thing these days and likely to only get even more data-intensive.
There are two distinct forms of Big Data processing:
- Compute-intensive sequential scanning and sequential processing of an entire data stream.
- Query-intensive selective processing, using complex Boolean expressions and optimizing access using indices.
In both cases, graph-like relationships, such as the SQL JOIN operator, can cause the intensity of data and compute processing to rise exponentially, which is not a good match for a classical computer.
Both of these forms of processing are handled reasonably well with current classical computing technology, but performance and capacity are issues which can always use more optimal solutions. In addition, some problems are simply too data or compute-intensive to even consider tackling with a classical computer.
There are some areas where a universal quantum computer could help:
- Faster processing.
- More efficient forms of indexing.
- More efficient forms of queries.
- More efficient forms of storing and organizing data.
The question is whether Deutsch’s operators are sufficient or whether even more dramatic capabilities are needed.
And a core issue are the horrendous mountains of code which are required to accomplish even relatively simple database operations — under the hood, even if the user’s queries and operations seem rather simple. Maybe the total number of operations performed cannot be substantially reduced (although hopefully it can), but anything that adds some structure and order to the mountain would be beneficial.
Time-series data operations
Most database data is organized logically around identifiers such as names and symbols, and has very little sequential quality other than alphabetic and numeric sorting of those identifiers and symbols.
Time-series data is very different. It lacks any abstract, logical structure, and is instead ordered strictly chronologically, by time and date, in the sequence events occurred.
Once again, as with common database data, horrendous mountains of code are required to accomplish even relatively simple time series operations.
Anything that adds some structure and order to the mountain would be beneficial.
Software can tediously build and maintain sophisticated indexing and summarization structures so that analytic operations are relatively more efficient, but once again, this requires even more horrendous mountains of code.
Taylor series expansion
Most of the common mathematical functions, such as trigonometric, exponential, and logarithms, are commonly approximated using Taylor series expansion.
How to do such expansions efficiently is an ongoing technical challenge, which gets even more challenging with higher-precision floating point number formats such as 128-bit quadruple precision.
Physicists will need to make explicitly clear how much precision they require for Taylor series expansion of functions.
People have simply grown accustomed to the performance intensity of such calculations, which helps to drive the demand for ever-more powerful computers.
It sure would be nice if a hybrid of classical and quantum computing could perform such calculations a lot more efficiently.
Challenges of algorithms
The architecture of a universal quantum computer will provide the basic building blocks with which algorithms and code are constructed.
There are many challenges confronting developers and algorithm designers for quantum computers, which are reviewed in my companion paper, The Greatest Challenges for Quantum Computing Are Hardware and Algorithms.
That paper was written before this paper on the presumption that algorithm developers would not be able to exploit a tightly integrated universal quantum computer.
With only a separate coprocessor or only the hybrid mode of operation, algorithms for quantum computing have many more challenges to face, as detailed in that paper.
Data modeling
Whether for basic data structures, databases, or time-series data, data modeling is a real challenge, and always has been for classical computing. Quantum computing simply makes it worse, or at least more challenging.
Trying to represent any significant degree of complex data in the simple two-state data model of qubits can be a monumental challenge.
The question is whether there are architectural features which can be considered for a universal quantum computer which could dramatically ease the burden of data modeling.
Achieving statistical stability from probabilistic computation
Classical computers are inherently deterministic while quantum computers are inherently probabilistic. Probabilities are great, for what they are, but ultimately we want solid answers.
The question is how to get solid answers from probabilistic data.
It is worth noting that all matter, including and especially solid objects are based on the same probabilistic nature as the qubits of a quantum computer. The ground and floors under our feet, the roofs over our heads, the chairs we sit on, the tables we work on, and the pens, paper, computers, tools, and equipment we work with have all managed to achieve a remarkable level of stability despite the underlying probabilistic nature of the quantum mechanics which underlies… everything in existence.
In other words, statistical stability can be achieved using only probabilistic data as the input.
As with safety in numbers, we see stability in numbers.
From a simple practical perspective, this simply means we have to perform any given quantum computation more than once, some number of times to get a sense of its statistical distribution, its stability or lack of stability.
As stable as solid matter can be, we also have liquids, gases, plasmas, and combinations of states of matter.
And we also have earthquakes, fires, explosions, weather, climate, biological evolution, and decay over time, or changes in state and stability due to the introduction of external forces.
So, we can have stability, but it may not be guaranteed forever even when we do find it.
In any case, a universal quantum computer needs to help us with the full range from hard determinism and fuzzy probability to statistical stability and risks to stability.
Ultimately, which is the bigger challenge — coping with the probabilistic nature of quantum computation, or accepting the unreality and risks of the hard determinism of our classical computations?
PL/Q — Need for a programming language to match the full potential of a universal quantum computer
Computer programming languages have evolved in a rather chaotic manner over the past 80 years, never really fully and adequately capturing the full potential for expression of algorithms and code for the underlying machine. Never really facilitating much more than a subset of the true potential, varying the subset from one language to another.
Now more than ever, we need much more expressive programming languages, which capture and express the quantum operations and their integration with classical operations.
I’ll call it PL/Q, for programming language — quantum. Or maybe it should be PL/U for universal.
No such language exists, nor has one been proposed.
This is merely a placeholder for such a language.
Great barriers to architectural progress: qubit count, coherence time, and connectivity
Grand ideas and ideals for ultimate universal quantum computers are all for naught if we don’t have the hardware required to support the architecture of universal quantum Turing machines.
There are many challenges on the hardware front, but the three tallest peaks of the mountain range are:
- Greater qubit count. A handful or even a few dozen qubits are not even table stakes to get started. We need hundreds of qubits, with the promise of thousands just to get the juices flowing.
- Greater qubit coherence time. There is no clarity about what the threshold should be, but one ten-thousandth of a second is again not even enough for table stakes. Hours and minutes may be too much to ask for at this stage, but a healthier fraction of a second (quarter of a second? Tenth of a second?) is needed before we can get serious, with serious algorithms and serious hybrids of quantum and classical operations. This also includes error correction to assure coherence.
- Greater qubit connectivity. It must be possible to connect any qubit with any other qubit before we can enable wide-open algorithms. Current systems have so many restrictions and special rules that algorithms are seriously constrained — we need unconstrained algorithms.
None of those three hardware challenges are the focus on this paper, but all three fatally preclude any real and substantial progress on the real meat of this paper.
There is further discussion of hardware issues in the companion paper, The Greatest Challenges for Quantum Computing Are Hardware and Algorithms.
What can a classical computer do?
A companion paper, Knowledge Needed to Deeply Comprehend Digital Computing lists many of the specific capabilities that we have come to expect from a classical computer.
It provides a guide for the kinds of operations which need to be implemented on a universal quantum computer, from a classical computing perspective.
What should we expect a quantum computer to do?
A companion paper, What Knowledge Is Needed to Deeply Comprehend Quantum Computing? explores what topics may be relevant to understanding what to expect from a quantum computer.
It provides a guide for the kinds of operations which need to be implemented on a universal quantum computer, from a quantum computing perspective.
Criteria for judging progress
How do we judge progress towards a more ideal quantum computer?
I have outlined quite a number of criteria in my paper, Criteria for Judging Progress of the Development of Quantum Computing.
Although that paper was written before I began work on this paper with its focus on integrating classical and quantum operations. The section on levels and types of universal quantum computers in the paper should supplement that paper.
How soon before we see the first real universal quantum computers?
Who knows, we could see some early renditions of a basic, Level 1 / Type A universal quantum computer in five years. Or within 10 years. Or, it could take longer.
First we need progress on core qubit technology so that getting to 64, 128, or 256 qubits is no longer a primary focus of hardware development.
Once the hardware folks break through the wall on qubit count, qubit coherence time, and qubit connectivity, they will suddenly have some breathing room to consider how to pursue architectural issues such as integration with classical operations and how people might actually use these qubits.
But it may take another few years to formalize and evaluate the concept of a universal quantum Turing machine. Or maybe I should say a few years from when serious work commences, but we have no idea when the work might commence, even on a halfhearted scale.
So, five years seems very possible if people get really focused, but 7–10 years seems like more of a safer bet.
Maybe in five or six years we see a 1940’s-style room-sized universal quantum computer, and maybe in 7–10 years we see 1960’s-style minicomputers, rack-sized universal quantum computers.
Of course all of this is very speculative. And optimistic!
No, a Noisy Intermediate-Scale Quantum (NISQ) computer is not a universal quantum computer
Preskill suggests both the promises and the pitfalls of a Noisy Intermediate-Scale Quantum (NISQ) computer in his Quantum Computing in the NISQ era and beyond.
On the plus side, intermediate-scale suggests having enough qubits to actually handle a real-world problem in terms of capacity.
On the minus side, noisy suggests that a NISQ computer is too error-prone to be reliable enough for many applications.
That said, the great hope there is that for many applications approximate solutions are both sufficient and all that is really expected, so a few errors here and there will not be fatal.
And since quantum computers run really fast, you can always run a program multiple or many times to try to get a the errors to wash out.
Sure, a moderate error rate will be fatal for financial transaction applications, or anywhere exact, deterministic answers are expected and required, but the probabilistic nature of quantum computing is not a fit for such application anyway.
Even so, it doesn’t feel appropriate to nominate a noisy technology for a universal quantum computer.
But the real fatal nail in the coffin is the fact that NISQ computers won’t satisfy Deutsch’s vision of integration of classical and quantum computing in the same universal quantum Turing machine.
So, a NISQ computer will still be just a basic quantum computer. As Preskill puts it, even a decent 100-qubit quantum computer will merely be “a significant step toward the more powerful quantum technologies of the future.”
To be clear, Preskill does not discuss Deutsch’s vision or the concept of a universal quantum Turing machine.
The bottom line is that NISQ itself and alone doesn’t promise more function than is found in current quantum computers. Mostly it’s about more capacity — more qubits.
Simulate a universal quantum computer
Given the sluggish pace of hardware advances for quantum computers and the continued improvements of classical computers it would seem desirable and practical to attempt to simulate a universal quantum computer even if the hardware for a full-blown Level 2 / Type B universal quantum computer remains years away.
In fact, a simulator for a universal quantum computer might be close to achieving the function and capacity of a Level 1 / Type A universal quantum computer — it demonstrates all of the function and operations, and even some reasonable degree of capacity, even if the performance is minimal or even only marginal.
At least it would permit testing of all or at least many of the concepts discussed in this paper.
My definition
In my own Glossary for Quantum Computing I focus on defining universal quantum computation and then define universal quantum Turing machine and universal quantum computer in comparable terms. Here are those three definitions excerpted from the glossary:
- universal quantum computation. Subject to practical capacity limitations, the ability of a quantum computer to do anything a classical computer can do, in addition to anything any quantum computer can do beyond the capabilities of a classical computer, in particular, supporting a universal quantum logic gate set. It must be able to compute all mathematically computable functions as well as able to simulate any real physical quantum system, subject to practical capacity limits. By definition, this includes all classical computation, including any function which is computable by a classical Turing machine. In theory, universal quantum computation is any computation possible by a universal quantum Turing machine. Even if a quantum computer is universal, it may not be able to practically do everything which a classical computer can do — it may have too few qubits, too short a coherence, or lack sufficient quantum error correction. See the classic Quantum theory, the Church-Turing principle and the universal quantum computer paper by David Deutsch. Alternatively, in practice, the ability to compute the lion-share of computations which have practical application in the real world. Alternatively, the computations possible using a universal quantum logic gate set, in contrast to the limited set of functions supported by a special-purpose or fixed-function quantum computer, and in contrast to the full functions of a classical Turing machine.
- universal quantum computer. A quantum computer capable of universal quantum computation. A quantum computer which can be applied to all classes of problems and capable of computing whatever a classical computer can compute, in addition to anything a quantum computer can compute, including simulation of real physical quantum systems, in contrast to a fixed-function quantum computer which is tailored to only one relatively narrow niche class of problems. Note that universal refers to each of the discrete operations which can be performed, not necessarily the amount of data which can be processed or its structure. Alternatively, a quantum computer which supports a universal quantum logic gate set, in contrast to a special-purpose or fixed-function quantum computer, and in contrast to a computer supporting the full functions of a classical Turing machine. Alternatively, a vague marketing term asserting that a quantum computer or some purported future quantum computer is somehow all-powerful, or at least much more impressive than current and envisioned classical computers. Alternatively, again as a vague marketing term, simply a crude synonym for a general purpose quantum computer, in contrast to a fixed-function quantum computer. Note that a universal quantum computer may not have the resources and capabilities to make it suitable as a general purpose computer.
- universal quantum Turing machine. A hypothetical equivalent of the concept of a universal Turing machine of classical computing applied to quantum computing. This is the ultimate, ideal conception of a universal quantum computer, in theory even if not in practice. Such a conception has been neither formalized in theory nor realized in the lab in any current quantum computer or near-term quantum computer as of July 2018. See the Wikipedia Quantum Turing machine article. See the Quantum Turing Machines Computations and Measurements paper by Guerrini, Martini, and Masini. See the classic Quantum theory, the Church-Turing principle and the universal quantum computer paper by David Deutsch. Shortened as QTM. See also: quantum Turing machine.
I was tempted to push the full definition into universal quantum Turing machine, but since it may be some time before that concept is fully formalized, a focus on universal quantum computation seemed more appropriate. And a universal quantum computer is simply a machine which supports universal quantum computation.
Note: I did not include artificial intelligence as a special type of application. The jury is out on that. For now, I presume that the combination of classical and quantum operations is fully sufficient to model all aspects of human-level intelligence, but this presumption may need to be revisited eventually as AI research advances sluggishly from its current, primitive state.
What’s next?
Unfortunately, the hardware guys will continue to be laser-focused on simply ramping up qubit count for the next few to five years, before anybody really notices that their machines are completely missing the notion of a Turing machine.
The coprocessor approach and the hybrid mode of operation will likely be appealing intermediate stepping-stones on the path towards full integration into a truly universal quantum computer.
The immediate next steps are for researchers and hardware engineers to perfect qubit implementations so we can break through the walls on the qubit count, qubit coherence time, and qubit connectivity fronts. Without dramatic advances on all three fronts, any talk of grander schemes for universal quantum computers are a glorious waste of energy.
Meanwhile, it would be good to start considering how to formalize a universal quantum Turing machine, especially how to integrate the disparate quantum and classical worlds.
And I would definitely like to see people start envisioning what a PL/Q programming language for a universal quantum computer might look like.