Quantum computers are not yet ready for prime time, but this informal paper summarizes some of the categories of applications which will likely eventually be appropriate for quantum computers in the coming years. Most generally, any application which is too computationally intensive for a classical computer is likely a good candidate for a quantum computer. Granted, it’s a little more complicated than that, with some applications even too complex for even a quantum computer, or structured in a way which is not readily transformed into a structure which is readily processed by a quantum computer.
A simpler formulation would be applications for which we understand how to structure code on a classical computer but for which the computational requirements — time, memory, and parallel processing — are much too intensive for even readily available supercomputers.
In addition, there are applications for which we have no ready methods to structure on even the most powerful supercomputers, but which can be structured in terms of the raw physics operations supported by quantum computers. Simulating quantum mechanics and chemical reactions are two such applications.
The goal here is not to precisely characterize what makes an application suitable for a quantum computer, but to summarize categories of applications which vendors and other proponents of quantum computers are touting as the more fertile applications for quantum computing.
Whether these categories turn out to be accurate over the longer term remains to be seen.
And in the short to medium term actual, real quantum computers may turn out to be too limited to fully support the claimed applications. The basic idea is that eventually we will see sufficiently powerful quantum computers capable of supporting these applications.
The short summary of applications includes:
- Optimization, planning, and logistics
- Financial modeling
- Drug design and discovery
- Cybersecurity and cryptography
- Molecular modeling
- Chemistry modeling, computational chemistry
- Material design and modeling
- Aerospace physics
- Quantum simulation — simulation of physical systems at the quantum mechanical level
- Random number generation
The rest of this paper will list applications as they are indicated from a variety of sources, including:
- BBVA (Banco Bilbao Vizcaya Argentaria, S.A — Spain)
- Department of Energy (DOE)
- D-Wave Systems
- National Science and Technology Council (NSTC)
- National Strategic Overview for Quantum Information Science
- Quantum Circuits, Inc. (QCI)
- Rigetti Computing
- Singularity Hub
- Zapata Computing
Note: This paper does not cover the broader field of quantum information science (QIS), which includes quantum communication and quantum networking in addition to quantum computing — only the latter is the focus of this paper.
The information presented in this informal paper is not designed to be absolutely exhaustive, but certainly reasonably comprehensive. The information will be updated as the field evolves and progresses.
But before we dive directly into the applications, some preliminary issues must be discussed:
- Big Data
- Partitioning large datasets
- Quantum computer as a coprocessor
- Complex calculations
- Text processing
- Image and media processing
- Unstructured data
- Application structure to exploit quantum computing
- Need for a compelling quantum advantage
- Lack of clarity and specificity for net effect on performance
- Presumption of applicability vs. reality
- ENIAC moment for quantum computing
- FORTRAN moment for quantum computing
Sorry, but despite the hype, quantum computing is not appropriate for directly processing large quantities of data — so-called Big Data. Rather, classical software must preprocess data and spoon-feed it to a quantum computer in very small and manageable chunks.
In particular, a quantum computer has no ability to directly access data itself:
- No equivalent of a classical computer terminal to read and display data.
- No ability to read and write text or data files.
- No ability to access a database to read, write, update, or query data.
- No ability to access data over a network.
- No ability to send or receive data from web services over a network.
- No real-time sensor access.
Instead, classical software must be used to preprocess and postprocess any data required by or produced by a quantum program.
Any data needed by the quantum program must be embedded within the program itself.
How much data can a quantum computer handle? No more than the number of qubits. N qubits means no more than N classical bits of input data and no more than N classical bits of result data.
Sure, a quantum computer with N qubits can work with 2^N quantum states, but that’s within the quantum program, separate from the maximum of N input and N result classical bits of data.
Partitioning large datasets
Since quantum computers generally cannot handle Big Data directly, it is necessary to use classical software to break (partition) large problems and large datasets into smaller problems and smaller datasets, each of which can be handled by a sufficiently powerful quantum computer.
Partitioning is not a perfect panacea. The quantum computer will not see all of the data at the same time, so global optimization across the full dataset will not be possible — solutions can only be optimized locally for the limited partition of data which is currently being processed. Classical software can then be used to attempt to piece or stitch the partitioned results into a single global result, but the final results may or may not be a reasonable approximation of the optimal results if the quantum computer had all of the input data at the same time.
Despite the hype, centered on Grover’s algorithm, quantum computers are particularly ill-suited for large-scale search applications, such as an internet search engine or querying of databases. In particular, this is due to the fact that a quantum computer has no access to external or classical data, having direct access to only the relatively few bits of information which can be represented in the limited qubits of a quantum computer.
The fact that N qubits can represent 2^N quantum states doesn’t get around the fact that there is no practical or even conceptual method to load a large classical database into even a large quantum computer. Not now. Not soon. In fact, not ever.
Besides, modern search engines or distributed databases running on large networks of distributed classical computers have very sophisticated indexing algorithms to find keywords or data records very quickly as it is. Besides, even Grover’s algorithm offers only a sqrt(N) speedup (quadratic), not the kind of exponential speedup quantum computing promises in many application areas.
There may be some very specialized forms of search, such as images and media, where quantum computing may facilitate advanced search methods, but even there, the very limited classical data capacity of a relatively small number of qubits precludes any real quantum advantage.
There may also be some very specialized smaller portions of the overall search problem which can be processed on a quantum computer with a modest number of qubits — using the quantum computer as a coprocessor as described in the following section.
Quantum computer as a coprocessor
A quantum computer is generally not a full, standalone, general purpose computer in the same sense as any classical computer. Rather, a quantum computer is more of a coprocessor — classical software must perform any data retrieval, preprocessing and data preparation, invocation of the actual quantum program, and then post-processing and storage of the results of the quantum program must also be performed by classical software.
A quantum program operates more as a subroutine or function call within a larger classical program.
Despite the focus on massive parallelism, quantum computers are ill-suited for very complex calculations — and better suited for large numbers of relatively simple calculations.
The trick for quantum parallelism is that a relatively simple calculation, called an oracle function is executed once, in parallel, for all possible integer values in the range of 0 to 2^N minus one, where N is the number of qubits needed to represent the range of the number to be evaluated.
Complex calculations fail on a quantum computer because quantum computers can execute only a relatively small number of operations before the quantum state of the qubits begins to decay, on the order of dozens to hundreds of so-called quantum logic gates, which are more like the instructions (operations) of a classical computer than the hardware logic gates familiar to digital electrical engineers. Quantum computers work well when the quantum program consists of a relatively small number of quantum logic gates, coupled with quantum parallelism to apply the logic to a wide range of possible values.
The trick for complex calculations is to factor them so that the quantum computer is used to quickly reduce the full range of possible values to a much more limited range which can then be evaluated at a much more leisurely pace on a classical computer.
In short, do quantum parallelism on the quantum computer but complex calculations on a classical computer.
Text processing is rather ill-suited for quantum computing at present. Generally, a problem needs to be reduced to a linear range of integers or energy states — a structure that very closely mirrors the physics embodied in the qubits of a quantum computer.
Image and media processing
Large-scale image and media processing is also rather ill-suited for quantum computing at present. Again, generally, a problem needs to be reduced to a linear range of integers or energy states — a structure that very closely mirrors the physics embodied in the qubits of a quantum computer.
Relatively small images and very limited media can be processed, but that severely limits the utility of quantum computing for image and media data.
Unstructured data is also rather ill-suited for quantum computing at present. Again, generally, a problem needs to be reduced to a linear range of integers or energy states — a structure that very closely mirrors the physics embodied in the qubits of a quantum computer.
Generally, data needs to have a very simple and very regular structure to be amenable to quantum algorithms.
Application structure to exploit quantum computing
Much of the code for an application, any application, won’t be able to exploit the power of a quantum computer and must be executed on a classical computer. It is an open question as to how to best structure a particular application so that a significant fraction of the application components can be executed on a quantum computer.
There is no single, uniform, universal application structure, but generally at least the following components will be present:
- Data retrieval. Fetch input data from data sources. Pure classical code.
- Preprocessing and data preparation. Pure classical code. Input data ready to be processed by quantum code.
- Quantum state preparation. Pure quantum code. Mapping of classical bits of input data to qubits to initialize the quantum state of the quantum computer.
- Invocation of the actual quantum program. Pure classical code which is transitioning to pure quantum code.
- Quantum code to do the quantum processing. Pure quantum code. Where all the real quantum action is, the quantum algorithm, exploiting quantum parallelism.
- Measurement. Capturing the results of the quantum program. Mapping quantum state of qubits to binary classical bits. Pure quantum code. Note: Measuring the quantum state of a qubit causes the quantum state to collapse into either a classical binary 0 or a classical binary 1. Any superposition, entanglement, or phase of that qubit will be lost.
- Post-processing. Additional processing of the quantum results. Not all processing can always be done or is best done on the quantum computer. Pure classical code.
- Storage of the post-processed results of the quantum program. After post-processing. The final results of the application.
There may be any number of instances of these steps, possibly in an iterative loop or with conditional processing in the classical code, invoking quantum code as subroutines or functions of the quantum computer as a coprocessor as needed.
Need for a compelling quantum advantage
As big a challenge as it will be to get an application running on a quantum computer at all, all is for naught if the net result is not a quantum advantage, namely that the quantum application is able to achieve a result that is either impossible to achieve with a classical computer or a performance improvement that is absolutely mind-boggling, like 1,000 to a million times faster or even much more, a result of the exponential speedup promised by the proponents of quantum computing.
A mere improvement — 10–50% or even 2x or 4x — is not sufficient to justify a quantum solution. A true quantum leap is required.
Sure, in the early days (next few years), even parity with classical computing will be welcomed in the face of the many technological challenges which must be overcome to make the leap to a quantum computer, but parity is a mere steppingstone on the path to the true goal, a clear, distinctive, and compelling quantum advantage.
Lack of clarity and specificity for net effect on performance
Most of the applications listed in this paper have a fair degree of complexity, much of which is not appropriate for a quantum computer, so much of the application must be executed on a classical computer while only key portions are executed on the quantum computer as a coprocessor.
But there is no clarity or specificity as to which portions of each of these applications actually can be executed on a quantum computer and can indeed exploit the key feature of quantum computing, quantum parallelism.
The net effect is that it won’t be known in advance how big a net gain in performance will be gained from quantum computing for any given application.
Some applications may gain a huge benefit, other applications only a moderate benefit, and some applications may not benefit at all. We just don’t know and won’t know until sufficiently powerful quantum computers become available and quantum algorithm developers take a shot at optimizing these applications.
Presumption of applicability vs. reality
In all honesty, the applications listed in this paper are virtually all presumed to be reasonable potential applications for quantum computing, but the reality is that this is mostly speculation and inference rather than reality since none of the presumed applications has been proven in realistic, nontrivial actual, real-world applications.
Sure, we can expect that quite a few of these potential applications will indeed bear fruit, but exactly which ones and exactly how much fruit remains to be seen, and the timeframe is a complete unknown.
We need to be more than a little circumspect when speaking of the more distant future as if it had already arrived when clearly it has not yet arrived and is not likely to arrive imminently.
ENIAC moment for quantum computing
The ENIAC computer was unveiled in 1946 as the first successful digital computer, focused on a specific application — computing artillery firing tables. It wasn’t simply a bare piece of technology, but demonstrated a real and compelling application.
When will quantum computing achieve its own ENIAC moment, when both the hardware and a compelling application are here together for the first time? I explored this topic in my paper When Will Quantum Computing Have Its ENIAC Moment? The short answer is no time soon, but maybe in four to seven years.
The point here is that with all the applications listed in this paper, eventually one of them will manage to come together at the same time that the hardware comes together, and then, only then, will quantum computing have its ENIAC moment.
We need to see both algorithms and hardware advance together, not necessarily in absolute lockstep, but it does no good to have hardware without applications or applications without hardware. I explored this topic at length in my paper The Greatest Challenges for Quantum Computing Are Hardware and Algorithms.
FORTRAN moment for quantum computing
The FORTRAN programming language was the first widely successful high-level programming language and programming model for classical computing. There were some earlier attempts, but FORTRAN made the big splash and opened up computing to a much wider market. Before FORTRAN, programmers had no choice but to code in assembly or raw machine language — the world of bits, words, registers, memory, machine instructions, and raw hardware I/O. It was very, very, VERY tedious. But FORTRAN allowed programmers to write and think in the higher-level terms of variables, integers, real numbers, arrays, control structures (conditionals, loops, and function calls), and even formatted I/O. Programmers became MUCH more productive. Other high-level languages followed, such as COBOL, LISP, and BASIC, but it was FORTRAN which opened the doors (or floodgates!) wide open in the first place.
The point here is that the high-level programming model and features of FORTRAN ushered in a new age for application developers.
Quantum computing does not yet have a sophisticated high-level programming model and features that delivers the kind of productivity boost as FORTRAN did for classical computing.
Until quantum computing does gain such a sophisticated high-level programming model and programming language, comparable to FORTRAN — what I am calling the FORTRAN moment for quantum computing, application development will proceed at a very sluggish pace and be restricted to the most elite of software developers.
I explore this topic at much greater length in my paper When Will Quantum Computing Have Its FORTRAN Moment?
Enough with the preliminaries and caveats. On with listing actual (oops — potential) applications for quantum computing, as indicated from specific sources…
- Machine learning.
- Currently intractable problems.
Airbus Quantum Computing Challenge — Bringing flight physics into the Quantum Era.
This is more of a statement of the desirability of quantum solutions to these application areas rather than an indication that quantum solutions are indeed possible. But that same distinction can be drawn for many of the applications mentioned in this paper. It is more literally a Request for Solutions solicitation.
- Aircraft Climb Optimisation.
- Computational Fluid Dynamics.
- Quantum Neural Networks for Solving Partial Differential Equations.
- Wingbox Design Optimisation.
- Aircraft Loading Optimisation.
BBVA (Banco Bilbao Vizcaya Argentaria, S.A — Spain)
BBVA is “following six lines of research, working hand in hand with Spain’s Senior Council for Scientific Research (CSIC), Accenture, Fujitsu, Zapata Computing, and Multiverse.”
- Development of quantum algorithms (CSIC).
- Static Portfolio Optimization (Fujitsu).
- Dynamic portfolio optimization (Accenture, Multiverse).
- Credit scoring process optimization (Accenture).
- Currency arbitrage optimization (Accenture).
- Derivative valuations and adjustments (Zapata Computing).
- Improve machine learning algorithms.
- Improve energy efficiency.
- Help advance toward a more sustainable society.
- Selection of new materials based on quantum chemistry — e.g., for the development of battery cells.
- Efficient and convenient provision of individual mobility — e.g., traffic management for autonomous vehicles in urban environments and megacities.
- Logistics planning for delivery vans, where routes need to be planned and updated in real time on the basis of numerous variables.
- Optimization of production planning and production processes.
- Machine learning to advance the development of artificial intelligence.
- Understanding complex physical systems.
- Hard science modeling problems.
- Improving artificial intelligence (AI) and machine learning (ML) and deep learning (DL).
- Enhancing distributed sensing.
Department of Energy (DOE)
- Solve large, extremely complex problems that lie entirely beyond the capacity of even today’s most powerful supercomputers.
- Exquisitely sensitive sensors, with a variety of possible medical, national security, and scientific applications.
- Cybersecurity and encryption.
- Provide insights into such cosmic phenomena as Dark Matter and black holes.
- Optimization. Too many combinations of options to evaluate exhaustively on classical computers.
- Machine learning. Detecting recurring patterns in huge amounts of data with an immense number of potential combinations of data elements.
- Materials simulation.
- Monte Carlo simulation. Complex models, with many different variables.
- Artificial intelligence.
- Machine learning. Classification and clustering.
- Generative and discriminative quantum neural networks.
- Discrete optimization.
- Simulation of new materials.
- Elucidation of complex physics.
- Simulation of chemistry.
- Simulation of condensed matter models.
- Random number generation — generating certifiable random numbers.
- Materials science.
- Pharmaceuticals. Improve the efficiency of early-phase drug design and discovery.
- Chemicals. Accelerate development of new chemicals.
- Finance. Reduce risk through improved portfolio insight.
- Aerospace/Defense. Develop new aircraft materials and advanced military technology.
- Oil & Gas. Optimize production and expedite exploration.
- Data center. Accelerate machine learning and analysis of large data sets.
- Manufacturing. Gain visibility into design and production limitations.
- Telecommunication. Optimize antenna efficiency and bandwidth utilization.
- Medicine & Materials. Untangling the complexity of molecular and chemical interactions leading to the discovery of new medicines and materials.
- Supply Chain & Logistics. Finding the best solutions for ultra-efficient logistics and global supply chains, such as optimizing fleet operations for deliveries during the holiday season.
- Financial Services. Finding new ways to model financial data and isolating key global risk factors to make better investments.
- Artificial Intelligence. Making facets of artificial intelligence such as machine learning much more powerful when data sets are very large, such as in searching images or video.
- Massive parallelism.
- Simulate and analyze natural phenomena.
- Individualized genetic medicine.
- Solving environmental challenges.
- Hard optimization problems.
- Chemistry. Computational chemistry. Chemicals.
- Chemistry. Molecular interactions.
- Material science.
- Optimization problems.
- Machine learning.
Source: https://arxiv.org/abs/1905.02860 — From Ansätze to Z-gates: a NASA View of Quantum Computing
- Fault diagnosis.
- Machine learning.
- Robustness of communication networks.
- Simulation of many-body systems for material science and chemistry.
National Science and Technology Council (NSTC)
Source: http://calyptus.caltech.edu/qis2009/documents/FederalVisionQIS.pdf — A Federal Vision for Quantum Information Science (2008)
- Impossible problems.
- Greatly improved sensors. Mineral exploration and medical imaging.
- Exotic new and emergent states of matter that emerge from collective quantum systems. Including fractional quantum Hall states, topological insulators, new superconducting materials, and new states of matter that arise through quantum phase transitions.
- Enable long-lived quantum mechanical states.
- Quantum chemistry.
- Drug design.
- Design and development of new and exotic materials. Including materials for energy systems.
- Simulate quantum mechanical systems.
- Development of an analog quantum simulator.
- Accurate predictions of chemical properties.
- Measurements on individual quantum systems.
- Implications for national security.
- Implications for future economic competitiveness.
- Improvements in the global positioning system.
- Health care.
National Strategic Overview for Quantum Information Science
- Materials development.
- New approaches to understanding materials.
- New approaches to understanding chemistry.
- Chemical calculations.
- Modeling of chemical reactions to enhance corrosion-resistant materials.
- New approaches to understanding gravity
- Novel algorithms for machine learning.
- Novel algorithms for optimization
- Transformative cyber security systems including quantum-resistant cryptography.
- Improvements in effective drug discovery.
- Optimizing logistics solutions.
- Machine learning.
- Quantum sensing.
- Quantum sensors.
- Secure data transmission.
- Random number generation.
- Complex materials.
- Molecular dynamics.
- QCD. [Modeling Quantum chromodynamics?]
- Cryptanalysis. Including post-quantum cryptography.
- Quantum chemistry.
- Quantum field theory.
- Quantum networks.
Quantum Circuits, Inc. (QCI)
- Drug design for biotech.
- Materials science.
- Improved processes for industrial chemicals.
- Machine learning.
No applications explicitly listed.
- Solve previously unsolvable problems.
- Address fundamental challenges in medicine, energy, business, and science.
- Chemistry. Predicting the properties of complex molecules and materials. Design more effective medicines, energy technologies and more resilient crops.
- Machine Learning. Training advanced AI on quantum computers.
- Computer vision, pattern recognition, voice recognition, and machine translation.
- Optimization. Solve complex optimizations such as ‘job shop’ scheduling and traveling salesperson problems.
- Drive critical efficiencies in businesses, military and public sector logistics, scheduling, shipping, and resource allocation.
- Artificial Intelligence.
- Molecular Modeling.
- Financial Modeling.
- Weather Forecasting.
- Particle Physics.
- Simulate the chemical structure of batteries.
- Traffic optimization.
- Smart traffic management.
- Passenger number prediction. [Not completely clear if this is a quantum component or a classical component which works in conjunction with the quantum route optimization.]
- Route optimization, congestion-free route optimization.
- Quantum search.
- Quantum simulation.
- Quantum annealing and adiabatic optimization
- Solving linear equations
- Machine learning.
- Chemistry. Drug discovery and material design.
- Finance. Accelerating complex pricing models that factor in numerous outcomes and variables changing over time. Portfolio optimization, algorithmic trading, and quantum machine learning for fraud detection.
- Sensors. Quantum sensing, using quantum photonics, combined with quantum machine learning techniques, has the potential to make autonomous cars safer and scanning precious biological or chemical samples more accurate.
- Chemistry Simulation.
- Logistics optimization.
- Machine Learning.
- Financial Tech.
- Materials Design.
- Pharma Lead Gen.
Military and intelligence?
There are certainly plenty of applications of quantum computing for defense and intelligence for national security, but other than what is briefly mentioned for DARPA, little is publicly known. That said, most of the applications from other areas generally apply equally well for the military and intelligence agencies, including:
- Cybersecurity and cryptography.
- Optimization, planning, and logistics.
- Material science.
- Modeling chemical reactions.
- Quantum system simulation.
- Artificial intelligence.
And presumably quantum computers can have a role in design and testing of weapon systems, including nuclear weapons.
The vague category defense includes both the military services and the many private sector contractors and venders who design, develop, manufacture, maintain, and service the myriad of products and services used by the military services.
As mentioned at the outset, the list of applications enumerated in this paper is not intended to be exhaustive — there are undoubtedly many other potential applications, as yet undiscovered, or maybe just not yet known to me.
This paper will be updated as additional applications and sources of applications become apparent, especially as quantum computers evolve to the stage where such applications actually become feasible.
Beyond the unknown
As we contemplate what applications can be handled by a quantum computer we are unfortunately limited or biased by the lens and blinders of classical computers, so that we are not even ready to begin to deeply contemplate alternative ways to even conceptualize problems using a quantum mindset, let alone contemplate solutions using a quantum mindset.
We’re too busy conceptualizing solutions in terms of classical solutions. We need to come up with whole new categorizes of models for conceptualizing solutions in the terms of how quantum computers operate.
And that begs the question of not even knowing what capabilities quantum computers will have in five, ten, and twenty years.
The simple answer is that all of this will evolve in a variety of dimensions and stages:
- The relentless march of technology, sometimes in giant leaps, mostly in incremental progress, and sometimes long gaps when little progress seems to be being made.
- Evolution of how we analyze problems.
- Evolution in how we conceptualize solutions.
- Evolution of quantum hardware, software, and tools.
- Evolution of building blocks for quantum algorithms.
- Learning from the work of others.
- Waiting for technological advances, or divine inspiration for clever ways to do better with what technology we already have.
- Feedback loops between any and all of the above. Co-design of hardware and algorithms, for example.
- Rinse and repeat. [Hmmm… that’s a classical concept; what’s the quantum version of rinse and repeat, other than massive parallel rising and repeating!]
For now, we’re focused on relatively simple problems and solutions, but primarily we’re waiting for much more capable quantum hardware so that we can begin seriously iterating that evolution.
For more of my writing: List of My Papers on Quantum Computing.