What Is Quantum Computing?

This informal paper is designed to provide a brief but meaningful introduction to quantum computing — the process and significance of using quantum computers to address compute-intensive application problems, but without getting bogged down in any of the detailed technical jargon. It should be suitable for technical and non-technical audiences alike, including managers, and executives, as well as technical staff who are not yet familiar with quantum computing.

This is intended to be a light, high-level view of the overall process of using quantum computers, rather than a tutorial or hands-on guide on how to design algorithms or develop applications which use quantum computers. It stops short of teaching you about all of the programming model details, software, tools, and techniques needed to engage in quantum computing. But it does offer a reasonably complete panoramic view of what quantum computing can do for you, when it might and might not be an appropriate choice for solving your application problems, and what types of issues might arise and need to be addressed during the process.

The good news is that this paper will be almost exclusively simple, plain language:

  1. No math. No equations or formulas.
  2. No physics. Okay, a little, but just a very little, and only in plain language.
  3. No math symbols.
  4. No Greek letters or symbols.
  5. No complex, complicated, or confusing diagrams, tables, or charts. Just plain text, albeit with lots of bullet points and numbered lists.
  6. No pretty but distracting and relatively useless photos, images, or graphics.
  7. No jargon. Okay, a little, but minimal.
  8. No PhD required. Not even a STEM degree is required.

It doesn’t tell you about what a quantum computer is in any great detail, since that was done in the preceding paper:

Caveat: This paper focuses on general-purpose quantum computing, using so-called universal gate-based quantum computers. Special-purpose quantum computing devices such as quantum annealing, boson sampling, other forms of so-called photonic quantum computing, and specialized physics simulators are explicitly excluded from this paper, but are mentioned to a limited degree in that preceding paper on quantum computers.

This is a very long paper. To get a briefer view on quantum computing, check out these two sections:

  1. The elevator pitch on quantum computing
  2. Quantum computing in a nutshell

Topics discussed in this paper:

  1. The target of quantum computing: production-scale practical real-world problems
  2. The goal of quantum computing: production deployment of production-scale practical real-world quantum applications
  3. The elevator pitch on quantum computing
  4. What is a quantum computer?
  5. What is a practical quantum computer?
  6. The essence of quantum computing: exploiting the inherent parallelism of a quantum computer
  7. What is practical quantum computing?
  8. The quantum computing sector
  9. Scope
  10. Quantum computing in a nutshell
  11. The grand challenge of quantum computing
  12. The twin goals of quantum computing: achieve dramatic quantum advantage and deliver extraordinary business value
  13. Achieve dramatic quantum advantage
  14. Deliver extraordinary business value
  15. Quantum computing is the process of utilizing quantum computers to address production-scale practical real-world problems
  16. The components of quantum computing
  17. Quantum computer, quantum computation, and quantum computing
  18. Quantum computing is the use of a quantum computer to perform quantum computation
  19. Quantum algorithms and quantum applications
  20. Approaches to quantum computing
  21. Three stages for development of quantum algorithms and quantum applications — prototyping, pilot projects, and production projects
  22. Prototyping, experimentation, and evaluation
  23. Proof of concept experiments
  24. Prototyping vs. pilot project
  25. Production projects and production deployment
  26. Production deployment of a production-scale practical real-world quantum application
  27. Quantum effects and how they enable quantum computing
  28. Quantum information
  29. Quantum information science (QIS) as the umbrella field over quantum computing
  30. Quantum information science and technology (QIST) — the science and engineering of quantum systems
  31. Quantum mechanics — can be ignored at this stage
  32. Quantum physics — can also be ignored at this stage
  33. Quantum state
  34. What is a qubit? Don’t worry about it at this stage!
  35. No, a qubit isn’t comparable to a classical bit
  36. A qubit is a hardware device comparable to a classical flip flop
  37. Superposition, entanglement, and product states enable quantum parallelism
  38. Quantum system — in physics
  39. Quantum system — a quantum computer
  40. Computational leverage
  41. k qubits enable a solution space of 2^k quantum states
  42. Product states are the quantum states of quantum parallelism
  43. Product states are the quantum states of entangled qubits
  44. k qubits enable a solution space of 2^k product states
  45. Qubit fidelity
  46. Nines of qubit fidelity
  47. Qubit connectivity
  48. Any-to-any qubit connectivity is best
  49. Full qubit connectivity
  50. SWAP networks to achieve full qubit connectivity — works, but slower and lower fidelity
  51. May not be able to use all of the qubits in a quantum computer
  52. Quantum Volume (QV) measures how many of the qubits you can use in a single quantum computation
  53. Why is Quantum Volume (QV) valid only up to about 50 qubits?
  54. Programming model — the essence of programming a quantum computer
  55. Ideal quantum computer programming model not yet discovered
  56. Future programming model evolution
  57. Algorithmic building blocks, design patterns, and application frameworks
  58. Algorithmic building blocks
  59. Design patterns
  60. Application frameworks
  61. Quantum applications and quantum algorithms
  62. Quantum applications are a hybrid of quantum computing and classical computing
  63. Basic model for a quantum application
  64. Post-processing of the results from a quantum algorithm
  65. Quantum algorithm vs. quantum circuit
  66. Quantum circuits and quantum logic gates — the code for a quantum computer
  67. Generative coding of quantum circuits rather than hand-coding of circuits
  68. Algorithmic building blocks, design patterns, and application frameworks are critical to successful use of a quantum computer
  69. Quantum Fourier transform (QFT) and quantum phase estimation (QPE) are critical to successful use of a quantum computer
  70. Measurement — getting classical results from a quantum computer
  71. Measurement — collapse of the wave function
  72. Extracting useful results from a massively parallel quantum computation
  73. Components of a quantum computer
  74. Access to a quantum computer
  75. Quantum service providers
  76. Having your own in-house quantum computer is not a viable option at this stage
  77. Job scheduling and management
  78. Local runtime for tighter integration of classical and quantum processing
  79. Where are we at with quantum computing?
  80. Current state of quantum computing
  81. Getting to commercialization — pre-commercialization, premature commercialization, and finally commercialization — research, prototyping, and experimentation
  82. Quantum computing is still in the pre-commercialization phase
  83. More suited for the lunatic fringe who will use anything than for normal, average technical staff
  84. Still a mere laboratory curiosity
  85. Much research is still needed
  86. How much more research is required?
  87. No 40-qubit algorithms to speak of
  88. Where are all of the 40-qubit algorithms?
  89. Beware of premature commercialization
  90. Doubling down on pre-commercialization is the best path forwards
  91. The ENIAC Moment — proof that quantum computer hardware is finally up to the task of real-world use
  92. The time to start is not now unless you’re the elite, the lunatic fringe
  93. At least another two to three years before quantum computing is ready to begin commercialization
  94. Little of current quantum computing technology will end up as the foundation for practical quantum computers and practical quantum computing
  95. Current quantum computing technology is the precursor for practical quantum computers and practical quantum computing
  96. Little data with a big solution space
  97. Very good at searching for needles in haystacks
  98. Quantum parallelism
  99. The secret sauce of quantum computing is quantum parallelism
  100. Combinatorial explosion — moderate number of parameters but very many combinations
  101. But not good for Big Data
  102. What production-scale practical real-world problems can a quantum computer solve — and deliver so-called quantum advantage, and deliver real business value?
  103. Heuristics for applicability of quantum computing to a particular application problem
  104. What applications are suitable for a quantum computer?
  105. Quantum machine learning (QML)
  106. Quantum AI
  107. Quantum AGI
  108. What can’t a quantum computer compute?
  109. Not all applications will benefit from quantum computing
  110. Quantum-resistant problems and applications
  111. Post-quantum cryptography (PQC)
  112. Post-quantum cryptography (PQC) is likely unnecessary since Shor’s factoring algorithm likely won’t work for large numbers the size of encryption keys
  113. How do you send input data to a quantum computer? You don’t…
  114. Any input data must be encoded in the quantum circuit for a quantum algorithm
  115. Classical solution vs. quantum solution
  116. Quantum advantage
  117. Dramatic quantum advantage is the real goal
  118. Quantum advantage — make the impossible possible, make the impractical practical
  119. Fractional quantum advantage
  120. Three levels of quantum advantage — minimal, substantial or significant, and dramatic quantum advantage
  121. Net quantum advantage — discount by repetition needed to get accuracy
  122. What is the quantum advantage of your quantum algorithm or application?
  123. Be careful not to compare the work of a great quantum team to the work of a mediocre classical team
  124. To be clear, quantum parallelism and quantum advantage are a function of the algorithm
  125. Quantum supremacy
  126. Random number-based quantum algorithms and quantum applications are actually commercially viable today
  127. Quantum supremacy now: Generation of true random numbers
  128. Need to summarize capability requirements for quantum algorithms and applications
  129. Matching the capability requirements for quantum algorithms and applications to the capabilities of particular quantum computers
  130. A variety of quantum computer types and technologies
  131. Quantum computer types
  132. General-purpose quantum computers
  133. Universal general-purpose gate-based quantum computer
  134. Vendors of general-purpose quantum computers
  135. Special-purpose quantum computing devices
  136. Special-purpose quantum computers or special-purpose quantum computing devices?
  137. General-purpose vs. special-purpose quantum computers
  138. Qubit technologies
  139. Qubit modalities
  140. Quantum computer technologies
  141. Different types and technologies of quantum computers may require distinctive programming models
  142. Don’t get confused by special-purpose quantum computing devices that promise much more than they actually can deliver
  143. Coherence, decoherence, and coherence time
  144. Gate execution time — determines how many gates can be executed within the coherence time
  145. Maximum quantum circuit size — limits size of quantum algorithms
  146. Another quantum computer technology: the simulator for a quantum computer
  147. Simulators for quantum computers
  148. Classical quantum simulator
  149. Quantum computers — real and simulated are both needed
  150. Two types of simulator and simulation
  151. Classical quantum simulator — simulate a quantum computer on a classical computer
  152. Simulation — simulating the execution of a quantum algorithm using a classical computer
  153. Context may dictate that simulation implies simulation of science
  154. Focus on using simulators rather than real quantum computers until much better hardware becomes available
  155. Capabilities, limitations, and issues for quantum computing
  156. Quantum computers are inherently probabilistic rather than absolutely deterministic
  157. Quantum computers are a good choice when approximate answers are acceptable
  158. Statistical processing can approximate determinism, to some degree, even when results are probabilistic
  159. As challenging as quantum computer hardware is, quantum algorithms are just as big a challenge
  160. The process of designing quantum algorithms is extremely difficult and challenging
  161. Algorithmic complexity, computational complexity, and Big-O notation
  162. Quantum speedup
  163. Key trick: Reduction in computational complexity
  164. We need real quantum algorithms on real machines (or real simulators) — not hypothetical or idealized
  165. Might a Quantum Winter be coming?
  166. Risk of a Quantum IP Winter
  167. No, Grover’s search algorithm can’t search or query a database
  168. No, Shor’s factoring algorithm probably can’t crack a large encryption key
  169. No, variational methods don’t show any promise of delivering any dramatic quantum advantage
  170. Quantum computer vs. quantum computing vs. quantum computation
  171. Quantum computer vs. quantum computer system vs. quantum processor vs. quantum processing unit vs. QPU
  172. Quantum computer as a coprocessor rather than a full-function computer
  173. Quantum computers cannot fully replace classical computers
  174. Quantum applications are mostly classical code with only selected portions which run on a quantum computer
  175. No quantum operating system
  176. Noise, errors, error mitigation, error correction, logical qubits, and fault tolerant quantum computing
  177. Perfect logical qubits
  178. NISQ — Noisy Intermediate-Scale Quantum computers
  179. Near-perfect qubits as a stepping stone to fault-tolerant quantum computing
  180. Quantum error correction (QEC) remains a distant promise, but not critical if we have near-perfect qubits
  181. Circuit repetitions as a poor man’s approximation of quantum error correction
  182. Beyond NISQ — not so noisy or not intermediate-scale
  183. When will the NISQ era end and when will the post-NISQ era begin?
  184. Three stages of adoption for quantum computing — The ENIAC Moment, Configurable packaged quantum solutions, and The FORTRAN Moment
  185. Configurable packaged quantum solutions are the greatest opportunity for widespread adoption of quantum computing
  186. The FORTRAN Moment — It is finally easy for most organizations to develop their own quantum applications
  187. Quantum networking and quantum Internet — research topics, not a near-term reality
  188. Distributed quantum computing — longer-term research topic
  189. Distributed quantum applications
  190. Distributed quantum algorithms — longer-term research topic
  191. Quantum application approaches
  192. Quantum network services vs. quantum applications
  193. What could you do with 1,000 qubits?
  194. 48 fully-connected near-perfect qubits may be the sweet spot for achieving a practical quantum computer
  195. 48 fully-connected near-perfect qubits may be the minimum configuration for achieving a practical quantum computer
  196. 48 fully-connected near-perfect qubits may be the limit for practical quantum computers
  197. The basic flow for quantum computing
  198. Quantum programming
  199. Quantum software development kits (SDK)
  200. Platforms
  201. Quantum computing software development platforms
  202. Quantum workflow orchestration
  203. Programming languages for quantum applications
  204. Programming languages for quantum algorithms
  205. Quantum-native high-level programming language
  206. Python for quantum programming
  207. Python and Jupyter Notebooks
  208. Support software and tools
  209. Audiences for support software and tools
  210. Tools
  211. Developer tools
  212. Audiences for developer tools
  213. Compilers, transpilers, and quantum circuit optimization
  214. Interactive and online tools
  215. Beware of tools to mask severe underlying technical deficiencies or difficulty of use
  216. Agile vs. structured development methodology
  217. Need for an Association for Quantum Computing Machinery — dedicated to the advancement of practical quantum computing
  218. Need to advance quantum information theory
  219. Need to advance quantum computer engineering
  220. Need to advance quantum computer science
  221. Need to advance quantum software engineering
  222. Need to advance quantum algorithms and applications
  223. Need to advance quantum infrastructure and support software
  224. Need to advance quantum application engineering
  225. Need for support for research
  226. Need for quantum computing education and training
  227. Need for quantum certification
  228. Need for quantum computing standards
  229. Need for quantum publications
  230. Need for quantum computing community
  231. Quantum computing community
  232. Need for quantum computing ecosystem
  233. Quantum computing ecosystem
  234. Distinction between the quantum computing community and the quantum computing ecosystem
  235. Call for Intel to focus on components for others to easily build their own quantum computers
  236. Intel could single-handedly do for the quantum computing ecosystem what IBM, Intel, and Microsoft did for the PC ecosystem
  237. Need to assist students
  238. Need for recognition and awards
  239. Need for code of ethics and professional conduct
  240. Quantum computing as a profession
  241. Benchmarking
  242. Education
  243. Training
  244. What is the best training for quantum computing?
  245. Getting started with IBM Qiskit Textbook
  246. Workforce development
  247. How to get started with quantum computing
  248. Investors and venture capital for quantum computing
  249. Commitment to a service level agreement (SLA)
  250. Diversity of sourcing
  251. Dedicated access vs. shared access
  252. Deployment and production
  253. Shipment, release, and deployment criteria
  254. How much might a quantum computer system cost?
  255. Pricing for service — leasing and usage
  256. Pricing for software, tools, algorithms, applications, and services
  257. Open source is essential
  258. Intellectual property (IP) — boon or bane?
  259. Shared knowledge — opportunities and obstacles
  260. Secret projects — sometimes they can’t be avoided
  261. Secret government projects — they’re the worst
  262. Transparency is essential — secrecy sucks!
  263. An open source quantum computer would be of great value
  264. Computational diversity
  265. Quantum-inspired algorithms and quantum-inspired computing
  266. Collaboration — strategic alliances, partnerships, joint ventures, and programs
  267. Classical computing still has a lot of runway ahead of it and quantum computing still has a long way to go to catch up
  268. Science fiction
  269. Universal quantum computer is an ambiguous term
  270. Both hardware and algorithms are limiting quantum computing
  271. A practical quantum computer would be ready for production deployment to address production-scale practical real-world problems
  272. Practical quantum computing isn’t really near, like within one, two, or three years
  273. Why aren’t quantum computers able to address production-scale practical real-world problems today?
  274. Limitations of quantum computing
  275. Quantum computation must be relatively modest
  276. No, you can’t easily migrate your classical algorithms to run on a quantum computer
  277. Lack of fine granularity for phase angles may dramatically limit quantum computing
  278. Limitations of current quantum computers
  279. Two most urgent needs in quantum computing: higher qubit fidelity and full qubit connectivity
  280. Premature commercialization is a really bad idea
  281. Minimum viable product (MVP)
  282. Initial commercialization stage — C1.0
  283. Subsequent commercialization stages
  284. But commercialization is not imminent — still years away from even commencing
  285. Quantum Ready
  286. Quantum Ready — but for who and when?
  287. Quantum Aware — what quantum computing can do, not how it does it
  288. The technical elite and lunatic fringe will always be ready, for anything
  289. But even with the technical elite and lunatic fringe, be prepared for the potential for several false starts
  290. Quantum Ready — on your own terms and on your own timeline
  291. Timing is everything — when will it be time for your organization to dive deep on quantum computing?
  292. Your organizational posture towards technological advances
  293. Each organization must create its own quantum roadmap
  294. Role of academia and the private sector
  295. Role of national governments
  296. When will quantum computing finally be practical?
  297. Names of the common traditional quantum algorithms
  298. Quantum teleportation — not relevant to quantum computing
  299. IBM model for quantum computing
  300. Circuit knitting
  301. Dynamic circuits
  302. Future prospects of quantum computing
  303. Post-quantum computing — beyond the future
  304. Should we speak of quantum computing as an industry, a field, a discipline, a sector, a domain, a realm, or what? My choice: sector (of the overall computing industry)
  305. Why am I still unable to write a brief introduction to quantum computing?
  306. List of my papers on quantum computing
  307. Definitions from my Quantum Computing Glossary
  308. Jargon and acronyms — most of it can be ignored unless you really need it
  309. My glossary of quantum computing terms
  310. arXiv — the definitive source for preprints of research papers
  311. Free online books
  312. Resources for quantum computing
  313. Personas, use cases, and access patterns for quantum computing
  314. Beware of oddball, contrived computer science experiments
  315. Beware of research projects masquerading as commercial companies
  316. Technology transfer
  317. Research spinoffs
  318. Don’t confuse passion for pragmatism
  319. Quantum hype — enough said
  320. Should this paper be turned into a book?
  321. My original proposal for this topic
  322. Summary and conclusions

The target of quantum computing: production-scale practical real-world problems

A lot of published material on quantum computing is addressing hypothetical, theoretical, contrived, or rather small problems. A lot of the material is simply separated from real-world concerns. So, the central focus needs to be:

  • Real-world problems

And since a lot of material is approaching topics from too abstract, hypothetical, and theoretical perspectives, the focus needs to be on the practical:

  • Practical real-world problems

A lot of existing efforts at using quantum computing have been very limited, at a toy scale, or call them prototypes or proof of concept experiments to be charitable. The great opportunity with quantum computing is supposed to be impressive performance, so the emphasis needs to be on problems at a production scale, so that’s the essential target of quantum computing:

  • Production-scale practical real-world problems

So, applications for quantum computers need to be:

  • Production-scale practical real-world quantum applications

The goal of quantum computing: production deployment of production-scale practical real-world quantum applications

That’s the target of quantum computing, addressing production-scale practical real-world problems.

And the goal is to produce production-scale practical real-world quantum applications.

Actually, the goal is not simply to produce such quantum applications, but to deploy them in a production environment so that they can begin fulfilling their promise, so the goal of quantum computing is actually:

  • Production deployment of production-scale practical real-world quantum applications

That is the promised land that we seek to journey to. Anything less than that won’t be worth our attention, resources, and effort.

The elevator pitch on quantum computing

What is quantum computing?

Unlike classical computing in which one alternative at a time is evaluated, quantum computing is able to evaluate a vast number of alternatives all at the same time, in a single computation rather than a vast number of computations. This parallel evaluation of a large number of alternatives is known as quantum parallelism.

A quantum computer is a specialized electronic device which exploits the seemingly magical qualities of quantum effects governed by the principles of quantum mechanics to enable it to perform a fairly small calculation on a very large number of possible solution values all at once, producing a solution value in a very short amount of time. This parallel computation on a large number of values is the quantum parallelism mentioned above.

A quantum computer is not a full-blown computer capable of the full range of computation of a classical computer, but is more of a coprocessor which can perform carefully chosen and crafted calculations at a rate far exceeding the performance of even the best classical supercomputers.

Quantum applications are a hybrid of classical code and quantum algorithms. Most of the application, including any complex logic and handling of data, is classical, while small but critical and time-consuming calculations can be extracted and transformed into quantum algorithms which can be executed in a much more efficient manner than is possible with even the most powerful classical supercomputers.

Quantum computers are still in their infancy, demonstrating rudimentary capabilities, and not even close to being able to address production-scale practical real-world problems and not even close to being ready for production deployment, but are available in limited capacities for prototyping and experimentation, even as significant research continues.

It may be three to five, seven, or even ten years before practical quantum computers are generally available which support production-scale practical real-world quantum applications and achieve dramatic quantum advantage over classical computing and deliver extraordinary business value which is well beyond the reach of classical computing.

What is a quantum computer?

A quantum computer is a specialized electronic device which exploits the seemingly magical qualities of quantum effects governed by the principles of quantum mechanics to enable it to perform a fairly small calculation on a very large number of possible solution values all at once using a process known as quantum parallelism, producing a solution value in a very short amount of time.

This paper focuses on quantum computing, the use of quantum computers, the application of quantum computers to addressing real-world application problems.

For more detail on quantum computers themselves, see my previous paper:

What is a practical quantum computer?

As already mentioned, quantum computers are still in their infancy and not ready for practical use. So, what is a practical quantum computer? By definition:

  • A practical quantum computer supports production-scale practical real-world quantum applications and achieves dramatic quantum advantage over classical computing and delivers extraordinary business value which is well beyond the reach of classical computing.

The essence of quantum computing: exploiting the inherent parallelism of a quantum computer

At its essence, quantum computing enables a quantum application to exploit the inherent parallelism of a quantum computer to evaluate a very large number of potential solutions to a relatively small computation all at once, delivering a result in a very short amount of time.

What is practical quantum computing?

Practical quantum computing is the fruition of quantum computing when practical quantum computers themselves come to fruition, coupled with all of the software, algorithms, and other components of quantum computing.

Practical quantum computers are the hardware.

Practical quantum computing is the hardware plus all of the software.

The software includes quantum algorithms, quantum applications, support software, and tools, especially developer tools.

Also included are:

  1. Education.
  2. Training.
  3. Ongoing research.
  4. Consulting.
  5. Conferences.
  6. Publications.
  7. Community.
  8. Ecosystem.

The quantum computing sector

Some may consider quantum computing to be an industry or field, but for the purposes of this informal paper I consider quantum computing to be a sector of the overall computing industry — the quantum computing sector.

See a greater discussion of this later in this paper.

Scope

This paper endeavors to provide a light, high-level view of what quantum computing is and has to offer and its general benefits, and what issues might arise and need to be addressed.

A previous paper, What Is a Quantum Computer?, covered the nature of the quantum computer itself.

The details of how to program or directly use a quantum computer are also beyond the scope of this paper.

Some of the details of quantum computing that are not included in this paper:

  1. Math. In general.
  2. Physics. In general.
  3. Linear algebra.
  4. Quantum mechanics.
  5. Quantum physics.
  6. Unitary transformation matrices.
  7. Density matrices.
  8. Details of particular quantum algorithms.

There are quite a few details which are included in many of the typical, traditional introductions to quantum computing which are not included here, simply because the focus here is a lighter, higher-level view of quantum computing, rather than all of the detail normally hidden under the hood.

There are actually quite a few details which are in fact mentioned in passing but not in fine detail in this paper. In general, citations will be given when details can be found elsewhere.

And as mentioned in the caveat in the introduction, the focus of this paper is general-purpose quantum computers, not special-purpose quantum computing devices.

Quantum computing in a nutshell

Here we try to summarize as much of quantum computing as possible as succinctly as possible. It’s a real challenge to be succinct, but there are a lot of important points to be made about quantum computing to capture the big picture in any truly meaningful manner.

  1. The target of quantum computing: production-scale practical real-world problems. Addressing real-world problems, at scale.
  2. The goal of quantum computing: production deployment of production-scale practical real-world quantum applications. The goal is met when these production-scale quantum applications are actually in production, at scale, and delivering their promised results.
  3. What is quantum computing? Quantum computing permits the evaluation of a vast number of alternatives all at the same time, in a single computation rather than a vast number of computations. This parallel evaluation of a large number of alternatives is known as quantum parallelism.
  4. What is a quantum computer? A quantum computer is a specialized electronic device which exploits the seemingly magical qualities of quantum effects governed by the principles of quantum mechanics to enable it to perform a fairly small calculation on a very large number of possible solution values all at once using a process known as quantum parallelism, producing a solution value in a very short amount of time.
  5. What is a practical quantum computer? A practical quantum computer supports production-scale practical real-world quantum applications and achieves dramatic quantum advantage over classical computing and delivers extraordinary business value which is well beyond the reach of classical computing.
  6. What is practical quantum computing? Practical quantum computing is the fruition of quantum computing when practical quantum computers themselves come to fruition, coupled with all of the software, algorithms, and other components of quantum computing.
  7. The quantum computing sector. Some may consider quantum computing to be an industry or field, but for the purposes of this informal paper I consider quantum computing to be a sector of the overall computing industry.
  8. The grand challenge of quantum computing. Figure out how to exploit the capabilities of a quantum computer most effectively for a wide range of applications and for particular application problems to enable production-scale practical real-world quantum applications that achieve dramatic quantum advantage and deliver extraordinary business value.
  9. The twin goals of quantum computing: achieve dramatic quantum advantage and deliver extraordinary business value.
  10. Achieve dramatic quantum advantage. Not merely a relatively modest computational advantage, but a truly mind-boggling performance advantage.
  11. Deliver extraordinary business value. Even more important than raw computational performance advantage, a quantum computer needs to deliver actual business value, and in fact extraordinary business value.
  12. Quantum computing is the process of utilizing quantum computers to address production-scale practical real-world problems.
  13. The components of quantum computing. The major components, functions, areas, and topics of quantum computing.
  14. Quantum computer, quantum computation, and quantum computing. The machine, the work to accomplish, and the overall process of achieving it.
  15. Quantum computing is the use of a quantum computer to perform quantum computation. Putting it all together.
  16. Quantum algorithms and quantum applications. Besides the quantum computer itself, these are the two central types of objects that we focus on in quantum computing.
  17. Approaches to quantum computing. There are a variety.
  18. Three stages for development of quantum algorithms and quantum applications — prototyping, pilot projects, and production projects. Basic proof of concept for a subset of functions, proof of scaling for full functions, and full-scale project.
  19. Prototyping, experimentation, and evaluation. Quantum computing is new, untried, and unproven, so it needs to be tried and proven. And feedback given to researchers and vendors. And decisions must be made as to whether to pursue the technology or pass on it. Or maybe simply defer until a later date when the technology has matured some more.
  20. Proof of concept experiments. It’s probably a fielder’s choice whether to consider a project to be a prototype or a proof of concept. The intent is roughly the same — to demonstrate a subset of basic capabilities. Although there is a narrower additional meaning for proof of concept, namely, to prove or demonstrate that some full feature or full capability can be successfully implemented.
  21. Prototyping vs. pilot project. Prototyping and pilot projects can have a lot in common, but they have very distinct purposes. Prototyping seeks to prove the feasibility of a small but significant subset of functions. A pilot project seeks to prove that a prototype can be scaled up to the full set of required functions and at a realistic subset of production scale.
  22. Production projects and production deployment. When a product or service is ready to be made operational, to begin fulfilling its promised features, we call that production deployment.
  23. Production deployment of a production-scale practical real-world quantum application. That’s the ultimate goal for a quantum computing project. It’s ready to be made operational to begin fulfilling its promise.
  24. In simplest terms, quantum computing is computation using a quantum computer.
  25. The secret sauce of a quantum computer is quantum parallelism — evaluating a vast number of alternative solutions all at once. A single modest computation executed once but operating on all possible values at the same time.
  26. Focus is on specialized calculations which are not easily performed on a classical computer, not because the calculation itself is hard, but because it must be performed a very large number of times.
  27. Perform calculations in mere seconds or minutes which classical computers might take many years, even centuries, or even millennia. Again, not because the individual calculation takes long, but because it must be performed a very large number of times.
  28. Essentially, a quantum computer is very good at searching for needles in haystacks. Very large haystacks, much bigger than even the largest classical computers could search. And doing it much faster.
  29. The optimal use of a quantum computer is for little data with a big solution space. A small amount of input data with a fairly simple calculation which will be applied to a very large solution space, producing a small amount of output. See little data with a big solution space below.
  30. But quantum parallelism is not automatic — the algorithm designer must cleverly deduce what aspect of the application can be evaluated in such a massively parallel manner. Not all applications of algorithms can exploit quantum parallelism or exploit it fully.
  31. The degree of quantum parallelism is not guaranteed and will vary greatly between applications and algorithms and even depending on the input data.
  32. Combinatorial explosion — moderate number of parameters but very many combinations. These are the applications which should be a good fit for quantum computing. Matching combinatorial explosion with quantum parallelism.
  33. What production-scale practical real-world problems can a quantum computer solve — and deliver so-called quantum advantage, and deliver real business value? No clear answer or clearly-defined path to get an answer, other than trial and error, cleverness, and hard work. If you’re clever enough, you may find a solution. If you’re not clever enough, you’ll be unlikely to find a solution that delivers quantum advantage on a quantum computer.
  34. Heuristics for applicability of quantum computing to a particular application problem. Well, there really aren’t any yet. Much more research is needed. Prototyping and experimentation may yield some clues on heuristics and rules of thumb for what application problems can be matched to quantum algorithm solutions.
  35. What applications are suitable for a quantum computer? No guarantees, but at least there’s some potential.
  36. Quantum machine learning (QML). Just to highlight one area of high interest for applications of quantum computing, but details are beyond the scope of this paper.
  37. Quantum AI. Beyond quantum machine learning (QML) in particular, there are many open questions about the role of quantum commuting in AI in general. This is all beyond the scope of this paper.
  38. Quantum AGI. Beyond quantum machine learning (QML) in particular and AI generally, there are many open questions about the role of quantum commuting in artificial general intelligence (AGI) in particular. This is all beyond the scope of this paper. Is the current model for general-purpose quantum computing (gate-based) sufficiently powerful to compute all that the human brain and mind can compute, or is a more powerful architecture needed? I suspect the latter. This is mostly a speculative research topic at this stage.
  39. What can’t a quantum computer compute? There are plenty of complex math and logic problems which simply aren’t computable on even the best and most-capable quantum computers.
  40. Not all applications will benefit from quantum computing. Sad but true.
  41. Quantum-resistant problems and applications. Some problems or applications are not readily or even theoretically solvable using a quantum computer. These are sometimes referred to as quantum-resistant mathematical problems or quantum-resistant problems. Actually, there is one very useful and appealing sub-category of quantum-resistant mathematical problems, namely post-quantum cryptography (PQC), which includes the newest forms of cryptography which are explicitly designed so that they cannot be cracked using a quantum computer.
  42. Post-quantum cryptography (PQC). It is commonly believed that Shor’s factoring algorithm will be able to crack even 2048 and 4096-bit public encryption keys once we get sufficiently capable quantum computers. As a result, a search began for a new form of cryptography which wouldn’t be vulnerable to attacks using quantum computers. The basic idea is that all you need is a quantum-resistant mathematical problem, which, by definition, cannot be attacked using quantum computation. NIST (National Institute of Standards and Technology) is currently formalizing a decision in this area.
  43. How do you send input data to a quantum computer? You don’t…
  44. Any input data must be encoded in the quantum circuit for a quantum algorithm.
  45. Classical solution vs. quantum solution. An algorithm or application is a solution to a problem. A key task in quantum computing is comparing quantum solutions to classical solutions. This is comparing a quantum algorithm to a classical algorithm, or comparing a quantum application to a classical application.
  46. Quantum advantage expresses how much more powerful a quantum solution is compared to a classical solution. Typically expressed as either how many times faster the quantum computer is, or how many years, decades, centuries, millennia, or even millions or billions or trillions of years a classical computer would have to run to do what a quantum computer can do in mere seconds, minutes, or hours.
  47. Dramatic quantum advantage is the real goal. A truly mind-boggling performance advantage. One quadrillion or more times a classical solution.
  48. Quantum advantage — make the impossible possible, make the impractical practical. Just to emphasize that point more clearly.
  49. Fractional quantum advantage. A more modest advantage. Substantial or significant would be one million or more times a classical solution. Minimal would be 1,000 times a classical solution.
  50. Three levels of quantum advantage — minimal, substantial or significant, and dramatic quantum advantage. Minimal — 1,000 X, substantial or significant — 1,000,000 X, and dramatic quantum advantage — one quadrillion X the best classical solution.
  51. Net quantum advantage — discount by repetition needed to get accuracy. Quantum advantage can be intoxicating — just k qubits gives you a computational leverage of 2^k, but… there’s a tax to be paid on that. Since quantum computing is inherently probabilistic by nature, you can’t generally do a computation once and have an accurate answer. Rather, you have to repeat the calculation multiple or even many times, called circuit repetitions, shot count, or just shots, and do some statistical analysis to determine the likely answer. Those repetitions are effectively a tax or discount on the raw computational leverage which gives you a net computational leverage, the net quantum advantage.
  52. What is the quantum advantage of your quantum algorithm or application? It’s easy to talk about quantum advantage in the abstract, but what’s really needed is for quantum algorithm designers to explicitly state and fully characterize the quantum advantage of their quantum algorithms. It’s also important for applications using quantum algorithms to understand their own actual input size since the actual quantum advantage will depend on the actual input size.
  53. Be careful not to compare the work of a great quantum team to the work of a mediocre classical team. If you redesigned and reimplemented a classical solution using a technical team as elite as that required for a quantum solution, the apparent quantum advantage of the quantum solution over the new classical solution might be significantly less, or even negligible. Not necessarily, but very possible.
  54. To be clear, quantum parallelism and quantum advantage are a function of the algorithm. A quantum computer does indeed enable quantum parallelism and quantum advantage, but the actual and net quantum parallelism and quantum advantage are a function of the particular algorithm rather than the quantum computer itself.
  55. Quantum supremacy. Sometimes simply a synonym for quantum advantage, or it expresses the fact that a quantum computer can accomplish a task that is impossible on a classical computer, or that could take so long, like many many years, that it is outright impractical on a classical computer.
  56. Quantum computers already excel at generating true random numbers. Clear supremacy over classical computing.
  57. Random number-based quantum algorithms and quantum applications are actually commercially viable today.
  58. Quantum supremacy now: Generation of true random numbers.
  59. The goal is not simply to do things faster, but to make the impossible possible. To make the impractical practical. Many computations are too expensive today to be practical on a classical computer.
  60. Quantum effects and how they enable quantum computing. If you want to get down into the quantum mechanics which enable quantum computers and quantum computing.
  61. Quantum information. What it is. How it is represented. How it is stored. How it is manipulated in quantum computing. It’s a lot more than just the quantum equivalent of the classical binary 0 and 1.
  62. Quantum information science (QIS) as the umbrella field over quantum computing. Also covers quantum communication, quantum networking, quantum metrology (measurement), quantum sensing, and quantum information.
  63. Quantum information science and technology (QIST) — the science and engineering of quantum systems.
  64. Quantum state. How quantum information is represented, stored, and manipulated.
  65. Measurement — getting classical results from a quantum computer. Quantum state can be complicated, but measurement of qubits always returns a classical binary 0 or 1 for each measured qubit.
  66. Measurement — collapse of the wave function. The loss of the additional information of quantum state and product state beyond the simple binary 0 or 1 on measurement of qubits is known as the collapse of the wave function.
  67. Extracting useful results from a massively parallel quantum computation.
  68. Quantum computers do exist today, but only in fairly primitive and simplistic form.
  69. They are evolving rapidly, but it will still be a few more years before they can be useful for practical applications.
  70. A practical quantum computer would be ready for production deployment to address production-scale practical real-world problems. Just to clarify the terminology — we do have quantum computers today, but they haven’t achieved the status of being practical quantum computers since they are not yet ready for production deployment to address production-scale practical real-world problems.
  71. Practical quantum computing isn’t really near, like within one, two, or three years. Although quantum computers do exist today, they are not ready for production deployment to address production-scale practical real-world problems. And it’s unlikely that they will be in the next one, two, or three years. Sure, there may be some smaller niches where they can actually be used productively, but those would be the exception rather than the rule. Four to seven years is a better bet, and even then only for moderate benefits.
  72. Why aren’t quantum computers able to address production-scale practical real-world problems today? Quite a long list of gating factors.
  73. Limitations of quantum computing. Quantum computing has great potential from a raw performance perspective, but does have its limits. And these are not mere limitations of current quantum computers, but of the overall architecture of quantum computing. Limited coprocessor function. None of the richness of classical computing. Complex applications must be couched in terms of physics problems. Etc.
  74. Limitations of current quantum computers. These are real limitations, but only of current quantum computers. Future quantum computers will be able to advance beyond these limitations. Limited qubit fidelity. Limited qubit connectivity. Limited circuit size.
  75. Two most urgent needs in quantum computing: higher qubit fidelity and full qubit connectivity. Overall, the most urgent need is to support more-sophisticated quantum algorithms. Supporting larger quantum circuits is a runner-up, but won’t matter until qubit fidelity and connectivity are addressed.
  76. Quantum computation must be relatively modest. Must be simple. Must be short. No significant complexity. The computational complexity comes from quantum parallelism — performing this relatively modest computation a very large number of times.
  77. No, you can’t easily migrate your classical algorithms to run on a quantum computer. You need a radically different approach to exploit the radically different capabilities of quantum computers.
  78. Lack of fine granularity for phase angles may dramatically limit quantum computing. For example, Shor’s factoring algorithm may work fine for factoring smaller numbers, but not for much larger numbers such as 1024, 2048, and 4096-bit encryption keys.
  79. Getting to commercialization — pre-commercialization, premature commercialization, and finally commercialization — research, prototyping, and experimentation.
  80. Premature commercialization is a really bad idea. The temptation will be strong. Many will resist. But some will succumb. A recipe for disaster. Expectations will likely be set (or merely presumed) too high and fail to be met.
  81. Minimum viable product (MVP). What might qualify for an initial practical quantum computer. The minimal requirements.
  82. Initial commercialization stage — C1.0. Where we end up for the initial commercial practical quantum computer.
  83. Subsequent commercialization stages. Progress once the initial commercialization stage, C1.0, is completed.
  84. But commercialization is not imminent — still years away from even commencing. The preceding sections were intended to highlight how commercialization might or should unfold, but should not be construed as implying that commercialization was in any way imminent — it’s still years away. In fact commercialization is still years away from even commencing. The next few years, at a minimum, will still be the pre-commercialization stage for quantum computing.
  85. Quantum Ready. Prepare in advance for the eventual arrival of practical quantum computing.
  86. Quantum Ready — but for who and when? Different individuals and different organizations have different needs and operate on different time scales. Not everybody needs to be Quantum Ready at the same time.
  87. Quantum Aware — what quantum computing can do, not how it does it. A broader audience needs to understand quantum computing at a high level — its general benefits — but doesn’t need the technical details of how quantum computers actually work or how to program them.
  88. The technical elite and lunatic fringe will always be ready, for anything.
  89. But even with the technical elite and lunatic fringe, be prepared for the potential for several false starts. Too often, the trick with any significant new technology is that it may take several false starts before the team hits on the right mix of technology, resources, and timing. This is why it’s best not to get the whole rest of the organization Quantum Ready until the technical elite and lunatic fringe have gotten all of the kinks out of the new and evolving quantum computing technology.
  90. Quantum Ready — on your own terms and on your own timeline. Corporate investment and deployments of personal computers made sense in 1992, but not in 1980. Don’t jump the gun. Wait until your needs and the technology are in sync.
  91. Timing is everything — when will it be time for your organization to dive deep on quantum computing? Each organization will be different. Different needs. Different goals. Different resources.
  92. Your organizational posture towards technological advances. Quantum computing may be different, but the context is the organization’s general posture towards technological advances in general.
  93. Each organization must create its own quantum roadmap. Each organization should define its own roadmap of milestones for preparation for initial adoption of quantum computing. This is a set of milestones which must be achieved before the organization can achieve practical quantum computing.
  94. Components of a quantum computer. At a high level. See the previous paper for greater detail.
  95. Access to a quantum computer. Generally over a network connection. Commonly a cloud-based service.
  96. Quantum service providers. A number of cloud service providers are quantum service providers, providing cloud access to quantum computers. Including IBM, Amazon, Microsoft, and Google.
  97. Having your own in-house quantum computer is not a viable option at this stage. In theory, customers could have their own dedicated quantum computers in their own data centers, but that’s not appropriate at this stage since quantum computers are still under research and evolving too rapidly for most organizations to make an investment in dedicated hardware. And customers are focused on prototyping and experimentation rather than production deployment. There are some very low-end quantum computers, such as 2 and 3-qubit machines from SpinQ which can be used for in-house experimentation, but those are very limited configurations.
  98. Job scheduling and management. One of the critical functions for the use of networked quantum computers is job scheduling and management. Requests are coming into a quantum computer system from all over the network, and must be scheduled, run, and managed until their results can be returned to the sender of the job.
  99. Local runtime for tighter integration of classical and quantum processing. Classical application code can be packaged and sent to the quantum computer system where the classical application code can run on the classical computer that is embedded inside of the overall quantum computer system, permitting the classical application code to rapidly invoke quantum algorithms locally without any of the overhead of a network connection. Final results can then be returned to the remote application.
  100. Where are we at with quantum computing? In a more abstract sense. Still at the early stages. Lots of research needed. Still in the pre-commercialization stage.
  101. Current state of quantum computing. I decided that would be too much distracting detail for this paper. The abstract state of affairs should be sufficient for this paper.
  102. Much research is still needed. Theoretical, experimental, and applied. Basic science, algorithms, applications, and tools.
  103. How much more research is required? A lot more research is still needed before quantum computing is ready to exit from the pre-commercialization stage and begin the commercialization stage. There’s no great clarity as to how much more research is needed. Most answers to this question are more abstract than concrete, although there are a number of concrete milestones.
  104. Quantum computing is still in the pre-commercialization phase. Focus is on research, prototyping, and experimentation.
  105. Not ready for production-scale practical real-world quantum applications. More capabilities and more refinement are needed.
  106. Production deployment is not appropriate at this time. Not for a few more years, at least.
  107. More suited for the lunatic fringe who will use anything than for normal, average technical staff. It’s still the wild west out there. Great and exciting for some, but not for most.
  108. Still a mere laboratory curiosity. Not ready for production-scale practical real-world quantum applications or production deployment.
  109. No 40-qubit algorithms to speak of. Where are all of the 40-qubit algorithms?
  110. Beware of premature commercialization. The technology just isn’t ready yet. Much more research is needed. Much more prototyping and experimentation is needed.
  111. Doubling down on pre-commercialization is the best path forwards. Research, prototyping, and experimentation should be the priorities, not premature commercialization, not production deployment.
  112. The ENIAC Moment — proof that quantum computer hardware is finally up to the task of real-world use. The first time a production-scale practical real-world quantum application can be run in something resembling a production environment. Proves that quantum computer hardware is finally up to the task of real-world use. We’re not there yet, not even close.
  113. The time to start is not now unless you’re the elite, the lunatic fringe. Most normal technical teams and management planners should wait a few years, specifically until the ENIAC Moment has occurred. Everything learned before then will need to be discarded — the ENIAC moment will be the moment when we can finally see the other side of the looking glass, where the real action will be and where the real learning needs to occur.
  114. At least another two to three years before quantum computing is ready to begin commercialization. Could even be longer. Even four to five years. Not likely to be less.
  115. Little of current quantum computing technology will end up as the foundation for practical quantum computers and practical quantum computing. Rapid evolution and radical change. Fueled by ongoing research. Wait a few years and everything will have changed.
  116. Current quantum computing technology is the precursor for practical quantum computers and practical quantum computing. That doesn’t mean that the current technology is a waste or a mistake — it does provide a foundation for further research, prototyping, and experimentation, which provides feedback into further research and ideas for future development.
  117. Quantum computers cannot handle Big Data. They can handle only a small amount of input data and generate only a small amount of output data. None of the Three V’s of Big Data are appropriate for quantum computing — volume, velocity, variety.
  118. Quantum computers cannot handle complex logic. Only very simple calculations, but applied to a very large number of possible solutions.
  119. Quantum computers cannot handle rich data types. Only very simple numeric calculations.
  120. Quantum computers are inherently probabilistic rather than absolutely deterministic. Not suitable for calculations requiring absolute precision, although statistical processing can approximate determinism.
  121. Quantum computers are a good choice when approximate answers are acceptable. Especially in the natural sciences or where statistical approximations are used, probabilistic results can be quite acceptable. But not good for financial transactions where every penny counts.
  122. Statistical processing can approximate determinism, to some degree, even when results are probabilistic. Run the quantum algorithm a bunch of times and statistically analyze the results to identify the more likely, deterministic result.
  123. As challenging as quantum computer hardware is, quantum algorithms are just as big a challenge. The rules for using a quantum computer are rather distinct from the rules governing classical computers, so entirely new approaches are needed for quantum algorithms.
  124. The process of designing quantum algorithms is extremely difficult and challenging. The final algorithm may be very simple, but getting to that result is a great challenge. And testing is really difficult as well.
  125. Algorithmic complexity, computational complexity, and Big-O notation. Algorithmic complexity and computational complexity are roughly synonyms and refer to the calculation of the amount of work that an algorithm will need to perform to process an input of a given size. Generally, the amount of work will vary based on the size of the input data. Big-O notation is used to summarize the calculation of the amount of required work.
  126. Quantum speedup. Moving from a classical algorithm with a high degree of algorithmic complexity to a quantum algorithm with a low or lower degree of algorithmic complexity achieves what is referred to as a speedup or quantum speedup.
  127. Key trick: Reduction in computational complexity. One of the key secret tricks for designing quantum algorithms is to come up with clever techniques for reduction in computational complexity — turn a hard problem into an easier problem. Yes, it is indeed harder than it sounds, but the benefits can be well worth the effort.
  128. We need real quantum algorithms on real machines (or real simulators) — not hypothetical or idealized. We’ve had quite a few years of papers published based on quantum algorithms for hypothetical or idealized quantum computers rather than real machines (or real simulators configured to reflect realistic expectations for real machines in the next few years.) That has led to unrealistic expectations for what to expect from quantum computers and quantum algorithms.
  129. What is a qubit? Don’t worry about it at this stage! At this stage it isn’t necessary to get into esoteric details such as qubits and how they might be related to the classical bits of a classical computer. All that really matters is quantum parallelism. Qubits are just an element of the technology needed to achieve quantum parallelism.
  130. No, a qubit isn’t comparable to a classical bit. Any more than a car is comparable to a bicycle or a rocket is comparable to an airplane.
  131. A qubit is a hardware device comparable to a classical flip flop. It is used to store and manipulate a unit of quantum information represented as a unit of quantum state.
  132. Superposition, entanglement, and product states enable quantum parallelism. The details are beyond the scope of this paper, but the concepts of superposition, entanglement, and product states are what combine to enable quantum parallelism, the ability to operate on 2^k distinct values in parallel with only k qubits.
  133. Quantum system — in physics. In physics, an isolated quantum system is any collection of particles which have a quantum state which is distinct from the quantum state of other isolated quantum systems. An unentangled qubit is a quantum system. A collection of entangled qubits is also a quantum system.
  134. Quantum system — a quantum computer. In addition to the meaning of quantum system in physics, a quantum system in quantum computing can also simply refer to a quantum computer or a quantum computer system.
  135. Computational leverage. How many times faster a quantum algorithm would be compared to a comparable classical algorithm. Such as 1,000 X, one million X, or even one quadrillion X a classical solution.
  136. k qubits enable a solution space of 2^k quantum states. The superposition, entanglement, and product states of k qubits combine to enable a solution space of 2^k quantum states.
  137. Product states are the quantum states of quantum parallelism. When superposition and entanglement of k qubits enables 2^k quantum states, each of those unique 2^k quantum states is known as a product state. These 2^k product states are the unique values used by quantum parallelism.
  138. Product states are the quantum states of entangled qubits. Simply stating it more explicitly, product state is only meaningful when qubits are entangled. An unentangled (isolated) qubit will not be in a product state.
  139. k qubits enable a solution space of 2^k product states. More properly, a solution space for k qubits is composed of 2^k product states. Each product state is a unique value in the solution space.
  140. Qubit fidelity. How reliable the qubits are. Can they maintain their quantum state for a sufficiently long period of time and can operations be performed on them reliably.
  141. Nines of qubit fidelity. Express the qubit reliability as a percentage and then count the number of leading nines in the percentage, as well as the fraction of a nine after the last nine. More nines is better — higher fidelity.
  142. Qubit connectivity. How easily two qubits can operate on each other. Usually it has to do with the physical distance between the two qubits — if they are adjacent, they generally can operate on each other most efficiently, but if they are separated by some distance, it may be necessary to move one or both of them so that they are physically adjacent, using so-called SWAP networks, and such movement takes time and can introduce further errors which impact qubit fidelity. Some qubit technologies, such as trapped-ion qubits support full any-to-any qubit connectivity which avoids these problems.
  143. Any-to-any qubit connectivity is best. The best performance and best qubit fidelity (fewest errors) comes with true, full, any-to-any qubit connectivity. Any two qubits can interact, regardless of the distance between them. Not all qubit technologies support it — notably, superconducting transmon qubits do not, but some do — such as trapped-ion qubits.
  144. Full qubit connectivity. Generally a synonym for any-to-any qubit connectivity.
  145. SWAP networks to achieve full qubit connectivity — works, but slower and lower fidelity. SWAP networks are sometimes needed to achieve full qubit connectivity. This approach does work, but is slower and causes lower qubit fidelity. If two qubits are not physically adjacent and any-to-any qubit connectivity is not supported, a SWAP network will be needed to shuffle the quantum states of the two qubits to a pair of qubits which are physically adjacent.
  146. May not be able to use all of the qubits in a quantum computer. Qubit fidelity and issues with qubit connectivity can limit how many qubits can be used in a single quantum computation.
  147. Quantum Volume (QV) measures how many of the qubits you can use in a single quantum computation. Technically it tells you how many quantum states can be used in a single quantum computation without excessive errors, but that is 2^k for k qubits, so k or log2(QV) is a measure of how many qubits you can use in a single quantum computation.
  148. Why is Quantum Volume (QV) valid only up to about 50 qubits? Measuring the Quantum Volume (QV) metric for a quantum computer requires simulating quantum circuits on a classical quantum simulator, which is constrained by memory capacity, so since simulation of quantum circuits larger than about 50 qubits is not feasible, obtaining a Quantum Volume (QV) metric greater than 2⁵⁰ is not feasible either.
  149. Programming model — the essence of programming a quantum computer. The details are beyond the scope of this paper, but the rules for how to program a quantum computer are very technical and require great care.
  150. Ideal quantum computer programming model not yet discovered. Research and innovation is ongoing. Don’t get too invested in what we have today since it will likely be obsolete in a few to five to ten years.
  151. Quantum applications and quantum algorithms. How a quantum computer is used.
  152. Quantum applications are a hybrid of quantum computing and classical computing. Most of the application, especially handling large volumes of data and complex logic, is classical code, while select compute-intensive functions can be implemented as quantum algorithms.
  153. Basic model for a quantum application. The major steps in the process.
  154. Post-processing of the results from a quantum algorithm. Put the results of the quantum algorithm in a form that classical application code can handle.
  155. Quantum algorithm vs. quantum circuit. In computer science we would say that the algorithm is the specification of the logic while the circuit is the implementation of the specification for the logic. A quantum circuit is the exact sequence of quantum logic gates which will be sent to the quantum processing unit (QPU) for execution. The algorithm focuses on the logic, while the circuit focuses on the execution.
  156. Quantum circuits and quantum logic gates — the code for a quantum computer.
  157. Generative coding of quantum circuits rather than hand-coding of circuits. Fully hand-coding the quantum circuits for algorithms is absolutely out of the question. What is needed is a more abstract representation of an algorithm. Generative coding of quantum circuits provides this level of abstraction. Any algorithm designed to be scalable must be generated dynamically using classical code which is parameterized with the input size or size of the simulation, so that as the input size grows or the simulation size grows, a larger number of quantum logic gates will be generated to accommodate the expanded input size.
  158. Algorithmic building blocks, design patterns, and application frameworks are critical to successful use of a quantum computer. The level immediately above the programming model. But beyond the scope of this paper, which focuses on quantum computing overall.
  159. Quantum Fourier transform (QFT) and quantum phase estimation (QPE) are critical to successful use of a quantum computer. These algorithmic building blocks are profoundly critical to effectively exploiting the computational power of a quantum computer, such as for quantum computational chemistry, but beyond the scope of this paper, which focuses on quantum computing overall.
  160. Quantum computer, quantum processor, quantum processing unit, QPU, and quantum computer system are roughly synonyms. But a few distinctions.
  161. Quantum computer as a coprocessor rather than a full-function computer. Most quantum application code will run on a classical computer with only selected functions offloaded to a quantum processor.
  162. Quantum computers cannot fully replace classical computers. Although eventually they will be merged with classical computers as a universal quantum computer, but that’s far in the future.
  163. Quantum applications are mostly classical code with only selected portions which run on a quantum computer. Most application logic either cannot be performed on a quantum computer at all, or wouldn’t achieve any meaningful quantum advantage over performing it on a classical computer. Only selected portions of the quantum application would be coded as quantum algorithms and execute on a quantum computer.
  164. No quantum operating system. As previously mentioned, a quantum computer is not a full-function computer as a classical computer is. Rather, it acts as a coprocessor, similar to how a graphics processing unit (GPU) operates. As such, there is no quantum operating system per se.
  165. Coherence, decoherence, and coherence time. Coherence is the ability of a quantum computer to remain in its magical quantum state where quantum effects can be maintained in a coherent manner which enables the quantum parallelism needed to fully exploit quantum computation. Decoherence is simply the loss of coherence. Coherence time is generally fairly short for many quantum computer technologies, limiting the size and complexity of quantum algorithms. Some technologies are more coherent than others, meaning they have a longer coherence time, which enables greater size and complexity of quantum algorithms.
  166. Gate execution time — determines how many gates can be executed within the coherence time.
  167. Maximum quantum circuit size — limits size of quantum algorithms.
  168. Need to summarize capability requirements for quantum algorithms and applications. Clearly document the capability requirements for quantum algorithms and applications. What capabilities does a quantum computer need to possess to support a quantum algorithm or quantum application.
  169. Matching the capability requirements for quantum algorithms and applications to the capabilities of particular quantum computers. Both must be clearly documented. Can’t run all quantum algorithms and applications on all quantum computers. Need to match the requirements for quantum algorithms and applications with the capabilities for particular quantum computers.
  170. A variety of quantum computer types and technologies. General-purpose and special-purpose types. A variety of qubit technologies.
  171. General-purpose quantum computers. Can be applied to many different types of applications.
  172. Special-purpose quantum computers and special-purpose quantum computing devices. Beyond the scope of this paper, which focuses on general-purpose quantum computers, also referred to as universal gate-based quantum computers.
  173. Different types and technologies of quantum computers may require distinctive programming models. General-purpose quantum computers and special-purpose quantum computers tend to have different programming models — and their quantum algorithms won’t be compatible. Different types of special-purpose quantum computers will tend to have different programming models. Some qubit technologies may indeed have compatible programming models, but some may not.
  174. Don’t get confused by special-purpose quantum computing devices that promise much more than they actually can deliver.
  175. Simulators for quantum computers. You don’t need a real quantum computer to run relatively simple quantum algorithms — you can simulate a quantum computer on a classical computer, also known as a classical quantum simulator. But complex quantum algorithms will run very slowly or not at all since the whole point of a quantum computer is to greatly outperform even the best classical supercomputers. Simulators are also good for debugging and experimenting with improved hardware before it is even available.
  176. Quantum computers — real and simulated are both needed. Both are important for quantum computing. Simulated quantum computers are needed for development and debugging. Real quantum computers are great for production, but less useful for development and debugging.
  177. Focus on using simulators rather than real quantum computers until much better hardware becomes available. Current quantum computers have too many shortcomings to be very useful or productive in the near term. It would be more productive for most people to use classical quantum simulators rather than real quantum computers for most of their work.
  178. Noise, errors, error mitigation, error correction, logical qubits, and fault tolerant quantum computing. Noise can cause errors. Sometimes errors can be mitigated and corrected. Sometimes they just have to be tolerated. But noise is just a fact of life for quantum computing, for the foreseeable future. Eventually we will get to true fault tolerance, but not soon. Near-perfect qubits will help sooner.
  179. Perfect logical qubits. The holy grail of quantum computing. Regular qubits are noisy and error-prone, but the theory, the generally-accepted belief, is that quantum error correction will fully overcome that limitation.
  180. NISQ — Noisy Intermediate-Scale Quantum devices (computers). Official acknowledgement that noise is a fact of life for quantum computing, for the foreseeable future.
  181. Near-perfect qubits as a stepping stone to fault-tolerant quantum computing. Dramatically better than current NISQ devices even if still well-short of true fault-tolerant quantum computing.
  182. Quantum error correction (QEC) remains a distant promise, but not critical if we have near-perfect qubits. Not within the next few years. But not critical as long as we achieve near-perfect qubits within a year or two.
  183. Circuit repetitions as a poor man’s approximation of quantum error correction. By executing the same quantum circuit a bunch of times and then examining the statistical distribution of the results, it is generally possible to determine which of the various results is the more likely result — which result occurs more frequently.
  184. Beyond NISQ — not so noisy or not intermediate-scale. NISQ is technically inaccurate for many current (and future) quantum computers. I’ve proposed some alternative terms to supplement NISQ.
  185. When will the NISQ era end and when will the post-NISQ era begin? Maybe just a year or two, maybe three. Near-perfect qubits would happen first. Perfect logical qubits would be several years after that. Hundreds of qubits are coming later this year (IBM Osprey), but the combination of hundreds of near-perfect qubits might take two or three years, maybe four.
  186. Three stages of adoption for quantum computing — The ENIAC Moment, Configurable packaged quantum solutions, and The FORTRAN Moment. The first production-scale quantum application. Widespread use of quantum applications. It is finally easy for most organizations to develop their own quantum applications.
  187. Configurable packaged quantum solutions are the greatest opportunity for widespread adoption of quantum computing. Combine prewritten algorithms and code with the ability to dynamically customize both using high-level configuration features rather than needing to dive deep into actual quantum algorithms or application code. This will likely be the main method by which most organizations exploit quantum computers.
  188. The FORTRAN Moment — It is finally easy for most organizations to develop their own quantum applications. Advent of a truly high-level programming model, rich collection of high-level algorithmic building blocks, plethora of design patterns, numerous rich application frameworks, and many examples of working and deployable quantum algorithms and applications.
  189. Quantum networking and quantum Internet — research topics, not a near-term reality.
  190. Distributed quantum computing — longer-term research topic. Just as quantum networking is a longer-term research project rather than a current reality, so is distributed quantum computing — a longer-term research topic rather than a current reality.
  191. Distributed quantum applications. Even though distributed quantum computing is not a near-term reality, distributed quantum applications are quite feasible since they’re mostly classical code, which can be distributed today. But, each quantum algorithm works independently, since there is no quantum networking, yet.
  192. Distributed quantum algorithms — longer-term research topic. Although quantum applications can be distributed, today, quantum algorithms cannot be distributed, at present, since there is no quantum networking to connect them and their quantum state.
  193. Quantum application approaches. Three choices: Redesign the entire application, isolate changes to just a few modules, or implement the quantum portions of the application as network services which could have parallel classical and quantum implementations.
  194. Quantum network services vs. quantum applications. A network service which invokes quantum algorithms. The application developer can decide whether to package a quantum application which accesses quantum algorithms as a quantum application or a quantum network service. Is the application designed for use by users (a quantum application) or to provide an API for a network service (a quantum network service)?
  195. What could you do with 1,000 qubits? Nobody really knows for sure. An open question. Some possibilities can be mentioned.
  196. 48 fully-connected near-perfect qubits may be the sweet spot for achieving a practical quantum computer. This would support a 20-qubit quantum Fourier transform (QFT), which could achieve a computational leverage of one million over a classical solution.
  197. 48 fully-connected near-perfect qubits may be the minimum configuration for achieving a practical quantum computer. More than being the sweet spot, anything less just may not cut it. More qubits is easy, but sufficient qubit fidelity and sufficient qubit connectivity may be the critical gating factors.
  198. 48 fully-connected near-perfect qubits may be the limit for practical quantum computers. There are practical limits to real hardware, primarily at the analog interface level.
  199. The basic flow for quantum computing.
  200. Quantum programming.
  201. Quantum software development kits (SDK). A library or collection of tools used to create and execute quantum circuits.
  202. Platforms. A vague, general term, which could be simply an API or it could be an interactive online tool.
  203. Quantum computing software development platforms. Generally an interactive online environment which provides an integrated collection of tools designed to make it easy to design and develop quantum algorithms and/or quantum applications.
  204. Quantum workflow orchestration. Automation of tasks needed to accomplish quantum computing.
  205. Programming languages for quantum applications.
  206. Programming languages for quantum algorithms.
  207. Quantum-native high-level programming language. Designed for quantum algorithms. Make it very easy to design quantum algorithms. Eventually, but not soon.
  208. Python for quantum programming.
  209. Python and Jupyter Notebooks.
  210. Support software and tools. Much software is needed. All classical. Distinct from quantum algorithms.
  211. Tools. A wide variety. Especially developer tools.
  212. Developer tools. Full spectrum from support for design of algorithms to development of quantum applications.
  213. Audiences for developer tools. Quantum algorithm designers and quantum application developers have different needs. Network services differ from classical applications.
  214. Compilers, transpilers, and quantum circuit optimization. Automate tedious aspects of transforming quantum algorithms to efficient quantum circuits.
  215. Interactive and online tools. Many tools for quantum computing are file-oriented, transforming information from one file format to another, but some tools are interactive or online, allowing the user to interact directly with the tool and see an immediate response.
  216. Beware of tools to mask severe underlying technical deficiencies or difficulty of use. Tools are great, but sometimes they are an extra burden and don’t actually fully solve the problems they purportedly address. Granted, a lot of the need for tools may simply be to compensate for deficiencies in quantum hardware, which the tool developers can’t fix, but that shouldn’t excuse the obligation of the hardware vendors to correct deficiencies in their hardware designs.
  217. Agile vs. structured development methodology. Of course it’s up to the quantum algorithm designer or the quantum application developer whether they wish to use an agile methodology or a more traditional structured methodology. Both have their places. Agile is more appropriate during the pre-commercialization stage, where the focus is on research, prototyping, experimentation, and rapid and radical changes. But during commercialization, a more methodical structured approach may make more sense.
  218. Need for an Association for Quantum Computing Machinery — dedicated to the advancement of practical quantum computing. To advance the science, technology, and applications of quantum computing, including research and the development and deployment of quantum applications. And to advance quantum computing as a profession, including education and training, professional development, and networking of professionals and students. Developing and promoting standards, benchmarking, and codes of ethics and professional conduct.
  219. Need to advance quantum information theory. Information at the quantum level. From basic concepts to advanced theory.
  220. Need to advance quantum computer engineering. The hardware, particularly the programming model, architecture, and qubit technology and qubit control. Including fault-tolerant quantum computing — full, automatic, and transparent error detection and correction.
  221. Need to advance quantum computer science. Quantum algorithms operating on quantum information.
  222. Need to advance quantum software engineering. Design, development, deployment, and operation of quantum applications which utilize quantum algorithms.
  223. Need to advance quantum algorithms and applications. Application domain-specific quantum algorithms and software.
  224. Need to advance quantum infrastructure and support software. Including software tools.
  225. Need to advance quantum application engineering. Applications and their deployment are too ad hoc. An engineering approach is needed for the analysis, specification, design, implementation, testing, configuration, deployment, maintenance, and enhancement of quantum applications.
  226. Need for support for research. Administrative and institutional support for research. Separate from the specific technical content and funding of the research.
  227. Need for quantum computing education and training. Both academic and commercial. Seminars, workshops, and conferences as well. Professional growth. Life-long learning. Career development.
  228. Need for quantum certification. Play a role in the development and promotion of certification programs for all aspects of quantum computing skills. Credentials which bear witness to the skills of a professional.
  229. Need for quantum computing standards. To produce and promote the use of formal (or even informal) standards in the quantum computing community and ecosystem. Most importantly, to take an active role in keeping attention focused on standards.
  230. Need for quantum publications. Books and journals. Print, electronic, and online. Email newsletters. Emphasis on research, products, and practice.
  231. Need for quantum computing community. Conferences. In-person and online networking and support forums. Hackathons. Local and student chapters. Employment and academic opportunities. Funding opportunities — academic and commercial, private sector, and government. Emphasis on research, products, and practice. Part of the larger quantum computing ecosystem, which includes vendors, customers, users, and investors and venture capital.
  232. Need for quantum computing ecosystem. The quantum computing community plus vendors, customers, users, and investors and venture capital.
  233. Call for Intel to focus on components for others to easily build their own quantum computers. We would suddenly have a more diverse but compatible universe of hardware vendors. This would produce a much more vibrant quantum computing ecosystem.
  234. Intel could single-handedly do for the quantum computing ecosystem what IBM, Intel, and Microsoft did for the PC ecosystem. Just to emphasize the potential impact of my proposal from the preceding section.
  235. Need to assist students. Outreach. Community. Education. Internship opportunities. Mentoring. Research opportunities. Recognition and awards. Job placement in industry, government, and academia.
  236. Need for recognition and awards. Acknowledge and reward notable technical and professional contributions to the field.
  237. Need for code of ethics and professional conduct.
  238. Quantum computing as a profession. This is a new and open area — what does it mean to be a professional in quantum computing? Is it just that the subject matter is somewhat different? It may be true that the common subject matter binds the disparate professionals together even if their specific professional disciplines differ.
  239. Benchmarking. Measuring the capabilities of a particular quantum computer, and comparison with other quantum computers.
  240. Education. Both academic and professional education in quantum computing.
  241. Training. For employees and users in quantum computing.
  242. Workforce development. Create, maintain, and enhance the potential for individuals to be productive employees in quantum computing.
  243. How to get started with quantum computing. There are any number of alternative approaches and paths to get started in quantum computing. It will depend on the nature of an organization and its objectives, as well as the skills and interests of its existing technical teams.
  244. Investors and venture capital for quantum computing.
  245. Commitment to a service level agreement (SLA). Production deployment requires a solid commitment for availability, performance, capacity, support, redundancy, etc. Be sure to have contractual commitments to all of the above, which is nominally in the form of a service level agreement (SLA). Be sure to read all of the fine print.
  246. Diversity of sourcing. Don’t be reliant on a single provider for any service, equipment, software, or tool. Companies can go out of business during a technological winter, or change their terms of service in an unacceptable manner at any time.
  247. Dedicated access vs. shared access. When we get into the commercialization stage of adoption of quantum computing, when people begin worrying about production deployment, then people will begin worrying about possibly needing dedicated access to quantum computing hardware rather than occasional shared access for short periods of time.
  248. Deployment and production. Once an application has been developed and tested, it’s ready to be deployed and put into production use.
  249. Shipment, release, and deployment criteria. It’s hard enough to get a handle on what an algorithm or application must do, but then there are all of the other factors that go into engineering an industrial-grade and production-scale product or service. This comes down to clearly defining and evaluating criteria for shipment, release, and deployment.
  250. How much might a quantum computer system cost? Generally this question is beyond the scope of this paper at this time, but it’s worth raising the question on principle.
  251. Pricing for service — leasing and usage. Not everyone will wish to purchase a quantum computer system to own it all for themselves. There are alternatives.
  252. Pricing for software, tools, algorithms, applications, and services. Beyond the scope of this paper, but worth highlighting on principle.
  253. Open source is essential. Everything should be open source. All of the software for sure. Generally firmware as well. Any diagnostics, configuration tools, and support software. And operating systems. And even hardware designs when possible. Facilitate customization and extension by customers and academic and government researchers.
  254. Intellectual property (IP) — boon or bane? Intellectual property (IP) such as patents can cut both ways. The prospect of proprietary advantage is a fantastic incentive. But open source can be a huge advantage as well. If too much of the key technologies of quantum computing are locked up due to IP protections, innovation and adoption can be stifled or delayed. See also Risk of a Quantum IP Winter below.
  255. Shared knowledge — opportunities and obstacles. Knowledge is essential and critical for any thriving technical sector. Quantum computing is no exception. There are plenty of opportunities for freely and openly shared knowledge, but also potential for proprietary hiding of knowledge, and fee-based or licensed knowledge as well. And also sensitive, secret, confidential, or classified knowledge which cannot be shared.
  256. Secret projects — sometimes they can’t be avoided. They happen. It’s unavoidable. But not helpful to the health of the quantum computing sector.
  257. Secret government projects — they’re the worst. Unlike private sector secrecy, secret government projects are likely to remain secret indefinitely, at least until their service life has ended and it no longer matters. We can all see what’s going on with publicized quantum computing efforts, but who knows what is going on in secretive government laboratories.
  258. Transparency is essential — secrecy sucks! A thriving, healthy, sustainable, and vibrant quantum computing sector requires a maximal degree of transparency. Tying together the preceding sections into the overarching concept of transparency, open source — maximizes transparency, intellectual property — minimizes transparency, or constrains it to some degree, sharing of knowledge — maximizes transparency most of the time, but in some cases sharing of knowledge is constrained, and secret projects — sometimes they can’t be avoided, but secrecy is not helpful to the health of the quantum computing sector.
  259. An open source quantum computer would be of great value. An open source project for an entire quantum computer would be quite useful. This would enable a greater population of researchers to quickly start up new quantum computing research projects without any need to reinvent the basic wheel just to get started. All of the science and engineering details would be worked out in full detail and posted online and free, with a thriving online support and contribution community. All that would be needed would be to order the parts and put it together. And maybe a startup could offer kits to do so even more easily. And another startup to put the kit together for you.
  260. Computational diversity. Refers to the use of a variety of types of computing hardware, not only classical computers, but GPU, FPGA, and others as well. Quantum computers are added to the mix.
  261. Quantum-inspired algorithms and quantum-inspired computing. The radically different mindset of quantum computing which focuses on quantum parallelism can lead to novel re-thinking of how problems can be solved on classical computers, particularly to exploit multiple processors, parallel processing, and large distributed clusters. Sampling approaches such as Monte Carlo simulation can also be used to approximate solutions on a classical computer, but still modeled on quantum parallelism.
  262. Collaboration — strategic alliances, partnerships, joint ventures, and programs. Organizations leverage the resources of each other for shared benefits.
  263. Classical computing still has a lot of runway ahead of it and quantum computing still has a long way to go to catch up. It is likely premature to give up on classical computing and put all (or most, or even many) of your eggs in the quantum computing basket. There is also the potential for GPUs and other forms of computational diversity before quantum computing becomes the slam-dunk best choice for many applications.
  264. Science fiction. Real science is frequently predicted or anticipated much earlier in fiction or science fiction. It’s always interesting to see how science reality turns out compared to its fictional predecessors. There have been a handful of cameo appearances or minor references to quantum computers in some fictional works, but I’m surprised that there haven’t been more. In truth, there has been a vast wealth of fictional treatment of quantum computing — in the form of hype and anticipation of what we think or imagine quantum computing will be or be like when we finally do get to practical quantum computing in another five to ten years. It actually might be quite helpful to see greater fictional treatments of quantum computing. Sometimes examination of fiction can provide us with insight about where the true boundaries are between fact and fiction. And who’s to say what fact really is when we are speculating about the future.
  265. Universal quantum computer is an ambiguous term. Two valid meanings. First, universal gate set, which means that the machine is capable of all possible functional manipulations of quantum information and implies general purpose, in contrast to special purpose. Second, a universal quantum computer, which is a proposal for a merger of quantum computing and classical computing into a single integrated machine with classical and quantum data coexisting.
  266. No, quantum computers cannot crack encryption keys, today, in the next few years, and possibly ever.
  267. Might a Quantum Winter be coming? Not in the next couple of years. But if promised advances don’t materialize over two to three years, then a Quantum Winter might be possible.
  268. Risk of a Quantum IP Winter. Separate from the possibility of a Quantum Winter which is based on the inability of technology to fulfill expectations, there is also the possibility of a Quantum Winter based on restrictions on use of available technology as a result of intellectual property protections. The technology needed to fulfill expectations may indeed be available, but restricted due to IP legal protections or onerous licensing fees, or contract disputes or protracted lawsuits.
  269. No, Grover’s search algorithm can’t search or query a database.
  270. No, Shor’s factoring algorithm probably can’t crack a large encryption key.
  271. No, variational methods don’t show any promise of delivering any dramatic quantum advantage. If quantum Fourier transform cannot be used, one category of alternatives are variational methods. Unfortunately, they are not anywhere near as powerful as quantum Fourier transform. They work, in a fashion, but don’t really offer a lot of computational power or opportunity for truly dramatic quantum advantage. So far, only mediocre results, at best. And they have difficulties such as so-called barren plateaus which make them difficult to work with and problematic as well. Mostly such an approach simply confirms that a solution can be implemented on a quantum computer, not that such a solution has any great advantage over classical solutions.
  272. The hardware is relatively simple compared to even a typical personal computer, tablet, or even a smartphone, but packaged in a very large system to accommodate cryogenic refrigeration or a vacuum chamber or lots of laser devices and a lot of specialized electronics which has not yet been miniaturized since it is so new and still under active research.
  273. Jargon and acronyms — most of it can be ignored unless you really need it.
  274. My glossary of quantum computing terms. Over 3,000 entries, including terms from quantum mechanics and classical computing.
  275. arXiv — the definitive source for preprints of research papers. Most of the academic and industrial research papers I read can be found on arXiv.org, which specializes in preprints of research papers. If you do a Google search on some topic of quantum computing, results that come from arXiv.org can usually be trusted to be fairly definitive. You can also just add the “arxiv” keyword to your Google search to force Google to find search results from arXiv.org.
  276. Free online books. Many are available.
  277. Resources for quantum computing. Just do a Google search on any questions you have. As well as ressources I’ve linked throughout this paper.
  278. Personas, use cases, and access patterns for quantum computing. In order to put quantum computing in context, we need to be able to discuss who is trying to do what and how they are going to do it. Personas are the who, use cases are the what, and access patterns are the how. Personas, use cases, and access patterns lie at the intersection of workforce and applications, with personas corresponding to jobs, positions, or roles of the workforce, and use cases corresponding to applications.
  279. Beware of oddball, contrived computer science experiments. It is relatively easy to contrive artificial computer science experiments that seem rather impressive, but don’t represent any practical real-world problems or their solutions. Google’s quantum supremacy experiment is one example. Boson sampling is another. Interesting experiments for computer scientists, but no practical applications.
  280. Beware of research projects masquerading as commercial companies. It can be tempting to move a promising research project out of the research laboratory and form a commercial company around it to finish the research and commercialize it as a commercial company long before the research is likely to be completed and actually ready for commercialization. There is a high risk of either outright failure, or a very extended timeframe before the research really is ready to be turned into a viable commercial product.
  281. Technology transfer. It is not uncommon for a research-oriented organization to develop some new technology and sense that it has potential commercial value, but recognize that realizing commercial value is beyond their charter, capabilities, and interests. Organizations finding themselves in such a situation might then consider technology transfer to a commercial organization who is more ideally positioned and equipped to realize the perceived commercial potential.
  282. Research spinoffs. Such as a university spinoff or academic spinoff is a variation of technology transfer where the research organization creates a commercial entity to pursue commercialization of the technology developed by the researchers.
  283. Don’t confuse passion for pragmatism. Enthusiasm is a good thing, but exuberant passion is not a valid substitute for pragmatism.
  284. Quantum hype — enough said. There’s a lot of it out there. I do address a fair amount of it directly. But a lot of it doesn’t deserve our attention. Just ignore all of the noise.
  285. Role of academia and the private sector. Research and development of commercial products. And of course education of students and preparing the workforce.
  286. Role of national governments. Funding of a lot of research. Demand for quantum applications.
  287. When will quantum computing finally be practical? Practical quantum computing isn’t really near, like within one, two, or three years. Although quantum computers do exist today, they are not ready for production deployment to address production-scale practical real-world problems. Four to seven years is a fair bet, and even then only for moderate benefits.
  288. Names of the common traditional quantum algorithms. This list is presented here simply to illustrate where we are today in quantum computing. It also highlights that we have a long way to go before we have a decent foundation for average application developers to begin exploiting quantum computing.
  289. Quantum teleportation — not relevant to quantum computing. I only mention quantum teleportation here because it does get mentioned a lot in introductions to quantum computing even though it has nothing to do with quantum computing. Rather, it is the foundation for quantum communication, the use of quantum state to securely transmit classical information, which is rather distinct from quantum computing.
  290. IBM model for quantum computing. IBM has a somewhat distinctive model for quantum computing. You can’t blame them for wanting them to distinguish themselves from the rest of the pack. I won’t dive into it too deeply here, but you can get a better sense of it from their latest (2022) Quantum roadmap.
  291. Circuit knitting. Circuit knitting breaks a large quantum circuit into smaller pieces to run on multiple quantum processors, and then combines the results together on a classical computer. This is a fairly recent development that is not yet widely available. IBM has announced it and targeted it for 2025 in their Development Roadmap.
  292. Dynamic circuits. A technique for interspersing classical processing in the middle of quantum circuits, allowing classical code to detect conditions in intermediate quantum results, and dynamically selecting portions of quantum circuits to be executed based on classical logic. This is a fairly recent development that is not yet widely available. IBM has announced it and targeted it for later in 2022 in their Development Roadmap.
  293. Future prospects of quantum computing. Very speculative. Too much is unknown.
  294. Post-quantum computing — beyond the future. What might come after quantum computing? It may seem way too soon to even contemplate what might come after quantum computing, especially since we aren’t even close to achieving practical quantum computing yet, but still it’s an intriguing thought.
  295. Should we speak of quantum computing as an industry, a field, a discipline, a sector, a domain, a realm, or what? My choice: sector (of the overall computing industry).

The grand challenge of quantum computing

Quantum computers provide some truly amazing capabilities to achieve dramatic quantum advantage over classical solutions, but the grand challenge is to figure out how to exploit those capabilities most effectively for a wide range of applications and for particular application problems to enable production-scale practical real-world quantum applications that achieve dramatic quantum advantage and deliver extraordinary business value.

That is no easy task.

Designing quantum algorithms is a great challenge.

Developing quantum applications which can effectively utilize these quantum algorithms is also a great challenge.

The twin goals of quantum computing: achieve dramatic quantum advantage and deliver extraordinary business value

The only point of using a quantum computer is to achieve two, twin goals:

  1. Achieve dramatic quantum advantage.
  2. Deliver extraordinary business value.

The first is needed to achieve the second, but it is really the second that is the primary goal — deliver extraordinary business value.

Without delivering extraordinary business value, there’s no real point of going quantum.

And the implication is that only by achieving dramatic quantum advantage can we get to delivering extraordinary business value.

Achieve dramatic quantum advantage

The goal using a quantum computer is not simply to get some performance advantage or even a significant advantage, but to achieve a dramatic quantum advantage, to achieve a mind-boggling performance advantage.

Not just a 25% or 50% improvement, or even just a 2X or even 10X or even a 100X improvement, but more like a million billion times advantage over classical computing, and that’s just for starters.

The goal is to achieve results that simply aren’t within the realm of practicality using classical computers.

Deliver extraordinary business value

But even more important than raw computational performance advantage, a quantum computer needs to deliver actual business value, and in fact extraordinary business value.

Classical computers can deliver business value, so quantum computing needs to deliver business value which simply isn’t within the realm of practicality using classical computers.

Quantum computing is the process of utilizing quantum computers to address production-scale practical real-world problems

Quantum computers are machines — hardware and software. The question is how to most effectively utilize those machines to address production-scale practical real-world problems. This is a process. Quantum computing is this process.

It’s not an easy process.

And in fact it’s a work in progress. Nobody has it all figured out yet.

The components of quantum computing

These are the major components, functions, areas, and topics which need to be discussed to cover quantum computing:

  1. Quantum computers. The machines themselves.
  2. Classical quantum simulators. Using classical software to simulate quantum algorithms as if running on actual, real quantum computers.
  3. Quantum algorithms.
  4. Quantum applications.
  5. Tools.
  6. Compilers, transpilers, and quantum circuit optimization.
  7. Support software.
  8. System management software. Including job scheduling and management.
  9. Programming models.
  10. Programming languages.
  11. Algorithmic building blocks.
  12. Design patterns.
  13. Application frameworks.
  14. Scalability of quantum algorithms.
  15. Debugging.
  16. Testing.
  17. Deployment.
  18. Monitoring applications.
  19. Performance.
  20. Capacity.
  21. Availability.
  22. Service level agreements (SLA).
  23. Benchmarking.
  24. Education.
  25. Training.
  26. Workforce development.
  27. The ENIAC Moment. First production-scale quantum application.
  28. Configurable packaged quantum solutions. The greatest opportunity for widespread adoption of quantum computing.
  29. The FORTRAN Moment. Introduction of a high-level programming model and language which enables application development without needing to be a quantum expert.
  30. Moving beyond the lunatic fringe.
  31. Moving beyond being a mere laboratory curiosity.
  32. Quantum advantage.
  33. Collaboration. Alliances, partnerships, joint ventures, programs.
  34. Need for an Association for Quantum Computing Machinery — dedicated to the advancement of practical quantum computing.
  35. Quantum computing as a profession.

Quantum computer, quantum computation, and quantum computing

These three terms are very similar but still distinct:

  1. Quantum computation. The work that is to be accomplished by a quantum algorithm. The actual work that is accomplished using a quantum algorithm on a quantum computer.
  2. Quantum computer. The machine, the device which actually performs the work for a quantum computation. It actually executes the quantum algorithm.
  3. Quantum computing. The process of using a quantum computer to perform quantum computation. The process of designing and developing quantum algorithms and quantum applications. The process of integrating quantum computation with classical computation to collectively form a complete quantum application. The process of using quantum applications.

Quantum computing is the use of a quantum computer to perform quantum computation

Or putting it all together:

  • Quantum computing is the use of a quantum computer to perform quantum computation.

Quantum algorithms and quantum applications

Besides the quantum computer itself, there are two central types of objects that we focus on in quantum computing: quantum algorithms and quantum applications.

The quantum algorithms perform the actual computation on the quantum computer. They represent the quantum computation to be performed.

The quantum application is classical code which invokes the quantum algorithms to be run on the quantum computer, and then processes the results from the quantum computation in the classical code of the application.

Approaches to quantum computing

Quantum computing is not a one size fits all process. There are a variety of approaches. It’s not clear what processes will work best. To each his (or her) own. It’s all a work in progress.

Some tentative approaches:

  1. Research. Focus on research alone. Any commercial product development is left to others.
  2. Commercial product development. Design and develop commercial products based on research results completed by others, preferably using off the shelf technology.
  3. Combined research and commercial product development. Both research and commercial product development. Risky and not recommended. Research efforts may fail, fall short of expectations, or take much longer than expected.
  4. Top down. Start with critical applications and then consider how quantum computation can be utilized.
  5. Bottom up. Design a broad range of critical quantum algorithms and then consider how applications can use them.
  6. Configurable packaged quantum solutions. Hand-crafted generalized applications which can then be easily configured to address specific application situations.
  7. Prototyping. Focus on stripped-down subsets of applications chosen to be more easily solved with simpler approaches to quantum algorithms. If and when that works, look for opportunities to expand to make the prototype algorithms and applications more full-featured and incrementally less stripped down.
  8. Pilot projects. Full-featured, but limited size. Shows how everything would work. Large enough size to get a handle on capacity issues, but less than full production scale.
  9. Proof of concept projects. Test out particular ideas, but not necessarily for an actual, specific application. Will the idea work as expected? Test out performance and capacity issues.
  10. Production application development. Design, implement, test, and deploy full and complete applications.
  11. Major consulting firms — assistance. Accenture, et al. Help give major corporate clients a boost. Review and advise.
  12. Major consulting firms — application development. Actual design and development of quantum algorithms and applications for major corporate clients. Initially prototyping. Then pilot projects — near production scale, but no actual deployment. Eventually production application development — and deployment.
  13. Major consulting firms — license quantum IP. Develop and license quantum intellectual property to major corporate clients.
  14. Hardware vendors. Focus on the actual quantum processing units.
  15. Full stack quantum computing. Hardware and software.
  16. Software and tools. Software and tools that facilitate use of the hardware or design and implementation of quantum algorithms and quantum applications.
  17. Quantum startups. Anything goes. The full spectrum of possibilities. One or more of the above. May tend to specialize in niches for maximum impact, but target multiple niches to gain advantage by integrating functions across the targeted niches.

Three stages for development of quantum algorithms and quantum applications — prototyping, pilot projects, and production projects

Since quantum computing is so new and undeveloped, a multi-stage process will generally be needed to develop quantum algorithms and quantum applications:

  1. Prototyping. Demonstrating a subset of basic capabilities. Are they even remotely feasible. The research stage. Basic proof of concept as well.
  2. Pilot projects. Near production-scale. Full feature. But no deployment. Testing scalability for algorithms and applications. Larger-scale proof of concept.
  3. Production projects. Full production-scale. Full feature. Robust quality. Fault-tolerant. Ready for production deployment. Production deployment, monitoring, problem recovery, maintenance, and enhancement.

Prototyping, experimentation, and evaluation

Quantum computing is new, untried, and unproven, so it needs to be tried and proven. And feedback needs to be given to researchers and vendors. And decisions must be made as to whether to pursue the technology as it is or pass on it. Or maybe simply defer until a later date when the technology has matured some more.

In the early stages, before little is known, the main approaches are:

  1. Prototyping. Demonstrating a subset of basic capabilities. Are they even remotely feasible. The research stage. Basic proof of concept.
  2. Experimentation. Just trying things out. See how well it works. Run tests. Make improvements. Iterate.
  3. Evaluation. Review the results of prototyping and experimentation. Make improvements as needed and iterate. Converge to an opinion as to how workable the technology really is. Make a go/no-go decision on moving forward with the technology.

And each of these stages or approaches can involve providing feedback to researchers and vendors. Then each of these stages or approaches can be iterated as fresh research results and iterations of the quantum computing hardware and software are made available.

Proof of concept experiments

It’s probably a fielder’s choice whether to consider a project to be a prototype or a proof of concept. The intent is roughly the same — to demonstrate a subset of basic capabilities. In this paper I bundle proof of concept experiments under prototyping.

Although there is a narrower additional meaning for proof of concept, namely, to prove or demonstrate that some full feature or full capability can be successfully implemented.

A prototype can be a simplified subset, as can a proof of concept, but a proof of concept can also be used to test the extremes for a capability or feature. To make sure that it will really work under more realistic, production conditions.

Even if a proof of concept does test or prove a capability or feature to its full extremes, the proof of concept would generally only cover a subset of the full capabilities or features of the full quantum application or full quantum algorithm.

Generally, a prototype would never be expected to demonstrate or prove anything in a full and complete sense.

Prototyping vs. pilot project

Prototyping and pilot projects can also have a lot in common, but they have very distinct purposes:

  1. Prototyping seeks to prove the feasibility of a small but significant subset of functions.
  2. A pilot project seeks to prove that a prototype can be scaled up to the full set of required functions and at a realistic subset of production scale capacity. And validate performance expectations.

Generally, a pilot project might be preceded by one or more prototypes, each testing different sets of features or adding features. Eventually the technical team will have built up enough confidence from their prototyping so that they have high confidence in the pilot project.

This is all occurring in the pre-commercialization stage of development. Too soon to even be considering commercialization.

Production projects and production deployment

In the pre-commercialization stage we are focused on trying things out and experimentation — proving concepts and gaining knowledge. But once pre-commercialization is complete, we move on to commercialization, with production projects where the goal is to design and build complete, full-featured products or services.

The culmination of a production project is a product or service which is ready for deployment.

When a product or service is ready to be made operational, to begin fulfilling its promised features, we call that production deployment.

Production deployment of a production-scale practical real-world quantum application

That’s the ultimate goal for a quantum computing project, when it’s ready to be made operational to begin fulfilling its promise:

  • Production deployment of a production-scale practical real-world quantum application

Of course, this event occurs during the commercialization stage, not during pre-commercialization.

Quantum effects and how they enable quantum computing

This paper is intended to be a relatively light and high level introduction to quantum computers, so it’s too much to delve deeply into the physics behind quantum computing. All you need to know at this stage is that quantum effects enable quantum parallelism.

But if you do wish to dive deep to understand the physics (quantum mechanics) which enables quantum computers, quantum parallelism, and quantum computing, check out my paper:

Quantum information

Information is more complicated in quantum computing. Classical information is based on the classical binary 0 and 1. Quantum information takes classical information to a whole new level, more than can be trivially summarized here. Once again, this paper is intended to be a relatively light and high level introduction to quantum computers, so it’s too much to delve deeply into the physics behind quantum computing. All you need to know at this stage is that quantum effects enable classical information to be represented, stored, and manipulated in a form that enables quantum information which in turn enables quantum parallelism.

Some of the details will be given in subsequent sections.

As a teaser, a mere k qubits can represent 2^k distinct values, all simultaneously. In contrast, k classical bits can also represent 2^k distinct values, but only one at a time. That’s a huge difference. Each of the 2^k distinct values on a quantum computer is known as a product state. Product states are what enable the quantum parallelism of quantum computing.

If you really want to dive deeply into quantum information, check out my paper:

Such as the section entitled “Unlike a classical bit, a qubit is a device”. Like I said, it gets complicated.

Quantum information science (QIS) as the umbrella field over quantum computing

People sometimes treat quantum computing and quantum information science as if they were exact synonyms, but there is a clear distinction. Quantum information science is the umbrella field which covers:

  1. Quantum computing.
  2. Quantum communication. Transferring classical information using quantum state for security.
  3. Quantum networking. Transferring quantum state between separate quantum computers.
  4. Quantum metrology. Measurement of discrete physical quantities.
  5. Quantum sensing. Observing entire quantum phenomena. Including imaging.
  6. Quantum information. What can be represented in the quantum state of quantum systems.

So, quantum computing is certainly part of quantum information science, but there’s more to quantum information science than quantum computing alone.

For more detail on quantum information science, see my paper:

Quantum information science and technology (QIST) — the science and engineering of quantum systems

Quantum information science and technology (QIST) is sometimes simply a synonym for quantum information science (QIS), but emphasizes both the science and the engineering of quantum systems.

Researchers may be more concerned with quantum information science (QIS) alone, but users and developers of products based on quantum information science are more concerned with quantum information science and technology (QIST), with emphasis on the engineering.

Quantum mechanics — can be ignored at this stage

Quantum mechanics along with quantum effects is the physics that enables quantum computers, but once again this paper is intended to be a relatively light and high level introduction to quantum computers, so it’s too much to delve deeply into the physics behind quantum computers.

But if you do wish to dive deep to understand the physics behind quantum computers, check out my paper on quantum effects:

Quantum physics — can also be ignored at this stage

Quantum physics is sometimes used simply as a synonym for quantum mechanics, but generally refers to physics at the molecular, atomic, and particle level. Quantum mechanics is a little more general and abstract than just the physics, the physical systems, but that’s well beyond the scope of a light, high-level view of quantum computing.

Quantum state

Quantum state is the essence behind how quantum information is represented, stored, and manipulated in a quantum computer in a form that enables quantum parallelism. But once again this paper is intended to be a relatively light and high level introduction to quantum computing, so it’s too much to delve deeply into the physics behind quantum computers and quantum state.

Some details will be covered in subsequent sections of this paper.

Quantum state is covered in greater depth in my paper on quantum information:

What is a qubit? Don’t worry about it at this stage!

At this stage it isn’t necessary to get into esoteric details such as qubits and how they might be related to the classical bits of a classical computer. All that really matters is quantum parallelism. Qubits are just an element of the technology needed to achieve quantum parallelism.

Some of the other esoteric details of quantum computers which we similarly don’t need to get into at this level include:

  1. Quantum state.
  2. Basis states.
  3. Rotation of quantum state about three axes.
  4. Bloch sphere.
  5. Quantum logic gates.
  6. Quantum circuits.
  7. Probability amplitudes.
  8. Phase angle.
  9. Superposition.
  10. Entanglement.
  11. Product states.
  12. Interference.
  13. Quantum mechanics.
  14. Vector spaces.
  15. Complex numbers.
  16. Wave function.
  17. Hamiltonian.
  18. Measurement.

No, a qubit isn’t comparable to a classical bit

Again, it isn’t necessary to get into esoteric details such as qubits at this stage — and you really don’t need to know anything about qubits to understand the essence of quantum computers, which is quantum parallelism.

Most puff pieces on quantum computing start by comparing quantum bits (qubits) to classical bits, but they simply aren’t comparable and it gets too complicated to unravel the differences.

Trying to compare quantum bits with classical bits is as useless as comparing:

  1. A car to a bicycle.
  2. A plane to a car.
  3. A boat to a car.
  4. A jet to a plane.
  5. A rocket to a jet.
  6. A rocket to a snail. Hmmm… maybe that’s not such a bad comparison after all! But it’s still more complicated than that.

See also the section above entitled Quantum information. Although if you’re just trying to get a high-level view of quantum computing — which really is the purpose and point of this paper, don’t waste your energy, at least yet, at this stage.

Qubits and quantum information are covered in greater depth in my paper on quantum information:

A qubit is a hardware device comparable to a classical flip flop

The purpose of a qubit is to store and manipulate a unit of quantum state. This is in much the way that a classical digital electronic flip flop is used to store and manipulate a classical bit.

Quantum state is used to represent a unit of quantum information.

The distinction here is that a classical bit is the actual classical information (0 or 1) while a quantum qubit is a hardware device which stores and manipulates the quantum information.

For more on quantum information, see my paper:

Superposition, entanglement, and product states enable quantum parallelism

The details of superposition, entanglement, and product states are beyond the scope of this paper, but they combine to enable quantum parallelism, which is the ability to operate on 2^k distinct values in parallel with only k qubits. Some examples:

  1. 10 qubits can operate on 2¹⁰ or approximately 1,000 distinct values simultaneously.
  2. 20 qubits can operate on 2²⁰ or approximately a million distinct values simultaneously.
  3. 30 qubits can operate on 2³⁰ or approximately a billion distinct values simultaneously.
  4. 40 qubits can operate on 2⁴⁰ or approximately a trillion distinct values simultaneously.
  5. 50 qubits can operate on 2⁵⁰ or approximately a quadrillion distinct values simultaneously.
  6. And so on.

Quantum system — in physics

In physics, an isolated quantum system is any collection of particles which have a quantum state which is distinct from the quantum state of other isolated quantum systems.

Actually, by definition, a quantum system is isolated — an isolated quantum system. Being isolated and having a distinct quantum state is what makes it a quantum system.

A single, isolated, unentangled qubit is an isolated quantum system. It has its own quantum state distinct from all other qubits.

A collection of entangled qubits is also an isolated quantum system. The collection of entangled qubits has its own quantum state distinct from all other qubits that are not in the collection.

Quantum system — a quantum computer

In addition to the meaning of quantum system in physics, a quantum system in quantum computing can also simply refer to a quantum computer or a quantum computer system.

Computational leverage

Computational leverage is how many times faster a quantum algorithm would be compared to a comparable classical algorithm.

Such as 1,000 X, one million X, or even one quadrillion X a classical solution.

Generally, k qubits will provide a computational leverage of 2^k. Such as:

  1. 10 qubits = 2¹⁰ = ~ 1,000 X.
  2. 20 qubits = 2²⁰ = ~ one million X.
  3. 50 qubits = 2⁵⁰ = ~ one quadrillion X.

k qubits enable a solution space of 2^k quantum states

As indicated by the previous sections, the superposition, entanglement, and product states of k qubits combine to enable parallel computation on 2^k distinct values (quantum states) simultaneously. This is also referred to as a solution space of 2^k quantum states.

This isn’t quite the same as loading 2^k distinct values into memory, but does enable computations on a wide range of possible values simultaneously.

Product states are the quantum states of quantum parallelism

When superposition and entanglement of k qubits enables 2^k quantum states, each of those unique 2^k quantum states is known as a product state.

These 2^k product states are the unique values used by quantum parallelism.

Product states are the quantum states of entangled qubits

Simply stating it more explicitly, product state is only meaningful when qubits are entangled.

An unentangled (isolated) qubit will not be in a product state.

k qubits enable a solution space of 2^k product states

As we said earlier, a solution space for k qubits is composed of 2^k quantum states, but it is a little more proper and meaningful to say that a solution space for k qubits is composed of 2^k product states, to emphasize that the quantum state requires more than a single qubit.

Each of the 2^k product states is a unique value in the solution space.

Qubit fidelity

Qubit fidelity refers to how reliable the qubits are. Can they maintain their quantum state for a sufficiently long period of time and can operations be performed on them reliably without causing errors.

For more detail on qubit fidelity, see my paper:

Qubit fidelity is commonly expressed as nines of qubit fidelity

Nines of qubit fidelity

Qubit fidelity can be expressed in a number of ways, but nines of qubit fidelity is one convenient and popular method. Simply express the qubit reliability as a percentage and then count the number of leading nines in the percentage, as well as the fraction of a nine after the last nine. More nines is better — higher fidelity.

Some examples:

  1. 90% = one nine.
  2. 95% = 1.5 nines.
  3. 98% = 1.8 nines.
  4. 99% = two nines.
  5. 99.5% = 2.5 nines.
  6. 99.8% = 2.8 nines.
  7. 99.9% = three nines. The minimum for reasonable quality results.
  8. 99.95% = 3.5 nines. A better goal than 3 nines.
  9. 99.975% = 3.75 nines. Even better.
  10. 99.99% = four nines. Really where we need to be, but quite a stretch at this stage.
  11. 99.999% = five nines.

For more on nines of qubit fidelity, see my paper:

Qubit connectivity

Qubit connectivity refers to how easily two qubits can operate on each other. Usually it has to do with the physical distance between the two qubits — if they are adjacent, they generally can operate on each other most efficiently, but if they are separated by some distance, it may be necessary to move one or both of them so that they are physically adjacent, using so-called SWAP networks, and such movement takes time and can introduce further errors which impact qubit fidelity. Some qubit technologies, such as trapped-ion qubits support full any-to-any qubit connectivity which avoids these problems.

For more detail on qubit connectivity, see my paper on quantum computers:

Any-to-any qubit connectivity is best

The best performance and best qubit fidelity (fewest errors) comes with true, full, any-to-any qubit connectivity. Any two qubits can interact, regardless of the distance between them. Not all qubit technologies support it — notably, superconducting transmon qubits do not, but some do, such as trapped-ion qubits.

Full qubit connectivity

Generally a synonym for any-to-any qubit connectivity.

SWAP networks to achieve full qubit connectivity — works, but slower and lower fidelity

SWAP networks are sometimes needed to achieve full qubit connectivity. This approach does work, but is slower and causes lower qubit fidelity.

When two qubits are to be used together in the same quantum logic gate, one of three approaches will be used:

  1. If the qubits are physically adjacent, the gate can be immediately executed. This is fast and high fidelity. This is ideal.
  2. If full any-to-any qubit connectivity is supported by the qubit technology, the gate can be immediately executed. This is fast and high fidelity. This is also ideal.
  3. But if the qubits are not physically adjacent and any-to-any qubit connectivity is not supported, a SWAP network will be needed to shuffle the quantum states of the two qubits to a pair of qubits which are physically adjacent. This can be slow and can introduce errors.

There are compilers, transpilers, optimizers, and other tools which can automatically optimize a quantum circuit to minimize the need for and impact from SWAP networks, but even though they can automate the process of generating the SWAP networks, this can have the side effect of:

  1. Making the quantum circuit larger. And if it gets too large it might not be able to be executed at all.
  2. Slow the quantum circuit down. Which may exceed the coherence time for the quantum computer.
  3. Introduce errors. More errors means lower qubit fidelity. Too much need for SWAP networks can render the results of the quantum circuit as meaningless due to an excessive error rate.

A single SWAP network may consist of an arbitrary number of SWAP gates, each of which swaps the quantum state of two adjacent qubits. So if a considerable distance separates two qubits, a number of SWAP gates must be used, which compounds the performance and fidelity degradation.

Actually, a SWAP gate is really a composite operation of three CNOT gates, each operating on the same two qubits, but in a different order. So, a SWAP network can generate a lot of extra CNOT quantum logic gates, and potentially a lot of errors.

Again, there are tools to automate this, but they have the negative side effect of impacting performance, circuit size, and overall circuit fidelity and errors.

In short, some quantum algorithms may seem practical even on today’s quantum computers, but may fail to operate properly once the negative impacts from SWAP networks are taken into account.

May not be able to use all of the qubits in a quantum computer

Qubit fidelity and qubit connectivity issues can limit how many qubits can be used in a single quantum computation.

If qubit fidelity and qubit connectivity are too low, errors can accumulate so that at some point it is no longer possible to accurately perform the desired computation. This implicitly limits the number of qubits of the quantum computer that can be used in a quantum computation.

Quantum Volume (QV) measures how many of the qubits you can use in a single quantum computation

IBM came up with a single performance metric, Quantum Volume (QV), which combines qubit fidelity and any limitations on qubit connectivity. Technically it tells you how many quantum states can be used in a single quantum computation without excessive errors, but that is 2^k for k qubits, so k or log2(QV) is a measure of how many qubits you can use in a single quantum computation.

For more detail on Quantum Volume, see my paper:

Why is Quantum Volume (QV) valid only up to about 50 qubits?

It’s complicated, but measuring the Quantum Volume (QV) metric for a quantum computer requires simulating quantum circuits on a classical quantum simulator, which is constrained by memory capacity, so since simulation of quantum circuits larger than about 50 qubits is not feasible, obtaining a Quantum Volume (QV) metric greater than 2⁵⁰ is not feasible either.

This does not mean that you can’t measure Quantum Volume (QV) for quantum computers with more than 50 qubits (53, 65, 80, 100, 127, etc.) since generally such quantum computers can only reliably execute quantum circuits significantly smaller than their total qubit capacity. For example, as of the time of this writing, the Quantum Volume (QV) metric for the 127-qubit IBM Eagle quantum computer was only 64, meaning that quantum circuits using more than six qubits (6 = log2(64) would tend to get excessive errors.

At some stage it is likely that quantum circuits using more than 50 qubits will be executable without excessive errors, but that won’t be in the near future or next year or two. A new benchmark will likely be in place by the time quantum circuits with 50 or more qubits can be executed reliably by a real quantum computer.

For more detail on this issue with Quantum Volume, see my paper:

Programming model — the essence of programming a quantum computer

The programming model is the essence of programming a quantum computer. It’s the rules of the road for quantum computing.

Although this paper is billed as focusing on what quantum algorithm designers and quantum application developers can see and expect from a quantum computer, which is roughly known as the programming model, the details of the programming model are at a lower level than the level of this paper, which is intended to be a relatively light and high level introduction to quantum computing.

This paper is not intended to be an introduction to programming quantum computers or algorithm design.

Although there are many online resources which can provide an introduction to programming quantum computers and algorithm design, one that stands out is the IBM Qiskit Textbook:

  • Learn Quantum Computation using Qiskit
  • Greetings from the Qiskit Community team! This textbook is a university quantum algorithms/computation course supplement based on Qiskit to help learn:
  • The mathematics behind quantum algorithms
  • Details about today’s non-fault-tolerant quantum devices
  • Writing code in Qiskit to implement quantum algorithms on IBM’s cloud quantum systems
  • https://qiskit.org/textbook/preface.html

There is also a section on Training later in this paper which lists a few training resources.

It is also worth noting that special-purpose quantum computers tend to have different programming models than general-purpose quantum computers. This paper focuses on the latter, general-purpose quantum computers.

Ideal quantum computer programming model not yet discovered

Research and innovation for how to represent, store and manipulate quantum information is ongoing. Don’t get too invested in what we have today since it will likely be obsolete in a few to five to ten years.

If nothing else, the current general-purpose quantum computing programming model is far too primitive. A high-level programming model is needed.

Future programming model evolution

The current programming model for quantum computing is very primitive. It’s roughly comparable to classical machine language. It really isn’t for mere mortals, for the normal and average members of technical staff. It’s really only suitable for the most elite technical staff, the lunatic fringe. A high-level programming model is needed.

The current programming model may be sufficient to make it through the next two major stages of the adoption of quantum computing:

  1. A few hand-crafted applications (The ENIAC Moment). Limited to super-elite technical teams.
  2. A few configurable packaged quantum solutions. Focus super-elite technical teams on generalized, flexible, configurable applications which can then be configured and deployed by non-elite technical teams. Each such solution can be acquired and deployed by a fairly wide audience of users and organizations without any quantum expertise required.
  3. Higher-level programming model (The FORTRAN Moment). Which can be used by more normal, average, non-elite technical teams to develop custom quantum applications. Also predicated on perfect logical qubits based on full, automatic, and transparent quantum error correction (QEC). It is finally easy for most organizations to develop their own quantum applications.

That third stage would clearly require a radically new programming model.

There is also plenty of room for incremental improvement of the current programming model, although most of the improvements might come in the form of:

  1. Algorithmic building blocks. Various collections of the operations from the programming model, which perform interesting and useful high-level operations.
  2. Design patterns. Sequences and arrangements of the algorithmic building blocks which are known to be very effective — and easy to understand and easy to use.
  3. Application frameworks. Structure for the overall application or modular portions of the application which are known to be very effective — and easy to understand and easy to use.

Algorithmic building blocks, design patterns, and application frameworks

Once the foundational programming model is established, then the question becomes how algorithm designers proceed to design quantum algorithms and how application developers proceed to design and develop quantum applications.

There are four levels to work with:

  1. Fundamental operations of the programming model. The atomic operations of the quantum computer. Quantum logic gates.
  2. Algorithmic building blocks. Various collections of the operations from the programming model, which perform interesting and useful high-level operations.
  3. Design patterns. Sequences and arrangements of the algorithmic building blocks which are known to be very effective — and easy to understand and easy to use.
  4. Application frameworks. Structure for the overall application or modular portions of the application which are known to be very effective — and easy to understand and easy to use.

The overall, overarching goal is to avoid reinventing the wheel. Those who blaze the trails of quantum computing in the early days, the elite and the lunatic fringe will have learned how to do things the hard way. Those who come later and follow their blazed trails can reuse a lot of their work.

Algorithmic building blocks

It’s beyond the scope of this high-level paper to dive into details, but algorithmic building blocks are various collections of the operations from the programming model, which perform interesting and useful high-level operations. The design of quantum algorithms is greatly facilitated by the availability of a rich collection of high-level algorithmic building blocks.

Some examples:

  1. Quantum Fourier transform (QFT).
  2. Quantum phase estimation (QPE).
  3. Quantum amplitude estimation (QAE).
  4. Quantum amplitude amplification (QAA).
  5. SWAP gate and SWAP networks. SWAP is actually a composite gate — it translates into three CNOT gates. A SWAP network can transfer the quantum state of a qubit to any other qubit regardless of the distance between them.

Other than the handful of examples mentioned, there is little in the way of such high-level algorithmic building blocks available at present.

Design patterns

Specific design patterns for quantum algorithms are beyond the scope of this paper. The important point here is that the design of quantum algorithms is greatly facilitated by having a rich library of design patterns to choose from. And on the flip side, a dearth of design patterns makes the design of quantum algorithms that much more problematic.

Design patterns apply to quantum applications as well, a mix of classical code and quantum algorithms.

If everybody is designing the wheel from scratch for every quantum algorithm and quantum application, the pace of progress is greatly slowed.

At present, there is nothing available in terms of established design patterns, for either quantum algorithms or quantum applications.

Application frameworks

Specific application frameworks for quantum applications are beyond the scope of this paper. The important point here is that application frameworks would greatly facilitate the design and development of quantum applications. And on the flip side, a dearth of application frameworks makes the design and development of quantum applications that much more problematic.

If everybody is designing the wheel from scratch for every quantum application, the pace of progress is greatly slowed.

At present, there is nothing available in terms of established application frameworks for quantum applications.

Quantum applications and quantum algorithms

Quantum applications and quantum algorithms are how a quantum computer is used.

A quantum algorithm describes the functions to be performed by the quantum computer itself.

A quantum application is essentially a normal classical application which invokes quantum algorithms to perform some of its functions.

How much of the classical code is translated to quantum algorithms will vary from application to application. It might be a single function for some applications and many functions for other applications.

And to be clear, the translation process is strictly manual and very tedious, with little opportunity for automation since the programming models of classical computers and quantum computers are so radically different.

Quantum applications are a hybrid of quantum computing and classical computing

Quantum applications are a hybrid of classical code and quantum algorithms. Most of the application, especially handling large volumes of data and complex logic, is classical, while small but critical and time-consuming calculations can be extracted and transformed into quantum algorithms which can be executed in a much more efficient manner than is possible with even the most powerful classical supercomputers.

Basic model for a quantum application

As already mentioned, a quantum application is a hybrid of classical application code and quantum algorithms which are invoked from the classical code. The overall model has these components:

  1. Application performs most processing in classical code. Most application logic is not suitable for a quantum computer, typically because it is either complex logic or involves processing of large amounts of data (such as so-called Big Data.)
  2. Selected application functions are implemented as quantum algorithms. These are compute-intensive, but don’t require processing large volumes of data (Big Data.)
  3. Classical application code prepares data and invokes quantum algorithms as needed. Quantum algorithms cannot directly access application data.
  4. Each quantum algorithm will be repeatedly invoked a number of times (shots or circuit repetitions) so that the results can be statistically analyzed to get the likely correct result. Since quantum computing is inherently probabilistic. And noisy and error-prone as well. Even if it wasn’t noisy and error-prone, such as with quantum error correction, it would still be probabilistic, by nature.
  5. Classical post-processing code in the application transforms the results from the quantum algorithm into a form usable by the application’s classical code.
  6. Application continues processing with classical code. Again, most application logic is not suitable for a quantum computer.

Post-processing of the results from a quantum algorithm

A quantum algorithm returns results from a quantum computation which typically then have to be post-processed to put them into a form that the classical code can process as normal classical application data. This post-processing would be performed using classical code in the application.

Quantum algorithm vs. quantum circuit

Quantum algorithm and quantum circuit are frequently used as synonyms, but technically there is a difference.

In computer science we would say that the algorithm is the specification of the logic while the circuit is the implementation of the specification for the logic.

A quantum circuit is the exact sequence of quantum logic gates which will be sent to the quantum processing unit (QPU) for execution.

Some quantum algorithm designers may choose to directly compose actual quantum circuits, ready to be executed directly. Such quantum algorithms are not scalable — they are designed for specific input.

More sophisticated quantum algorithm designers write classical code in a high-level classical programming language such as Python which invokes a library function for each gate to be executed, and then sends that collection of gates to be executed on the quantum computer. Such algorithms may or may not be scalable.

Generative coding of quantum circuits, to be described in a subsequent section of this paper uses classical logic to dynamically decide what gates to generate, depending on the particular input and any parameters. In such cases, the algorithm is abstract and can generate a variety of quantum circuits, while each quantum circuit is specific to a particular input value and particular parameters.

With generative coding the classical code is really the algorithm, the specification of the logic, capable of generating any number of specific quantum circuits.

Generative coding is essentially required, mandatory, for designing a scalable algorithm.

So, while people may casually treat the terms quantum algorithm and quantum circuit as exact synonyms, in this paper they are quite distinct although inextricably linked.

Quantum circuits and quantum logic gates — the code for a quantum computer

A quantum algorithm might be written in any notation that the quantum algorithm designer chooses to adequately express what the algorithm is trying to accomplish, but such an abstract notation won’t be directly executable by a quantum computer. Once an algorithm has been designed, it must be translated into the machine language of the quantum computer.

A quantum algorithm is translated to a quantum circuit, which is what a quantum computer directly executes.

A quantum circuit consists of a sequence or graph of quantum logic gates which are the elementary operations which a quantum computer can execute.

Quantum logic gates operate directly on qubits, revising their quantum state as dictated by the quantum algorithm.

Getting more technical, each quantum logic gate represents a unitary transformation matrix which specifies how the quantum state of the designated qubit should be rotated about the three axes. But most people can ignore this level of detail, at least at this stage.

Generative coding of quantum circuits rather than hand-coding of circuits

Fully hand-coding the quantum circuits for algorithms is absolutely out of the question. What is needed is a more abstract representation of an algorithm. Generative coding of quantum circuits provides this level of abstraction. Any algorithm designed to be scalable to handle input of any size must be generated dynamically using classical code which is parameterized with the input size or size of the simulation, so that as the input size grows or the simulation size grows, a larger number of quantum logic gates will be generated to accommodate the expanded input size.

Generally, any input value will have to be encoded in the gate structure of the generated quantum circuit — using quantum logic gates to initialize the quantum state of qubits to correspond to any input value and any parameters. That’s the only way to get input data or parameters into a quantum algorithm (circuit) since a quantum computer has no I/O capabilities at the circuit or QPU level.

Ideally, the rules for generating the quantum circuit for a given input will be identical regardless of input size. The rules should be identical, but parameterized by the input size so that the exact gates generated will be determined by the input size, in some predictable, mathematical manner.

The classical code would typically be developed using a programming language such as Python using looping and conditional statements to invoke a library to generate the individual gates.

Generative coding of quantum circuits uses classical logic to dynamically decide what gates to generate, depending on the particular input and any parameters. In such cases, the algorithm is abstract and can generate a variety of quantum circuits, while each quantum circuit is specific to a particular input value and particular input parameters.

With generative coding the classical code is really the algorithm, the specification of the logic, capable of generating any number of specific quantum circuits depending on the particular input value and input parameter values.

Generative coding of quantum circuits is essentially required, mandatory, for designing a scalable algorithm.

Algorithmic building blocks, design patterns, and application frameworks are critical to successful use of a quantum computer

Algorithmic building blocks, design patterns, and application frameworks are the level immediately above the programming model, but they are a software layer which is not technically part of the quantum computer itself.

They are critical, but beyond the scope of this paper, which focuses on quantum computing overall.

Quantum Fourier transform (QFT) and quantum phase estimation (QPE) are critical to successful use of a quantum computer

Quantum Fourier transform (QFT) and quantum phase estimation (QPE) are algorithmic building blocks which are profoundly critical to effectively exploiting the computational power of a quantum computer — quantum parallelism, especially for applications such as quantum computational chemistry, but beyond the scope of this paper, which focuses on quantum computing overall.

There’s more to quantum computing than the bare hardware of the quantum computer itself.

Measurement — getting classical results from a quantum computer

It won’t do any good to perform a calculation on a quantum computer unless we can get some results which we can then use in the classical code of a quantum application. That’s where measurement comes in.

As each step of a quantum computation proceeds, each qubit contains a fragment of the computation in the form of quantum state. And, entangled qubits share a more complicated form of quantum state known as product state.

Quantum states and product states cannot be observed or measured directly. They aren’t the simple 0 and 1 of classical bits, but more complex forms with probability amplitudes and phase angles. But that’s beyond the scope of this informal paper.

In any case,measurement operations are used to obtain classical binary approximations of the quantum states of qubits and product states.

However complex the quantum state of a qubit or product state may be, measurement returns an exact classical binary 0 or 1 for each qubit. The probability of the probability amplitude will bias the result for each qubit more towards 0 or 1. A lower probability below 0.5 biases towards 0 and a higher probability above 0.5 biases towards 1. A probability of exactly 0.5 will be a coin flip between 0 and 1.

But in the end, the quantum application will receive a binary 0 or 1 for each qubit which is measured.

Measurement — collapse of the wave function

However complex the quantum state of a qubit or product state may be, measurement returns an exact classical binary 0 or 1 for each qubit. The probability of the probability amplitude will bias the result for each qubit more towards 0 or 1. A lower probability below 0.5 biases towards 0 and a higher probability above 0.5 biases towards 1. A probability of exactly 0.5 will be a coin flip between 0 and 1.

The loss of the additional information of quantum state and product state beyond the simple binary 0 or 1 on measurement of qubits is known as the collapse of the wave function.

The wave function is what maintains the complex quantum state of a qubit — the probability amplitudes and phase angles. But that’s a topic beyond the scope of this informal paper.

Extracting useful results from a massively parallel quantum computation

Quantum parallelism is a great tool, but after performing that massive number of simple computations in parallel, the question is how you obtain useful results which can be shipped back to the classical code of a quantum application. Unfortunately, there is no simple and pat answer. Great cleverness and care are needed. The details are beyond the scope off this paper, but some of the techniques involve:

  1. Phase angle.
  2. Interference.
  3. Quantum Fourier transform (QFT).
  4. Quantum phase estimation (QPE).
  5. Quantum amplitude amplification (QAA).
  6. Quantum amplitude estimation (QAE).

Components of a quantum computer

The details are well beyond the scope of this paper, but at a very high level, the major components of a quantum computer system are:

  1. Classical computer. Overall system management. Including job scheduling and management. Oversee execution of quantum algorithms. Interface to the control electronics to execute the quantum circuits for quantum algorithms. Interface to the network to manage the system and to receive requests to run quantum algorithms. May also permit local execution of all or a portion of the classical code for a quantum application, such as using Qiskit Runtime.
  2. Control electronics. Analog electronics to control the quantum electronics, and digital electronics to control the analog electronics.
  3. Quantum processing unit (QPU). The actual qubits and the components above which are actually involved with maintaining and manipulating the quantum states of the qubits.
  4. Environmental control. May involve cryogenics or a vacuum chamber and shielding to maintain isolation of quantum components from external and environmental interference.
  5. Network access. A quantum computer is generally accessed over a network connection. It may simply be a local network connection or a connection to the Internet to enable cloud access.
  6. Algorithm execution. Software running on the classical computer within the quantum computer system which sequences through the definition of quantum circuit for the quantum algorithm and transforms each step of the circuit from a high-level matrix operation into a stream of requests which are sent to the control electronics to actually perform each step at the quantum level. After sequencing through the entire circuit, which may only take a small fraction of a second, the quantum results are read and sent back to the quantum application for processing by the classical code of the quantum application.
  7. Calibration. Special software running periodically within the quantum computer system which tests and adjusts the quantum components, namely the qubits, since although they should be identical, in practice they tend to all operate a little differently.

See the previous paper for greater detail:

Access to a quantum computer

A quantum computer is generally accessed over a network connection.

It may simply be a local network connection or a connection to the Internet to enable cloud access via a quantum service provider.

Another option is local runtime for tighter integration of classical and quantum processing. See that section a few sections down from here.

Quantum service providers

A number of cloud service providers are quantum service providers, providing cloud access to quantum computers, such as:

  1. IBM Quantum. Provides cloud-based access to IBM’s in-house fleet of quantum computers.
  2. Amazon Braket. Provides a “managed quantum computing service” which permits you to access a wide variety of quantum computers from within AWS. Supports quantum computers from multiple vendors.
  3. Microsoft Azure Quantum.An open ecosystem to write and run code on a diverse selection of today’s quantum hardware.
  4. Google Quantum Computing Service.Our quantum computing service provides chaperoned access to NISQ processors and our simulator for researchers who aim to advance the state-of-the-art in quantum computing and publicly share their results in algorithms, applications, tools, and processor characterizations.

Having your own in-house quantum computer is not a viable option at this stage

In theory, customers could have their own dedicated quantum computers in their own data centers, but that’s not appropriate at this stage since quantum computers are still under research and evolving too rapidly for most organizations to make an investment in dedicated hardware. And customers are focused on prototyping and experimentation rather than production deployment.

There are some very low-end quantum computers, such as 2 and 3-qubit machines from SpinQ which can be used for in-house experimentation, but those are very limited configurations.

Job scheduling and management

One of the critical functions for the use of networked quantum computers is job scheduling and management. Requests are coming into a quantum computer system from all over the network, and must be scheduled, run, and managed until their results can be returned to the sender of the job. Details are beyond the scope of this paper, but the various quantum service providers listed in a preceding section will have details.

Local runtime for tighter integration of classical and quantum processing

The simplest method of integrating classical application logic with quantum algorithms is to have the classical code running on a classical system, and then accessing the quantum algorithms remotely in the cloud over a network connection, typically via a quantum service provider. This does work fine, but is very suboptimal when a very large number of invocations of quantum algorithms are needed for the quantum application.

IBM Qiskit Runtime addresses this issue by permitting classical application code to be packaged and sent to the quantum computer system where the classical application code can run on the classical computer that is embedded inside of the overall quantum computer system, permitting the classical application code to rapidly invoke quantum algorithms locally without any of the overhead of a network connection. Final results can then be returned to the remote application.

Consult the IBM summary for Qiskit Runtime:

Other quantum computer vendors may offer similar service, if not today, then likely in the future.

Where are we at with quantum computing?

At a very high level, quantum computing can be generally characterized as:

  1. Still in the early stages. Still in the pre-commercialization stage.
  2. Much research is still needed. Theoretical, experimental, and applied research. Basic science, algorithms, applications, and tools.
  3. Quantum computing is still in the pre-commercialization phase. Focus is on research, prototyping, and experimentation.
  4. Not ready for production-scale practical real-world quantum applications. More capabilities and more refinement are needed.
  5. Production deployment is not appropriate at this time. Not for a few more years, at least.
  6. More suited for the lunatic fringe who will use anything than for normal, average technical staff.
  7. Still a mere laboratory curiosity. Not ready for production-scale practical real-world quantum applications or production deployment.
  8. No 40-qubit algorithms to speak of. Where are all of the 40-qubit algorithms? We have 53, 65, 80, and 127-qubit quantum computers and 32 and 40-qubit classical quantum simulators, but no 40-qubit or even 32-qubit quantum algorithms, or even 28 or 24-qubit quantum algorithms. We need to up our algorithm game.
  9. Beware of premature commercialization. The technology just isn’t ready yet. Much more research is needed. Much more prototyping and experimentation is needed.algorithms are still much too primitive.
  10. Doubling down on pre-commercialization is the best path forwards. Research, prototyping, and experimentation should be the priorities, not premature commercialization.
  11. What production-scale practical real-world problems can a quantum computer solve — and deliver so-called quantum advantage, and deliver real business value? No clear answer or clearly-defined path to get an answer, other than trial and error, cleverness, and hard work. If you’re clever enough, you may find a solution. If you’re not clever enough, you’ll be unlikely to find a solution that delivers any significant quantum advantage on a quantum computer.

Current state of quantum computing

The previous section addresses the current state of affairs of quantum computing in a more abstract sense.

I considered detailing the current state of affairs of quantum computing in a concrete sense, such as detailing what specific technologies and capabilities are available today, but I realized that could be a whole long paper on its own — and would need constant updating with each new announcement. And such details would be too much of a distraction since this paper is more focused on quantum computing as an abstraction anyway.

Some of the categories of details include:

  1. The various quantum computer architectures.
  2. The various qubit technologies.
  3. Vendors of the various qubit technologies.
  4. Number of qubits in various models of quantum computers from various vendors.
  5. Performance of the various models of quantum computers.
  6. Performance metrics of the various models of quantum computers.
  7. The many quantum algorithms.
  8. The performance of the many quantum algorithms.
  9. Consulting firms specializing in quantum computing.
  10. Various software development kits.
  11. Various software tools.
  12. Many books on quantum computing.
  13. Many research papers on quantum computing.
  14. Many web sites focused on quantum computing.
  15. Many Wikipedia articles focused on quantum computing.
  16. Detailed timeline for all of the many developments and advances in quantum computing.
  17. The many promises that have been made for quantum computing in general.
  18. The many promises that have been made by various vendors.

Getting to commercialization — pre-commercialization, premature commercialization, and finally commercialization — research, prototyping, and experimentation

Quantum computing is still in its infancy. Much work is needed before practical quantum computers will be commercially viable. The overall model:

  1. Pre-commercialization means research as well as prototyping and experimentation. This will continue until the research advances to the stage where sufficient technology is available to produce a viable product that solves production-scale practical real-world problems. All significant technical issues have been resolved, so that commercialization can proceed with minimal technical risk.
  2. Premature commercialization is bad. Attempting to commercialize before all of the hard research questions have been answered is a recipe for disaster. Pre-commercialization must be completed before commercialization can begin. But, history shows that people will be tempted to jump the gun and some won’t be able to resist that temptation.
  3. Commercialization means productization after research as well as prototyping and experimentation are complete. Productization means a shift in focus from research to a product engineering team — commercial product-oriented engineers and software developers rather than scientists and researchers.

For more discussion of this overall process, see my paper:

Quantum computing is still in the pre-commercialization phase

Focus is on research, prototyping, and experimentation. Definitely not ready for commercialization or production deployment.

For more details on pre-commercialization, see my paper:

More suited for the lunatic fringe who will use anything than for normal, average technical staff

Eventually quantum computing will enter the mainstream of computing, but for now and the foreseeable future it will be suitable only for the lunatic fringe, those elite experts and early adopters who are able to master and exploit the most obscure and difficult to use technology that is far from ready for use by normal, average technical staff — and far from ready for production deployment.

It’s still the wild west out there. It can be great and exciting, for some, but not for most.

For more on the lunatic fringe, see my paper:

Still a mere laboratory curiosity

Quantum computers are not ready for production-scale practical real-world quantum applications or production deployment and won’t be for the foreseeable future. Sure, they do exist today, but more for research, prototyping, experimentation, and demonstration than practical use. This makes them mere laboratory curiosities — we can look at them and imagine their future potential, but that potential does not yet exist for production-scale practical real-world quantum applications.

For more on quantum computing being a mere laboratory curiosity, see my paper:

So when will quantum computing advance beyond mere laboratory curiosity? That’s one of the great unknown questions at this stage. It could be a few years. It could be five to seven years. Maybe ten years. Or more. Nobody knows. And I mean nobody!

Much research is still needed

Much more research is needed for quantum computing in all areas at all levels. Theoretical, experimental, and applied. Basic science, algorithms, applications, and tools.

For more details, see my paper:

How much more research is required?

A lot more research is still needed before quantum computing is ready to exit from the pre-commercialization stage and begin the commercialization stage. How much research is still needed? There’s no great clarity as to how much more research is needed. Most answers to this question are more abstract than concrete, although there are a number of concrete milestones.

We will be done with research for pre-commercialization when:

  1. All open questions about the basics of quantum computing have been answered.
  2. We’re able to demonstrate all basic capabilities.
  3. Qubit fidelity is high enough.
  4. Full any to any qubit connectivity is achieved.
  5. Coherence time is increased and gate execution time is reduced so that relatively large and complex circuits can be accommodated.
  6. We’re able to reach The ENIAC Moment. First production-scale practical real-world quantum application.
  7. The technology feels ready for commercial-scale use. Feels ready to address production-scale practical real-world problems. Not just a distant future, but we are able to begin tackling these problems in the present.

Of course, research will be an ongoing long-term project, in such areas as:

  1. Full, automatic, and transparent quantum error correction (QEC).
  2. Technology is sufficient to develop configurable packaged quantum solutions.
  3. Development of a high-level programming model.
  4. Development of a quantum-native high-level programming language.
  5. Rich and high-level algorithmic building blocks.
  6. Rich and high-level design patterns.
  7. Ready for widespread use. Not just the elite and the lunatic fringe.
  8. Reach The FORTRAN Moment. Really ready for widespread use.
  9. Universal quantum computer merging all classical computing.

No 40-qubit algorithms to speak of

We have 53, 65, 80, and 127-qubit quantum computers and 32 and 40-qubit simulators, but no 40-qubit or even 32-qubit quantum algorithms, or even 28 or 24-qubit quantum algorithms.

Oh, and oddball, contrived computer science experiments that use many qubits without regard to the quality of the results don’t count here. I’m interested in practical real-world applications.

Sure, there are some technical excuses, but the bottom line is that people are not making a big enough deal about the lack of more sophisticated quantum algorithms.

They’re making a big deal of having quantum computers with 53, 65, 80, and 127 qubits, but ignoring the fact that algorithms aren’t using close to all of those qubits.

We need to up our algorithm game.

Where are all of the 40-qubit algorithms?

I wrote an entire paper on this issue of the dearth of 40-qubit algorithms:

Beware of premature commercialization

The technology just isn’t ready yet for quantum computing at a production scale for practical real-world problems. Much more research is needed. Much more prototyping and experimentation is needed. Premature commercialization at this stage is a really bad idea.

For more detail on premature commercialization, see my paper:

Doubling down on pre-commercialization is the best path forwards

Research, prototyping, and experimentation should be the priorities, not premature commercialization. And definitely not production deployment.

For more detail on this recommendation, see my paper:

The ENIAC Moment — proof that quantum computer hardware is finally up to the task of real-world use

The ENIAC Moment is defined as the first time a production-scale practical real-world quantum application can be run in something resembling a production environment. Proves that quantum computer hardware is finally up to the task of real-world use. And quantum algorithms are up to the task as well.

We’re not there yet. Not even close.

For more detail, see my paper:

The time to start is not now unless you’re the elite, the lunatic fringe

Most normal technical teams and management planners should wait a few years, specifically until the ENIAC Moment has occurred. Everything learned before then will need to be discarded.

The ENIAC moment will be the moment when we can finally see to the other side of the looking glass, where the real action will be and where the real learning needs to occur.

At least another two to three years before quantum computing is ready to begin commercialization

We still have a mountain of work to complete before quantum computing is ready to exit from the pre-commercialization stage even to begin the commercialization stage.

We’re looking at probably at least another two to three years of pre-commercialization before quantum computing is ready to begin commercialization.

It could even be longer. Even four to five years. Or longer.

And it’s not likely to be less.

Little of current quantum computing technology will end up as the foundation for practical quantum computers and practical quantum computing

Rapid evolution and radical change will virtually assure that virtually all of our current quantum computing technology will be replaced by the time practical quantum computers and practical quantum computing become practical. This process will be fueled by ongoing research. Wait a few years and everything will have changed.

This includes hardware and software. And algorithms.

Including the programming model. The current programming model is too primitive to be used by most people.

Sure, some elements of current quantum computing technology will likely remain, at least in the earliest stages of practical quantum computing, but that will be the exception rather than the rule.

Current quantum computing technology is the precursor for practical quantum computers and practical quantum computing

That doesn’t mean that the current technology is a waste or a mistake — it does provide a foundation for further research, prototyping, and experimentation, which provides feedback into further research and ideas for future development.

In essence, the current technology is simply the precursor to what will be the technology for practical quantum computing.

Little data with a big solution space

The ideal use case for a quantum computer has these qualities:

  1. A small amount of input data.
  2. A fairly simple calculation.
  3. Apply that simple calculation to a very large solution space. Many possible values. The calculation will be replicated for each value in that large solution space, in parallel. Very quickly.
  4. Producing a small amount of output.

Quantum computers have no ability to access files or databases, so they have no access to Big Data per se. The big solution space is the analog to Big Data in quantum computing.

Very good at searching for needles in haystacks

Essentially, a quantum computer is very good at searching for needles in haystacks. Very large haystacks, much bigger than even the largest classical computers could search. And doing it much faster.

Back to little data with a big solution space model, the huge solution space is the haystack to be searched and the solution is the needle to be found.

Quantum parallelism

As mentioned at the start, quantum parallelism is the secret sauce of quantum computing and involves evaluating a vast number of alternative solutions all at once. A single modest computation is executed once but operating on all possible values at the same time.

Essentially, a quantum computer is very good at searching for needles in haystacks. Very large haystacks, much bigger than even the largest classical computers could search. And doing it much faster.

And it was just mentioned that the optimal use of a quantum computer is for little data with a big solution space. A small amount of input data with a fairly simple calculation which will be applied to a very large solution space, producing a small amount of output.

But a difficulty with quantum parallelism is that it is not automatic — the algorithm designer must cleverly deduce what aspect of the application can be evaluated in such a massively parallel manner.

And the ugly truth is that not all algorithms can exploit quantum parallelism or exploit it fully.

Ultimately the degree of quantum parallelism is not guaranteed and will vary greatly between applications and algorithms and even depending on the input data.

Quantum advantage expresses how much more powerful a quantum solution is compared to a classical solution. It is typically expressed as either how many times faster the quantum computer is, or how many years, decades, centuries, millennia, or even millions or billions or trillions of years a classical computer would have to run to do what a quantum computer can do in mere seconds, minutes, or hours.

Ultimately, the goal is not simply to do things faster, but to make the impossible possible. To make the impractical practical. Many computations are too expensive today to be practical on a classical computer. Quantum parallelism is the means by which this is accomplished.

The secret sauce of quantum computing is quantum parallelism

Just to emphasize the point clearly, the secret sauce of a quantum computer is quantum parallelism, where a vast number of alternative solutions are evaluated simultaneously, all at once. A single modest computation executed once but operating on all possible values at the same time.

Combinatorial explosion — moderate number of parameters but very many combinations

Another way of characterizing the sweet spot for quantum computing is applications which have a moderate number of parameters but a huge number of combinations or permutations of those parameters. What is called a combinatorial explosion.

The combinations of parameters can be evaluated in parallel with one simple computation, far faster than any classical computer or even any large classical supercomputer or large distributed computing cluster.

But not good for Big Data

Curiously, quantum computers have no ability to access files, databases, or network services, or any other source of Big Data, so Big Data is not the kind of haystack that a quantum algorithm can search. As just mentioned in little data with a big solution space, searching a huge solution space is the analog to Big Data for quantum computing. But it’s not Big Data in the classical sense.

None of the Three V’s of Big Data are appropriate for quantum computing — volume, velocity, variety.

The classical portion of a quantum application can indeed access files, databases, network services, and other sources of Big Data, but then only relatively small amounts of that data can be processed on any given invocation of a quantum algorithm, which greatly limits the value of quantum computing for processing that Big Data. Any quantum parallelism would be limited to the small amount of data contained in each chunk.

In fact, a quantum computer would only have value if the classical computing time to process a single element or chunk of Big Data was huge, which is generally not the case. Generally with Big Data the time cost comes from the large volume of data to be processed, not the time to process an individual item or modest chunk of data.

What production-scale practical real-world problems can a quantum computer solve — and deliver so-called quantum advantage, and deliver real business value?

There is no clear answer or clearly-defined path to get an answer as to what application problems can effectively be solved using a quantum computer, other than trial and error, cleverness, and hard work.

If you’re clever enough, you may find a solution.

If you’re not clever enough, you’ll be unlikely to find a solution that delivers significant quantum advantage on a quantum computer.

Heuristics for applicability of quantum computing to a particular application problem

Well, there really aren’t any robust heuristics yet for determining whether a quantum solution is appropriate for a particular application problem.

Much more research is needed.

Prototyping and experimentation may yield some clues on heuristics and rules of thumb for what application problems can be matched to quantum algorithm solutions.

What applications are suitable for a quantum computer?

There are a lot of tricky and obscure criteria for what applications are most appropriate for a quantum computer, but some general categories of applications for quantum computers include:

  1. Optimization, planning, and logistics.
  2. Forecasting.
  3. Financial modeling.
  4. Drug design and discovery.
  5. Genomics.
  6. Cybersecurity and cryptography.
  7. Molecular modeling.
  8. Chemistry modeling, computational chemistry.
  9. Material design and modeling.
  10. Aerospace physics.
  11. Quantum simulation. Simulation of physical systems at the quantum mechanical level.
  12. Artificial intelligence, machine learning.
  13. Random number generation. Such as for public encryption keys.

Although these are generalizations — not all applications in any of these categories would necessarily be suitable for a quantum computer.

For more detail, see my paper:

Quantum machine learning (QML)

Just to highlight one area of high interest for applications of quantum computing — quantum machine learning (QML), but details are beyond the scope of this paper.

Quantum AI

Beyond quantum machine learning (QML) in particular, there are many open questions about the role of quantum computing in AI in general. This is all beyond the scope of this paper.

Quantum AGI

Beyond quantum machine learning (QML) in particular and AI generally, there are many open questions about the role of quantum commuting in artificial general intelligence (AGI) in particular. This is all beyond the scope of this paper.

Ultimately there is the question of how to achieve higher-order human-level intelligence, commonly referred to as AGI (artificial general intelligence.)

Are quantum effects needed to mimic the human brain and human mind? I suspect so.

Is the current model for general-purpose quantum computing (gate-based) sufficiently powerful to compute all that the human brain and mind can compute, or is a more powerful architecture needed? I suspect the latter.

This is mostly a speculative research topic at this stage. I don’t expect much progress over the next five years — we’re too busy getting a basic practical quantum computer to work first.

What can’t a quantum computer compute?

Quantum computers offer some awesome features, but they lack most of the features of a general purpose Turing machine which are offered by all classical computers. A quantum computer is more of a specialized coprocessor than a general purpose computer. Some of the basic features of a classical computer can be mimicked by a quantum computer, but unless your algorithm exploits quantum parallelism to a dramatic degree, it will generally not gain any significant advantage over a classical computer.

A few of the constraints:

  1. No support for control structures. No conditionals, looping, function calls, or recursion.
  2. No rich data types. Even integer and real numbers are not built in. No ability to process raw text.
  3. Very limited algorithm size. Nothing that might take tens of thousands of lines of code.
  4. No support for Big Data.
  5. No support for I/O.
  6. No support for database access.
  7. No support for accessing network services.
  8. No high-level programming model.
  9. No high-level programming languages. For the quantum algorithms themselves, that is.

For more detail, see my paper:

Not all applications will benefit from quantum computing

Just to drive the point home. It’s sad but true. Not all applications will benefit from quantum computing. In many cases, classical solutions will be good enough.

Worse, there really aren’t any robust heuristics yet for determining whether a quantum solution is appropriate for a particular application problem.

Much more research is needed.

Prototyping and experimentation may yield some clues on heuristics and rules of thumb for what application problems can be matched to quantum algorithm solutions. Lots more work to be done on this front.

Quantum-resistant problems and applications

Some problems or applications are not readily or even theoretically solvable using a quantum computer. These are sometimes referred to as quantum-resistant mathematical problems or quantum-resistant problems.

This category includes any application problems which are just too hard or too complex to be readily addressed using quantum algorithms.

For example, the traditional traveling salesman problem (TSP) is technically a quantum-resistant problem since it has factorial algorithmic complexity, which is significantly harder than even the exponential complexity which quantum computers are best for. There may be special cases or simplifications of TSP which are tractable on a quantum computer, but the general problem is not.

Actually, there is one very useful and appealing sub-category of quantum-resistant mathematical problems, namely post-quantum cryptography (PQC), also known as post-quantum encryption, which includes the newest forms of cryptography which are explicitly designed so that they cannot be cracked using a quantum computer.

Post-quantum cryptography (PQC)

It is commonly believed that Shor’s factoring algorithm will be able to crack even 2048 and 4096-bit public encryption keys once we get sufficiently capable quantum computers. As a result, a search began for a new form of cryptography which wouldn’t be vulnerable to attacks using quantum computers.

The basic idea is that all you need is a quantum-resistant mathematical problem, which, by definition, cannot be attacked using quantum computation.

NIST (National Institute of Standards and Technology) is currently formalizing a decision in this area. They’ve made some tentative decisions, but there are further milestones in the process before official standards are formalized.

See the NIST efforts on post-quantum cryptography:

Post-quantum cryptography (PQC) is likely unnecessary since Shor’s factoring algorithm likely won’t work for large numbers the size of encryption keys

Shor’s factoring algorithm is very impressive, but technical limitations in the analog electronics of real quantum computers likely mean that it can only work for factoring relatively small numbers and not the very large numbers used for public encryption keys — 1024, 2048, and 4096 bits. Therefore, the whole premise of post-quantum cryptography (PQC) is likely bogus, and hence PQC is likely unnecessary.

Everybody is loudly proclaiming that all encrypted data is at risk of being decrypted by a quantum computer running Shor’s factoring algorithm to crack public encryption keys, but:

  1. This isn’t true today.
  2. This won’t be true in the near future.
  3. This won’t be true in the foreseeable future.
  4. It is very unlikely to be true… ever.

There are detailed technical reasons why Shor’s factoring algorithm may indeed eventually work for smaller numbers up to maybe 10 to 20 bits, but not work for very large numbers beyond that, especially 1024, 2048, and 4096-bit public encryption keys.

The technical details are described in my paper:

See the section entitled What about Shor’s factoring algorithm? Sorry, but it will only work for small numbers, not very large numbers.

How do you send input data to a quantum computer? You don’t…

It’s a good question, but the answer is simple — you can’t send any input data to a quantum computer, at least not as data.

Rather, any input data for a quantum algorithm must be encoded in the quantum circuit for the quantum algorithm. Tedious, but that’s how it’s done.

Any input data must be encoded in the quantum circuit for a quantum algorithm

As previously stated, you can’t send any input data to a quantum computer, at least not as data.

Rather, any input data for a quantum algorithm must be encoded in the quantum circuit for the quantum algorithm.

Yes, this is tedious, but necessary. That’s how it’s done.

Generally, this is all automated — the application developer writes classical code which generates extra quantum logic gates to incorporate the input data into the quantum circuit which is being generated for the quantum algorithm.

It’s still the responsibility of the application developer, but hopefully the algorithm designer provides the classical code to generate the gates for initializing any input data.

Classical solution vs. quantum solution

Just some minor terminology… an algorithm or application is a solution to a problem.

A classical algorithm or classical application is a classical solution to a problem.

A quantum algorithm or quantum application is a quantum solution to a problem.

A key task in quantum computing is comparing quantum solutions to classical solutions.

This is comparing a quantum algorithm to a classical algorithm, or comparing a quantum application to a classical application.

Quantum advantage

Quantum advantage expresses how much more powerful a quantum solution is compared to a classical solution. Typically expressed as either how many times faster the quantum computer is, or how many years, decades, centuries, millennia, or even millions or billions or trillions of years a classical computer would have to run to do what a quantum computer can do in mere seconds, minutes, or hours.

The ultimate goal of quantum computing is dramatic quantum advantage, a quantum advantage that is truly mind-boggling. Not just 50% or 2X or 4X or even 100X faster, but upwards of a quadrillion X and more faster than a classical computer.

But short of that ultimate goal of dramatic quantum advantage, we can expect a fraction of that goal, which I call fractional quantum advantage.

Both of those goals are discussed next.

For more on quantum advantage, see my paper:

Dramatic quantum advantage is the real goal

The ultimate goal of quantum computing is dramatic quantum advantage, a quantum advantage that is truly mind-boggling. How big is that?

  • Not just 10 times faster.
  • Not just 100 times faster.
  • Not just 1,000 times faster.
  • Not just a million times faster.
  • Not just a billion times faster.
  • Not just a trillion times faster.
  • But at least one quadrillion times faster — and that’s just the starting point.

Why that much faster? Who really needs that?

The goal is not simply to get your computer to run faster, but to get a computer to run fast enough that you can tackle application problems which are so much more complex than problems which can be tackled today.

The goal is not simply to do things faster, but to make the impossible possible. To make the impractical practical.

For more detail on dramatic quantum advantage, see my paper:

Quantum advantage — make the impossible possible, make the impractical practical

Just to emphasize that point more clearly:

  • The goal of quantum computing is not to make computing faster.
  • But to make the impossible possible.
  • To make impractical applications practical.

Fractional quantum advantage

As previously mentioned, although the ultimate goal of quantum computing is mind-boggling dramatic quantum advantage, it may still be quite reasonable to be able to achieve only a fraction of that, a so-called fractional quantum advantage.

If dramatic quantum advantage of quantum computing starts at one quadrillion times a classical solution, fractional quantum advantage might be some not insignificant fraction of that and still have significant utility:

  1. 10 times.
  2. 100 times.
  3. 1,000 times.
  4. 10,000 times.
  5. 100,000 times.
  6. One million times.
  7. One billion times.
  8. One trillion times.

For more detail on fractional quantum advantage, see my paper:

Three levels of quantum advantage — minimal, substantial or significant, and dramatic quantum advantage

Just to keep things simple, overall there are three categorical levels of quantum advantage:

  1. Minimal quantum advantage. A 1,000X performance advantage over classical solutions. 2X, 10X, and 100X (among others) are reasonable stepping stones.
  2. Substantial or significant quantum advantage. A 1,000,000X performance advantage over classical solutions. 20,000X, 100,000X, and 500,000X (among others) are reasonable stepping stones.
  3. Dramatic quantum advantage. A one quadrillion X (one million billion times) performance advantage over classical solutions. 100,000,000X, a billion X, and a trillion X (among others) are reasonable stepping stones.

To put it most simply, anything less than a dramatic quantum advantage would be considered a fractional quantum advantage.

For more detail on fractional quantum advantage, see my paper:

For more detail on dramatic quantum advantage, see my paper:

Net quantum advantage — discount by repetition needed to get accuracy

Quantum advantage can be intoxicating — just k qubits gives you a computational leverage of 2^k, but… there’s a tax to be paid on that. Since quantum computing is inherently probabilistic by nature, you can’t generally do a computation once and have an accurate answer. Rather, you have to repeat the calculation multiple or even many times, called circuit repetitions, shot count, or just shots, and do some statistical analysis to determine the likely answer. Those repetitions are effectively a tax or discount on the raw computational leverage which gives you a net computational leverage, the net quantum advantage.

Shot counts of 100, 1,000, 10,000, or more are not uncommon.

For example:

  1. 10 qubits and a shot count of 100 means a raw quantum advantage of 2¹⁰ = 1024, but divide by the shot count of 100 and you get a net quantum advantage of… 10.24 or 10 X net advantage over a classical solution.
  2. 20 qubits and a shot count of 1,000 means a raw quantum advantage of 2²⁰ = one million, but divide by the shot count of 1,000 and you get a net quantum advantage of… 1,000 or 1,000 X net advantage over a classical solution.
  3. 20 qubits and a shot count of 10,000 means a raw quantum advantage of 2²⁰ = one million, but divide by the shot count of 10,000 and you get a net quantum advantage of… 100 or 100 X net advantage over a classical solution.
  4. 10 qubits and a shot count of 1024 means a raw quantum advantage of 2¹⁰ = 1024, but divide by the shot count of 1024 and you get a net quantum advantage of… 1.0 or no net advantage over a classical solution.

For more detail on circuit repetitions and shots, see my paper:

What is the quantum advantage of your quantum algorithm or application?

It’s easy to talk about quantum advantage in the abstract, but what’s really needed is for quantum algorithm designers to explicitly state and fully characterize the quantum advantage of their quantum algorithms. Ditto for quantum application developers and their quantum applications.

It’s also important for quantum applications using quantum algorithms to understand their own actual input size since the actual quantum advantage will depend on the actual input size.

For smaller input sizes the actual quantum advantage or net quantum advantage might be small enough that there is negligible benefit to using a quantum computer at all, or maybe a relatively minor advantage over a classical solution.

It may be necessary for the input size to get fairly substantial before the net quantum advantage becomes reasonably impressive. So, it’s important to know what input sizes the quantum algorithm or quantum application is likely to encounter in real-world usage.

When posting or writing about your quantum algorithm or application, this kind of information should be clearly and fully disclosed.

For more detail and discussion, see my paper:

Be careful not to compare the work of a great quantum team to the work of a mediocre classical team

When judging the quantum advantage of a quantum solution over a classical solution to an application problem, one should be mindful that quantum computing is so new and so challenging that the technical team needed to design and implement a quantum solution needs to be very elite, while there’s a fair chance that the technical team (or nontechnical team of subject matter experts) which implemented the original classical solution might have been more mediocre or at least not as capable as the elite quantum team.

It could well be that if you simply redesigned and reimplemented the classical solution with an elite technical team of caliber comparable to the quantum team, the classical solution might be much improved, so that any quantum advantage might be significantly diminished — or even completely eliminated.

It’s just something to be aware of — is the apparent quantum advantage real or just an illusion because the two technical teams are not of comparable caliber.

And in simple truth, even if you simply had the same, original classical technical team redesign and reimplement their own code — or simply do some performance analysis and some code optimization — they might produce a replacement classical solution which is much improved, possibly significantly reducing the actual quantum advantage — or possibly completely eliminating it. This isn’t a slam dunk, but a possibility. Again, just something to be aware of.

To be clear, quantum parallelism and quantum advantage are a function of the algorithm

A quantum computer does indeed enable quantum parallelism and quantum advantage, but the actual and net quantum parallelism and actual and net quantum advantage are a function of the particular algorithm rather than the quantum computer itself.

The quantum computer may have n qubits, but the particular algorithm running at a given moment may only be coded to use k of those n qubits.

For example, the quantum computer may have 50 qubits, but an algorithm might only use 20 qubits, giving it a quantum advantage of 2²⁰ rather than 2⁵⁰. Some other algorithm might use 10 bits, giving it a quantum advantage of 2¹⁰, while yet another algorithm might use 30 qubits giving it a quantum advantage of 2³⁰.

Also, k is the number of qubits used in an entangled parallel computation, not the total qubits used by the entire computation of an algorithm. The algorithm which uses 20 qubits in a parallel computation might use a total of 23 or 43 qubits. It is only the qubits which are used in an entangled parallel computation which give the quantum advantage, the quantum parallelism.

Quantum supremacy

Quantum supremacy is sometimes simply a synonym for quantum advantage, or it expresses the fact that a quantum computer can accomplish a task that is impossible on a classical computer, or that could take so long, like many, many years, that it is outright impractical on a classical computer.

In 2019 Google claimed that it had achieved Quantum Supremacy, although it was for more of a contrived computer science experiment than a practical real-world application. I reported on this in detail in my paper:

Random number-based quantum algorithms and quantum applications are actually commercially viable today

Although much of quantum computing is still subject to research and intensive pre-commercialization, there is one narrow niche that actually is commercially viable right now, today — generation of true random numbers. True random numbers are not mathematically computable, and hence cannot be computed using a Turing machine or classical computer. Special hardware is needed to access the entropy (i.e., randomness) required for true random numbers. Quantum effects can supply the necessary entropy. And quantum computers are able to access the necessary quantum effects. No further research is required to make use of this simple capability right now, today.

This ability to generate true random numbers has significant applications for cybersecurity, such as encryption key generation.

In short, quantum algorithms and quantum applications focused on the generation of true random numbers are commercially viable today.

That said, if true random number generation is only a part of the quantum algorithm or quantum application, there may still be other parts of the quantum algorithm or quantum application which are not commercially viable today.

For more detail on true random number generation and quantum computers, see my paper:

In fact, there is already at least one commercial product exploiting this specific feature of quantum computing:

Quantum supremacy now: Generation of true random numbers

Generation of true random numbers is one area where quantum computers have already achieved quantum supremacy — they are able to accomplish something that no classical computer can accomplish. Classical computers can generate pseudo-random numbers, but not true random numbers. Although special hardware can be attached to a classical computer which effectively mimics a little bit of what a quantum computer can do by capturing entropy from the environment.

Need to summarize capability requirements for quantum algorithms and applications

Eventually quantum algorithms and applications will be run on actual quantum computers. Not every quantum algorithm or application will be able to run on every quantum computer. The capabilities required by each quantum algorithm and application must be clearly documented so that they can be matched to the capabilities of particular quantum computers. I have proposed an approach to labeling the capability requirements for quantum algorithms and applications to facilitate this matching process.

For details of my proposal, see my paper:

Matching the capability requirements for quantum algorithms and applications to the capabilities of particular quantum computers

Not every quantum algorithm or application will be able to run on every quantum computer. Each quantum algorithm and application will have its own requirements for what quantum computing capabilities it requires. Careful attention must be given to assuring that the quantum capability requirements for a quantum algorithm or application are matched by the quantum capabilities for particular quantum computers. To this end, quantum computers must be properly labeled as to their capabilities, and quantum algorithms and applications must be properly labeled as to their capability requirements.

This topic is discussed in detail in my paper which is a proposal for a label for capabilities for quantum computers, algorithms, and applications:

A variety of quantum computer types and technologies

Not all quantum computers are created equal — there’s a lot of diversity. There are two main dimensions of diversity among quantum computers:

  1. Type. What functions the quantum computer performs. The programming model. What you can do with it — which applications it can handle and which applications it handles best. This is an abstraction of function rather than the physical implementation.
  2. Technology. The specific technical technologies used to implement the quantum computer. Especially the qubits. Different quantum computers of the same type can be implemented using different technologies — they still function the same and address the same applications in the same way, but the technical details under the hood may be subtly or radically distinct.

Quantum computer types

There are really two very distinct categories of quantum computers:

  1. General-purpose quantum computer. Can be applied to many different types of applications. May be referred to more technically as a universal general-purpose gate-based quantum computer.
  2. Special-purpose quantum computing device. Only applies to a relatively narrow niche of applications which meet criteria of the device. Some may refer to it as a special-purpose quantum computer as well, although its lack of generality argues against this. May also be referred to as a single-function quantum computing device or a single-function quantum computer.

For the most part, this paper focuses on the former, general-purpose quantum computer.

General-purpose quantum computers

Just to reiterate the caveat from the beginning of this paper, this paper focuses on general-purpose quantum computing, using so-called universal gate-based quantum computers.

Special-purpose quantum computing devices such as quantum annealing, boson sampling, other forms of so-called photonic quantum computing, and specialized physics simulators are explicitly excluded from this paper, but are mentioned to a limited degree in my previous paper on quantum computers:

General-purpose quantum computers are the most general. They are the most flexible. And generally the most useful.

A general-purpose quantum computer may also be more technically specifically described as a universal general-purpose gate-based quantum computer. The extra adjectives clarify the distinctions from special-purpose quantum computing devices.

For the purposes of this paper we are focused almost exclusively on general-purpose quantum computers.

Universal general-purpose gate-based quantum computer

Just to emphasize that universal general-purpose gate-based quantum computer is a technically specific term which distinguishes general-purpose quantum computers from special-purpose quantum computing devices (or special-purpose quantum computers).

Vendors of general-purpose quantum computers

Some of the vendors of general-purpose quantum computers include:

  1. IBM.
  2. Rigetti Computing.
  3. IonQ.
  4. Honeywell.
  5. Google.
  6. Intel.

That’s not intended to be an all-inclusive list, just some examples to help the reader navigate the field.

Special-purpose quantum computing devices

Some of the different types of special-purpose quantum computing devices (or special-purpose quantum computers) that exist today:

  1. Quantum annealers. Focused on quantum annealing (QA). A specialized niche of optimization problems. Not a general-purpose quantum computer for non-optimization applications. D-Wave Systems is the best example.
  2. Adiabatic quantum computers. Generalization of quantum annealers. Again, not general-purpose.
  3. Boson sampling devices. Another specialized form of quantum computing — a special-purpose quantum computer or special-purpose quantum computing device. Also known as Gaussian boson sampling (GBS).
  4. Photonic quantum computing. Sometimes simply a synonym for boson sampling device. Still too early in the research stage to judge whether it has potential as a general-purpose quantum computer or restricted to special purposes such as boson sampling devices.
  5. Physics simulators. Custom-designed, special-purpose quantum computing device designed to address a specific physics problem. Not a general-purpose quantum computer. Debatable whether it should even be called a quantum computer, but it is a specialized quantum computing device.

Special-purpose quantum computers or special-purpose quantum computing devices?

Whether to refer to special-purpose quantum computing devices as special-purpose quantum computers is an interesting semantic challenge and debate. There’s no clear, bright-line answer at this stage.

Some will prefer the latter and even simply refer to them as quantum computers despite their lack of being general purpose.

I will default to overall referring to them as special-purpose quantum computing devices, but I won’t object too strenuously if people refer to them as special-purpose quantum computers. I will object if people refer to them simply as quantum computers.

General-purpose vs. special-purpose quantum computers

Just to emphasize the point that when someone, especially in the media, but even in technical media, uses the term quantum computer that doesn’t immediately tell you whether they may be talking about a general-purpose quantum computer or a special-purpose quantum computing device (or special-purpose quantum computer) as distinguished in the preceding sections.

You generally need to examine the context to determine the true nature of the hardware. Even then, it may still not be apparent.

Qubit technologies

See quantum computer technologies. The technology of the qubit is the primary characteristic of the technology of a quantum computer. It can be easier (briefer) to refer to qubit technology rather than quantum computer technology.

Qubit modalities

Some people use the term qubit modality to refer to what I refer to as qubit technology or quantum computer technology.

They may also use the terms quantum computer modalities or quantum computing modalities.

But qubit technology, qubit modality, quantum computer technology, quantum computer modality, and quantum computing modality all mean essentially the same thing — what technology and science is used to implement the qubits of a quantum computer.

Quantum computer technologies

As mentioned above, quantum computers of the same type function the same (or at least similarly) although they may be implemented with different technologies. There are a variety of approaches to storing and manipulating quantum information. There are only a few main technologies in use today:

  1. Superconducting transmon qubits. IBM, Rigetti, Google.
  2. Trapped-ion qubits. IonQ, Honeywell.
  3. Neutral-atom qubits. Cold Quanta, Atom Computing.
  4. Classical software simulators. Not a hardware technology per se, but certainly a quantum computer technology.

Some other quantum computer technologies either under current or past research:

  1. Nuclear Magnetic Resonance (NMR). The earliest quantum computing technology. Used in the late 1990’s and early 2000’s, until superconducting transmon qubits became available. More recently, SpinQ is offering two and three-qubit NMR-based quantum computers for economical experimentation.
  2. Nitrogen-vacancy Center (NVC). Primarily in diamond, at present.
  3. Photonics. Great hopes and potential, but other than boson sampling devices, it has remained an elusive research effort.
  4. Silicon spin qubits. Intel.
  5. Topological qubits. Microsoft, QuTech.

These lists are not intended to be exhaustive, but simply intended to help the reader navigate the field.

Different types and technologies of quantum computers may require distinctive programming models

There are a variety of types and technologies of quantum computers, each with its own nuances. The differences and nuances could range from being relatively minor so that quantum algorithms for one type or technology of quantum computer can run compatibly on other types or technologies of quantum computers, to relatively major meaning that generally algorithms for one type or technology will tend to be incompatible with other types and technologies.

Some general rules:

  1. Special-purpose quantum computers are generally not compatible with general-purpose quantum computers.
  2. Different types of special-purpose quantum computers are generally not compatible with other types of special-purpose quantum computers.
  3. General-purpose quantum computers tend to be relatively compatible with other technologies of general-purpose quantum computers.
  4. There will frequently be at least minor differences between different technologies of general-purpose quantum computers.

Don’t get confused by special-purpose quantum computing devices that promise much more than they actually can deliver

Special-purpose quantum computing devices can indeed deliver amazing capabilities, but only for a relatively narrow niche of functions. If your particular application problem can’t be easily mapped to that narrow niche, you’re out of luck. You’re probably better off with a general-purpose quantum computer, in general.

The catch is that we’re still very early in the development and evolution of quantum computers, so that even general-purpose quantum computers aren’t yet up to solving many application problems.

Coherence, decoherence, and coherence time

Coherence is the ability of a quantum computer to remain in its magical quantum state where quantum effects can be maintained in a coherent manner which enables the quantum parallelism needed to fully exploit quantum computation.

Decoherence is when a quantum system (such as a single isolated qubit or an entangled collection of qubits) loses its coherence.

Generally a quantum system doesn’t lose its coherence suddenly, all at once, but it decays or deteriorates over time, and we can speak of what fraction of its coherence remains, or the probability that if you measure the quantum system that you will get the expected quantum state.

Coherence time is the time which must elapse before it is likely that the quantum state of a quantum system has lost most of its coherence. It is generally fairly short for many quantum computer technologies, limiting the size and complexity of quantum algorithms.

But some technologies are more coherent than others, meaning they have a longer coherence time, which enables greater size and complexity of quantum algorithms.

Gate execution time — determines how many gates can be executed within the coherence time

Coherence time alone does not determine the absolute size of a quantum algorithm — it also depends on how long it takes to execute each quantum logic gate — gate execution time.

The faster a quantum logic gate can be executed, the more of them that can be executed in the coherence time.

Maximum quantum circuit size — limits size of quantum algorithms

Although coherence time does in fact limit quantum circuit size and hence the size of a quantum algorithm, the coherence time alone doesn’t tell you the limit — you have to divide the coherence time by the gate execution time to get the maximum number of quantum logic gates which can be executed in the coherence time interval.

For example, if the coherence time was 100 microseconds and the gate execution time was 250 nanoseconds, then a maximum of 400 quantum logic gates could be executed during the coherence time interval. So the maximum quantum circuit size would be 400 quantum logic gates.

The maximum quantum circuit size is the limit to the size of quantum algorithms.

Another quantum computer technology: the simulator for a quantum computer

There is one particularly interesting quantum computer technology, a simulator for a quantum computer.

Simulators for quantum computers

In addition to having an actual real hardware implementation of a quantum computer, we can have simulators for quantum computers, which are software packages which run on classical computers to simulate the behavior of real quantum computer hardware.

You don’t need a real quantum computer to run simple quantum algorithms.

But complex quantum algorithms will run very slowly or not at all since the whole point of a quantum computer is to greatly outperform even the best classical supercomputers.

Simulators of quantum computers are also good for debugging and experimenting with improved hardware before it is even available.

In theory, a simulator for a quantum computer can be configured to very closely approximate the features, limits, and quirks of any real or proposed quantum computer, subject to limitations of performance and capacity of the classical computer on which the simulator is running.

There are several reasons to utilize a simulator for a quantum computer:

  1. The real quantum computer hardware does not exist yet.
  2. The real quantum computer hardware is not fully debugged yet. The simulator can function correctly even if the real hardware cannot.
  3. The real quantum computer hardware is too noisy. Researchers and evaluators want to know how the real hardware might perform if it was somewhat less noisy. Discover how much less noisy it needs to become before it can deliver useful results.
  4. To debug a quantum algorithm. By definition, the operations within a quantum circuit are not observable until the end of the circuit when the qubits are measured. A simulator will allow you to examine all aspects of quantum state at every step of the way.
  5. To test out ideas for proposed revisions to current quantum computers. Before the hardware is available.
  6. To test out ideas for entirely new quantum computers. Before the hardware is available.
  7. To perform algorithm research for ideal quantum computers. Even though such hardware cannot actually be created in reality. It is still useful to know what the limits of the capabilities of a quantum computer really are.
  8. For benchmarking. Such as calculating Quantum Volume (QV).

For additional information on uses of simulators for quantum computers, see my paper:

Classical quantum simulator

I also refer to a simulator for a quantum computer as a classical quantum simulator.

Quantum computers — real and simulated are both needed

Real quantum computers and simulated quantum computers are both important for quantum computing.

Simulated quantum computers are needed for development and debugging. Especially during the early stages, in pre-commercialization. But even during commercialization, when real quantum hardware may still be under development, or when debugging is needed.

Real quantum computers are great for production, but less useful for development and debugging.

Two types of simulator and simulation

The terms simulator and simulation are actually ambiguous in quantum computing. They can have two distinct meanings:

  1. Simulating science. Especially simulating physics or simulating chemistry.
  2. Simulating a quantum computer on a classical computer. Either because real quantum computing hardware is not available, to facilitate debugging, or to research quantum algorithms abstractly on an ideal quantum computer. In my writing I refer to this as a classical quantum simulator or a simulator for a quantum computer.

Generally it should be clear from context which meaning is intended.

Classical quantum simulator — simulate a quantum computer on a classical computer

Just to emphasize that I use the term classical quantum simulator to refer to using classical software to simulate a quantum computer on a classical computer.

Simulation — simulating the execution of a quantum algorithm using a classical computer

As a general proposition, when I use the term simulation, I’m referring to the use of a classical quantum simulator to simulate the execution of a quantum algorithm using a classical computer rather than a real quantum computer.

But a physicist or quantum chemist will commonly refer to their quantum algorithms (or their classical code if they aren’t yet using a quantum computer) as simulation, such as simulating physics or simulating chemistry.

Context may dictate that simulation implies simulation of science

My general rule that simulation implies simulation of a quantum circuit on a classical computer notwithstanding, if the context is applications and the simulation of science such as physics and chemistry, then the context may dictate that the naked term simulation implies simulation of science rather than simulation of a quantum computer.

Focus on using simulators rather than real quantum computers until much better hardware becomes available

Current quantum computers have too many shortcomings to be very useful or productive in the near term. It would be more productive for most people to use classical quantum simulators rather than real quantum computers for most of their work.

Simulators also make it easier to debug quantum algorithms.

The simulators should be configured to match the capabilities which are expected for practical quantum computers so that results from simulation will closely match results from practical quantum computers when they do become available in a few years.

A configuration such as my proposal for a 48-qubit quantum computer with fully-connected near-perfect qubits, or something fairly close to it, would be reasonable. See my proposal:

Capabilities, limitations, and issues for quantum computing

Much of quantum computing can be categorized as capabilities, limitations, or issues. Those are my own three areas of interest as a technologist.

For a high-level summary and some detail on these three aspects of quantum computing, see my paper:

Quantum computers are inherently probabilistic rather than absolutely deterministic

Quantum computers are inherently probabilistic by nature rather than being strictly and absolutely deterministic as classical computers are. This means that they are not suitable for calculations requiring absolute precision, although statistical processing can approximate determinism, to some degree.

And this is beyond any errors due to noise, so even with flawless quantum hardware and full quantum error correction, quantum computers will still produce probabilistic results.

A lot of calculations, especially in the natural sciences, are completely compatible with calculations being probabilistic. Or approximate in general.

In fact, a lot of quantum algorithms explicitly rely on quantum computations being probabilistic. So this is actually a feature, not a bug.

It is common for quantum applications to run a given quantum algorithm multiple times and then to examine the results to see which result has the highest probability of being correct. This is called circuit repetitions or shot count or just shots.

For more detail on circuit repetitions and shots, see my paper:

Quantum computers are a good choice when approximate answers are acceptable

The preceding section highlighted how quantum computers are inherently probabilistic in nature. Superficially that sounds like a negative, but for a lot of computations, especially in the natural sciences or where statistical approximations are used, probabilistic results can be quite acceptable. But approximate results are clearly not good for financial transactions where every penny counts.

And if achieving approximate answers is much faster than classical computing, that’s a very positive thing.

Of course, not all problems can be effectively solved with approximate answers, so the algorithm designer and application developer will have to do a careful analysis and use solid judgment to determine whether a probabilistic or approximate answer will be acceptable for a given problem or application.

Statistical processing can approximate determinism, to some degree, even when results are probabilistic

As noted earlier, even though quantum computers are probabilistic by nature rather than strictly deterministic as classical computers are, statistical processing can enable us to achieve a close approximation to a deterministic answer in many cases.

Generally this simply means that a given quantum computation (algorithm or circuit) is run a bunch of times, possibly even hundreds or thousands of times, enough to get a decent statistical distribution of the results. An examination of that distribution can tell us whether there is in fact a single result or narrow range of results which have proven to be statistically more likely. If so, this is a close approximation of a deterministic result. If not, then maybe no deterministic result can be achieved.

In some cases, it may turn out that the classical solution is flawed and assumes that a deterministic result can be achieved and does in fact compute a single result, but it may in fact be only one of many possible results if the assumptions had been made correctly.

It’s also possible that the distribution of quantum results does not end up having a single mode (unimodal) representing a single best result. Some quantum computations might produce two or more modes (bimodal or multimodal). Some quantum computations might produce a uniform distribution of results, essentially truly random results. The quantum algorithm designer and quantum application developer will have to decide how they will deal with such possibilities. In some cases, it will be due to limitations of the quantum algorithm or the quantum computer, or just the nature of the real-world problem, or an issue with the input data.

For more on the process of repeating the execution of a quantum computation and statistically processing the results, see my paper:

As challenging as quantum computer hardware is, quantum algorithms are just as big a challenge

The rules for using a quantum computer are rather different from the rules governing classical computers, so entirely new approaches are needed for quantum algorithms.

There are quite a few areas where more research is needed in the design of quantum algorithms.

And hardware and algorithms are tied together, so advances in hardware are also needed to advance with algorithms.

For some detail, see my paper:

And for more on research needed, see my paper:

The process of designing quantum algorithms is extremely difficult and challenging

Even though the final quantum algorithm may be very simple, getting to that result is a great challenge. And testing is really difficult as well.

Unfortunately, there is no simple cookbook for the design of quantum algorithms. Or for testing.

It’s all art and craft. And too much trial and error as well.

Okay, a lot of the time it is simply finding algorithms for similar problems and seeing how much of the algorithms for similar problems can be scavenged and repurposed for the new quantum algorithm.

Algorithmic complexity, computational complexity, and Big-O notation

Algorithmic complexity and computational complexity are roughly synonyms and refer to the calculation of the amount of work that an algorithm will need to perform to process an input of a given size. Generally, the amount of work will vary based on the size of the input data. Big-O notation is used as a shorthand to summarize the calculation of the amount of required work.

Algorithmic complexity can be thought of as how hard an algorithm will have to work to do its job.

Quantum computers are ideal for computing tasks which are too hard for classical computers. That means that the classical equivalent of a typical quantum computer algorithm will have a high degree of algorithmic complexity.

The purpose of a quantum computer is to reduce the algorithmic complexity of a solution to a problem.

A typical quantum algorithm will have a fairly low degree of algorithmic complexity.

At least in theory. And that is the goal.

Algorithmic complexity will generally be described using so-called Big-O notation, which uses a shorthand to describe a rough approximation of the computation of the amount of work required for the algorithm. The “O” is an abbreviation of “on the order of”. A few brief examples:

  1. O(c). Constant complexity. Won’t vary based on input data size. Always very fast.
  2. O(ln n). Log complexity. Sub-linear complexity. Work increases slower than an increase in input data size. Quite fast, better than linear. Faster than square root complexity.
  3. O(sqrt n). Square root complexity. Work increases as the square root of the increase in input data size. Characteristic of the Grover search algorithm. Much faster than linear. Slower than log (ln) complexity.
  4. O(n). Linear complexity. Work increases in proportion to any increase in input data size. Quite fast — unless you’re processing a large amount of input.
  5. O(n²). Polynomial complexity. Work increases much faster than any increase in input data size. Generally what we’re shooting for in quantum computing — assuming that a classical solution has exponential complexity.
  6. O(2^n). Exponential complexity. Work increases much, much faster than any increase in input data size. Generally what we’re trying to get away from for quantum computing.
  7. O(n!). Factorial complexity. Work increases extraordinarily fast as input data size increases. The Traveling Salesman Problem (TSP) is the common example. Beyond the capability of even a quantum computer.

Algorithmic complexity is a very difficult topic, generally beyond the scope of this paper.

For detail, see my paper:

Quantum speedup

Moving from a classical algorithm with a high degree of algorithmic complexity to a quantum algorithm with a low or lower degree of algorithmic complexity achieves what is referred to as a speedup or quantum speedup.

There are two speedups commonly referred to in quantum computing:

  1. Exponential speedup. When the classical algorithm has exponential complexity while the equivalent quantum algorithm has only polynomial complexity.
  2. Quadratic speedup. Commonly referring to the Grover search algorithm, where the classical algorithm has linear complexity while the quantum algorithm has square root complexity.

Key trick: Reduction in computational complexity

One of the key secret tricks for designing quantum algorithms is to come up with clever techniques for reduction in computational complexity — turn a hard problem into an easier problem. Yes, it is indeed harder than it sounds, but the benefits can be well worth the effort.

A great example is Shor’s factoring algorithm, which transforms factoring into order finding, which is a much simpler problem to solve. Actually, Shor transformed factoring into a pair of problems, modular exponentiation and order finding. Those are still both fairly hard problems, but simpler than raw factoring itself.

We need real quantum algorithms on real machines (or real simulators) — not hypothetical or idealized

We’ve had quite a few years of papers published based on quantum algorithms for hypothetical or idealized quantum computers rather than real machines (or real simulators configured to reflect realistic expectations for real machines in the next few years.) That has led to unrealistic expectations for what to expect from quantum computers and quantum algorithms.

Alternatively, we’ve seen primitive quantum algorithms tailored to very limited current quantum computers, which once again don’t realistically represent the capabilities which will be available in the next few years, or even available in practical quantum computers a little later.

Granted, real quantum computers are still rather limited and constrained, but we also have simulators which can be configured to approximate realistic expectations of real quantum computers that should be available in the coming few years, or the practical quantum computers coming a little later.

We can’t simulate hardware with hundreds or even one hundred qubits, but we can simulate 32 to 40 qubits, and maybe up to 48 qubits within a year or two.

Might a Quantum Winter be coming?

The term Quantum Winter refers to a period of disenchantment with the technology which would occur if promised advances fail to materialize or fail to meet sky-high expectations.

Many grandiose promises have been made for quantum computing and expectations have been set very, very high, so the theoretical possibility of a Quantum Winter is quite possible.

That said, I don’t expect a Quantum WInter in the next couple of years. But if promised advances don’t materialize over the next two to three years, then the risk of a Quantum Winter will rise dramatically.

The risk of a Quantum Winter is currently fairly low, but it does in fact appear to be rising. It will be quite a horse race between rising expectations, rising risks, and actual delivery of technology that meets or exceeds those rising expectations.

For more on Quantum Winter, see my paper:

Risk of a Quantum IP Winter

Separate from the possibility of a Quantum Winter which is based on the inability of technology to fulfill expectations, there is also the possibility of a Quantum Winter based on restrictions on use of available technology as a result of intellectual property protections.

At some stage we may get to the point where the technology needed to fulfill expectations may indeed be feasible, workable, and readily available, but restricted due to IP legal protections or onerous licensing fees, or contract disputes or protracted lawsuits. A form of Quantum Winter could result from such a legalistic inability to use the available technology to its full potential.

This is not a high probability event, but is still a possibility.

No, Grover’s search algorithm can’t search or query a database

Some naive introductions to quantum computing suggest or explicitly state that the Grover search algorithm can be used to search a database, but that simply isn’t true.

In fact, Grover’s search algorithm does exactly the opposite — it is able to search unstructured data, which is the exact opposite of a database, which is based on structured data.

So, don’t expect Google to replace its vast search engine infrastructure with a single 72-qubit quantum computer running the Grover search algorithm.

No, Shor’s factoring algorithm probably can’t crack a large encryption key

Everybody is loudly proclaiming that all encrypted data is at risk of being decrypted by a quantum computer running Shor’s factoring algorithm to crack public encryption keys, but:

  1. This isn’t true today.
  2. This won’t be true in the near future.
  3. This won’t be true in the foreseeable future.
  4. Is very unlikely to be true… ever.

There are detailed technical reasons why Shor’s factoring algorithm may indeed eventually work for smaller numbers up to maybe 10 to 20 bits, but not work for very large numbers beyond that, especially 1024, 2048, and 4096-bit public encryption keys.

The technical details are described in my paper:

See the section entitled What about Shor’s factoring algorithm? Sorry, but it will only work for small numbers, not very large numbers.

No, variational methods don’t show any promise of delivering any dramatic quantum advantage

If quantum Fourier transform (QFT) cannot be used, one category of alternatives are variational methods. Unfortunately, they are not anywhere near as powerful as quantum Fourier transform.

They work, in a fashion, but don’t really offer a lot of computational power or opportunity for truly dramatic quantum advantage. Maybe they might work extremely well for some niche applications, but nobody has discovered any yet. So far, only mediocre results, at best.

And they have difficulties such as so-called barren plateaus which make them difficult to work with and problematic as well.

Mostly such an approach simply confirms that a solution can be implemented on a quantum computer, not that such a solution has any great advantage over classical solutions.

The incremental and iterative nature of a variational method eliminates the potential for any dramatic quantum advantage, even if some more modest fractional quantum advantage might still be possible.

While a quantum Fourier transform might evaluate 2^n possible solutions all at once, a variational method will only evaluate 2^k possible solutions at a time, where k is much smaller than n, and a sequence of attempts for different ranges of 2^k solutions must be attempted iteratively using classical code. So, there is far less than a 2^n computational advantage over classical methods. In fact, the advantage isn’t even 2^k since a sequence of attempts must be made, with a classical optimization step between each of them.

In short, reliance on variational methods will not deliver the full promise of quantum computing, no dramatic quantum advantage. Any quantum advantage of a variational method will be modest at best, a fractional quantum advantage.

For more on variational methods, see the Google tutorial:

Quantum computer vs. quantum computing vs. quantum computation

The three terms quantum computer, quantum computing, and quantum computation are very closely related but are different aspects of the same phenomenon:

  1. Quantum computer. The machine. A device. The hardware. It does the actual work. It has capabilities — and limitations, You feed it data and quantum code and it produces output. May be on the premises or accessed remotely in the cloud.
  2. Quantum computing. The overall process. Including the machine (the quantum computer), support software, tools, data, code, executing code, gathering results, using the results, and all of the issues involved in this process. Acquiring and maintaining the machine as well. And the people and organizations involved as well. And training and education. And conferences and publications. And collaborations between organizations.
  3. Quantum computation. The actual work done by the quantum computer. There are two sides to quantum computation: the data and code that gets sent to the machine, the quantum computer, and then the results that are returned — the description of the task to be performed, and then the results of performing that task. And application software that prepares the quantum computation and consumes its results. The process of arranging and performing the parts of a quantum computation mentioned previously.

Some usage:

  1. You design and develop a quantum computation. A quantum algorithm.
  2. You execute a quantum computation on a quantum computer. A quantum algorithm, although technically it is a quantum circuit that is executed.
  3. Somebody needs to decide to acquire or arrange for access to a quantum computer.
  4. You design and develop a quantum application. It uses one or more quantum algorithms. But mostly it is classical software.
  5. You execute your quantum application, which in turn performs one or more quantum computations, whose results are used by the quantum application.
  6. Quantum computing is both the development and the use of quantum algorithms and quantum applications, as well as acquiring and maintaining the quantum computer.

Quantum computer vs. quantum computer system vs. quantum processor vs. quantum processing unit vs. QPU

These terms are frequently used as synonyms, but there are distinctions:

  1. Quantum computer system. All of the hardware. All of it. Including the fancy enclosure box with the vendor and system name, the cabling, power supply, cooling system and dilution refrigerator (if any, or vacuum chamber), console, network connection, a classical computer and classical hardware needed to control the quantum hardware, and support software that runs on it. Everything needed to support a quantum computer.
  2. Quantum processor. The subset of the hardware which actually performs the quantum computation. Includes classical digital and analog circuitry to control the quantum circuitry.
  3. Quantum processing unit. Generally a reference to the quantum processor. Alternatively, a reference to the physical hardware or chip that holds the qubits of the quantum processor. Abbreviated as QPU.
  4. QPU. Initialism for quantum processing unit.
  5. Quantum computer. Alternatively this could simply be a synonym for the overall quantum computer system, or the subset of the overall quantum computer system that includes both the quantum processor and any hardware and software needed to process a quantum computation once it has been received by the quantum computer system. This would include transforming quantum circuits from unitary transformation matrices into classical electronic signals that would control the quantum processor. This also includes minimal processing of the results of a quantum computation so that they can be returned to whoever requested the quantum computation to be performed, either a person or a quantum application.

Usage:

  1. Quantum computer system is the proper term to refer to the actual hardware and the overall system.
  2. Quantum computer is the proper term for what a quantum software developer is concerned with when designing and developing quantum algorithms and quantum applications.
  3. Quantum computer is also the proper term for the subset of the overall quantum computer system which processes a quantum computation once it has been received from outside of the quantum computer system and is to be executed. The quantum computer will do only minimal processing of the results of a quantum computation, merely packaging the results so that they are ready to be shipped back to whoever requested the computation to be executed, either a person or a quantum application.
  4. QPU or quantum processor are proper terms to refer to just the raw qubit hardware, although technically they include the classical digital and analog circuitry which controls the qubits.
  5. Quantum processor is the proper term to refer to the subset of the overall quantum computer system which is only concerned with the processing and execution of quantum circuits (algorithms.)

Quantum computer as a coprocessor rather than a full-function computer

We commonly refer to a quantum computer as if it was a full-function computer system, but it falls far short of that. There are many functions which can be performed with even the simplest classical computer which are beyond what is possible with even the best quantum computers. But the whole point of a quantum computer or quantum processor if you will is that it performs only a small fraction of what a classical computer can do, but it performs that fraction extremely well, even far better than even the best classical computers or even the best classical supercomputers.

Most of the function of a typical quantum application is performed on a classical processor, and only a small fraction of its function is offloaded to a quantum processor (quantum computer).

The quantum computer (quantum processor) acts as a coprocessor (auxiliary processor). The term coprocessor is a synonym for auxiliary processor.

This is similar to the way that graphical functions can be offloaded to a graphics processing unit (GPU).

So the bulk of the quantum application is processed by the classical computer as the main processor with a small fraction of the function offloaded to the quantum processor (quantum computer) as a coprocessor.

In short, what we commonly refer to as a quantum computer or quantum processor or quantum processing unit or QPU is in fact an auxiliary processor or coprocessor.

Quantum computers cannot fully replace classical computers

Quantum computers may sound as if they have great potential to fully replace classical computers, but that just isn’t true. Although quantum computers can do some tasks much better than classical computers, there are many tasks that classical computers can do that quantum computers simply can’t do at all.

Eventually quantum computers will be merged with classical computers as a universal quantum computer, but that’s far in the future.

For my own proposal for a universal quantum computer, see my paper:

But for now and the foreseeable future, quantum computers will be an adjunct to classical computers rather than an outright replacement for them.

Quantum applications are mostly classical code with only selected portions which run on a quantum computer

Most application logic either cannot be performed on a quantum computer at all, or wouldn’t achieve any meaningful quantum advantage over performing it on a classical computer. Only selected portions of the quantum application would be coded as quantum algorithms and executed on a quantum computer.

No quantum operating system

As previously mentioned, a quantum computer is not a full-function computer as a classical computer is. Rather, it acts as a coprocessor, similar to how a graphics processing unit (GPU) operates. As such, there is no quantum operating system per se.

A quantum computer system does indeed have an operating system, but it is the classical operating system running on the classical computer system which controls the quantum processing unit (QPU). This could be any operating system, but is more likely to be a variant of Unix, such as Linux. In any case, the actual operating system is general purpose and generic, nothing specific to quantum computing per se.

A quantum application running on a classical computer sends a request to execute a quantum algorithm (quantum circuit.) The request is received by the system management software running on the classical computer which controls the overall quantum computer system and is listening to requests coming in on its network connection. Classical software then processes the request and processes the quantum circuit one quantum logic gate at a time, sending electronic signals to the quantum processor unit (QPU) to execute the gates on the qubits. The classical software sends signals to read the results of the quantum computation from the qubits at the end of the quantum circuit and ships the results back to the quantum application that originally requested the quantum algorithm to be run.

So, there is software involved in executing quantum algorithms, but it’s more analogous to the driver for a GPU than a full-blown operating system.

Nothing in the operating system on a quantum computer system knows anything about quantum computing, quantum computers, or quantum computation.

Noise, errors, error mitigation, error correction, logical qubits, and fault tolerant quantum computing

There can be many sources of noise in a quantum computer. Classical computers can be easily designed so that most noise has no detectable effect on computation, but that’s not so easy to do with a quantum computer which is dependent on just the kinds of quantum effects that look virtually like noise. Eventually we will get to true fault tolerance, but not soon. Noise is just a fact of life for quantum computing, for the foreseeable future.

Noise can cause errors. It tends to cause errors.

Sometimes those errors can be mitigated and sometimes even corrected automatically.

But sometimes they just have to be tolerated.

Or sometimes you can simply run the same code multiple times and see which result is produced more frequently, and presume that this is likely the more correct result.

Research is ongoing for true quantum error correction (QEC) which will transparently and automatically detect and correct most errors. This will provide algorithm designers with perfect logical qubits. That will be great, but it will come at a significant cost — it may require substantially more hardware of substantially greater complexity. This will usher in the era of fault-tolerant quantum computing (FTQC), but no time soon.

Another shorter-term possibility is near-perfect qubits, which are not close to perfect logical qubits, but are close enough for most quantum algorithms and quantum applications.

This paper won’t go into the details here at this level, but for more detail see my paper:

Perfect logical qubits

Perfect logical qubits are the holy grail of quantum computing. Regular qubits are noisy and error-prone, but the theory, the generally-accepted belief, is that quantum error correction (QEC) will fully overcome that limitation. The theory is that quantum error correction will usher in the era of fault-tolerant quantum computing, and with its perfect logical qubits.

Whether that presumption does indeed transpire remains to be seen. And it’s not likely to happen in the next few years.

Still, it’s a powerful metaphor that keeps the dream of quantum computing alive.

Even if the dream is not fulfilled, or at least not fulfilled in the next few years, near-perfect qubits should be good enough for most quantum algorithms and quantum applications.

NISQ — Noisy Intermediate-Scale Quantum computers

This informal paper won’t go into the details here beyond what was just described in the preceding section, but this whole era of quantum computers which are plagued by noise is referred to as noisy intermediate-scale quantum devices (NISQ). This is the reality of quantum computing, for the foreseeable future.

The “IS” of NISQ stands for intermediate-scale, which technically means 50 to hundreds of qubits. Since most current quantum computers have fewer than 50 qubits, they technically aren’t intermediate-scale, but people commonly refer to all current quantum computers as being NISQ anyway.

NISQ computers are sometimes referred to as NISQ devices.

For more on NISQ, see Prof. John Preskill’s paper which introduced the term:

Near-perfect qubits as a stepping stone to fault-tolerant quantum computing

Fault-tolerant quantum computing is the ultimate solution to the noise problems which plague current and near-term quantum computers — so-called NISQ devices — but it’s too far over the distant horizon to do anybody much good any time soon. Near-perfect qubits are an intermediate stepping stone, dramatically better than current NISQ devices even if still well-short of true fault-tolerant quantum computing.

The essential point is that even if near-perfect qubits aren’t as good as the perfect logical qubits of fault-tolerant quantum computing which are enabled by full quantum error correction, they are good enough for most quantum algorithms and most quantum applications.

This paper won’t go into the details here at this level, but for more detail see my paper:

Quantum error correction (QEC) remains a distant promise, but not critical if we have near-perfect qubits

We’re not likely to see full, transparent, and automatic quantum error correction (QEC) within the next few years. Maybe five years, or even longer. But it’s not critical as long as we achieve near-perfect qubits within a year or two.

Circuit repetitions as a poor man’s approximation of quantum error correction

While we’re waiting for full, automatic, and transparent quantum error correction in the coming years — and near-perfect qubits as well — we have another approach to hold us over, namely, using circuit repetitions (shots) to approximate full quantum error correction. In truth, both approaches are accomplishing roughly the same task.

By executing the same quantum circuit a bunch of times — dozens, hundreds, or even thousands of times, and then examining the statistical distribution of the results, it is generally possible to determine which of the various results is the more likely result — which result occurs more frequently. It’s not guaranteed to achieve a perfect result, but it’s certainly better than nothing. A lot better.

For more details, see my paper:

Beyond NISQ — not so noisy or not intermediate-scale

The term NISQ (Noisy Intermediate-Scale Quantum device or computer) is technically inaccurate for many current (and future) quantum computers. I’ve proposed some alternative terms to supplement NISQ.

The “IS” of NISQ stands for intermediate-scale, which technically means 50 to hundreds of qubits (as per Preskill’s original paper.) Since most current quantum computers have fewer than 50 qubits, they technically aren’t intermediate-scale. So, I propose the term NSSQ for noisy small-scale quantum devices.

When we get to fault-tolerant quantum computing with full quantum error correction, then I propose we replace the “N” of NISQ with “FT”. So, NISQ would become FTISQ. But, given that a large number of physical qubits are needed to construct even a single qubit, I expect that the initial fault-tolerant quantum computers will have fewer than 50 logical qubits, so they would be FTSSQ — fault-tolerant small-scale fault-tolerant quantum devices.

Hopefully in the not too distant future we will achieve near-perfect qubits, in which case the “N” of NISQ can be replaced with “NP” for near-perfect. So, we would then have NPSSQ and NPISQ devices, depending on whether they had fewer than 50 qubits or 50 or more qubits.

For details, see my paper:

When will the NISQ era end and when will the post-NISQ era begin?

Technically post-NISQ would imply both beyond noisy qubits and beyond an intermediate-scale count of qubits (beyond a few hundred). But, my opinion is that getting past noisy qubits is the most urgent priority. If we still don’t have more than a few hundred near-perfect qubits, I think that will be fine.

And when we get to full quantum error correction and perfect logical qubits, the logical qubit counts will be much lower, so that even 48 perfect logical qubits might still be considered good enough to no longer be considered NISQ. Besides, how could any number of non-noisy qubits still be considered NISQ?! That wouldn’t make any sense. With the terminology recommended in the preceding section, NISQ would become FTISQ for 50 to hundreds of perfect logical qubits or FTSSQ for less than 50 logical qubits.

In any case, when might any of this happen? There’s some chance that we might see near-perfect qubits in a year or two. Perfect logical qubits are years away.

We could easily see hundreds of near-perfect qubits in two to three years. Or four years, to be safe.

Three stages of adoption for quantum computing — The ENIAC Moment, Configurable packaged quantum solutions, and The FORTRAN Moment

Quantum computing has progressed dramatically in the past few years, but still has a very long way to go before we achieve practical quantum computing and production deployment of production-scale practical real-world quantum applications.

As quantum computing does begin to approach practical quantum computing, I see three stages for adoption:

  1. A few hand-crafted applications (The ENIAC Moment). Limited to super-elite technical teams.
  2. A few configurable packaged quantum solutions. Focus super-elite technical teams on generalized, flexible, configurable applications which can then be configured and deployed by non-elite technical teams. Each such solution can be acquired and deployed by a fairly wide audience of users and organizations without any quantum expertise required.
  3. Higher-level programming model (The FORTRAN Moment). Which can be used by more normal, average, non-elite technical teams to develop custom quantum applications. Also predicated on perfect logical qubits based on full, automatic, and transparent quantum error correction (QEC). It is finally easy for most organizations to develop their own quantum applications.

Each stage would open up a significantly wider market for adoption and deployment of quantum computing.

  1. In the initial stage, only a relatively few, elite organizations will be able to utilize quantum computing to any great effect. The elite, the lunatic fringe, will dominate production deployment of production-scale practical real-world quantum applications.
  2. In the second stage there will be relatively few configurable packaged quantum solutions, but a significant number of organizations will be able to deploy these few applications fairly widely. Few organizations will be able to develop their own applications.
  3. Not until the third stage will development of quantum computing applications be possible by the average organization, without the need for elite technical teams or the lunatic fringe. Configurable packaged quantum solutions will remain common and popular, but because they deliver substantial business value at an economical price, rather than simply because organizations had no other choice.

For more detail on these three stages, see my paper:

Configurable packaged quantum solutions are the greatest opportunity for widespread adoption of quantum computing

Designing quantum algorithms and developing quantum applications to exploit the capabilities of a quantum computer is a very challenging task and will remain beyond the reach of most organizations well after quantum applications become practical from a hardware perspective.

One promising approach is configurable packaged quantum solutions, which combine prewritten quantum algorithms and quantum application code with the ability to dynamically customize both using high-level configuration features rather than needing to dive deep into actual quantum algorithms or quantum application code.

This will likely be the main method by which most organizations exploit quantum computers. A relatively small number of these solutions would be developed by elite teams, but then many organizations could readily deploy them for immediate quantum advantage and quickly return significant business value.

For more details, see my paper:

The FORTRAN Moment — It is finally easy for most organizations to develop their own quantum applications

The FORTRAN Moment will mark the milestone of when it is finally easy for most organizations to develop their own quantum applications.

This will be the grand confluence of:

  1. A practical quantum computer. At least 48 fully-connected near-perfect qubits.
  2. A truly high-level programming model.
  3. A quantum-native high-level programming language.
  4. A rich collection of high-level algorithmic building blocks.
  5. A plethora of high-level design patterns.
  6. Numerous rich application frameworks.
  7. Many examples of working, deployable, and proven production-scale quantum algorithms and applications addressing practical real-world problems.

Once all of these capabilities come together, there will no longer be any notable impediments preventing most organizations from developing their own quantum applications.

For more details, see my paper:

Quantum networking and quantum Internet — research topics, not a near-term reality

Quantum communication uses quantum effects to securely transfer classical information. It’s not directly related to quantum computing per se. The endpoints would be classical computers, not quantum computers.

Quantum networking permits two or more quantum computers to share quantum informationquantum state. A larger quantum computation could be partitioned onto multiple quantum computers in a quantum network. Unlike quantum communication which is merely using quantum effects to securely transmit classical information, a quantum network is directly sharing full quantum state.

Quantum networking is mostly at the conceptual stage but definitely still at the research stage. There have been some prototype networks in the lab — and between labs, but it remains a pure research topic at this stage. Don’t let anybody fool you into believing that quantum networking is ready for commercialization.

In many cases, someone may say quantum networking but really they are only referring to quantum communication — secure communication links that use quantum effects to protect classical data, but not for connecting quantum computers to allow them to share quantum state for quantum computations.

Quantum Internet is an ill-defined concept that is also in the earliest stages of research and far from being realized except in esoteric lab experiments. People toss the term around to mean a variety of things, sometimes just a spruced-up version of quantum communication.

Quantum networking will become of interest in a few more years, but it may not become a practical reality for production-scale practical real-world quantum applications for five to seven years. People have enough trouble utilizing two dozen qubits on a single quantum computer, let alone coordinating multiple quantum computers.

Whether quantum Internet will be realized in five years (or longer) remains to be seen.

Distributed quantum computing — longer-term research topic

As just mentioned, quantum networking is a longer-term research project rather than a current reality. Similarly, distributed quantum computing is a longer-term research topic rather than a current reality.

But distributed quantum computing should not be confused with distributed quantum applications, since quantum applications are mostly classical code which can readily be distributed among any number of distributed servers on a network.

Distributed quantum applications

Since quantum applications are mostly classical code which can readily be distributed among any number of distributed servers on a network, it is very conceivable to develop and deploy distributed quantum applications.

Whether a distributed quantum application is accessing a single quantum computer or multiple quantum computers is up to the application. The application itself is still distributed either way.

But regardless of how many quantum computers might be used by a distributed quantum application, the quantum algorithms running on each of those quantum computers are completely independent — there is no quantum networking among those quantum algorithms.

Eventually, distributed quantum algorithms will be possible, but that’s a longer-term research topic, not a near-term reality.

Distributed quantum algorithms — longer-term research topic

Just to reiterate, eventually, distributed quantum algorithms will be possible, with interactions between the quantum algorithms running on distributed quantum computers, but that’s a longer-term research topic, not a near-term reality.

Quantum application approaches

Not all quantum applications are structured the same way. There are three distinct approaches to structuring quantum applications:

  1. Full application. Alternative to classical application. May be radical changes to the application code. May be a full redesign and a full reimplementation.
  2. Quantum replacement modules. Maintain bulk of original classical application. Replace only a few classical modules with replacement quantum modules which invoke quantum algorithms rather than classical implementations.
  3. Quantum network services. Maintain the bulk of original classical application, but use quantum replacement modules which merely invoke network services to perform critical functions, and the network services handle all of the quantum-related processing. Could even have a switch to permit the network services to select between classical and quantum implementations.

Quantum network services vs. quantum applications

A quantum network service is a network service which invokes quantum algorithms. It would be developed, deployed, and run as any other network service.

The application developer can decide whether to package an application which accesses quantum algorithms as a quantum application or a quantum network service.

It simply comes down to whether the application is designed for use by users (a quantum application) or to provide an application programming interface (API) for a network service (a quantum network service.)

What could you do with 1,000 qubits?

This is an interesting question. Nobody really knows for sure what one could do with 1,000 qubits. It’s an open question.

Interesting applications exist for a couple of hundred qubits. I saw a reference for 110 and 125 qubits for quantum computational chemistry.

Quantum error correction (QEC) could use large numbers of physical qubits to implement a few perfect logical qubits. If it took 65 physical qubits to implement a single logical qubit, 1,000 physical qubits could implement 15 logical qubits.

People speculate that Shor’s factoring algorithm could use thousands of qubits, although I doubt the feasibility of the algorithm working for large numbers even if we had the qubits. See the section on that topic elsewhere in this paper — No, Shor’s factoring algorithm probably can’t crack a large encryption key.

Some oddball, contrived computer science experiments could conceivably use any number of qubits, but my interest is production-scale practical real-world quantum applications.

For some perspective on at least one of the issues that arises when trying to use a larger number of qubits — impact of fine granularity of phase on quantum Fourier transform (QFT), see my paper:

48 fully-connected near-perfect qubits may be the sweet spot for achieving a practical quantum computer

48 fully-connected near-perfect qubits would support a 20-qubit quantum Fourier transform (QFT), which could achieve a computational leverage of one million over a classical solution. This might be a reasonable configuration for a practical quantum computer.

For details, see my paper:

48 fully-connected near-perfect qubits may be the minimum configuration for achieving a practical quantum computer

More than being the sweet spot, anything less than 48 fully-connected near-perfect qubits may not cut it for achieving a practical quantum computer. More qubits is easy, but sufficient qubit fidelity and sufficient qubit connectivity may be the critical gating factors.

48 fully-connected near-perfect qubits may be the limit for practical quantum computers

There are practical limits to real hardware, primarily at the analog interface level, so even though vendors have actually built quantum computers with 53, 65, 80, and 127 qubits with more to come, the ability of algorithms to effectively utilize all of these qubits for production-scale practical real-world quantum applications has not yet materialized, and I surmise that it never will.

I suggest in another of my papers that 48 fully-connected near-perfect qubits may be the practical limit not for raw hardware, but for hardware that can meet the expectations for production-scale practical real-world quantum application algorithms using all of those qubits. See my paper:

And in yet another paper I highlight that analog hardware limitations on fine granularity of phase angles may limit the utility of higher qubit counts, so that 48 qubits may be the practical limit for a useful quantum computer:

The basic flow for quantum computing

In its simplest form, quantum computing requires these steps:

  1. Design quantum algorithms. All of the steps required to perform a function on a quantum computer.
  2. Design a quantum application. Mostly classical code. Invokes quantum algorithms.
  3. Run the quantum application. Mostly running classical code. Invokes quantum algorithms as needed.
  4. Create a quantum circuit. As needed, in the quantum application. From the quantum algorithm. A sequence or graph of quantum logic gates. Any input data must be encoded into the circuit itself.
  5. Select a quantum computer. Either real hardware or a simulator.
  6. Execute the quantum circuit. Will actually run the circuit a specified number of times (shots or circuit repetitions) since quantum computers are inherently probabilistic.
  7. Retrieve the results. Measurement of the qubits will collapse quantum states into classical binary bits. Each repetition will have its own results, which may vary due to the probabilistic nature of quantum computing.
  8. Statistically analyze the results to select the best result. In the quantum application.
  9. Post-process the results. In the quantum application. Perform any post-processing needed to convert the results into a form that can be used by the quantum application.
  10. Use the results in the quantum application. As if the result had been computed classically.
  11. Rinse and repeat until the quantum application completes.

Generally this sequence of steps (at least the first five steps) is performed using a quantum software development kit (SDK).

Quantum programming

At present, there is no high-level programming model for quantum programming. A quantum program is simply a quantum circuit which is simply a sequence or graph of quantum logic gates.

A quantum circuit is created as a specialized data structure, which when shipped off to a quantum computer can be readily transformed into the sequence of operations needed to execute the circuit on the quantum computer itself. Each of the elements of this data structure is a representation of a quantum logic gate.

Quantum computers have no ability to perform I/O, so any input data must be encoded into the quantum circuit itself as additional quantum logic gates which initialize or operate on the quantum state of the qubits on the quantum computer.

Generally, a library is needed to easily construct and execute quantum circuits. These libraries are known as quantum software development kits (SDK).

Quantum software development kits (SDK)

A quantum software development kit (SDK) or quantum SDK is a library or collection of tools used to create and execute quantum circuits.

There are a variety of quantum software development kits available from various vendors.

Some are specific to the hardware or software of a particular vendor, but a lot of them can work with the hardware from different vendors.

A representative sample of current quantum software development kits:

  1. Amazon Braket.
  2. Google CIRQ.
  3. IBM Qiskit.
  4. Microsoft Azure Quantum Q# and the Quantum Development Kit.
  5. NVIDIA cuQuantum.
  6. Rigetti Forest/pyQuil.
  7. TKET by Quantinuum.
  8. Pennylane by Xanadu.
  9. IonQ.Forget translating your work into yet another quantum SDK. IonQ works with all of the most popular cloud providers, libraries and tools. Just sign in and get to work.

Platforms

Platform is a rather generic term in computing in general. Typically, it has one of two distinct meanings:

  1. An API. An application or network service which offers an application programming interface (API) which can be invoked from applications to perform interesting high-level functions. Popular for network services.
  2. An integrated multi-function environment focused on some general class of activity. Typically an interactive, online application which allows a user to complete any of a collection of tasks and functions. Everything is integrated and designed to work well together. Much easier to use than separate tools.

The point is that if someone simply refers to something as a platform, you can’t know what they are really talking about without a little more context. Adding adjectives helps.

Quantum computing software development platforms

A quantum computing software development platform is generally an interactive online environment which provides an integrated collection of tools designed to make it easy to design and develop quantum algorithms and/or quantum applications.

Sometimes simple software development kits (SDK) for quantum computing qualify as quantum computing software development platforms, although generally, by default, an SDK is more of an API (application programming interface) than an interactive tool.

It’s a fielder’s choice whether you wish to consider each of the quantum software development kits (SDK) listed above as a quantum computing software development platform.

Some vendors will choose to refer to their tools as kits and others will choose to refer to their tools as platforms. But the general concept is that the various functions and features should be integrated to a greater degree, especially for platforms.

One example is Zapata’s Orquestra Studio:

Another example is Classiq’s Quantum Algorithm Design platform:

  • Introducing Quantum Algorithm Design
  • Quantum Algorithm Design (QAD) is the quantum version of computer-aided design (CAD). With QAD, quantum software engineers and scientists innovate and produce much faster than ever before.
  • The Classiq Quantum Algorithm Design platform is used by teams to design, analyze, and optimize quantum circuits. This software platform transforms high-level functional models into concrete quantum circuits optimized for the back-end of choice.
  • The Classiq platform asks designers to describe the circuit functionality by creating a high-level circuit model. The model is written using a Quantum Description Language (QDL) with a textual or a Python (SDK) interface.
  • The model is then ingested by the Classiq synthesis engine, which uses advanced constrained optimization solvers to choose the optimal circuit (or circuits) out of billions of possible options. The synthesis engine aims to find a circuit that matches a set of design constraints and rules that are also defined by the designer, as well as general rules embedded in the platform and can be overridden by the designer.
  • https://docs.classiq.io/latest/

Quantum workflow orchestration

Workflow refers to a sequence of tasks. Workflow orchestration refers to automation of the sequencing of the tasks. Quantum workflow refers to a sequence of tasks needing to be accomplished for quantum computing. Quantum workflow orchestration refers to automation of tasks needed to accomplish quantum computing.

Details of tasks are beyond the scope of this informal paper, but the point is that complicated workflows should be automated.

Programming languages for quantum applications

Virtually any programming language can be used to develop the classical code for a quantum application.

Python happens to be very popular.

Microsoft also offers Q#.

Programming languages for quantum algorithms

At present, there are no high-level programming languages for quantum algorithms.

All design of quantum algorithms is currently done at the gate levelquantum logic gates.

Quantum circuits can be generated using classical code, such as with Python, typically using a quantum SDK, but the individual gates must still be manually coded as API calls in the classical code of the application.

Quantum-native high-level programming language

I fully expect that eventually we will see a quantum-native high-level programming language for quantum algorithms which makes it very easy to design quantum algorithms, but not soon. Not likely over the next couple of years. When it does happen, it would enable the milestone that I call The FORTRAN Moment — design of algorithms will be much simpler and easier, and not require elite technical staff.

It’s unlikely that we will see a quantum-native high-level programming language until we first see a high-level programming model.

Just to be clear, this is a language for quantum algorithms, not for coding of the classical code of quantum applications.

Python for quantum programming

There is nothing special about the Python programming language that is specific to quantum computing. It’s simply an easy to use programming language and the developers of various software development kits (SDK) for quantum computing chose to make their SDKs available in Python.

Microsoft Q# is a variant of their C# programming language to facilitate quantum programming, but it’s not clear whether the differences are all that beneficial over C# or Python.

Python and Jupyter Notebooks

It is common to experiment with and learn quantum programming using the Python programming language in Jupyter notebooks. For details, see:

Support software and tools

Much software is needed to support quantum computing.

Generally software is classical.

This is distinct from quantum algorithms.

Not always a clear distinction between support software and tools.

Generally, a tool is software that a user chooses to use for a given task.

Generally, support software is software that is always used, without little if any choice left to the user.

Details of specific support software and tools is beyond the scope of this high-level paper.

Two categories of tools which have already been discussed in this paper:

  1. Quantum software development kits (SDK).
  2. Quantum computing software development platforms.

One special category of support software and tools is simulators for quantum computers. They are all classical software, focused on the simulation of the execution of quantum algorithms.

Audiences for support software and tools

Not everybody needs all of the support software and all of the tools. Each software package and tool will have its own audience or target users. The common audiences are:

  1. Hardware vendors. Software needed to design, build, and support quantum computer hardware systems.
  2. Software vendors. Software needed to facilitate the development and support of software products, such as tools and other support software.
  3. Quantum algorithm designers. Software needed to facilitate the design of quantum algorithms.
  4. Quantum application developers. Software needed to facilitate the development of quantum applications.
  5. IT staff. Software needed to facilitate the testing, deployment, monitoring, troubleshooting, and maintenance of quantum computing hardware and quantum applications in production use.

Tools

A wide variety of tools are needed for quantum computing. These are software tools.

Developer tools are the most common tools. Used by quantum algorithm designers and quantum application developers.

Details of specific tools are beyond the scope of this high-level paper.

Developer tools

A full spectrum of developer tools are needed, from support for design of algorithms to development of quantum applications.

Details of specific developer tools are beyond the scope of this high-level paper.

Two categories of developer tools which have already been discussed in this paper:

  1. Quantum software development kits (SDK).
  2. Quantum computing software development platforms.

An example of a developer tool:

  1. Compilers, transpilers, and quantum circuit optimization.

Audiences for developer tools

Not all quantum developers have the same needs. There are several distinct audiences for quantum developer tools:

  1. Quantum algorithm designers. Pure quantum. No need for classical, except for generative coding.
  2. Quantum application developers. Classical, using quantum algorithms.
  3. Service providers. Networked services are somewhat distinct from classical applications.

Compilers, transpilers, and quantum circuit optimization

Classical software uses compilers to transform high-level language source code into executable machine code. At present, there are no high-level languages for quantum algorithms. This may change in the coming years, but is not presently on any near-term horizon.

In quantum computing, as presently envisioned and currently implemented, the concept of a compiler or a transpiler is a software tool which automates tedious aspects of transforming quantum algorithms to efficient quantum circuits. Quantum circuit optimization is often part of the process.

As an example, IBM Qiskit has a Transpiler:

  • Transpiler
  • Transpilation is the process of rewriting a given input circuit to match the topology of a specific quantum device, and/or to optimize the circuit for execution on present day noisy quantum systems.
  • Most circuits must undergo a series of transformations that make them compatible with a given target device, and optimize them to reduce the effects of noise on the resulting outcomes. Rewriting quantum circuits to match hardware constraints and optimizing for performance can be far from trivial. The flow of logic in the rewriting tool chain need not be linear, and can often have iterative sub-loops, conditional branches, and other complex behaviors.
  • https://qiskit.org/documentation/apidoc/transpiler.html

Interactive and online tools

Many tools for quantum computing are file-oriented, transforming information from one format to another, but some tools are interactive or online, allowing the user to interact directly with the tool and see an immediate response.

For example, from IBM:

And from Zapata:

Another example is that it is common to experiment with and learn quantum programming using the Python programming language in Jupyter notebooks. For details, see:

Beware of tools to mask severe underlying technical deficiencies or difficulty of use

Tools are great, but sometimes they are an extra burden and don’t actually fully solve the problems they purportedly address.

Tools should make life easier for developers, users, and technical staff in general.

But sometimes tools can take on a life of their own and actually make life more difficult for developers. If a tool only solves part of a problem, it adds complexity and maybe overall effort required to complete tasks, even if it does automate some tedious aspects of tasks.

It is a disappointment if the only reason for a tool is to mitigate some underlying technical deficiency. This happens a lot, but it’s not the most desirable use of tools. The underlying technical deficiencies should be corrected, not add more overall system and usage complexity.

If the underlying system is difficult to use, fix it — make it easier to use, for everyone, without adding on the need for yet another tool to attempt to compensate for the difficulty of use.

The overarching goal of tools is to make life simpler and easier. Adding complexity or difficulty is counter to this goal.

And every time you add a tool you have to fully document it, promote it, and train people to use it. Better to eliminate the need for the tool and eliminate all of that other extra effort.

Granted, a lot of the need for tools may simply be to compensate for deficiencies in quantum hardware, which the tool developers can’t fix, but that shouldn’t excuse the obligation of the hardware vendors to correct deficiencies in their hardware designs.

Agile vs. structured development methodology

Of course it’s up to the quantum algorithm designer or the quantum application developer whether they wish to use an agile methodology or a more traditional structured methodology. Both have their places.

Agile is more appropriate for the rapid and radical changes during the pre-commercialization stage, where the focus is on:

  1. Research.
  2. Prototyping.
  3. Experimentation.

With a rapidly changing environment requiring constant adaptation.

But once (eventually, we all hope) quantum computing and quantum applications development mature, and the focus is on the commercialization stage, algorithm design and application development will likely be much more methodical and structured and less likely to be amenable to agile methods.

Although, any time there are quantum algorithms or quantum applications which are breaking new ground and can’t be specified in detail in advance, agile methods will retain the appeal of their flexibility.

Need for an Association for Quantum Computing Machinery — dedicated to the advancement of practical quantum computing

Quantum computing is still a nascent sector of computing. A lot has been accomplished, but much remains to be done. Part of what is needed is a formal association to advance the sector. I call it an Association for Quantum Computing Machinery, to parallel the Association for Computing Machinery that exists for classical computing.

The core essential purpose of an association for quantum computing machinery would be twofold, to be detailed shortly:

  1. To advance the science, technology, and applications of quantum computing. Including research and the development and deployment of quantum applications.
  2. To advance quantum computing as a profession. Including education and training, professional development, and networking of professionals and students. Development and promotion of standards, benchmarking, and codes of ethics and professional conduct.

The ultimate primary goal of the association would be to achieve the ultimate goal of quantum computing:

  • Practical quantum computers capable of addressing production-scale practical real-world problems and professionals capable of exploiting them.

Or stated more explicitly:

  • An association dedicated to producing practical quantum computers capable of addressing production-scale practical real-world problems and professionals capable of exploiting them.

Or stated more succinctly:

  • An association dedicated to practical quantum computing.

Such an association would be a nonprofit organization, not a commercial venture. Its purpose is to serve its members, professionals, not investors or shareholders.

Although there are a wide range of areas covered by quantum computing, to be detailed shortly, the most essential technical areas of interest are:

  1. Quantum information theory. Information at the quantum level. From basic concepts to advanced theory.
  2. Quantum computer engineering. The hardware, particularly the programming model, architecture, and qubit technology and qubit control. Including fault-tolerant quantum computing — full, automatic, and transparent error detection and correction.
  3. Quantum computer science. Quantum algorithms operating on quantum information.
  4. Quantum software engineering. Design, development, deployment, and operation of quantum applications which utilize quantum algorithms.
  5. Quantum algorithms and applications. Application domain-specific quantum algorithms and software.
  6. Quantum infrastructure and support software. Including software tools.
  7. Quantum application engineering. Applications and their deployment are too ad hoc. An engineering approach is needed for the analysis, specification, design, implementation, testing, configuration, deployment, maintenance, and enhancement of quantum applications.

Activities of the association would include:

  1. Support for research. Administrative and institutional support for research. Separate from the specific technical content and funding of the research.
  2. Quantum computing education and training. Both academic and commercial. Seminars, workshops, and conferences as well. Professional growth. Life-long learning. Career development.
  3. Quantum certification. Play a role in the development and promotion of certification programs for all aspects of quantum computing skills. Credentials which bear witness to the skills of a professional.
  4. Quantum computing standards. To produce and promote the use of formal (or even informal) standards in the quantum computing community and ecosystem. Most importantly, to take an active role in keeping attention focused on standards.
  5. Quantum publications. Books and journals. Print, electronic, and online. Email newsletters. Emphasis on research, products, and practice.
  6. Quantum computing community. Conferences. In-person and online networking and support forums. Hackathons. Local and student chapters. Employment and academic opportunities. Funding opportunities — academic and commercial, private sector, and government. Emphasis on research, products, and practice. Part of the larger quantum computing ecosystem, which includes vendors, customers, users, and investors and venture capital.
  7. Quantum computing ecosystem. The quantum computing community plus vendors, customers, users, and investors and venture capital.
  8. Students. Outreach. Community. Education. Internship opportunities. Mentoring. Research opportunities. Recognition and awards. Job placement in industry, government, and academia.
  9. Recognition and awards. Acknowledge and reward notable technical and professional contributions to the field.
  10. Code of ethics and professional conduct.

There are existing organizations for computing which can cover quantum computing to a limited extent:

  1. Association for Computing Machinery. Primarily focused on software.
  2. IEEE. Primarily focused on hardware.
  3. IEEE Computer Society. Mix of hardware and software.

But overall, quantum computing deserves an organization which is dedicated with a laser focus to quantum computing and won’t be distracted by classical computing with all of its baggage.

Since the essential purpose of an association for quantum computing machinery is to serve its members, it makes the most sense for it to be a nonprofit organization, not a commercial venture. Its purpose is to serve its members, professionals and students, not investors or shareholders.

The association would be local, regional, national, international, and both offline and online in scope. Overall, the association can best be described as international or global.

There would be a roughly 50/50 split between theory and practice. Research and practical applications are of equal value and equal interest.

Students would warrant special attention since they are in fact the future of quantum computing.

The proposed Association for Quantum Computing Machinery is a QSTEM research, practice, and educational organization. Research, practice, and students are all of equal value and equal interest. I suggest the term QSTEM as an expansion of the traditional concept of STEM to broaden it to emphasize the role of quantum effects.

The role of government is vitally significant — not just academia and industry. Government funds much computing research and uses much computing technology — classical and quantum.

When should such an association be brought into existence? It could happen at any time, but there are any number of reasons to delay its formation for a more opportune time. The Association for Computing Machinery didn’t come into existence until 1947, shortly after the pioneering ENIAC computer entered service. It might be wise to wait for a similar moment for quantum computing. Or maybe wait until quantum computing has a more advanced programming model which is more usable by less-elite professionals. Ultimately, the association will happen when a motivated team of founding fathers (and mothers!) get together and put in the effort to make it happen — in a sustainable manner.

Some sponsors may be needed to get the association off the ground financially initially, but primarily it would be funded by memberships, although ongoing sponsorships may be appropriate, provided that there are no strings attached and that sponsors get no say in the operation or administration of the association.

How can someone get involved? Just do it! Do your own thing! Do something, anything, and see what develops!

For now, quantum computing remains a mere laboratory curiosity.

For now, quantum computing is still more appropriate for the lunatic fringe rather than mainstream application developers.

For now, quantum computing remains in the pre-commercialization stage, not ready for commercialization yet and at risk of premature commercialization. Much more research, prototyping, and experimentation is needed before commercial products and production-scale applications can even be conceptualized with enough detail and accuracy to avoid premature commercialization.

But we’re still in the early days, when an association dedicated to quantum computing could well make the difference in getting past both the laboratory curiosity stage and the lunatic fringe stage.

For more detail, see my paper:

Need to advance quantum information theory

Information at the quantum level. From basic concepts to advanced theory.

A new field.

Need to advance quantum computer engineering

The hardware, particularly the programming model, architecture, and qubit technology and qubit control. Including fault-tolerant quantum computing — full, automatic, and transparent error detection and correction.

A new field.

Need to advance quantum computer science

Quantum algorithms operating on quantum information.

A new field.

Need to advance quantum software engineering

Design, development, deployment, and operation of quantum applications which utilize quantum algorithms.

A new field.

Need to advance quantum algorithms and applications

Application domain-specific quantum algorithms and software.

We’re still in the pre-commercialization stage. Much more research is needed. And prototyping. And experimentation.

Eventually we can move on to commercialization, but we’re not there yet.

Need to advance quantum infrastructure and support software

Including software tools.

Need to advance quantum application engineering

Applications and their deployment are too ad hoc. An engineering approach is needed for the analysis, specification, design, implementation, testing, configuration, deployment, maintenance, and enhancement of quantum applications.

Need for support for research

Administrative and institutional support for research. Separate from the specific technical content and funding of the research.

Need for quantum computing education and training

Both academic and commercial. Seminars, workshops, and conferences as well. Professional growth. Life-long learning. Career development.

Need for quantum certification

Development and promotion of certification programs for all aspects of quantum computing skills. Credentials which bear witness to the skills of a professional.

Need for quantum computing standards

To produce and promote the use of formal (or even informal) standards in the quantum computing community and ecosystem. Most importantly, to take an active role in keeping attention focused on standards.

Need for quantum publications

Books and journals. Print, electronic, and online. Email newsletters. Emphasis on research, products, and practice.

Need for quantum computing community

Conferences. In-person and online networking and support forums. Hackathons. Local and student chapters. Employment and academic opportunities. Funding opportunities — academic and commercial, private sector, and government. Emphasis on research, products, and practice. Part of the larger quantum computing ecosystem, which includes vendors, customers, users, and investors and venture capital.

Quantum computing community

The quantum computing community encompasses all aspects of individuals interacting as individuals and as groups. This includes:

  1. Conferences.
  2. In-person and online networking and support forums.
  3. Local and student chapters.
  4. Employment and academic opportunities.
  5. Funding opportunities — academic and commercial, private sector and government.
  6. Emphasis on research, products, and practice.
  7. Part of the larger quantum computing ecosystem, which includes vendors, customers, users, and investors and venture capital.

Note: Quantum computing community is frequently abbreviated as quantum community.

Need for quantum computing ecosystem

The quantum computing community plus vendors, customers, users, and investors and venture capital.

Quantum computing ecosystem

The quantum computing ecosystem is the whole enchilada of quantum computing. It includes:

  1. The technology.
  2. The research.
  3. Academia.
  4. Community.
  5. Vendors.
  6. Customers.
  7. Users.
  8. Investors. Venture capital firms. Investors in startups.

The key emphasis of the quantum computing ecosystem is on complete and fully functional quantum computing systems and applications.

Note: Quantum computing ecosystem is frequently abbreviated as quantum ecosystem.

Distinction between the quantum computing community and the quantum computing ecosystem

As used in this paper, quantum computing community refers to the professionals responsible for designing and building quantum computing systems. The core of those professionals being scientists and engineers. This notion of community is distinct from the operational and business aspects of quantum computing, which is nominally the world of vendors, customers, users, and investors — those individuals and groups working with complete and fully-functional quantum computing systems and applications, in contrast to those professionals who design and develop those quantum computing systems and applications.

When in doubt, you can presume that someone is a member of the quantum computing community.

You only need to consider people as part of the quantum computing ecosystem rather than the quantum computing community is when they are:

  1. A vendor. Selling and distributing complete quantum computing systems. Or producing components which other vendors are integrating to produce complete quantum computing systems.
  2. A customer. Buying, leasing, deploying, and operating complete quantum computing systems. Includes cloud access.
  3. A user. Using or operating quantum applications.
  4. An investor. Such as a venture capital firm.

Call for Intel to focus on components for others to easily build their own quantum computers

The quantum computing ecosystem would get a huge boost if Intel were to focus on producing components for others to easily build their own quantum computers. We would suddenly have a more diverse but compatible universe of hardware vendors.

For details, see my paper:

Intel could single-handedly do for the quantum computing ecosystem what IBM, Intel, and Microsoft did for the PC ecosystem

Just to emphasize the potential impact of my proposal from the preceding section.

Need to assist students

Outreach. Community. Education. Internship opportunities. Mentoring. Research opportunities. Recognition and awards. Job placement in industry, government, and academia.

Need for recognition and awards

Acknowledge and reward notable technical and professional contributions to the field.

Need for code of ethics and professional conduct

Need to be developed and promoted. And enforced.

Quantum computing as a profession

This is a new and open area — what does it mean to be a professional in quantum computing?

Is it any different than being a professional in the computing industry overall?

Is it any different than a physicist being a scientist?

Is it any different than a mathematician being… a mathematician?

Is it any different than a computer scientist being a… computer scientist?

Is it any different than a computer engineer being an… electrical engineer?

Is it any different than a software developer being a… software developer?

Is it any different than a quantum application developer being an… application developer?

What standards, ethics, and code of conduct should apply?

Is it just that the subject matter is somewhat different? It may be true that the common subject matter binds the disparate professionals together even if their specific professional disciplines differ.

Benchmarking

Need for measuring the capabilities of a particular quantum computer, and comparison with other quantum computers.

Details are beyond the scope of this informal paper.

Quantum Volume (QV) is a simple metric developed by IBM to summarize the overall capability and quality of a quantum computer. See the section elsewhere in this paper: Quantum Volume (QV) measures how many of the qubits you can use in a single quantum computation.

For more detail on Quantum Volume, see my paper:

For a summary of a variety of approaches to benchmarking of quantum computers, check out this paper from Boston Consulting Group (BCG), which covers both system and application benchmarks from IBM, Sandia National Laboratories, UC Berkeley / Berkeley Lab, Atos, QED-C, and Super.tech:

Education

Need for both academic and professional education in quantum computing.

Details are beyond the scope of this informal paper.

Training

Need to train employees and users in quantum computing.

Details are beyond the scope of this informal paper.

What is the best training for quantum computing?

Training is an important aspect of quantum computing, but beyond the scope of this informal paper.

There are so many training resources for quantum computing out there and it’s hard to say where to start or what’s best.

To a large extent it depends on your own background and what your objectives are.

There are plenty of paid training courses, but plenty of free online resources as well.

I’ll just mention a few professional resources here and you can use Google to find more:

  • QURECA
  • Now more than ever, we need a workforce that is ready for the quantum revolution. QURECA is dedicated to supporting the development of the quantum workforce. We are focused on educating the necessary skills and knowledge to individuals and businesses around the world.
  • https://qureca.com/

Another source is MIT’s xPro program ($$$):

And

Another commercial offering is Black Opal from Q-CTRL ($):

Please note that these are merely examples of what is available for training for quantum computing rather than recommendations or endorsements per se.

Getting started with IBM Qiskit Textbook

I don’t want to get too involved with recommending how to dive deeper into quantum computing, but one free online resource is the IBM Qiskit Textbook:

  • Learn Quantum Computation using Qiskit
  • Greetings from the Qiskit Community team! This textbook is a university quantum algorithms/computation course supplement based on Qiskit to help learn:
  • The mathematics behind quantum algorithms
  • Details about today’s non-fault-tolerant quantum devices
  • Writing code in Qiskit to implement quantum algorithms on IBM’s cloud quantum systems
  • https://qiskit.org/textbook/preface.html

Workforce development

Need to create, maintain, and enhance the potential for individuals to be productive employees in quantum computing.

Need to grow the talent pool for quantum computing.

Details are beyond the scope of this informal paper.

How to get started with quantum computing

There are any number of alternative approaches and paths to get started in quantum computing. It will depend on the nature of an organization and its objectives, as well as the skills and interests of its existing technical teams.

A little light training and study might be enough for some, while a deep-dive immersive approach might be ideal for others, while others might benefit more from hiring a consulting firm, while others might opt to wait until commercial offerings of applications are available.

Some might opt to build a sizable in-house elite quantum technical team, possibly even hiring elite quantum PhDs, while others might choose to outsource all quantum technical efforts, or a hybrid of these two ends of the spectrum.

Some might simply opt to have selected members of current technical teams spend 20% of their time familiarizing themselves with quantum computing.

Overall, an organization’s approach to quantum computing should be consistent with how they approach any new technology. See the section Your organizational posture towards technological advances.

There are five key aspects of getting into quantum computing:

  1. Understanding the technology.
  2. Identifying use cases.
  3. Research, prototyping, and experimentation with the new technology for selected use cases.
  4. Deciding and initiating pilot projects.
  5. Deciding and initiating full-scale projects.

And of course workforce issues are a constraint for all of those aspects.

Whether an organization, team, or individual dives into the technology first or focuses on identifying use cases first and then dives into the technology to pursue prototypes for the use cases will again depend on the organization’s overall approach to new technologies, as discussed in that section just mentioned — Your organizational posture towards technological advances.

There is the notion of Quantum Ready — focusing on understanding the technology and what it can do. Again, it will depend on the particular organization how to approach and pursue quantum readiness. See the section Quantum Ready — on your own terms and on your own timeline.

Again, each organization is different, so these differences must be taken into account. See these sections:

  1. Your organizational posture towards technological advances.
  2. Timing is everything — when will it be time for your organization to dive deep on quantum computing?
  3. Quantum Ready — on your own terms and on your own timeline.
  4. Each organization must create its own quantum roadmap.

And if you’re reading this paper, you are already well on your way to getting started with quantum computing.

For some specific suggestions — besides finishing this paper, see these sections:

  1. What is the best training for quantum computing?
  2. Getting started with IBM Qiskit Textbook.

Investors and venture capital for quantum computing

In a real sense, money makes the world go round. Research needs to be funded. Education needs to be paid for. Products and services needed to be purchased. Product development needs to be funded. Investment is critical for any technology.

Initial investment, especially in research may be paid for by government, academia, or larger corporations.

Venture capital is a key source of investment to create and grow new technology companies, so-called startups. Venture capital firms are the source for venture capital investments.

As such, investors and venture capital firms are critical parts of the quantum computing ecosystem.

Whether investors and venture capital firms should also be considered part of the quantum computing community is an interesting question to debate. I tend to lean towards including them in the quantum computing ecosystem rather than the quantum computing community since they are focused more on business and finance than just the technology per se.

So, I can understand if some choose to consider investors and venture capital as part of the quantum computing community, even as I pursue my preference as them being part of the broader ecosystem rather than the technology-focused community.

Commitment to a service level agreement (SLA)

Production deployment requires a solid commitment for:

  1. Availability.
  2. Performance.
  3. Latency.
  4. Capacity.
  5. Support.
  6. Redundancy.
  7. Dedicated hardware.
  8. Etc.

Be sure to have contractual commitments to all of the above, which is nominally in the form of a service level agreement (SLA).

Be sure to read all of the fine print.

Diversity of sourcing

Don’t be reliant on a single provider for any service, equipment, software, or tool.

Companies can go out of business during a technological winter, or change their terms of service in an unacceptable manner at any time.

Dedicated access vs. shared access

At this stage of pre-commercialization people are not too worried about access to quantum computing resources. Most vendors are providing free access.

And most users only need occasional access for short periods of time.

But, eventually, when we get into the commercialization stage of adoption of quantum computing, when people begin worrying about production deployment, then people will begin worrying about possibly needing dedicated access to quantum computing hardware rather than occasional shared access for short periods of time.

Deployment and production

Once an application has been developed and tested, it’s ready to be deployed and put into production use. This paper won’t go into details for this process, but some of the aspects include:

  1. Negotiate service level agreements (SLA). Agree to all terms of service — availability, performance, capacity, redundancy, etc.
  2. Test deployment. The same as real deployment but for testing and evaluation purposes only.
  3. Pre-production testing. Testing as the software would be used in full production, but only using test deployment.
  4. Preserve current release. In case a need to revert to the previous release is required later.
  5. Deployment. Making the software available and ready to go live.
  6. Going live. Flip the switch. Now it’s live in production.
  7. Monitoring applications. Test to confirm that the software is live and performing as expected.
  8. Checkpointing. Periodically save all critical data so that it can be restored if lost or corrupted.
  9. Performance. Evaluate performance in production.
  10. Latency. Evaluate latency in production.
  11. Capacity. Evaluate capacity in production.
  12. Availability. Monitor and confirm availability in production.
  13. Confirm service level agreements (SLA). Confirm that all commitments are being met.
  14. Checkpoint restore. Restore all critical data from the most recent checkpoint if data has been lost or corrupted.
  15. Revert to previous release. If some serious issue does crop up and there is a need to drop back to the previous release.

Shipment, release, and deployment criteria

It’s hard enough to get a handle on what an algorithm or application must do, but then there are all of the other factors that go into engineering an industrial-grade and production-scale product or service. This comes down to clearly defining and evaluating criteria for shipment, release, and deployment. The basic criteria are:

  1. Full function. All the major features, as well as all of the minor nuances and options.
  2. Performance. The whole point of choosing a quantum solution. Actually measure the quantum advantage.
  3. Capacity.
  4. Error conditions.
  5. QA. Full testing.
  6. Documentation.
  7. Training.
  8. Support.
  9. Improper use. Accidental misuse. Improper training. Malicious misuse.
  10. Hacking and cybersecurity.
  11. Privacy and data protection.

How much might a quantum computer system cost?

Generally this question is beyond the scope of this paper at this time, but it’s worth raising the question on principle. So, how much might a quantum computer system cost?

Some aspects of pricing:

  1. Retail price. Total cost to buy a single system.
  2. Volume pricing. Discount from retail price for multiple or many systems.
  3. Hardware component cost. Money to component suppliers and/or distributors. Cost before the components are assembled into a complete quantum computing system.
  4. Cost of dilution refrigerator and packaging. Beyond the cost of electronic components from Intel, what does a dilution refrigerator and its associated equipment cost, as well as the overall packaging for the quantum computing system.
  5. Cost to assemble and test. How much does it cost to assemble and test a full quantum computer system?

Here are some wild guesses for possible total quantum computing system costs:

  1. $100,000. Bare bones simple system. Possibly second-hand components.
  2. $250,000. A little better than a bare bones simple system. Possibly second-hand components.
  3. $500,000. Moderately better than a bare bones simple system. Possibly second-hand components.
  4. $1,000,000. Entry level system. More than bare bones.
  5. $2,500,000. Above entry level system.
  6. $5,000,000. Mid-range system.
  7. $7,500,000. Mid-range system. Larger configuration.
  8. $10,000,000. Super-mid-range system. More than mid-range.
  9. $15,000,000. Super-mid-range system. More than mid-range. Larger configuration.
  10. $20,000,000. Super-mid-range system. More than mid-range. Larger configuration.
  11. $25,000,000. High-end system. More than super-mid-range. Multiple processors.
  12. $50,000,000. Super-high-end system. Lots of processors.

Pricing for service — leasing and usage

Not everyone will wish to purchase a quantum computer system to own it all for themselves. Alternatives include:

  1. Limited or unlimited free access for light evaluation. Especially at this stage of pre-commercialization. No production usage.
  2. Leasing. Interesting questions about the useful service life of a computer system. How long before it becomes obsolete? How long before it is ready to be discarded in favor of newer systems? How long can a lease be before it is effectively ownership for the useful life of the system?
  3. Dedicated. May be similar to leasing, but focused on 100% availability and no term of lease per se.
  4. Metered usage. Pay as you go. Simply pay for whatever service you use on an hourly basis.
  5. Guaranteed availability. Beyond paying for actual usage, pay a premium to be guaranteed that the system will be available when needed. At the extreme, 24/7 availability. Or for x hours within a specified daily time interval — may not need real-time access, just guarantee that the system will be available somewhere between a specified start time of day and a specified end time of day.

Pricing for software, tools, algorithms, applications, and services

Besides pricing for quantum computers themselves, there is also pricing for software, tools, developer tools, quantum algorithms, quantum applications, consulting, and other services. That’s all beyond the scope of this paper, other than to highlight the issue.

The preceding discussion of pricing for hardware service may apply here to some extent as well, especially limited or unlimited free access for light evaluation, especially at this stage of pre-commercialization, with no production usage.

Open source is essential

It is no exaggeration to suggest that open source is essential. It is absolutely vital.

Everything should be open source. This includes:

  1. All of the software for sure.
  2. All of the quantum algorithms.
  3. All of the quantum applications.
  4. All of the support software.
  5. All of the tools.
  6. All of the developer tools.
  7. All of the algorithmic building blocks.
  8. All of the design patterns.
  9. All of the application frameworks.
  10. All of the configurable packaged quantum solutions.
  11. Generally firmware as well.
  12. Any diagnostics, configuration tools, and support software.
  13. And even hardware designs when possible.

Open source facilitates customization and extension by customers and academic and government researchers.

Open source is the single best way to assure that quantum computing innovation occurs and that practical quantum computing comes to fruition.

Intellectual property (IP) — boon or bane?

Intellectual property (IP) such as patents can cut both ways. The prospect of proprietary advantage is a fantastic incentive. But open source can be a huge advantage as well.

If too much of the key technologies of quantum computing are locked up due to IP protections, innovation and adoption can be stifled or delayed.

Intellectual property pertains to both hardware and software, including algorithms.

Ultimately intellectual property comes down to licensing, or licensing fees. As long as intellectual property is licensed at reasonable fees, including free access for research, prototyping, experimentation, and low-volume commercial testing and pilot projects, everything should be fine.

And the ultimate cure for intellectual property licensing excesses are technical workarounds which skirt the issue.

The great risk is that some critical narrow patents or some overly broad patents might make technical workarounds more difficult, onerous, or near-impossible.

There’s no problem with IP for quantum computing at present, and no hint of an imminent problem, but it is a potential issue to keep an eye on.

Shared knowledge — opportunities and obstacles

Knowledge is essential and critical for any thriving technical sector. Quantum computing is no exception. There are plenty of opportunities for freely and openly shared knowledge, but also potential for proprietary hiding of knowledge, and fee-based or licensed knowledge as well. And also sensitive, secret, confidential, or classified knowledge which cannot be shared.

There are immense benefits to global access to freely and openly shared knowledge.

Generally it is best if all basic knowledge is freely and openly available.

Generally there is no inherent benefit to private knowledge.

Publication of academic papers is a prime source for sharing knowledge.

In addition to actual publications, posting of preprints of papers on arxiv.org is a great way to share knowledge.

And much knowledge is shared on private web sites, blogs, and a variety of online media, including Medium (where all of my own informal writing is posted — list of my informal papers on quantum computing.)

Open source projects are a great opportunity for generating and sharing knowledge. Including:

  1. Research.
  2. Algorithmic building blocks.
  3. Design patterns.
  4. Algorithms.
  5. Application frameworks.
  6. Applications.
  7. Examples of algorithms, applications, and data.
  8. Tests and test data.

Government can play a role in the sharing of knowledge as well. Government projects, government contracts, and government-funded projects can generate a wealth of knowledge. But sensitive, secretive, and classified government projects can hide knowledge as well.

Books are a mixed bag. They do tend to be fairly well curated and have high value, and at reasonably nominal cost. But sometimes they can be expensive and the information may not be readily found using standard Internet search engines. Some books are actually freely available online, but most are not.

Some advanced or specialized knowledge may be proprietary and fee-based. That could be great if you can afford it, but it’s a general disadvantage if too much critical knowledge is fee-based.

Some advanced or specialized knowledge may be licensed at a fairly expensive rate. In some cases the value of the knowledge could be worth the licensing fees, but everybody else is out of luck.

Finally, there is proprietary knowledge, including trade secrets, which is unavailable to anyone outside of the organization. There is no opportunity for (legally) sharing such knowledge.

There is also customer confidential knowledge which is created by a vendor working under contract for a customer, which belongs to the customer, so it won’t be generally usable by the vendor or anybody other than the customer. This is the customer’s right, and they may have legally valid and solid proprietary reasons for not sharing, but such knowledge has no value for the larger community and the quantum computing sector overall.

Secret projects — sometimes they can’t be avoided

Openness and sharing are highly desirable, but sometimes secrecy is required or at least mandated.

Secret projects happen. They’re not entirely avoidable.

But they’re not helpful to the health of the quantum computing sector.

Some examples:

  1. Stealth startups. Attempt to preclude competition. Or at least maintain an advantage over potential competitors.
  2. Trade secrets. Proprietary knowledge or technology. Generally to obtain or maintain a competitive advantage.
  3. Classified projects. Secretive government agencies. Such as the U.S. NSA, CIA, DOD, DHS, DOE, and national laboratories.

Most private sector secret projects end up eventually having a visible public effect.

Stealth startups eventually unveil themselves. Secrecy wasn’t a permanent state of affairs, just a temporary tactic.

Trade secrets may remain secret indefinitely, but they usually have some public, visible effect. A firm may hide the secret sauce behind a successful product, but competitors can study the public product and either reverse engineer or simply guess what the trade secret might be. They may not be able to exactly reproduce the trade secret, but they may be able to do good enough.

Secret government projects — they’re the worst

Unlike private sector secrecy, secret government projects are likely to remain secret indefinitely, at least until their service life has ended and it no longer matters. But it could be decades — unless there is an earlier leak.

We can all see what’s going on with publicized quantum computing efforts, but who knows what is going on in secretive government laboratories.

Well-funded classified government projects at the U.S. NSA, CIA, DOD, DHS, DOE, and national laboratories — or similar agencies in other countries — may have been able to co-opt a number of the best researchers and engineers and already built or be working on quantum computing projects that far outstrip even the most promising publicly-known academic and private sector projects. It’s very possible. We just can’t tell.

If Shor’s factoring algorithm really could crack even strong public encryption keys, why wouldn’t NSA, CIA, et al already be using advanced quantum computers to be cracking encryption everywhere? And if they were able to do so, it would make sense that they would keep it a big secret — and probably seek to prevent others from gaining access to the same quantum computing technology.

That said, although we can’t tell or know, I’m not prepared to presume that such secret government projects really do exist at present.

And even if they did, their secret existence wouldn’t have any effect on the work that will be done in academia and the private sector.

Also, even if such secret government projects did exist, it’s quite likely that rumors and occasional details would leak out into public anyway, no matter how classified or secretive.

The bottom line is that secret government projects would be our worst nightmare in terms of secretive efforts in quantum computing.

Transparency is essential — secrecy sucks!

Tying together the preceding sections into the overarching concept of transparency:

  1. Open source. Maximize transparency.
  2. Intellectual property. Can minimize transparency. Or constrain it to some degree.
  3. Sharing of knowledge. Maximize transparency most of the time. But in some cases sharing of knowledge is constrained.
  4. Secret projects. Generally suck from a transparency perspective. But may be necessary or at least mandated.

A thriving, healthy, sustainable, and vibrant quantum computing sector requires a maximal degree of transparency.

Sure, there will be cases where total transparency is not possible, practical, easy, or as cheap as desired, but maximizing transparency is the goal.

An open source quantum computer would be of great value

Open source is great, especially for computer software and algorithms, but the same concept applies to hardware as well. An open source project for an entire quantum computer system would be quite useful.

This would enable a greater population of researchers to quickly start up new quantum computing research projects without any need to reinvent the basic wheel just to get started.

All of the science and engineering details would be worked out in full detail and posted online and free. Specifications, engineering drawings, parts lists. You name it.

A thriving online community would provide support and enable public contributions to the project.

All that would be needed would be to order the parts and put it together.

And maybe a startup could offer kits to do so even more easily.

And another startup to put the kit together for you.

Other startups might focus on the more difficult technical parts, such as chip fabrication.

There are endless possibilities. All build on a foundation of sharing knowledge.

Computational diversity

Computational diversity refers to the use of a variety of types of computing hardware, not only classical computers. Quantum computers are added to the mix. This includes:

  1. Classical digital processors. Both basic processors and high-end processors.
  2. Local computers. Laptops and desktops.
  3. Servers. Typically in a data center or cloud.
  4. Multiple classical processors. Multiple cores as well as multiple chips in a single classical computer system.
  5. Distributed processing. Network connections between classical computer systems.
  6. Supercomputers with a large number of classical digital processors. High-speed, high-capacity connections between classical processors.
  7. Analog signal processing. Not digital.
  8. Graphics processing units (GPU). Actually somewhat more general purpose than simply graphics tasks.
  9. Field programmable gate arrays (FPGA). Custom hardware without the upfront design cost of custom chips.
  10. Custom hardware. Full-custom integrated circuits and circuit boards.
  11. Quantum computers.

So, quantum computers are simply another tool in the toolkit for complex applications.

Quantum-inspired algorithms and quantum-inspired computing

The radically different mindset of quantum computing which focuses on quantum parallelism can lead to novel re-thinking of how problems can be solved on classical computers, particularly to exploit multiple processors, parallel processing, and large distributed clusters. Sampling approaches such as Monte Carlo simulation can also be used to approximate solutions on a classical computer, but still modeled on quantum parallelism.

This is still a very new and largely unexplored area, but well worth research, prototyping and experimentation.

Personally, I think there is great potential for quantum-inspired computing — looking at quantum approaches to algorithms and applications and then attempting to approximate or simulate them using classical computing. In many situations, approximations are actually good enough, especially when the full quantum solutions are not yet feasible, especially in the earlier stages of adoption of quantum computing.

One example is using Monte Carlo simulation to approximate (actually, sample) a full quantum parallel computation. Maybe for, say, a traveling-salesman optimization problem, as an example. Again, it’s an approximation, not a full and accurate solution, but in many situations it may be good enough. Or at least better than a traditional classical solution when a full quantum solution is not yet feasible or otherwise not yet available.

Whether to consider quantum-inspired computing as under the umbrella of quantum computing proper or still part of classical computing is an interesting semantic problem. I can see it both ways. But that does not take away from its potential. And, it doesn’t take away from the starting point, which is research into quantum approaches to computation.

Develop a quantum algorithm first (possibly using a classical quantum simulator), and then evaluate possible classical approximations to arrive at the quantum-inspired algorithm that is inspired by the quantum approach.

It might well be that the quantum-inspired algorithm is a stopgap measure if near-term quantum computers simply don’t have the capabilities to support the actual quantum algorithm.

I wouldn’t necessarily recommend quantum-inspired algorithms as the preferred approach, but they are at least a possibility that can be considered.

And they are also an approach to recouping research into quantum algorithms should the pure quantum approach not be completely fruitful for some reason.

Collaboration — strategic alliances, partnerships, joint ventures, and programs

For industries and sectors of great complexity and rapid innovation and evolution and change, it can be rather difficult for a single organization to do everything it needs to do all by itself, with only its own resources. Collaboration between organizations permits the organizations to leverage the resources of each other for mutual benefit.

In some cases the leveraging is primarily in one direction, but it can simply be that the benefits are asymmetric, so that all parties benefit in some way, even if not in exactly the same way.

Details of collaboration between organizations in the quantum computing sector are beyond the scope of this informal paper.

Some of the types of collaboration are:

  1. Strategic alliances. Both sides bring a lot to the table.
  2. Partnerships. Significant sharing, but not at the extreme intensity of a strategic alliance.
  3. Joint ventures. Relatively modest and isolated from the rest of the organization.
  4. Programs. These tend to be one-sided, such as vendors offering sophisticated programs and access to resources, and customers making use of those resources.
  5. Licensing. Intellectual property, technology, products, brands.

There are two distinct forms of collaboration:

  1. Carefully negotiated between a very small number of organizations. Generally two, or maybe three. Such as a joint venture or strategic partnership.
  2. Open to all who are interested. May be subject to approval and requires commitments to terms. Such as a program or licensing.

The content or subject matter of collaborations can be one or more of:

  1. Access to existing research.
  2. Engaging in new research.
  3. Product development.
  4. Marketing.
  5. Sales.
  6. Access to technology. License to use intellectual property.
  7. Technical information sharing. Specifications and technical data.
  8. Access to technical resources. Use of technical systems not available to others.

Classical computing still has a lot of runway ahead of it and quantum computing still has a long way to go to catch up

It is likely premature to give up on classical computing and put all (or most, or even many) of your eggs in the quantum computing basket. There is also the potential for GPUs and other forms of computational diversity before quantum computing becomes the slam-dunk best choice for many applications.

Advances in classical computing are still occurring at a reasonably rapid pace, with faster hardware, better hardware and system architectures, falling prices, and better software. And the cleverness of computer programmers keeps advancing as well.

What may have been an optimal classical solution just a few years ago may no longer be optimal, and what’s optimal today may no longer be optimal even a few years from now.

Even if a quantum solution appears to be somewhat better today, advances in classical computing may eliminate that advantage within a few years.

Of course, quantum and classical solutions may leapfrog each other for some time, at least until classical really does reach the end of its runway or quantum really does finally take off.

Science fiction

Real science is frequently predicted or anticipated much earlier in fiction or science fiction. It’s always interesting to see how science reality turns out compared to its fictional predecessors. What did they get right and what was off the mark.

Granted, there have been a handful of cameo appearances or minor references to quantum computers in some fictional works, but I’m surprised that there haven’t been more.

In truth, there has been a vast wealth of fictional treatment of quantum computing — in the form of hype and anticipation of what we think or imagine quantum computing will be or be like when we finally do get to practical quantum computing in another five to ten years. This is fiction by non-fictional characters.

Indeed, how much of what we currently believe about quantum computing will actually be realized just as we currently talk about it.

But on a more positive note, who knows what magical properties of quantum effects haven’t been fully studied well enough to even imagine how they might combine to form a quantum computing model that is far beyond the current conceptions.

It actually might be quite helpful to see greater fictional treatments of quantum computing. Sometimes examination of fiction can provide us with insight about where the true boundaries are between fact and fiction.

And who’s to say what fact really is when we are speculating about the future.

Maybe we should just throw in the towel and admit that anybody who writes about what quantum computing might be in a year or two or five or ten or 25 is a… science fiction writer.

Universal quantum computer is an ambiguous term

Saying that a quantum computer is universal is ambiguous. There are two distinct meanings:

  1. Universal gate set. The machine is capable of all possible functional manipulations of quantum information. Implies general purpose, in contrast to special purpose. This fits for most current quantum computers, those which are gate-based. It’s now a standard feature. An expected feature.
  • Universal quantum computer. A proposal for a merger of quantum computing and classical computing into a single integrated machine. Not two separate machines interfacing, but fully integrated into a single machine with classical and quantum data coexisting.

For my own proposal for a universal quantum computer, using that latter definition, see my paper:

Both hardware and algorithms are limiting quantum computing

Reiterating an earlier point, it’s not just that people are waiting for quantum computer hardware to progress to a usable state, but quantum algorithms need many advances before they are usable as well.

We do have quantum simulators which support 24, 28, 32, 36, and even 40 qubits, but we still don’t have quantum algorithms that can effectively exploit all of those qubits and quantum states.

A practical quantum computer would be ready for production deployment to address production-scale practical real-world problems

Just to clarify the terminology — we do have quantum computers today, but they haven’t achieved the status of being practical quantum computers since they are not yet ready for production deployment to address production-scale practical real-world problems.

We’re unlikely to see a practical quantum computer in the next few years.

Practical quantum computing isn’t really near, like within one, two, or three years

Although quantum computers do exist today, they are not ready for production deployment to address production-scale practical real-world problems. And it’s unlikely that they will be in the next one, two, or three years.

Sure, there may be some smaller niches where they can actually be used productively, but those would be the exception rather than the rule.

Four to seven years is a better bet, and even then only for moderate benefits.

Generation of true random numbers is one exception where quantum computers are actually commercially viable today, although technically a full-blown quantum computer is not needed just to use quantum effects to generate random numbers.

Why aren’t quantum computers able to address production-scale practical real-world problems today?

Why is it that we need to wait four to seven years for practical quantum computers which are finally able to address production-scale practical real-world problems? Some of the gating factors:

  1. Insufficient qubit fidelity. We have enough qubits, but the qubits don’t have the fidelity needed for advanced algorithms.
  2. Insufficient qubit connectivity. We need better (greater) connectivity between more qubits for advanced algorithms.
  3. In some cases we have the connectivity, but not enough qubits. Such as trapped-ion qubits.
  4. Lack of sufficient coherence time and qubit fidelity to support deep and complex quantum circuits.
  5. Don’t have the fine granularity of phase for advanced algorithms. Such as for quantum Fourier transform (QFT) and quantum phase estimation (QPE).
  6. Don’t have the algorithms. Regardless of the reasons. Even for simulators, which are not constrained by any of the previously listed gating factors constraining real hardware.
  7. Some algorithms would require qubits more capable than available on existing real quantum computers as well as more qubits than can be simulated. Simulators are great, but not beyond 50 qubits or maybe even 40 or 32 qubits.
  8. Lack of a high-level programming model. Too difficult to conceptualize and design quantum algorithms.
  9. Lack of high-level algorithmic building blocks.
  10. Lack of design patterns.
  11. Lack of application frameworks.
  12. Lack of examples to guide the design of new algorithms and applications.
  13. Research funding is still way too low.
  14. Research priorities may not be properly aligned with application needs.

Limitations of quantum computing

Quantum computing has great potential from a raw performance perspective, but does have its limits. And these are not mere limitations of current quantum computers, but of the overall architecture of quantum computing.

  1. Limited coprocessor function. It is universal in a primitive sense, but the operations (gates) are very primitive — that only a physicist could love.
  2. None of the richness of classical computing. No control structures. No rich data types.
  3. Quantum computation must be relatively modest. Must be simple. Must be short. No significant complexity.
  4. Complex applications must be couched in terms of physics problems. Very difficult. Very tedious.
  5. Translation of application problems to quantum algorithms is very tedious. Not so easy. Requires elite technical staff. Very time consuming. Best to wait for somebody else to solve a problem, and then license, use, or copy their solution, rather than roll your own.
  6. Most application functions must still be performed classically. Only selected functions can be performed using quantum algorithms. Difficult to achieve quantum advantage. Difficult to gain any dramatic business advantage.
  7. No, you can’t easily migrate your classical algorithms to run on a quantum computer. You need a radically different approach to exploit the radically different capabilities of quantum computers — and to achieve dramatic quantum advantage.
  8. Lack of fine granularity for phase angles may dramatically limit quantum computing. For example, Shor’s factoring algorithm may work fine for factoring smaller numbers, but not for much larger numbers such as 1024, 2048, and 4096-bit encryption keys.

Quantum computation must be relatively modest

The code for a quantum computation must be simple. It must be short. It cannot have any significant complexity.

The computational complexity comes from quantum parallelism — performing this relatively modest computation a very large number of times.

Generally, quantum computing is only applicable to functions of less than 100 lines of classical code, or even 50, or even 25, or even 12 — it must be a relatively simple function, not thousands of lines of convoluted spaghetti code.

You could have a moderately large number of distinct quantum computations in a quantum application, but each of those quantum computations must be relatively modest in size.

No, you can’t easily migrate your classical algorithms to run on a quantum computer

Quantum computers are not even close to being similar to classical computers.

You need a radically different approach to exploit the radically different capabilities of quantum computers.

You can’t just recompile your classical algorithms to run on a quantum computer.

You can’t even redesign your classical algorithms to run on a quantum computer.

You need a radically different approach to how to design algorithms to be able to exploit the radically different capabilities of quantum computers — and to achieve dramatic quantum advantage.

Even a fully parallel classical algorithm cannot be easily turned into a quantum algorithm with quantum parallelism.

In fact it may take more effort to forget how a classical algorithm works before beginning to rethink how the quantum algorithm should work.

Lack of fine granularity for phase angles may dramatically limit quantum computing

Phase angle is a useful feature of quantum state which enables a lot of sophisticated algorithms to exploit quantum parallelism, but this relies on phase angle being a continuous analog value rather than a discrete digital value. In reality, phase angle will be discrete at some level, so it’s a question of the granularity or number of gradations which a particular design of quantum computer supports.

This is not an area that is well-documented for current quantum computers, and there is no commitment from vendors as to how many gradations or how fine a granularity of phase angle is supported or will or might be supported in the future.

As an example, Shor’s factoring algorithm may work fine for factoring smaller numbers, but not for much larger numbers such as 1024, 2048, and 4096-bit encryption keys.

For now, this whole area is one great big question mark.

The real issue is that vendors and researchers are still so busy and overwhelmed with supporting more basic capabilities of quantum computing, that they simply haven’t gotten around to such advanced features as fine granularity of phase.

The really big issue is not simply that vendors haven’t gotten around to it yet, but that there appear to be real, physical limitations of real hardware that will place harsh limits on fineness of granularity — since it is essentially a continuous analog quantity rather than strictly discrete in nature. In other words, it appears likely to be a hard limit on the capabilities of quantum computing

For more detail, see my paper:

Limitations of current quantum computers

These are real limitations, but only of current quantum computers. Future quantum computers will be able to advance beyond these limitations, to some extent. Current limitations include:

  1. Limited qubit fidelity.
  2. Limited qubit connectivity.
  3. In some cases we have the connectivity, but not enough qubits. Such as trapped-ion qubits.
  4. Fineness of granularity of phase angle. There may still be some ultimate limit eventually, but in the meantime, granularity will or at least can vary, and we can hope that it will improve over time until it does finally hit that ultimate limit.
  5. Limited circuit size.
  6. Primitive programming model. Very difficult to work with. Very difficult to translate application problems into quantum circuits.

Two most urgent needs in quantum computing: higher qubit fidelity and full qubit connectivity

Overall, the most urgent need in quantum computing is to support more-sophisticated quantum algorithms. Despite the fact that quantum computers need improvements in many areas, the two most urgent needs which block the most progress towards more-sophisticated quantum algorithms are:

  1. Need for higher qubit fidelity. Need near-perfect qubits.
  2. Need for full qubit connectivity. Full any to any qubit connectivity is needed to accommodate more complex algorithms, including quantum Fourier transform (QFT) and quantum phase estimation (QPE).

Two runner-ups:

  1. Need to support larger quantum circuits. Support larger quantum algorithms. Implies some combination of longer coherence time and faster gate execution time. But this won’t be very useful until qubit fidelity and qubit connectivity are improved significantly.
  2. Need to support finer granularity for phase. Support larger quantum Fourier transforms (QFT) and quantum phase estimation (QPE).

For more general detail on critical needs, see my paper:

For more discussion on the most urgent needs, see my paper:

Premature commercialization is a really bad idea

The temptation to rush to early commercialization will be strong. Many will resist. But some will succumb. Premature commercialization of quantum computing is a recipe for disaster.

Expectations will likely be set (or merely presumed) too high and fail to be met.

For more discussion, see my paper:

Minimum viable product (MVP)

One could accept all of the critical technical gating factors for the initial stage of commercialization of quantum computing (C1.0) as the requirements for a minimum viable product (MVP). That would be the preference. But, it may turn out that not all customers or users need all of those capabilities or features. Or, maybe everybody wants and needs all of those capabilities and features, but they simply aren’t technically or economically feasible in a reasonable timeframe. In such situations it may make sense or at least be tempting to define a minimum viable product (MVP) which is substantially less than the more capable desired initial product.

This paper won’t attempt to predict what sort of minimum viable product (MVP) will form the ultimate initial commercialization stage for quantum computing, C1.0, but it is worth considering.

Some obvious compromises:

  1. Qubit count. 128 or 256 qubits may be a clear preference, but maybe 72 or 64 or even 48 qubits might be the best that can be achieved — or that initial customers might need — in the desired timeframe.
  2. Qubit fidelity. Five nines or at least 4.5 nines of qubit fidelity might be the preference, but four or even 3.5 nines might be the best that can be achieved — or that initial customers might need — in the desired timeframe. And we may have to settle for 3.25 nines or even 3 nines.
  3. Connectivity. Full connectivity might not be achievable. Maybe SWAP networks are feasible if qubit fidelity is high enough.
  4. Fineness of phase granularity. Critical for quantum Fourier transform and quantum phase estimation. Sufficient for at least 20 to 30 qubits = 2²⁰ to 2³⁰ gradations in a quantum Fourier transform, rather than the desired 50 to 64.
  5. Quantum Fourier transform and quantum phase estimation resolution. Preferably at least 20 to 30 qubits, but maybe only 16 bits of precision can be achieved — or even only 12, rather than 32 to 64 bits.

Quantum error correction (QEC) would likely come in a later stage of commercialization in any case. Near-perfect qubits should be good enough for many applications.

In any case, the precise definition of the minimum viable product (MVP) remains to be seen. It will likely evolve as pre-commercialization progresses — hopefully increasing in capabilities as more technical unknowns are resolved, favorably.

Initial commercialization stage — C1.0

The initial commercialization stage for quantum computing would be the very first product offering to result from the commercialization process. Call it C1.0. This initial product would meet all of the critical technical gating factors detailed in a preceding section of this paper, although it is possible that there might be a minimum viable product (MVP) which doesn’t meet all of those factors.

It is not the intention of C1.0 to fulfill all of the grand promises of quantum computing, but it’s the first real start.

With C1.0, quantum computing is no longer a mere laboratory curiosity. It’s actually ready to address at least some production-scale practical real-world problems.

Quantum error correction (QEC) would likely come in a later stage of commercialization in any case. Near-perfect qubits should be good enough for many applications.

Design of quantum algorithms and development of quantum applications should continue to rely on simulation as well as automatically scalable quantum algorithms, but with C1.0, quantum algorithms and quantum applications can finally run at full capacity, well beyond 50 qubits, at least to the extent of the capabilities of the hardware available at that time.

Also, with C1.0, customers, users, quantum algorithm designers and quantum application developers can expect some sense of stability and compatibility as further stages of commercialization unfold.

For more on what to expect in the initial commercialization stage, see my paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Subsequent commercialization stages

Research continues even as the technical details of the initial commercialization stage, C1.0, stabilize. This ongoing research will form the basis for subsequent commercialization stages.

These subsequent stages can include hardware advances, support software and tool advances, algorithm advances, algorithmic building block advances, programming model advances, etc.

More qubits. Higher fidelity qubits. Enhanced qubit connectivity. Finer granularity of phase. Higher-level programming model. Richer and higher-level algorithmic building blocks.

The first subsequent commercialization stage might be labeled C1.1, to be followed by C1.2, and so on. Eventually there will be a dramatic enough change to warrant a C2.0.

The intention is that with each subsequent commercialization stage customers, users, quantum algorithm designers, and quantum application developers can expect some sense of stability and compatibility as each stage of commercialization unfolds.

For more on what some of the expected subsequent commercialization stages might be, see my paper: Model for Pre-commercialization Required Before Quantum Computing Is Ready for Commercialization.

Simply as food for thought, here are some speculative suggestions for subsequent commercialization stages, taken from that paper:

  1. C1.0 — Reached The ENIAC Moment. All of the pieces are in place.
  2. C1.1 — Incremental improvements. Especially tracking hardware advances.
  3. C1.2 — More incremental improvements. More hardware advances.
  4. C1.5 — Reached multiple ENIAC Moments.
  5. C2.0 — First configurable packaged quantum solution.
  6. C2.5 — Reached multiple configurable packaged quantum solutions. And maybe or hopefully finally achieve full, dramatic quantum advantage somewhere along the way as well.
  7. C3.0 — Quantum Error Correction (QEC) and logical qubits. Very small number of logical qubits.
  8. C3.5 — Incremental improvements to QEC and increases in logical qubit capacity.
  9. C4.0 — Reached The FORTRAN Moment. And maybe full, dramatic quantum advantage as well.
  10. C4.5 — Widespread custom applications based on QEC, logical qubits, and FORTRAN Moment programming model. Presumption that full, dramatic quantum advantage is the norm by this stage.
  11. C5.0 — The BASIC Moment. Much easier to develop more modest applications. Anyone can develop a quantum application achieving dramatic quantum advantage.
  12. C5.5 — Ubiquitous quantum computing ala personal computing.
  13. C6.0 — More general AI, although not full AGI.
  14. C7.0 — Quantum networking. Networked quantum state.
  15. C8.0 — Integration of quantum sensing and quantum imaging with quantum computing. Real-time quantum image processing.
  16. C9.0 — Incremental advances along the path to a mature technology.
  17. C10.0 — Universal quantum computer. Merging full classical computing.

But commercialization is not imminent — still years away from even commencing

The preceding sections were intended to highlight how commercialization might or should unfold, but should not be construed as implying that commercialization was in any way imminent — it’s still years away.

In fact commercialization is still years away from even commencing.

The next few years, at a minimum, will still be the pre-commercialization stage for quantum computing.

Quantum Ready

IBM made a relatively big splash with the term Quantum Ready in late 2017, almost five years ago, when they posted:

  • Getting the World Quantum Ready
  • December 14, 2017 | Written by: Dario Gil
  • The world took a big leap forward toward quantum computing readiness, today. A dozen international organizations representing Fortune 500 companies, academia, and the government joined the newly minted IBM Q Network. Together, we are committed to exploring scientific and commercial applications of quantum computing, leveraging IBM’s recently-announced 20-qubit commercial system — and, soon, our next-generation 50 qubit processor.
  • What does it mean to be quantum ready?
  • We all know the reasons why quantum computing has attracted so much excitement. Despite the enormous progress we’ve achieved as a society with “classical” computers, they simply don’t have enough memory or processing power to solve historically intractable problems. Quantum computers, working with classical computers via the cloud, could be the answer for at least some of these.
  • It is true that much of what you have read about the promise of quantum computers will require fault-tolerance, a goal likely still a decade or more away. But we believe that the initial signs of a computational advantage will be achievable with nearer-term approximate quantum computers, which are now emerging. IBM Q systems are the most advanced superconducting universal quantum computers anywhere in the world today. With these systems, our scientists will continue to push the field towards the initial demonstrations of “quantum advantage” with applications in chemistry, optimization, and machine learning.
  • We are currently in a period of history when we can prepare for a future where quantum computers offer a clear computational advantage for solving important problems that are currently intractable. This is the “quantum ready” phase.
  • Think of it this way: What if everyone in the 1960s had a decade to prepare for PCs, from hardware to programming over the cloud, while they were still prototypes? In hindsight, we can all see that jumping in early would have been the right call. That’s where we are with quantum computing today. Now is the time to begin exploring what we can do with quantum computers, across a variety of potential applications. Those who wait until fault-tolerance might risk losing out on much nearer-term opportunities.
  • https://www.ibm.com/blogs/think/2017/12/getting-the-world-quantum-ready/

Quantum Ready — but for who and when?

IBM argues that everyone should be getting Quantum Readynow. I’m not so sure. Actually, I am indeed very sure that now is not the time for most individuals and organizations to be getting ready for quantum computing.

Yes, some day practical quantum computing will become a reality. Many individuals and organizations will need to be ready for that day. But their own lead times for readiness for that day will vary greatly, from six months to seven years.

As quantum computing remains in the pre-commercialization stage, focus is on research, prototyping, and experimentation.

Some individuals and organizations will indeed be involved in pre-commercialization, performing research, prototyping, and experimentation.

Some individuals and organizations will ultimately be acquiring and deploying packaged quantum solutions, so that they will have no need to even be aware of the technical details of what goes on inside of those packaged quantum solutions. They won’t need to be Quantum Ready until the vendors of those packaged quantum solutions have completed and tested products to sell and deploy.

Other individuals and organizations will require a substantially higher-level programming model and tools, which don’t even exist today. Trying to get ready today would be completely counterproductive for them — they would spend great time, attention, and resources learning a lot which they will never use.

Sure, for the technical elite and the lunatic fringe, it is never too soon to become Quantum Ready, but those people already know that — they’re never sitting around idle and waiting for somebody else to tell them when to get ready for anything.

Quantum Aware — what quantum computing can do, not how it does it

What is more arguable is that a broad audience of individuals and organizations should be becoming aware of the potential and prospects for quantum computing. But becoming ready for that eventual potential is not the optimal strategy for most individuals and organizations at this stage.

A broader audience needs to understand quantum computing at a high level — its general benefits — but doesn’t need the technical details of how quantum computers actually work or how to program them.

Quantum aware is the awareness of what quantum computing can accomplish — what it can do for the organization. But not so much how quantum computers do what they do. Or even how they are programmed to do what they do.

Every organization always has individuals who are always on the lookout for leading-edge technologies and when and how the organization needs to raise its awareness and eventually their readiness for the new technologies.

The question is when to broaden that awareness beyond those select few whose job is to be aware.

The technical elite and lunatic fringe will always be ready, for anything

If some new technology exists, the technical elite and the lunatic fringe of any organization will know about it, without being prompted.

Determining how and when to raise awareness and eventually readiness for a new technology is always an ongoing issue. Premature awareness and premature readiness can set false or misleading expectations, which are then too-easily dashed when they cannot be met. And then we run the risk of a Quantum Winter.

But even with the technical elite and lunatic fringe, be prepared for the potential for several false starts

Too often, the trick with any significant new technology is that it may take several false starts before the team hits on the right mix of technology, resources, and timing.

This is why it’s best not to get the whole rest of the organization Quantum Ready until the technical elite and lunatic fringe have gotten all of the kinks out of the new and evolving quantum computing technology.

Quantum Ready — on your own terms and on your own timeline

I don’t subscribe to the notion that everybody should be getting ready for quantum computing now, as IBM suggests — or as IBM actually suggested, almost five years ago (2017). IBM said:

  • What if everyone in the 1960s had a decade to prepare for PCs, from hardware to programming over the cloud, while they were still prototypes? In hindsight, we can all see that jumping in early would have been the right call. That’s where we are with quantum computing today. Now is the time to begin exploring what we can do with quantum computers, across a variety of potential applications.
  • https://www.ibm.com/blogs/think/2017/12/getting-the-world-quantum-ready/

Well, given that even the IBM PC didn’t appear until 1981 and corporate personal computers didn’t take off until PCs with the Intel 386SX appeared in 1988 along with Microsoft Windows 3.1 in 1992, any major corporate investment in PCs before 1992 was not such a great idea. Not a right call at all.

Sure, researchers, the technical elite, and the lunatic fringe of computing should have been experimenting and evaluating microprocessors and personal computing software much earlier than 1992 — like, maybe 1975, when Microsoft was founded, but any effort to plan for deployment of PCs before 1992 would have been a grand waste of attention, effort, resources, money, and time.

Each organization will have its own needs and interests, and hence its own timeline for adopting any new technology.

There are very few organizations that need to be getting ready for quantum computing now. Awareness, yes, but readiness, no.

IBM was jumping the gun five years ago. And they’re still jumping the gun today.

Yes, IBM and other visionary technology firms need to be investing heavily in research for these new technologies, but it’s grossly premature for most of IBM’s customers to be making such readiness investments in such an immature technology.

The organizations that were buying and deploying PCs in 1992 did not need to be investing heavily in personal computing before 1980, in the 1970’s. Let alone the 1960’s.

Don’t jump the gun. Wait until your needs and the technology are in sync.

Don’t confuse research, prototyping, and experimentation with product development and deployment.

Some organizations should indeed be doing research, prototyping, and experimentation with quantum computing right now, but only if they understand that production development and deployment is at least five to ten years in the future. And that any investment they make today will likely need to be discarded and reworked in five to ten years, much as any investment in the PCs of 1981 or even 1988 or 1992 were of no value in 1995.

Other organizations may indeed have the luxury of waiting another five years before they even dip their toe into the waters of quantum computing.

Understand what timeline your organization is on.

Timing is everything — when will it be time for your organization to dive deep on quantum computing?

That’s the really big, unknown question, the timing of the commitment of staff, resources, and management attention to quantum computing.

Again, it may be necessary to plow through several false starts before a productive stride can be reached and sustained.

Alternatively, at the risk of being late to the party, an organization can simply wait until peer organizations actually hit their stride before allocating resources and pulling the trigger for their own investments.

Your organizational posture towards technological advances

Each organization has its own personality, including its own posture towards technological advances. Quantum computing may be different, but the context is the organization’s overall posture towards technological advances in general. Some of the common postures:

  1. Ongoing research. No matter what the technology, the organization is always doing its own research.
  2. Bleeding edge. At the first hint of a new technological advance, the organization dives in deep to see how far they can push the next technology. Efforts may fail, but that’s just the cost of doing business. This is the realm of the lunatic fringe. Alpha testing is their thing.
  3. Leading edge. The bleeding edge gets their attention, but they simply monitor the nascent technology until it seems at least partially ripe. Beta testing is their thing.
  4. Trailing edge. Waits until most of the leading-edge organizations have invested heavily, then and only then does the organization jump in. These guys wait for the production release of a new technology.
  5. Mature technology only. The trailing edge will have to settle down. There won’t be any question that the technology has merits, works, has clear benefits, and has proven to deliver substantial business value. These guys wait for release 2.0 of any new technology.
  6. Keeping up with the competition. It’s not the technology per se, but the fact that a competitor has it — or is likely to have it in the foreseeable future.
  7. Make vs. buy. Does the organization prefer to roll its own to more precisely meet its own proprietary needs, or buy off the shelf to save money and benefit from standardization.

Different organizations will dive into quantum computing at different stages of the game, based largely on their general overall posture towards technological advances.

Each organization will have to make its own call on when it chooses to become Quantum Ready.

Each organization must create its own quantum roadmap

Each organization will be on its own timeline for adoption of quantum computing and should define a roadmap of milestones for preparation for initial adoption of quantum computing. This is a set of milestones which must be achieved before the organization can achieve practical quantum computing.

The organization should define a graduated sequence of milestones of incrementally increasing capabilities, both quantum computing requirements and functional application capabilities.

As quantum computing hardware, software, and algorithms become incrementally more advanced, each organization can then implement another increment of application functionality.

At some stage, the application functionality would constitute practical quantum computing for the organization. Each organization would have its own criteria for when it will achieve practical quantum computing.

Milestones which require research advances need to be especially called out in the roadmap since they may likely require long lead time and may likely have high risk.

Role of academia and the private sector

Much of the efforts in the development of quantum computing comes from two sectors for research and development of commercial products:

  1. Academia. University research programs. Fundamental research and prototyping of hardware, software, algorithms, and applications. And of course education of students and preparing the workforce.
  2. Private sector. Large technology companies and venture capital-funded startup companies. Build on academic research. Focus on engineering of prototypes for commercial products.

Role of national governments

National governments play a number of roles relative to the roles played by academia and the private sector for research and demand for quantum applications:

  1. Fundamental research. Such as NIST research in physics and quantum technologies that preceded a lot of academic research in quantum computing.
  2. Applied research. Focused on addressing problems faced by government agencies.
  3. Research funding. Funding a lot of academic research.
  4. Research grants. Specific government agencies funding academic research targeted to benefit those agencies.
  5. Government contracts. Government agencies tapping academia and the private sector to immediately address urgent needs. Focused on demand for quantum applications to solve production-scale practical real-word problems encountered by the government itself.
  6. Academic support. Funding designed to support educational programs in academia.
  7. Workforce training. Funding designed to develop a commercial and industrial workforce.

When will quantum computing finally be practical?

I addressed this earlier, but just to close out with it, although quantum computers do exist today, they are not ready for production deployment to address production-scale practical real-world problems. And it’s unlikely that they will be in the next one, two, or three years.

Sure, there may be some smaller niches where they can actually be used productively, but those would be the exception rather than the rule.

Four to seven years is a better bet, and even then only for moderate benefits.

Generation of true random numbers is one exception where quantum computers are actually commercially viable today, although technically a full-blown quantum computer is not needed just to use quantum effects to generate random numbers.

Names of the common traditional quantum algorithms

These are the quantum algorithms which are generally discussed in most introductions to quantum computing:

  1. Bernstein-Vazirani algorithm.
  2. Deutsch-Jozsa algorithm.
  3. Grover’s search algorithm.
  4. Quantum approximate optimization algorithm (QAOA).
  5. Quantum Fourier transform (QFT).
  6. Quantum phase estimation (QPE).
  7. Schor’s factoring algorithm.
  8. Simon’s algorithm.

All of those links are to the IBM Qiskit Textbook.

With the exceptions of quantum Fourier transform (QFT) and quantum phase estimation (QPE), most of them will have very little utility for production-scale practical real-world quantum applications.

This list is presented here simply to illustrate where we are today in quantum computing. It also highlights that we have a long way to go before we have a decent foundation for average application developers to begin exploiting quantum computing.

Quantum teleportation — not relevant to quantum computing

I only mention quantum teleportation here because it does get mentioned a lot in introductions to quantum computing even though it has nothing to do with quantum computing. Rather, it is the foundation for quantum communication, the use of quantum state to securely transmit classical information, which is rather distinct from quantum computing.

But if you do indeed wish to read about quantum teleportation, check out the IBM Qiskit Textbook:

IBM model for quantum computing

IBM has a somewhat distinctive model for quantum computing. You can’t blame them for wanting them to distinguish themselves from the rest of the pack. I won’t dive into it too deeply here, but you can get a better sense of it from their latest (2022) Quantum roadmap.

Their latest roadmap places a lot of emphasis on modular hardware architectures, with multiple chips, several forms of interconnections — both classical and quantum, and designs for much larger numbers of qubits.

Unfortunately, they haven’t placed as high a priority on raw physical qubit fidelity, but they are still pursuing error correction and integration of error mitigation strategies in the software stack.

They divide developers into three categories:

  1. Kernel developers. What I would call quantum algorithm designers.
  2. Algorithm developers. What I would call quantum application developers.
  3. Model developers. What I would call subject matter experts for the various application categories.

They use concepts such as:

  1. Quantum-centric supercomputers.
  2. Frictionless development experience.
  3. Runtime primitives.
  4. Threaded runtimes.
  5. Serverless.
  6. Application services.
  7. Error suppression.
  8. Entanglement forging.
  9. Quantum embedding.
  10. Circuit knitting.
  11. Circuit cutting.
  12. Dynamic circuits.
  13. Intelligent orchestration.

The press release:

The blog post:

The web page:

The video — well worth watching:

Circuit knitting

Circuit knitting breaks a large quantum circuit into smaller pieces to run on multiple quantum processors, and then combines the results together on a classical computer.

This is a fairly recent development that is not yet widely available. IBM has announced it and targeted it for 2025 in their Development Roadmap.

As their roadmap says:

  • What is circuit knitting? Circuit knitting techniques break larger circuits into smaller pieces to run on a quantum computer, and then knit the results back together using a classical computer.
  • Earlier this year, we demonstrated a circuit knitting method called entanglement forging to double the size of the quantum systems we could address with the same number of qubits. However, circuit knitting requires that we can run lots of circuits split across quantum resources and orchestrated with classical resources. We think that parallelized quantum processors with classical communication will be able to bring about quantum advantage even sooner, and a recent paper suggests a path forward.

And their blog post on circuit knitting:

  • At what cost can we simulate large quantum circuits on small quantum computers?
  • One major challenge of near-term quantum computation is the limited number of available qubits. Suppose we want to run a circuit consisting of 400 qubits, but we only have 100-qubit devices available. What do we do?
  • May 10, 2022
  • By David Sutter and Christophe Piveteau
  • https://research.ibm.com/blog/circuit-knitting-with-classical-communication
  • Over the course of the past year, the IBM Quantum team has begun researching a host of computational methods called circuit knitting. Circuit knitting techniques allow us to partition large quantum circuits into subcircuits that fit on smaller devices, incorporating classical simulation to “knit” together the results to achieve the target answer. The cost is a simulation overhead that scales exponentially in the number of knitted gates.
  • Circuit knitting will be important well into the future. Our quantum hardware development team is focused on scaling by connecting smaller processors via classical, and then via quantum links. Due to this planned hardware architecture, circuit knitting will be useful in the near future as we run problems on classically parallelized quantum processors. Techniques that boost the number of available qubits will also be relevant far into the future.

Dynamic circuits

Dynamic circuits is a technique for interspersing classical processing in the middle of quantum circuits, allowing classical code to detect conditions in intermediate quantum results, and dynamically selecting portions of quantum circuits to be executed based on classical logic.

This is a fairly recent development that is not yet widely available. IBM has announced it and targeted it for later in 2022 in their Development Roadmap.

As their roadmap says:

  • … we will add dynamic circuits, which allow for feedback and feedforward of quantum measurements to change or steer the course of future operations. Dynamic circuits extend what the hardware can do by reducing circuit depth, by allowing for alternative models of constructing circuits, and by enabling parity checks of the fundamental operations at the heart of quantum error correction.
  • https://research.ibm.com/blog/ibm-quantum-roadmap-2025

The major concern I have with dynamic circuits is that they are not a true integration of classical with quantum computation in that the embedded classical code must perform a measurement of qubits to access their quantum state. This has three negative consequences:

  1. The quantum state of the qubits is collapsed. The qubits no longer have their full quantum state after measurement, only the classical state after the collapse, which is only an approximation of the quantum state.
  2. Measurement only approximates the quantum state. Since measurement causes the quantum state to collapse to a discrete binary classical value.
  3. Measurement interferes with quantum parallelism. Each interruption of the pure quantum computation reduces the opportunity for exploiting full quantum computation. Rather than evaluating 2^n possible values in a single quantum computation, only 2^k possible values are evaluated, where k is much less than n — for example, 1,000 vs. one million possible values evaluated.

To be sure, dynamic circuits are better than nothing, but they’re a poor facsimile of full, pure quantum computation. This is more of a stopgap measure than a major leap forward.

Honeywell offers a similar feature, which they call mid-circuit measurement and qubit reuse (MCMR):

Future prospects of quantum computing

The future prospects for quantum computing are quite murky. Sure, there’s plenty of speculation — including throughout this paper. And some possibilities are quite clear. But so much is flat out unknown. And it’s very unclear how long various capabilities might actually take to come to fruition.

Here are the various time frames to be considered:

  1. Near-term. The next few months.
  2. Six months.
  3. Nine months to a year.
  4. A year to 15 or 18 months.
  5. 18 months to two years.
  6. Three to four years.
  7. Five years.
  8. Five to seven years.
  9. Seven to ten years.
  10. Ten years.
  11. Ten to 12 years.
  12. 12 to fifteen years.
  13. Fifteen to twenty years.
  14. 25 years. Will quantum computing be fully mature by then… or not? Classical computing took longer than that.

For comparison, here’s a timeline for early classical computers:

Post-quantum computing — beyond the future

What might come after quantum computing? It may seem way too soon to even contemplate what might come after quantum computing, especially since we aren’t even close to achieving practical quantum computing yet, but still it’s an intriguing thought.

Some opportunities:

  1. Alternative architectures. Is the current gate-model even close to optimal?
  2. Other quantum effects. Are there quantum effects that current quantum computing is not taking advantage of? Are there quantum effects that haven’t even been discovered yet?
  3. Applying quantum computing to Big Data. unclear how that might work. Possibly some new form of data storage, far beyond current memory and storage models.
  4. Mimicking the human mind. Maybe a combination of the preceding two opportunities. Clearly the human mind works. But the current model of quantum computing doesn’t seem capable of doing all that the human mind can do.
  5. Beyond quantum mechanics — quantum gravity. Quantum mechanics doesn’t model gravity. When gravity and quantum mechanics are merged, maybe that will introduce additional computing opportunities.
  6. Universal quantum computing — fully integrating classical computing. There’s no clarity right now as to whether the integration of quantum computing and classical computing should be considered as part of quantum computing proper, or whether it might be a form of post-quantum computing.

Should we speak of quantum computing as an industry, a field, a discipline, a sector, a domain, a realm, or what? My choice: sector (of the overall computing industry)

What exactly is quantum computing — is it an industry, a field, a discipline, a sector, a domain, a realm, or what?

  1. Field. Makes sense for research. And academia in general. And from a purely intellectual and theoretical perspective. Seems to ignore the industrial, commercial, and business aspects.
  2. Discipline. Somewhat formal. Not clear when it would make sense. Quantum computing is certainly a discipline in the sense of being a branch of knowledge or area of study.
  3. Industry. Common usage today. But it seems too pretentious and presumptuous to me. Definitely hype. Quantum computing is substantial, but seems too insubstantial to be an entire industry. And definitely not an industry in the same sense as computing overall, software, or electronics.
  4. Sector. Quantum computing really is only one sector of the larger computing industry.
  5. Domain. A domain of knowledge and expertise. Too esoteric for most people.
  6. Realm. A realm of knowledge and expertise. Too esoteric for most people.

Overall, it’s a fielder’s choice at present, so you can’t be blamed or faulted for picking any of these categorizations.

But my clear choice is sector — quantum computing is a sector of the computing industry, the quantum computing sector.

Why am I still unable to write a brief introduction to quantum computing?

In all honesty, this paper is my best shot at providing an introduction to quantum computing. But I just wish that I could provide more of a brief introduction, like just a few pages, and that actually shows some working code, a working quantum algorithm. So why am I unable to do that, briefly? Here are the factors:

  1. Lack of a concise quantum Hello World program that effectively uses quantum parallelism to achieve quantum advantage. Not just a handful of quantum logic gates that illustrate basic functions, but an example that actually does something.
  2. Okay, there is one trivial but functional example: generate a random number. But that’s not representative of the grand promises of quantum computing.
  3. No coherent high-level programming model. Anything nontrivial quickly devolves into a blizzard of quantum logic gates.
  4. Lack of a great set of coherent quantum algorithmic building blocks. We need higher-level building blocks for quantum algorithms.
  5. No coherent methodology for mapping real-world problems to models readily implemented on a quantum computer. It’s a really hard problem.
  6. No practical real-world problems are solvable with so few qubits of such low quality and such weak connectivity.
  7. Difficulty achieving quantum advantage — the only reason to use a quantum computer.
  8. Need to combine classical code with the quantum algorithm it uses. It just gets a little too messy too quickly to show anything nontrivial.
  9. Need to illustrate the probabilistic nature of quantum computing. No easy deterministic results. Showing a graph of the distribution of quantum results doesn’t convey what a real quantum application would be doing.

List of my papers on quantum computing

The following document lists all of the informal papers I have written about quantum computing, although quite a few have already been mentioned and linked throughout this paper:

Definitions from my Quantum Computing Glossary

  1. quantum computation. Computation on a quantum computer. Execution of the quantum logic gates of a quantum program or quantum circuit, based on quantum code which implements quantum algorithms operating on quantum bits (qubits), which follow from the principles of quantum mechanics, including and especially superposition and entanglement. The essence of quantum computation is the precise control of the quantum states of qubits. See Chapter 6 Quantum Computation by John Preskill of CalTech. See the Wikipedia Quantum computing article. Abbreviated as QC.
  2. quantum computer. A machine or computer capable of quantum computation. A machine exploiting the principles of quantum mechanics to perform computation in a way that is significantly superior to a classical computer. A computer in which the basic unit of information is a superposition of 0 and 1, and supports entanglement of values. May be either a physical quantum computer or a quantum computer simulator. Generally, the former. See the Wikipedia Quantum computing article. Abbreviated as QC.
  3. quantum computing. Computing using a quantum computer. All of the activities which surround the use of a quantum computer, from design of quantum algorithms, to development of quantum programs, to execution on a quantum computer, to integration with classical computing, to post-processing of measured results from the quantum computer. And the entire quantum computing ecosystem. See also: quantum computation. Alternatively, includes design and development of the quantum computer itself rather than only the use of the quantum computer. Abbreviated as QC.

Source, my quantum computing glossary:

Jargon and acronyms — most of it can be ignored unless you really need it

This paper won’t dive deep into any of the jargon of quantum computing, but there is a section of jargon in the preceding paper:

At this stage, you can ignore all of those terms, but this gives you an idea what you are in for if you dive deeper into quantum computing.

If you do wish to understand the meaning of any of these terms, search for them in my glossary for quantum computing terms:

Alternatively, you can search for many of the terms or topics in the IBM Qiskit Textbook for tutorial information:

  • Learn Quantum Computation using Qiskit
  • Greetings from the Qiskit Community team! This textbook is a university quantum algorithms/computation course supplement based on Qiskit to help learn:
  • The mathematics behind quantum algorithms
  • Details about today’s non-fault-tolerant quantum devices
  • Writing code in Qiskit to implement quantum algorithms on IBM’s cloud quantum systems
  • https://qiskit.org/textbook/preface.html

My glossary of quantum computing terms

I have compiled a glossary of terms related to quantum computing. It includes terms from quantum mechanics and classical computing as well. It has overy 3,000 entries.

arXiv — the definitive source for preprints of research papers

Most of the academic and industrial research papers I read can be found on arXiv.org, which specializes in preprints of research papers.

If you do a Google search on some topic of quantum computing, results that come from arXiv.org can usually be trusted to be fairly definitive.

You can also just add the “arxiv” keyword to your Google search to force Google to find search results from arXiv.org.

To browse recent postings related to quantum physics (the narrowest category which covers quantum computing), view this page:

Free online books

Besides buying hardcopy and e-books, a number of books on quantum computing are available for free online.

A great list of such free books is from Shlomo Kashani:

Resources for quantum computing

There are a wealth of resources for quantum computing available online. I won’t endeavor to list even a tiny fraction of them here.

Personally, whenever I have a question about quantum computing I simply do a Google search and generally Google instantly points me to a number of resources directly addressing my question. Or even a direct answer box for my question.

I’ve already linked to a wide variety of resources throughout this paper.

Other resources which can be found online, often with just a simple Google search:

  • Books. Either complete and free books, or Amazon or author web pages, which sometimes offer free sample text or even chapters. I personally get a lot from Amazon’s “Look Inside” feature, including exploring the table of contents and searching for keywords in the text. For free online books, see the preceding section.
  • Lecture notes. Especially for notable professors. Just search for their name and keywords “lecture notes”.

Personas, use cases, and access patterns for quantum computing

In order to put quantum computing in context, we need to be able to discuss who is trying to do what and how they are going to do it. Personas are the who, use cases are the what, and access patterns are the how.

Personas, use cases, and access patterns lie at the intersection of workforce and applications, with personas corresponding to jobs, positions, or roles of the workforce, and use cases corresponding to applications.

For more discussion of personas, use cases, and access patterns for quantum computing, see my paper:

Beware of oddball, contrived computer science experiments

It is relatively easy to contrive artificial computer science experiments that seem rather impressive, but don’t represent any practical real-world problems or their solutions.

Google’s quantum supremacy experiment is one example. Boson sampling is another. These are very Interesting experiments for computer scientists, but they have no practical applications.

Beware of research projects masquerading as commercial companies

The hallmark of successful commercial companies is taking technology off the shelf or out of the textbook or research papers and rapidly turning it into viable commercial products, with little if any additional hardcore research required. The hallmark is a very rapid process from conception to delivery of a viable commercial product.

Unfortunately, too often people are tempted to move science out of the research laboratory and into product development before the underlying research is even close to being ready for commercialization. They will form a company around the research project and attempt to complete the research under the guise of doing commercial product development.

Commercial companies are not a great forum for completing substantial research projects. There is a high risk of either outright failure, or a very extended timeframe before the research really is ready to be turned into a viable commercial product.

It’s one thing for a Fortune 500 company to have a dedicated research department, but usually they are wise enough to complete the research on a project in a dedicated research laboratory setting before deciding to turn it into a commercial product.

Technology transfer

It is not uncommon for a research-oriented organization to develop some new technology and sense that it has potential commercial value, but recognize that realizing commercial value is beyond their charter, capabilities, and interests. Organizations finding themselves in such a situation might then consider technology transfer to a commercial organization who is more ideally positioned and equipped to realize the perceived commercial potential.

Of course, it is the job of the acquiring organization to do sufficient due diligence and evaluation to judge that any perceived commercial value is likely to be realized. Hence the term caveat emptorlet the buyer beware. Generally, the transferring organization is not in a position to adequately assess the likelihood of success at commercialization of the technology.

A methodical technology transfer process is a far cry from the preceding section where the research organization attempts to pursue the commercialization itself, and to do so before the research has been completed.

Some examples of organizations seeking to transfer technology:

  1. Academic research organizations.
  2. Corporate research organizations.
  3. Government research laboratories.
  4. Research laboratories at government agencies.

Research spinoffs

A research spinoff such as a university spinoff or academic spinoff is a variation of technology transfer where the research organization creates a commercial entity to pursue commercialization of the technology developed by the researchers.

It is critical to distinguish situations where the research has been completed and truly is ready to be commercialized from situations where the research remains ongoing and simply looks promising but must be completed by the spinoff company.

The latter case runs the grave risk mentioned in the previous section Beware of research projects masquerading as commercial companies.

But the former case has less risk.

It is highly desirable for the research spinoff to have commercial partners or commercial firms as investors to assure that commercial viability is carefully vetted to assure that the spinoff doesn’t end up simply being… a research project masquerading as a commercial company.

Don’t confuse passion for pragmatism

Enthusiasm is a good thing, but exuberant passion is not a valid substitute for pragmatism.

It is easy to be super-excited about the prospects for a new technology, but that passion for the promise can easily get unmoored from the reality of completing research, properly and fully evaluating the technology, and then finally planning and completing an industrial-grade commercialization project.

Quantum hype — enough said

There’s certainly a lot of hype out there on all things quantum. I do address a fair amount of it directly in my writing. But a lot of it doesn’t deserve our attention. Just ignore all of the noise.

I can fully understand when mainstream media falls into the hype trap, but unfortunately even tech and STEM media falls into the hype trap as well.

And it’s certainly not helpful when quantum vendor press releases lapse into extravagant promises and set unreasonable expectations.

I don’t expect hype itself to lead to a Quantum Winter, but unfulfilled promises and expectations could do the trick. I think we have another two to three years before any deep disenchantment is likely to set in, which is the predicate for a Quantum Winter. For now, most people are content to believe that the promises and expectations will be met just over the horizon, relatively soon even if not this year or next year.

For more on Quantum Winter, see my paper:

Should this paper be turned into a book?

There’s certainly enough material here to fill a book. Maybe I’ll go that route, but that would be a lot of additional effort with unclear benefits.

My default will be to decline to do a book at this stage, but I’ll be open to reconsider in the future.

My original proposal for this topic

For reference, here is the original proposal I had for this topic. It may have some value for some people wanting a more concise summary of this paper.

  • What is quantum computing? Both hardware and software. Lots of superficial and misleading puff pieces out there. See Why am I still unable to write a brief introduction to quantum computing? What is a quantum computer? How does quantum computing fit in with classical computing? What is a… Quantum application, Quantum program, Quantum circuit. Measurement. Quantum Fourier transform. Quantum phase estimation. Quantum parallelism. Interference. Phase. Entanglement. Probability amplitude. Probability. Superposition. Unitary transforms. Quantum logic gates. Rotations. Bloch sphere. Basis states. Qubits. Various formats… One paragraph. One-pager. Two-pager. Four-pager. 10-pager. 20-pager. 50–75 page mini-book. 100-page mini book. Brief glossary (or glossaries, plural) — 10 most essential terms — to drop at cocktail parties, 25 terms, 50 terms, 100 terms, 250 terms. Torn between describing quantum computing as it exists today versus a vision of what it will eventually be like once it becomes useful and supports production-scale applications and achieves dramatic quantum advantage.
  • What is quantum computing? Ditto, but the software more than the hardware. Lots of superficial and misleading puff pieces out there.

Summary and conclusions

For a good summary of this paper, see the Quantum computing in a nutshell section near the top. But just a few points here, in closing:

  1. Quantum parallelism is the secret sauce of quantum computing.
  2. Much progress has been made, but much more progress is needed.
  3. Yes, we do have real, working quantum computers, but they aren’t even close to being capable of practical quantum computing — enabling production deployment of production-scale practical real-world quantum applications.
  4. We’re still in the pre-commercialization phase of quantum computing where the focus is on research, prototyping, and experimentation.
  5. It’s too soon for commercialization of quantum computing. Too many unanswered questions. Too much research needed.
  6. Maybe in two to three years it might be appropriate to begin the commercialization stage of quantum computing.
  7. Quantum programming is much too difficult for mere mortals. Only the most elite and the lunatic fringe will thrive.
  8. 48 fully-connected near-perfect qubits might be good enough to be close to being a practical quantum computer. A real start, even if not enough for many applications.
  9. Quantum Winter is not imminent, but lots of promises need to be fulfilled over the next two to three years to avoid the level of disappointment and disappointment which could then lead to a Quantum Winter.

--

--

Freelance Consultant

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store