What Knowledge Is Needed to Deeply Comprehend Quantum Computing?

How much of quantum computing is real and how much is hype, fantasy, and wishful thinking? How can you tell? This informal paper examines what fields of knowledge you need to comprehend in significant depth to fully grasp the capabilities and limitations of quantum computing. Not in a shallow, casual, or superficial way, but in a very deep and very hard-core manner. Down to the deepest levels of the nitty-gritty, both software and hardware, and even down to the underlying physics, theory, and mathematics.

The goal here is to develop a comprehensive framework for asking the right questions about quantum computing, so that we can know what quantum computing really is, what makes it tick, what it really can or can’t do, and what it is best for or maybe even isn’t good for.

This paper won’t have the answers and the details of the knowledge needed to deeply understand quantum computing, but it will list the questions and topics that do appear to be needed to gain such understanding. Future papers will pursue answers and details. The point of this paper is to provide the framework for that pursuit of knowledge.

A fair amount of information in this paper was already presented in a companion paper, Knowledge Needed to Deeply Comprehend Digital Computing, to make the foundation more clear, but this paper should stand by itself with regard to quantum computing.

Work in progress

The lists in this paper will be incrementally updated as my reading, research, and thinking progresses. This paper is simply an initial stab at the problem.

Sorry, no specific answers or deep knowledge here

Again, the intention of this paper is not to have all the answers or even any of the answers, or even be an introductory tutorial for quantum computing.

Framework for knowledge needed to deeply comprehend quantum computing

Rather, the goal here is to kick-start the process of developing a full framework for asking the right questions needed to fully and deeply understand quantum computing.

Answers to these questions are indeed the ultimate goal, but unless we endeavor to ask all the right questions no amount of answers will help us achieve enlightenment about quantum computing.

Subsequent work

Update 9/3/2018: Some subsequent papers cover more specific detail:

How does quantum computing compare to traditional digital computing?

Rather than start from scratch and detail every tiny detail of knowledge needed to deeply comprehend quantum computing, it would seem to make sense to start by looking back at traditional digital computing and assessing how much these two forms of computing have in common.

My initial presumption is that they have a lot in common, possibly even much more in common than they are different. Maybe that’s not exactly true, but it’s a good starting point for inquiry and would certainly save a lot of time and energy if it is in fact true.

For sure, there are certainly at least some differences, very real differences, and very significant differences.

But it would sure make this paper a lot shorter to be able to focus on simply the differences rather than reinvent the entire wheel of traditional digital computing.

My conjecture is that despite the differences between these two forms of computing there are still a significant number of parallels. Call it comparative computing, if you will.

A companion paper, Knowledge Needed to Deeply Comprehend Digital Computing, already summarizes to a fair level of detail the knowledge needed to deeply understand traditional digital computing. That paper can be used as the foundation and starting point for this paper.

Learning quantum computing

There are two basic models for learning about quantum computing:

  1. Starting with a solid knowledge of all aspects of traditional digital computing. What incremental knowledge of quantum computing is needed to build on that existing foundation?
  2. Starting from scratch. What is the total knowledge needed to deeply comprehend quantum computing? Likely to be at least some overlap with knowledge of traditional digital computing, but the goal is only that knowledge needed for quantum computing.

Incremental knowledge needed for quantum computing

Given that one is reasonably knowledgeable about traditional digital computing, exactly how much additional knowledge must one master to become equally knowledgeable about quantum computing?

Learning about quantum computing from scratch

Granted, none of us is currently faced with a quantum-only future for computing, and some hybrid with traditional digital computing is likely for the foreseeable future, it is nonetheless an enlightening thought experiment to contemplate what knowledge one would need to acquire to deeply comprehend quantum computing if one had no interest or knowledge about traditional digital computing.

Is the knowledge needed to master quantum computing alone substantially less than the knowledge needed to master traditional digital computing?

Would it be substantially easier for young students to learn only quantum computing?

Might it be better for young students to start with only quantum computing?

Or, is the reality that for the foreseeable future a hybrid world is the inevitable reality?

Which is the better foundation for the other, quantum computing built on a foundation of traditional digital computing, or traditional digital computing built on a fresh foundation of quantum computing?

Initial questions about quantum computing

I want to be able to answer any and all fundamental questions about quantum computing.

My initial set of questions about quantum computing includes but is certainly not limited to:

[Update: The questions on the list below have been move to a newer paper, Questions About Quantum Computing. There will be no further updates to the list here. It is provided strictly for historical reference, reflecting the list at the time this paper was originally written.]

  1. What is a quantum computer? Basic definition, but technically robust and free of hype or vagueness.
  2. What is quantum computing? Basic definition, but technically robust and free of hype or vagueness.
  3. How is quantum computing distinct from traditional digital computing?
  4. What can a quantum computer compute that a traditional digital computer cannot?
  5. What can a quantum computer do better than a traditional digital computer?
  6. Is speed the only truly significant advantage of a quantum computer?
  7. Is the concept of digital even relevant to quantum computing?
  8. Do quantum and traditional digital computers have more in common or more that differentiates them from each other?
  9. Is a qubit still considered digital?
  10. Is a qubit still considered discrete?
  11. Is a quantum computer still a digital computer?
  12. What precisely does digital mean?
  13. What precisely does digital computing mean?
  14. Is a quantum computer still a discrete computer (ala digital) or can it compute continuous data as an analog computer does?
  15. Can a quantum computer compute directly from continuous values from analog sensors, such as a voltage, current, temperature, audio, and video, or is an intermediate conversion to a discrete or digital value needed?
  16. How does a quantum computer handle analog to digital and digital to analog conversions?
  17. What operations can a quantum computer perform compared to operations that a traditional digital computer can perform?
  18. What data types, operators, and functions does a quantum computer support, compared to a traditional digital computer?
  19. Does a quantum computer perform Boolean logic (AND, OR, NOT with true and false — not to be confused with binary digits of 0 and 1), such as evaluating complex conditions, comparable to a traditional digital computer?
  20. Does a quantum computer have a processing unit comparable to the central processing unit (CPU) of a traditional digital computer?
  21. Does a quantum computer have an arithmetic and logic processing unit comparable to the arithmetic and logic unit (ALU) of a traditional digital computer?
  22. Does a quantum computer have registers for small amounts of data comparable to a traditional digital computer?
  23. Does a quantum computer have memory (for large volumes of data) comparable to a traditional digital computer?
  24. How is memory of a quantum computer (for large volumes of data) organized?
  25. Do quantum computers support virtual memory and paging?
  26. Does a quantum computer have a byte or word size or data path width comparable to a traditional digital computer (8, 16, 32, 64, or 128)?
  27. Does a quantum computer have addresses or pointers width comparable to a traditional digital computer? What width or range?
  28. Can a quantum computer compute values which cannot be represented on a traditional digital computer?
  29. How does a quantum computer represent real numbers (non-integers)?
  30. What are the largest and smallest real numbers (non-integers) that a quantum computer can represent?
  31. How many digits of precision can a quantum computer represent and compute for real numbers (non-integers)?
  32. What number base does a quantum computer use for real numbers (non-integers), 2, 10, or what?
  33. What does a quantum computer compute for 1.0 divided by 3.0, which has an infinite number of repeating digits?
  34. What does a quantum computer compute for 1.0 divided by 3.0 times 3.0–1.0 or 0.9999…?
  35. Does a quantum computer use so-called floating point arithmetic for real numbers like a traditional digital computer? Base 2, or what?
  36. How does a quantum computer compute infinite Taylor series expansions, compared to a traditional digital computer?
  37. What will a quantum computer compute for SQRT(2.0), which is an irrational number with infinite digits? How will it compare to a traditional digital computer?
  38. Can a quantum computer calculate SQRT(-1), otherwise known as i, since quantum mechanics is based on complex and imaginary numbers?
  39. Do quantum computers have more than one precision (bit width) for representing real numbers (non-integers)?
  40. Does a quantum computer compute with complex numbers more efficiently than a traditional digital computer, especially since complex numbers are the basis for quantum mechanics, which quantum computers are supposedly based on?
  41. How much formal knowledge of quantum mechanics does one need to fully and deeply comprehend to fully and deeply grasp all nuances of quantum computing?
  42. Can a quantum computer compute values which cannot be comprehended by a human being?
  43. How are quantum algorithms different from or similar to comparable algorithms of traditional digital computing?
  44. Can all, some, or no quantum algorithms be simulated (albeit much more slowly) on a traditional digital computer? What factors are involved in whether or how effectively any simulation can be performed?
  45. What slowdown factor can or should be expected when simulating a quantum computer (or quantum algorithm) on a traditional digital computer?
  46. What are the more common design patterns for algorithms for quantum computing?
  47. How practical is a quantum computer?
  48. How expensive is a quantum computer for a given task, especially compared to the cost of traditional digital computing?
  49. How much power (energy) does a quantum computer require for a given task?
  50. How large is a quantum computer for a given task?
  51. What technologies and knowledge are needed to design and produce a quantum computer?
  52. What physics knowledge is needed to design and produce a quantum computer?
  53. What physics knowledge is needed to understand how to use a quantum computer?
  54. What physics knowledge is needed to understand how to program a quantum computer?
  55. What mathematics knowledge is needed to design and produce a quantum computer?
  56. What mathematics knowledge is needed to understand how to use a quantum computer?
  57. What mathematics knowledge is needed to understand how to program a quantum computer?
  58. How much knowledge and skill with linear algebra (eigenfunctions, eigenvalues, Fourier transformations) is needed to be very skilled with quantum computing?
  59. How much knowledge and skill with vector spaces and quantum operators is needed to be very skilled with quantum computing?
  60. What kind of operating system is needed to run a quantum computer? What are the major components and features of a quantum operating system?
  61. Is a quantum computer more of a co-processor to be associated with a traditional general purpose digital computer, or can a quantum computer fully replace a traditional general purpose traditional digital computer?
  62. Does it make sense to speak of a grid of interoperating quantum computers or even a distributed cloud of quantum computers, or is each quantum computer a world of its own and unable to interact with another quantum computer except through an intermediary traditional digital computer or other custom traditional digital electronic or optical circuitry?
  63. Can quantum computers be networked comparable to local, wide area, and Internet networking of traditional digital computers?
  64. Does the concept of a website make sense for quantum computing? [imaginary or complex web pages??!! Just kidding.]
  65. Does the concept of networking protocols make sense for quantum computing?
  66. Would traditional network routers be relevant to networking of quantum computers?
  67. Can two or more quantum computers directly exchange qubits via quantum communication, or is some translation to and from digital format required to make the transition?
  68. Is coherence (technically, quantum decoherence) a fundamental limit or upper bound to quantum computing or simply a short-term technical matter that will be resolved soon enough?
  69. What degree of coherence can be expected over the next few to five to ten years?
  70. Is room temperature quantum computing even theoretically practical? How soon?
  71. What temperature of quantum computing will be practical over the next few to five to ten years?
  72. How much data can a quantum computer process, such as a large database or search engine, compared to the disk, flash memory, and main memory of a traditional digital computer, now and for the next few to five to ten years?
  73. What applications are most suitable for quantum computing?
  74. Are all applications suitable for quantum computing?
  75. Are any applications particularly unsuitable for quantum computing?
  76. What specific criteria can be used to determine the degree of suitability of an application for quantum computing?
  77. Can an automated profiling tool be used to determine the degree of suitability of a particular application or algorithm for quantum computing?
  78. What programming languages can be used for quantum computing?
  79. What programming languages are best or optimal for quantum computing?
  80. Is there a machine language and assembly language for quantum computing?
  81. How similar or dissimilar are quantum computers from different labs, designers, or vendors?
  82. What components are standard across all or most quantum computers?
  83. Can quantum computers run on relatively small batteries, or do they need a robust AC power source?
  84. Do quantum computers use direct current (DC)?
  85. What voltage levels do quantum computers operate on?
  86. Is statistical processing the same or different for quantum computing in contrast with traditional digital computing?
  87. How would a quantum computer compute the median (not mean or average, although those are of interest too) of a very large set of numbers or character strings? How would the performance differ from a traditional digital computer?
  88. Are all aspects of mathematics equally applicable to quantum computing and traditional digital computing?
  89. Does cybersecurity apply equally to quantum computing as to traditional digital computing?
  90. What is the quantum computing equivalent of a traditional digital Turing machine?
  91. Would a quantum computer perform natural language processing (NLP) in a qualitatively better way than a traditional digital computer?
  92. What specific aspects of artificial intelligence is quantum computing especially well-adapted for?
  93. What debugging and testing features and tools does a quantum computer provide?
  94. Can a quantum program be single-stepped to see all state changes and logic flow? Or, can this at least be done in a simulator for the quantum computer? Presumably not for the former, hopefully so for the latter.
  95. Can the full state of a quantum computer be dumped or otherwise captured for examination, analysis, debugging, and testing, or does the Heisenberg uncertainty principle and superposition preclude this?
  96. Can a quantum algorithm be interrupted and its state saved and later restored to resume where it left off?
  97. How does fabrication of chips and circuits differ for quantum computing compared to a traditional digital computer?
  98. How is color represented in a quantum computer, compared to RGB and other color models used by traditional digital computing?
  99. Would quantum computers still use pixels for representing images and video?
  100. How would audio be represented and processed in a quantum computer?
  101. Is there a decent and comprehensive glossary for quantum computing? Update: I’ve compiled an initial draft for such a glossary: Quantum Computing Glossary — Introduction.
  102. Is there a decent and comprehensive glossary for all aspects of quantum mechanics that is needed to fully comprehend quantum computing? Update: I’ve compiled an initial draft for such a glossary: Quantum Computing Glossary — Introduction.
  103. Is there a decent and robust introductory overview of quantum computing? Puff pieces and hype not welcome. Wikipedia entry is weak. Dense, academic textbooks have their place, but the question here is a decent introductory overview that is free of hype and adequately explains the technical differences from traditional digital computing.
  104. How much of quantum computing applications can be cost-effectively addressed using massively parallel grids of very small and very cheap traditional digital computers?
  105. Which is more cost effective, owning or leasing quantum computers?
  106. Is time-sharing and on-demand shared online access to quantum computers more cost effective and more efficient than either owning or leasing?
  107. How can the computational complexity of quantum computers best be described — polynomial (P), nondeterministic polynomial (NP), NP-Complete, NP-Hard, or what?
  108. What thought processes are needed to solve problems with a quantum computer, and how do they compare or contrast with the thought processes for solving problems with traditional digital computers?
  109. Is the concept of a user interface or user experience (UX) relevant to quantum computing?
  110. What logic gates are needed for quantum computing and how do they compare to the logic gates of traditional digital computing?
  111. What knowledge of logic gates is needed to develop algorithms and program applications for a quantum computer? Or are there more programmer-friendly higher-level language operators?
  112. What is the smallest quantum computing gate possible compared to the smallest traditional digital computing gate?
  113. What is the smallest qubit possible compared to the smallest traditional digital computing bit or memory cell?
  114. What is the fastest quantum computing gate possible compared to the fastest traditional digital computing gate?
  115. What is the fastest qubit possible compared to the fastest traditional digital computing bit or memory cell?
  116. What is the shortest quantum computing gate connection possible compared to the shortest traditional digital computing gate connection?
  117. What is the thinnest quantum computing gate connection possible compared to the thinnest traditional digital computing gate connection?
  118. Are there frameworks for quantum computing applications?
  119. Does the concept of software still apply for quantum computing? How might it differ from traditional digital computing?
  120. Does the concept of the software development life cycle (SDLC) (or systems software development life cycle) still apply to quantum computing? How might it differ from traditional digital computing?
  121. Does the concept of software engineering still apply to quantum computing? How might it differ from traditional digital computing?
  122. What problems are easiest to formulate for a quantum computer?
  123. What’s the preferred abbreviation for the term quantum computer?
  124. Is qbit still an acceptable variant of the term qubit?
  125. Is there a precise, correct technical term for each of the superposition terms or states in the value of a qubit? What’s a piece of a qubit?
  126. What is the precise technical term for a memory cell that holds a qubit as opposed to the value which is held in such a memory cell?
  127. How much energy is needed to represent a single qubit?
  128. How much energy is needed to represent a single value of a superposition in a qubit?
  129. Is quantum computing inherently a lot more energy efficient than traditional digital computing?
  130. Is superposition essentially free and cheap, or inherently costly?
  131. What is the technical definition of a quantum processor? How does that differ from a full quantum computer? Block diagram please.
  132. Is the cloud the best place for a quantum computer to live, as opposed to the office, your desk, or your hand? What are reasonable places for a quantum computer to live?
  133. Any github repositories for open source quantum computing work? Algorithms, code, libraries, etc.
  134. Are there any open source quantum computer designs?
  135. What are some examples of what a quantum computer could do given Hubble space telescope data (or other space telescopes)? Would they work best with the raw data or would some preprocessing be needed? Would one or more quantum computers be needed for all processing of space telescope data?
  136. Who are the top quantum computing vendors?
  137. Who are the hot startups in the quantum computing space?
  138. What are the use cases for quantum computing? Types, categories, or styles of applications.
  139. Who might use a quantum computer?
  140. How might someone use a quantum computer?
  141. Are quantum computers really probabilistic rather than deterministic? What does that actually mean? How does that limit or expand capabilities and applications?
  142. Is general-purpose quantum computing an oxymoron? Can it ever be achieved? Will it never be achieved? Based on what thinking, rationale, or logic?
  143. Can a Turing machine be simulated (or just implemented) on a quantum computer?
  144. Can finite state automata and pushdown automata be simulated or just implemented on a quantum computer?
  145. What stage of development has quantum computing achieved? Compared to traditional digital computing, what year or decade is quantum computing at — 1950, 1960, 1940, 1930??
  146. What technological, theoretical, and physical issues are constraining quantum computing at this stage?
  147. In what areas are dramatic breakthroughs required before quantum computing can come out of the shadows?
  148. How deeply does one need to comprehend Bell’s inequality to deeply comprehend quantum computing?
  149. Are all forms of quantum computers based on spin states? Could a quantum computer be based on a quantum state other than spin? Is spin only one of many possibilities for quantum computing?
  150. How much does one need to know about magnetism to comprehend quantum computing?
  151. Is the knowledge needed to master quantum computing substantially less than the knowledge needed to master traditional digital computing?
  152. Would it be substantially easier for young students to learn only quantum computing?
  153. Might it be better for young students to start with only quantum computing?
  154. Is the reality that for the foreseeable future a hybrid world is the inevitable reality?
  155. Is a hybrid of quantum computing and traditional digital computing a really good thing, a really bad thing, or an indeterminate mixed bag?
  156. Is SQL and the relational database model equally relevant for quantum computing, less relevant, or even more relevant?
  157. Will quantum computing mean the end of strong encryption?
  158. How credible is most of the hype about quantum computing?
  159. How much of the narratives about the promise of quantum computing is credible or has been proven to be true?
  160. How large a program can a quantum computer accommodate? How many instructions or operations, or lines of code?
  161. How complex can a quantum algorithm be?
  162. Do quantum computers support function calls and recursion?
  163. Do quantum computers support arrays, structures, and objects?
  164. How is code represented in a quantum computer? Same as in a traditional digital computer (bits) or using quantum computing qubits?
  165. Does a quantum computer have the equivalent of processes and threads of traditional digital computing?
  166. Does or could a quantum computer have a file system?
  167. Does or could a quantum computer have mass storage? Is it persistent or transient (only while a program is running)?
  168. How fast can a quantum computer count?
  169. How fast can a quantum computer divide two numbers? Such as to verify whether X is a factor of Y.
  170. What arithmetic and mathematical operations are most natural for a quantum computer?
  171. Do quantum computers have any special advantage for arithmetic and mathematics, other than raw speed of basic operations?
  172. How much of information theory (Shannon, et al) is still relevant to quantum computing? Or how might it be different (e.g., quantum communication)?
  173. How consequential is the length of interconnections between qubits? Does packing them much more densely yield a large benefit? Does separating them a modest amount have a large penalty?
  174. Might wafer-scale integration (WSI) have any significant benefit, performance or otherwise?
  175. Does a quantum computer have to operate within a Faraday cage, to shield from ambient electromagnetic radiation? Is the just the core quantum computer itself, the qubits, enclosed within a Faraday cage, or does the whole operating environment or entire room have to be in a Faraday cage?
  176. Do qubits require Josephson junctions, or is that just one option?

This informal paper will not attempt to answer any of those or other questions. That will be left for future papers. The goal here is simply to present a framework and point of departure for further inquiry into the nature of quantum computing.

I considered organizing that list into categories with separate lists, but the categories would be somewhat blurred and even overlap [like a superposition of quantum states!], so I opted to stick with a simpler single list, at least for now, but as work progresses a more clear categorization may emerge.

And that list lacks any clear order as well. I originally attempted to order it, but a lot of questions defied any clear order, so it ended up simply being the order that the questions occurred to me. Over time the ordering may improve.

Classic or traditional digital computing

We need a decent term to refer to non-quantum computers — the kind of computers most of us use today.

An unresolved issue here is whether there is any distinction between the terms:

  1. Digital computing, digital computer.
  2. Traditional digital computing, traditional digital computer.
  3. Classic computing, classic computer.
  4. Classical computing, classical computer.
  5. Classical digital computing, classical digital computer.
  6. Ordinary computing, ordinary computer.
  7. Regular computing, regular computer.
  8. Existing computing, existing computer.
  9. Standard computing, standard computer.

I’ve seen the term classical computing used in online discussion of quantum computing.

At this stage, it does seem that traditional and classical (or classic) are all equally valid modifiers to distinguish traditional digital computing from newfangled quantum computing.

That said, I’ll stick with traditional digital computing in my own writing, to distinguish that it is strictly digital and that it is not the new form of computing currently called quantum computing.

I also see that the terms quantum algorithms and classical algorithms are now used as well.

What is a quantum computer?

This paper is not intended to answer any questions about quantum computing, including basic definitions such as the meaning of the terms quantum computer and quantum computing.

But a future paper will indeed provide a robust and technically accurate definition for a quantum computer.

What is a digital computer?

The original intention of the term digital computer was to contrast with analog computer. The latter processes information as continuously varying electrical signals, commonly voltages, while the former processes information in discrete values, commonly as integers represented as sequences of the binary digits 0 and 1.

Generally a digital computer is an electronic digital computer, meaning that information is represented and transferred in the form of electrons. Although information can also be stored magnetically (disk and tape), which technically is not electronic.

Older computers were electromechanical, such as using relays.

Some computing research seeks to create photonic computers, also known as optical computers. They would still be digital computers, but would not be electronic digital computers since they would rely primarily on photons rather than electrons.

Is a quantum computer a digital computer?

Whether quantum computers are still digital computers is an open question, particularly since their reliance on the quantum mechanical property of superposition means that technically they are not operating on discrete values since a given value can be a superposition of many values.

In a traditional, electronic digital computer a bit is either a 0 or a 1, one or the other, while in a quantum computer a bit or qubit could be in any number of states rather than in only a single state.

This paper is not intending to take a firm stance on whether a quantum computer is or isn’t still a digital computer. A follow-up paper will indeed take a clear position on this matter, although it may end up being a somewhat arbitrary distinction and even a matter of debate and dispute.

Generally, data fed into a quantum computer or read out of a quantum computer will be in some digital format that will indeed be discrete values composed of sequences of strictly digital, binary 0’s and 1’s, but I/O generally isn’t the basis for defining how a computer operates, internally.

What does digital mean in the context of quantum computers?

Beyond how a digital computer processes and represents information internally, business, commerce, government, and consumers now refer to digital in many ways, most commonly simply to distinguish products and services which exist only online or only in electronic form, in contrast with real or physical products and services.

Also, media such as photos, images, audio, and video are now digitized in digital form rather than the analog forms common in the 1960’s.

Technically, that use of digital is still fully compatible with how a digital computer functions internally since data is still represented as discrete values.

But in the world of quantum computers, will digital in the real world still have the same meaning?

Technically, as currently implemented, today’s quantum computers process information internally in quantum format which uses superposition rather than discrete digital values, but the process of transferring information into a quantum computer or out of a quantum computer currently requires conversion from and to traditional, discrete digital format.

At least for the foreseeable future, any interactions between people or businesses in the real world and quantum computers would still require the use of digital data formats and protocols, so referring to business, products, and services as digital will still be sensible even in the world of quantum computing.

But data inside of a quantum computer would commonly not be discrete or technically digital.

In short, information on the outside would still be digital, but information on the inside would not be digital, strictly speaking.

Definition of quantum computing

There are already a variety of definitions for the terms quantum computing and quantum computer floating around, but I find them vague and otherwise lacking in precision and failing to enlighten anyone in any solidly meaningful technical sense.

People can read those definitions and pretend that they know what a quantum computing is or must be, but they will be deluding themselves. That’s the nature of the hype phase for a new technology.

Eventually I will indeed offer up my own definitions for these and related terms, but for now I’ll endeavor to exert a little discipline and refrain from prematurely adding to the noise around quantum computing.

For now, all I will say is that there is a need for much better definitions of these terms.

That said, here are two definitions (or at least characterizations) from Gartner, a fairly reputable information technology research organization:

  • A quantum computer uses atomic quantum states to effect computation. Data is held in qubits (quantum bits), which have the ability to hold all possible states simultaneously. This property, known as “superposition,” gives quantum computers the ability to operate exponentially faster than conventional computers as word length is increased. Data held in qubits is affected by data held in other qubits, even when physically separated. This effect is known as “entanglement.” Achieving both superposition and entanglement is extremely challenging.
  • https://www.gartner.com/it-glossary/quantum-computing/

Also from Gartner:

  • Quantum computing is a type of nonclassical computing that is based on the quantum state of subatomic particles. Quantum computing is fundamentally different from classic computers, which operate using binary bits. This means the bits are either 0 or 1, true or false, positive or negative. However, in quantum computing, the bit is referred to as a quantum bit or qubit. Unlike the strictly binary bits of classic computing, qubits can represent 1 or 0 or a superposition of both partly 0 and partly 1 at the same time.
  • https://www.gartner.com/smarterwithgartner/the-cios-guide-to-quantum-computing/

Consistent with my previous comment, these are not bad definitions per se, and may be quite reasonable given the current state of hype, but nonetheless I find them a bit too vague, simplistic, unrealistic, and unenlightening — for my own personal tastes.

Roles which need to comprehend quantum computing

Not everybody who uses, programs, designs, builds, purchases, deploys, maintains, or repairs a quantum computer will need to have the same full depth of knowledge of all aspects of quantum computing.

A robust but not necessarily comprehensive sampling of the roles includes:

  1. Scientists. Especially physicists.
  2. Engineers. Electrical engineers. Mechanical engineers.
  3. Mathematicians.
  4. Software designers.
  5. Software developers. Programmers. Coders.
  6. Software engineers.
  7. Software test engineers.
  8. Software testers.
  9. Application software developers.
  10. Technical writers.
  11. Managers.
  12. Project managers.
  13. Technical supervisors.
  14. Middle level technical managers.
  15. Product managers.
  16. Corporate executive management.
  17. User experience designers.
  18. Test engineers.
  19. IT specialists.
  20. Managers of IT specialists.
  21. Chief technology officers.
  22. Users.
  23. Managers of users.
  24. Buyers of computers and software.
  25. Marketing.
  26. Sales.
  27. Investors.
  28. Shareholders.
  29. Public policy staff.
  30. Government officials.
  31. Legislators.

How much or what each of these roles needs to know about quantum computing remains to be seen.

Fields needed to fully and deeply comprehend quantum computing

What fields will one have to be fairly knowledgeable about to fully and deeply comprehend quantum computing?

The list presented here is the comparable list for traditional digital computing. The open issue is which of these fields is also needed to comprehend the full scope of quantum computing.

Think of it as a check list of the fields to be considered. It may well be that any given field on this list is not needed, but making that determination will require a fairly deep comprehension of quantum computing, which I do not yet possess.

This is a semi-comprehensive list of the fields of study that are likely to encompass the vast bulk of everything that goes on inside of a traditional digital computer:

  1. Physics
  2. Chemistry of materials
  3. Semiconductor fabrication
  4. Electrical engineering
  5. Computer engineering
  6. Mechanical engineering
  7. Robotics
  8. Mathematics
  9. Computer science
  10. Software engineering
  11. Circuit design — digital and analog
  12. Energy sources — AC, batteries, solar, power management, and heat dissipation
  13. Psychology
  14. Graphic design
  15. User experience design
  16. Marketing
  17. Sales
  18. Support
  19. Philosophy
  20. Law
  21. Sociology
  22. Neuroscience

General areas

These are some or most of the general areas of knowledge and expertise needed to fully and deeply comprehend traditional digital computing; it remains to be seen which are equally relevant or relevant at all to a deep comprehension of quantum computing:

  1. Physics — Newtonian mechanics, electricity, magnetism
  2. Physics — Solid state physics
  3. Physics — Quantum mechanics
  4. Physics — Quantum field theory
  5. Physics — Flow of electrons in wires
  6. Physics — Flow of photons in waveguides, optical cables
  7. Physics — Flow of heat
  8. Chemistry of materials — Conductors
  9. Chemistry of materials — Insulators
  10. Chemistry of materials — Semiconductors
  11. Chemistry of materials — Batteries
  12. Chemistry of materials — Light sensitivity
  13. Chemistry of materials — Color properties
  14. Chemistry of materials — Solutions and solvents
  15. Software design
  16. Software architecture
  17. Math — Number theory
  18. Math — Real numbers
  19. Math — Rational numbers
  20. Math — Irrational numbers
  21. Math — Sets
  22. Math — Complex numbers
  23. Math — Algebra
  24. Math — Geometry
  25. Math — Trigonometry
  26. Math — Transcendental functions
  27. Math — Calculus
  28. Math — Probability
  29. Math — Statistics
  30. Math — Logic
  31. Math — Computational complexity
  32. Math — Linear algebra
  33. Computer science — Algorithms
  34. Computer science — Data structures
  35. Databases
  36. Distributed databases
  37. Blockchain and other distributed transaction ledger technologies
  38. Data modeling
  39. Search engines
  40. Artificial intelligence (AI)
  41. Machine intelligence
  42. Machine learning

Specific topic areas

These are many of the specific topic areas that span the full range of knowledge and expertise needed to be needed to fully master all aspects of traditional digital computing; it remains to be seen which are equally relevant or relevant at all to a deep comprehension of quantum computing:

  1. Electrons
  2. Photons
  3. Waves and particles
  4. Charge
  5. Spin
  6. Special relativity
  7. General relativity
  8. Quantum nature of electromagnetic radiation
  9. Electric field
  10. Magnetic field
  11. Potential
  12. Voltage
  13. Current
  14. Resistance
  15. Capacitance
  16. Magnetism for storage
  17. Discrete electronic components
  18. Data sheets for specification of electronic components
  19. Design of logic gates
  20. Design with logic gates
  21. Resistors, capacitors, inductors, capacitors, diodes, transistors, crystals, switches
  22. Design of transistors
  23. Design with transistors
  24. Physics of transistors
  25. Analog electronic components
  26. Wires and cables
  27. Plugs, sockets, and connectors
  28. Switches and buttons
  29. Electronic digital computing
  30. Photonic digital computing
  31. The many ways a bit can be represented, stored, transferred, and operated on.
  32. Bits, bytes, words, nibbles, bit strings, characters, character strings.
  33. Decimal, binary, hex, octal, Boolean, enumeration, integer, floating point, arbitrary decimal precision, and complex number and data representations.
  34. Distinction between binary, bit, and Boolean
  35. What precisely does digital mean?
  36. What precisely does digital computing mean?
  37. Memory models
  38. Transient memory
  39. Static memory
  40. Dynamic memory
  41. Memory refresh
  42. Associative memory
  43. Parity and parity errors
  44. Error correction code (ECC) memory
  45. Cache
  46. Data cache
  47. Instruction cache
  48. Random access memory
  49. Memory management — pages and segments, protection, sharing
  50. Virtual memory and paging
  51. Flash memory
  52. Mass storage
  53. Rotating storage, spinning storage
  54. Latency
  55. Seek time
  56. Tape storage
  57. Central processing unit (CPU)
  58. Arithmetic and Logic Unit (ALU)
  59. Chip design
  60. Chip layout
  61. Circuit board layout
  62. Circuit board fabrication
  63. Encryption
  64. Language translation
  65. Software modules
  66. Software APIs
  67. Code libraries
  68. Application frameworks
  69. Network service APIs
  70. REST APIs
  71. Software Testing
  72. Software performance characterization and testing
  73. Software capacity characterization and testing
  74. Algorithm design
  75. Specific algorithms
  76. Computer science — Complexity theory
  77. Arithmetic expressions, operators, and functions
  78. Boolean logic expressions, operators, and functions
  79. Bit and bit string expressions, operators, and functions
  80. Logical expressions, operators, and functions
  81. Character expressions, operators, and functions
  82. String expressions, operators, and functions
  83. Operations
  84. Operation codes (opcodes)
  85. Data types
  86. Data type conversion — implicit and explicit
  87. Language theory
  88. Languages — Human
  89. Languages — Computer programming
  90. Languages — specialized
  91. Language grammars
  92. Unrestricted languages and grammars, Type-0 languages
  93. Context-sensitive languages and grammars, Type-1 languages
  94. Context-free languages and grammars, Type-2 languages
  95. Regular languages and grammars, Type-3 languages
  96. Regular expressions, regex
  97. Language parsing
  98. Language lexemes and tokens
  99. Language parse trees
  100. Language syntax
  101. Language semantics
  102. Language meaning
  103. Compilers
  104. Language translators
  105. Code generation, code generators
  106. Code optimization
  107. Runtime systems
  108. Interpreters
  109. Virtual machines
  110. Knowledge
  111. Knowledge representation
  112. Knowledge semantics
  113. Knowledge meaning
  114. AI
  115. Machine intelligence
  116. Machine learning
  117. Sensors
  118. Real-time systems and programming
  119. Input and output devices
  120. Internet of Things (IoT) devices
  121. Math — Computability theory
  122. Math — NP-complete
  123. Coding algorithms
  124. Specific programming languages
  125. High-level languages
  126. Assembly language programming
  127. Machine language programming
  128. Layout of machine code
  129. Dumps, hex dumps
  130. Command languages
  131. Shell scripts
  132. Protocols for data transmission
  133. Data persistence
  134. State
  135. Data formats
  136. Data formats for transmission
  137. Data formats for storage
  138. Data structures in general
  139. Data structure fields
  140. Data structure design
  141. Specific data structures — arrays, lists, sets, maps, heaps, hash tables, graphs, directed graphs, directed acyclic graph, trees, Merkle trees, blocks, records, rows, columns
  142. Objects and classes
  143. Object-oriented programming
  144. Class functions
  145. Constants
  146. Symbolic constants
  147. Arrays
  148. Lists
  149. Trees
  150. Records
  151. Rows
  152. Packets
  153. Database principles
  154. Data model design principles
  155. Data consistency
  156. ACID data principles
  157. Relational database principles
  158. Query languages
  159. SQL
  160. NoSQL
  161. Security
  162. Privacy
  163. Data protection
  164. Data privileges
  165. Code protection
  166. Code privileges
  167. Code synchronization
  168. Distributed computing
  169. CAP theorem
  170. Data partition
  171. Data replication
  172. Server mirroring
  173. Data centers
  174. Cloud computing
  175. Clusters
  176. Nodes
  177. Single point of failure (SPOF)
  178. Client applications
  179. Middleware
  180. Writing requirements specifications
  181. Writing design specifications
  182. Writing architectural specifications
  183. Designing user interfaces
  184. Testing software
  185. Installing a computer
  186. Using a computer
  187. Maintaining a computer
  188. Human computer interaction (HCI)
  189. Human factors
  190. Response time
  191. Latency
  192. Automata
  193. Turing machines
  194. Finite state machine, automaton
  195. Pushdown automaton
  196. Co-processors
  197. Auxiliary processing units
  198. GPU (Graphics Processing Unit)
  199. Central processor architecture
  200. Central processor block diagram
  201. Multi-core processors
  202. Hyperthreaded processing
  203. Parallel processing
  204. Supercomputers
  205. Microprocessors
  206. Embedded computers
  207. Data flow diagrams
  208. Flow charts
  209. Block diagrams
  210. Abstraction
  211. Representation
  212. Markup languages
  213. Specific markup languages — SGML, HTML, XML
  214. Graphic representations — 2D and 3D
  215. Picture and photographic representations
  216. Audio and video representations
  217. Analog and digital conversion for speakers, microphones, and cameras (still and motion video)
  218. Color models — RGB, HSV, HSL, CMYK
  219. Pixels
  220. Displays
  221. Keyboards
  222. Pointing devices
  223. Software tools
  224. Source code control
  225. Configuration management
  226. Debugging features and tools
  227. Testing features and tools
  228. Nondeterministic polynomial time (NP) problems
  229. NP-Complete problems
  230. NP-Hard problems
  231. Polynomial time (P) problems
  232. Church-Turing thesis
  233. Turing test for intelligence
  234. Hilbert space
  235. Hamiltonian energy operator
  236. Hermitian operators
  237. Adjoint operators
  238. Fourier transforms
  239. Heat flow
  240. Heat dissipation
  241. Cooling
  242. Websites, web pages
  243. Blogs
  244. E-commerce
  245. Email
  246. Social media
  247. Backup
  248. Checkpoint and restart
  249. Undo and redo
  250. Audit trail logging
  251. Event tracing
  252. Information theory (Shannon, et al) and the intersection of communication and computing; data and information coding

Specific questions, unknowns, or issues

These are some of the specific questions and issues beyond all of the general and specific areas of interest to traditional digital computing; it remains to be seen which are equally relevant or relevant at all to a deep comprehension of quantum computing:

  1. Smallest transistor possible.
  2. Fastest transistor possible.
  3. Smallest logic gates (AND, OR, NOT, flip flop) possible.
  4. Fastest logic gates possible.
  5. Thinnest wire or connection possible between gates.
  6. Shortest wire or connection possible between gates.
  7. Smallest memory cell possible.
  8. Fastest memory cell possible.
  9. Impact of cosmic radiation on electronic components.
  10. Impact of background radiation on electronic components.
  11. Impact of impurities on electronic components.
  12. Speed of electron flow within and between electronic components.
  13. Impact of speed of light on electronic circuits.
  14. Impact of neutrinos on electronic circuits.
  15. Generation of random numbers. Quality of randomness or pseudorandomness.
  16. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by even the most sophisticated professional or certified genius.
  17. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by an average professional.
  18. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by even a team of the most sophisticated professionals.
  19. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by a team of average professionals.
  20. All of the above applied to photonic components as well.
  21. What makes metals such as copper, aluminum, silver, and gold be conductors?
  22. What makes materials such as silicon dioxide be insulators?
  23. What makes materials be semiconductors?
  24. What thought processes are needed to solve problems with a traditional digital computer?
  25. How does a transistor really work (physics)?
  26. What’s so special about electrons (physics)?

Quantum mechanics

My presumption is that a deep understanding of quantum mechanics is essential to a deep understanding of quantum computing. Again, my interest is not understanding quantum computing at a shallow level, the way an undergraduate computer science major would understand traditional digital computing, but to understand at a deep enough level to sift through and dispel any and all hype.

Resources

There are so many online resources for quantum computing that it is difficult to know where to start. To a large extent, it depends on what your objectives and immediate needs are.

This paper won’t endeavor to catalog or even highlight the more notable resources.

But I can’t resist linking to Prof. Richard Feynman’s watershed 1982 paper, Simulating Physics with Computers.

The official publication page:

A (bootleg?) copy of the full journal article:

A related article:

Again, the goal here is not to provide the reader with answers on quantum computing, or even attempt to point them in the right direction, but simply to begin developing a framework of the right questions to be asking about quantum computing.

But, I can’t resist providing at least a few resources…

The Wikipedia is notorious for being a dubious source of information about anything, but for anybody with at least half a brain capable of properly filtering dubious input it is always a semi-decent starting point:

There is a separate Wikipedia page for a Timeline of quantum computing, which lists notable events, breakthroughs, and commercial offerings in the history of quantum computing, and is kept reasonably up to date:

IBM is doing a lot of work in quantum computing:

Microsoft as well:

As, might be expected, of course Google is right there as well:

Many if not most premier academic institutions are deep into all things quantum.

MIT:

CalTech:

Stanford:

For a relatively complete list of major academic institutions with quantum computing programs:

Books? Yes, there are quite a few books on quantum computing. A Google search for “quantum computing books” will quickly highlight many of them. Even some available as free PDFs. Personally, I’m more interested in informal or web-style resources rather than dense formal textbooks — that are quite expensive. In any case, I’ll personally refrain from recommending specific books on quantum computing, at least for now.

I’m anxious to dig into the variety of startups in the quantum computing space, but I don’t have a decent list to present at this time. But I would mention one that gets a fair amount of press (and hype??) D-Wave Systems:

What’s next?

As mentioned upfront, this informal paper is a work in progress, so I’ll continue to update the lists in this paper as I dig deeper into quantum computing and realize how many points of commonality or difference there really are between quantum and traditional digital computing.

I won’t be attempting to provide hard, definitive answers to any of the questions in this paper — that will be left to yet another paper or multiple papers depending and how simple or complex the correspondence between quantum and traditional digital computing turns out to really be.

I’m also interested in compiling a relatively comprehensive glossary of quantum computing terms which have distinct meaning from similar terms in traditional digital computing. As well as a glossary which is a subset of quantum mechanical terms which are essential to a deep comprehension of quantum computing. Update: I’ve compiled an initial draft for such a glossary: Quantum Computing Glossary — Introduction.

At some point I intend to produce a What is Quantum Computing? paper. The puff pieces and hype that is out there, even the Wikipedia page, are all woefully inadequate, in my own view.

Meanwhile, my main focus in the near term will be on completing the three MIT online courses that cover quantum mechanics. Or at least reviewing them even though I have no expectation of truly mastering the material, but I do expect to learn enough to pass judgment on the veracity of a lot of the claims that are being made about quantum computing, at least from the perspective of the underlying quantum mechanical theory.

Once I get basic quantum mechanics nailed, then I will move on to slogging through a few of the various online courses on quantum computing in particular.

And I am very interested in cataloging the startup space.

How much of all of this I will succeed at completing and in what timeframe remains to be seen. My hope is that my preliminary list of questions about quantum computing should be sufficient to kick start the thought and discussion process even if I accomplish very little beyond that.

Update 9/3/2018: Some subsequent papers cover more specific detail:

Written by

Freelance Consultant

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store