Knowledge Needed to Deeply Comprehend Digital Computing

Jack Krupansky
19 min readFeb 20, 2018

--

What fields of knowledge do you need to comprehend in significant depth to fully grasp the capabilities and limitations of digital computing? Not in a shallow, casual, or superficial way, but in a very deep and very hard-core manner. Down to the deepest levels of the nitty-gritty.

My main motivation here is to lay the groundwork for asking the same question about quantum computing since it is so new and so over-hyped that a superficial examination won’t tell us how much of the hype is real and how much is mere fantasy and wishful thinking.

The theory here is that by asking similar questions about traditional digital computing we can develop a more comprehensive framework for asking the right questions about quantum computing, so that we can know what quantum computing really is, what makes it tick, what it really can or can’t do, and what it is best for.

As a side effect, the lists related to traditional digital computing will provide a reasonable checklist for evaluating any academic program which seeks to fully train computer scientists and software engineers.

Initial questions about quantum computing

I want to be able to answer any and all fundamental questions about quantum computing.

The purpose of presenting a fairly detailed list of questions about quantum computing up front here in this paper nominally about traditional digital computing is to make it clear why I am bothering to ask similar questions about traditional digital computing at all.

My conjecture is that despite the differences between these two forms of computing there are still a significant number of parallels. Call it comparative computing, if you will.

[NOTE: An updated list of these questions is contained in a follow-on paper, What Knowledge Is Needed to Deeply Comprehend Quantum Computing?. The list presented here is still valid, but somewhat out of date and will become more outdated as the list in that other paper continues to be updated.]

My initial set of questions about quantum computing includes but is certainly not limited to:

  1. What is a quantum computer? Basic definition.
  2. What is quantum computing? Basic definition.
  3. How is quantum computing distinct from traditional digital computing?
  4. What can a quantum computer compute that a traditional digital computer cannot?
  5. What can a quantum computer do better than a traditional digital computer?
  6. Is speed the only truly significant advantage of a quantum computer?
  7. Is the concept of digital even relevant to quantum computing?
  8. Do quantum and traditional digital computers have more in common or more that differentiates them from each other?
  9. Is a qubit still considered digital?
  10. Is a qubit still considered discrete?
  11. Is a quantum computer still a digital computer?
  12. What precisely does digital mean?
  13. What precisely does digital computing mean?
  14. Is a quantum computer still a discrete computer (ala digital) or can it compute continuous data as an analog computer does?
  15. Can a quantum computer compute directly from continuous values from analog sensors, such as a voltage, current, or temperature, or is an intermediate conversion to a discrete or digital value needed?
  16. How does a quantum computer handle analog to digital and digital to analog conversions?
  17. What operations can a quantum computer perform compared to operations that a traditional digital computer can perform?
  18. What data types, operators, and functions does a quantum computer support, compared to a traditional digital computer?
  19. Does a quantum computer perform Boolean logic (AND, OR, NOT with true and false — not to be confused with binary digits of 0 and 1), such as evaluating complex conditions, comparable to a traditional digital computer?
  20. Does a quantum computer have a processing unit comparable to the central processing unit (CPU) of a traditional digital computer?
  21. Does a quantum computer have an arithmetic and logic processing unit comparable to an arithmetic and logic unit (ALU) of a traditional digital computer?
  22. Does a quantum computer have memory (for large volumes of data) comparable to a traditional digital computer?
  23. How is memory of a quantum computer (for large volumes of data) organized?
  24. Does a quantum computer have a byte or word size or data path width comparable to a traditional digital computer (8, 16, 32, 64, or 128)?
  25. Can a quantum computer compute values which cannot be represented on a traditional digital computer?
  26. How does a quantum computer represent real numbers?
  27. What are the largest and smallest real numbers (non-integers) that a quantum computer can represent?
  28. How many digits of precision can a quantum computer represent and compute for real numbers (non-integers)?
  29. What does a quantum computer compute for 1.0 divided by 3.0, which has an infinite number of repeating digits?
  30. What does a quantum computer compute for 1.0 divided by 3.0 times 3.0–1.0 or 0.9999…?
  31. Does a quantum computer use so-called floating point arithmetic for real numbers like a traditional digital computer? Base 2, or what?
  32. How does a quantum computer compute infinite Taylor series expansions, compared to a traditional digital computer?
  33. What will a quantum computer compute for SQRT(2.0), which is an irrational number with infinite digits? How will it compare to a traditional digital computer?
  34. Can a quantum computer calculate SQRT(-1), otherwise known as i, since quantum mechanics is based on complex and imaginary numbers?
  35. Do quantum computers have more than one precision for representing real numbers (non-integers)?
  36. Does a quantum computer compute with complex numbers more efficiently than a traditional digital computer, especially since complex numbers are the basis for quantum mechanics, which quantum computers are supposedly based on?
  37. How much quantum mechanics does one need to fully and deeply comprehend to fully and deeply grasp all nuances of quantum computing?
  38. Can a quantum computer compute values which cannot be comprehended by a human being?
  39. How are quantum algorithms different from or similar to comparable algorithms of traditional digital computing?
  40. Can all, some, or no quantum algorithms be simulated (albeit much more slowly) on a traditional digital computer? What factors are involved in whether or how effectively any simulation can be performed?
  41. How practical is a quantum computer?
  42. How expensive is a quantum computer for a given task?
  43. How much power (energy) does a quantum computer require for a given task?
  44. How large is a quantum computer for a given task?
  45. What technologies are needed to design and produce a quantum computer?
  46. What physics knowledge is needed to design and produce a quantum computer?
  47. What physics knowledge is needed to understand how to use a quantum computer?
  48. What physics knowledge is needed to understand how to program a quantum computer?
  49. What mathematics knowledge is needed to design and produce a quantum computer?
  50. What mathematics knowledge is needed to understand how to use a quantum computer?
  51. What mathematics knowledge is needed to understand how to program a quantum computer?
  52. Is great skill knowledge and skill with linear algebra (eigenfunctions, eigenvalues, Fourier transformations) needed to be very skilled with quantum computing?
  53. What kind of operating system is needed to run a quantum computer?
  54. Is a quantum computer more of a co-processor to be associated with a traditional general purpose digital computer, or can a quantum computer fully replace a traditional general purpose digital computer?
  55. Does it make sense to speak of a grid of interoperating quantum computers or even a distributed cloud of quantum computers, or is each quantum computer a world of its own and unable to interact with another quantum computer except through an intermediary traditional digital computer or other custom traditional digital electronic or optical circuitry?
  56. Can quantum computers be networked comparable to local, wide area, and Internet networking of traditional digital computers?
  57. Does the concept of a website make sense for quantum computing? [imaginary or complex web pages??!! Just kidding.]
  58. Does the concept of networking protocols make sense for quantum computing?
  59. Can two or more quantum computers directly exchange qubits via quantum communication, or is some translation to and from digital format required to make the transition?
  60. Is coherence (technically, quantum decoherence) a fundamental limit or upper bound to quantum computing or simply a short-term technical matter that will be resolved shortly?
  61. What degree of coherence can be expected over the next few to five to ten years?
  62. Is room temperature quantum computing even theoretically practical? How soon?
  63. What temperature of quantum computing will be practical over the next few to five to ten years?
  64. How much data can a quantum computer process, such as a large database or search engine, compared to the disk, flash memory, and main memory of a traditional digital computer, now and for the next few to five to ten years?
  65. What applications are most suitable for quantum computing?
  66. Are all applications suitable for quantum computing?
  67. Are any applications particularly unsuitable for quantum computing?
  68. What specific criteria can be used to determine the degree of suitability of an application for quantum computing?
  69. Can an automated profiling tool be used to determine the degree of suitability of a particular application for quantum computing?
  70. What programming languages can be used for quantum computing?
  71. What programming languages are best or optimal for quantum computing?
  72. Is there a machine language and assembly language for quantum computing?
  73. How similar or dissimilar are quantum computers from different labs, designers, or vendors?
  74. What components are standard across all or most quantum computers?
  75. Can quantum computers run on relatively small batteries, or do they need a robust AC power source?
  76. Do quantum computers use direct current (DC)?
  77. What voltage levels do quantum computers operate on?
  78. Is statistical processing the same or different for quantum computing in contrast with traditional digital computing?
  79. How would a quantum computer compute the median (not mean or average, although those are of interest too) of a very large set of numbers or character strings? How would the performance differ from a traditional digital computer?
  80. Are all aspects of mathematics equally applicable to quantum computing and traditional digital computing?
  81. Does cybersecurity apply equally to quantum computing as to traditional digital computing?
  82. What is the quantum computing equivalent of a traditional digital Turing machine?
  83. Would a quantum computer perform natural language processing (NLP) in a qualitatively better way than a traditional digital computer?
  84. What specific aspects of artificial intelligence is quantum computing especially well-adapted for?
  85. What debugging and testing features and tools does a quantum computer provide?
  86. Can the full state of a quantum computer be dumped or otherwise captured for examination, analysis, debugging, and testing, or does the Heisenberg uncertainty principle and superposition preclude this?
  87. How does fabrication of chips and circuits differ for quantum computing compared to a traditional digital computer?
  88. How is color represented in a quantum computer, compared to RGB and other color models used by traditional digital computing?
  89. Would quantum computers still use pixels for representing images and video?
  90. How would audio be represented and processed in a quantum computer?
  91. Is there a decent and comprehensive glossary for quantum computing?
  92. Is there a decent and comprehensive glossary for all aspects of quantum mechanics that is needed to fully comprehend quantum computing?
  93. Is there a decent and robust introductory overview of quantum computing? Puff pieces and hype not welcome. Wikipedia entry is weak. Dense, academic textbooks have their place, but the question here is a decent introductory overview that is free of hype and adequately explains the technical differences from traditional digital computing.
  94. How much of quantum computing applications can be cost-effectively addressed using massively parallel grids of very small and very cheap traditional digital computers?
  95. Which is more cost effective, owning or leasing quantum computers?
  96. Is time-sharing and on-demand shared online access to quantum computers more cost effective and more efficient than either owning or leasing?
  97. How can the computational complexity of quantum computers best be described — polynomial (P), nondeterministic polynomial (NP), NP-Complete, NP-Hard, or what?
  98. What thought processes are needed to solve problems with a quantum computer, and how do they compare or contrast with the thought processes for solving problems with traditional digital computers?
  99. Is the concept of a user interface or user experience (UX) relevant to quantum computing?
  100. What is the smallest quantum computing gate possible compared to the smallest traditional digital computing gate?
  101. What is the smallest qubit possible compared to the smallest traditional digital computing bit?
  102. What is the fastest quantum computing gate possible compared to the fastest traditional digital computing gate?
  103. What is the fastest qubit possible compared to the fastest traditional digital computing bit?
  104. What is the shortest quantum computing gate connection possible compared to the shortest traditional digital computing gate connection?
  105. What is the thinnest quantum computing gate connection possible compared to the thinnest traditional digital computing gate connection?
  106. Are there frameworks for quantum computing applications?

Those are some of the questions I’d like to be able to answer. A subsequent paper will expand that list.

This informal paper will not attempt to answer any of those questions for quantum computing, or any other questions about quantum computing in general. That will be left for future papers.

Classic or traditional digital computing

We need a decent term to refer to non-quantum computers — the kind of computers most of us use today.

An unresolved issue here is whether there is any distinction between the terms:

  1. Digital computing.
  2. Traditional digital computing.
  3. Classic computing.
  4. Classical computing.
  5. Classical digital computing.

I’ve seen the term classical computing used in online discussion of quantum computing.

At this stage, it does seem that traditional and classical (or classic) are all equally valid modifiers to distinguish traditional digital computing from newfangled quantum computing.

That said, I’ll stick with traditional digital computing in my own writing, to distinguish that it is strictly digital and that it is not the new form of computing currently called quantum computing.

I also see that the terms quantum algorithms and classical algorithms are now used as well.

What is a quantum computer?

This paper is not intended to answer any questions about quantum computing, including basic definitions such as the meaning of the terms quantum computer and quantum computing.

Although this paper suggests a number of potential questions about quantum computing, it isn’t even intended to present a comprehensive set of questions about quantum computing, let alone answer any of those questions.

A subsequent paper will present a relatively comprehensive set of questions about quantum computing based in large part by a list of topics and questions about traditional digital computing, which is the focus of this particular paper.

What is a digital computer?

The original intention of the term digital computer was to contrast with analog computer. The latter processes information as continuously varying electrical signals, commonly voltages, while the former processes information in discrete values, commonly as integers represented as sequences of the binary digits 0 and 1.

Generally a digital computer is an electronic digital computer, meaning that information is represented and transferred in the form of electrons. Although information can also be stored magnetically (disk and tape), which technically is not electronic.

Older computers were electromechanical, such as using relays.

Some computing research seeks to create photonic computers, also known as optical computers. They would still be digital computers, but would not be electronic digital computers since they would rely primarily on photons rather than electrons.

Is a quantum computer a digital computer?

Whether quantum computers are still digital computers is an open question, particularly since their reliance on the quantum mechanical property of superposition means that technically they are not operating on discrete values since a given value can be a superposition of many values.

In a traditional, electronic digital computer a bit is either a 0 or a 1, one or the other, while in a quantum computer a bit or qubit could be in any number of states rather than in only a single state.

This paper is not intending to take a firm stance on whether a quantum computer is or isn’t still a digital computer.

Generally, data fed into a quantum computer or read out of a quantum computer will be in some digital format that will indeed be discrete values composed of sequences of strictly digital, binary 0’s and 1’s, but I/O generally isn’t the basis for defining how a computer operates (internally.)

What does digital mean in the context of quantum computers?

Beyond how a digital computer processes and represents information internally, business, commerce, government, and consumers now refer to digital in many ways, most commonly simply to distinguish products and services which exist only online or only in electronic form, in contrast with real or physical products and services.

Technically, that use of digital is still fully compatible with how a digital computer functions internally since data is still represented as discrete values.

But in the world of quantum computers, will digital in the real world still have the same meaning?

Technically, as currently implemented, today’s quantum computers process information internally in quantum format which uses superposition rather than discrete digital values, but the process of transferring information into a quantum computer or out of a quantum computer currently requires conversion from and to traditional, discrete digital format.

At least for the foreseeable future, any interactions between people or businesses in the real world and quantum computers would still require the use of digital data formats and protocols, so referring to business, products, and services as digital will still be sensible even in the world of quantum computing.

But data inside of a quantum computer would commonly not be discrete or technically digital.

In short, information on the outside would still be digital, but information on the inside would not be digital, strictly speaking.

Roles which need to comprehend traditional digital computing

Not everybody who uses, programs, designs, builds, purchases, deploys, maintains, or repairs a traditional digital computer needs to have the full depth of knowledge of all aspects of traditional digital computing.

A robust but not necessarily comprehensive sampling of the roles includes:

  1. Scientists.
  2. Engineers.
  3. Mathematicians.
  4. Software designers.
  5. Software developers.
  6. Software engineers.
  7. Software test engineers.
  8. Software testers.
  9. Application software developers.
  10. Technical writers.
  11. Managers.
  12. Project managers.
  13. Technical supervisors.
  14. Middle level technical managers.
  15. Product managers.
  16. Corporate executive management.
  17. User experience designers.
  18. Test engineers.
  19. IT specialists.
  20. Managers of IT specialists.
  21. Chief technology officers.
  22. Users.
  23. Managers of users.
  24. Buyers of computers and software.
  25. Marketing.
  26. Sales.
  27. Investors.
  28. Shareholders.
  29. Public policy staff.
  30. Government officials.
  31. Legislators.

Fields needed to fully and deeply comprehend traditional digital computing

This is a semi-comprehensive list of the fields of study that encompass the vast bulk of everything that goes on inside of a traditional digital computer:

  1. Physics
  2. Chemistry of materials
  3. Semiconductor fabrication
  4. Electrical engineering
  5. Computer engineering
  6. Mechanical engineering
  7. Robotics
  8. Mathematics
  9. Computer science
  10. Software engineering
  11. Circuit design — digital and analog
  12. Energy sources — AC, batteries, solar, power management, and heat dissipation
  13. Psychology
  14. Graphic design
  15. User experience design
  16. Marketing
  17. Sales
  18. Support
  19. Philosophy
  20. Law
  21. Sociology
  22. Neuroscience

General areas

These are some or most of the general areas of knowledge and expertise needed to fully and deeply comprehend traditional digital computing:

  1. Physics — Newtonian mechanics, electricity, magnetism
  2. Physics — Solid state physics
  3. Physics — Quantum mechanics
  4. Physics — Quantum field theory
  5. Physics — Flow of electrons in wires
  6. Physics — Flow of photons in waveguides, optical cables
  7. Physics — Flow of heat
  8. Chemistry of materials — Conductors
  9. Chemistry of materials — Insulators
  10. Chemistry of materials — Semiconductors
  11. Chemistry of materials — Batteries
  12. Chemistry of materials — Light sensitivity
  13. Chemistry of materials — Color properties
  14. Chemistry of materials — Solutions and solvents
  15. Software design
  16. Software architecture
  17. Math — Number theory
  18. Math — Real numbers
  19. Math — Rational numbers
  20. Math — Irrational numbers
  21. Math — Sets
  22. Math — Complex numbers
  23. Math — Algebra
  24. Math — Geometry
  25. Math — Trigonometry
  26. Math — Transcendental functions
  27. Math — Calculus
  28. Math — Probability
  29. Math — Statistics
  30. Math — Logic
  31. Math — Computational complexity
  32. Math — Linear algebra
  33. Math — Game theory
  34. Computer science — Algorithms
  35. Computer science — Data structures
  36. Databases
  37. Distributed databases
  38. Blockchain and other distributed transaction ledger technologies
  39. Data modeling
  40. Search engines
  41. Artificial intelligence (AI)
  42. Machine intelligence
  43. Machine learning

Specific topic areas

These are many of the specific topic areas that span the full range of knowledge and expertise needed to fully master all aspects of traditional digital computers:

  1. Electrons
  2. Photons
  3. Waves and particles
  4. Charge
  5. Spin
  6. Special relativity
  7. General relativity
  8. Quantum nature of electromagnetic radiation
  9. Electric field
  10. Magnetic field
  11. Potential
  12. Voltage
  13. Current
  14. Resistance
  15. Capacitance
  16. Magnetism for storage
  17. Discrete electronic components
  18. Data sheets for specification of electronic components
  19. Design of logic gates
  20. Design with logic gates
  21. Resistors, capacitors, inductors, capacitors, diodes, transistors, crystals, switches
  22. Design of transistors
  23. Design with transistors
  24. Physics of transistors
  25. Analog electronic components
  26. Wires and cables
  27. Plugs, sockets, and connectors
  28. Switches and buttons
  29. Electronic digital computing
  30. Photonic digital computing
  31. The many ways a bit can be represented, stored, transferred, and operated on.
  32. Bits, bytes, words, nibbles, bit strings, characters, character strings.
  33. Decimal, binary, hex, octal, Boolean, enumeration, integer, floating point, arbitrary decimal precision, and complex number and data representations.
  34. Distinction between binary, bit, and Boolean
  35. What precisely does digital mean?
  36. What precisely does digital computing mean?
  37. Memory models
  38. Transient memory
  39. Static memory
  40. Dynamic memory
  41. Memory refresh
  42. Associative memory
  43. Parity and parity errors
  44. Error correction code (ECC) memory
  45. Cache
  46. Data cache
  47. Instruction cache
  48. Random access memory
  49. Memory management — pages and segments, protection, sharing
  50. Virtual memory and paging
  51. Flash memory
  52. Mass storage
  53. Rotating storage, spinning storage
  54. Latency
  55. Seek time
  56. Tape storage
  57. Operating systems
  58. Device drivers
  59. Process scheduling
  60. Resource contention
  61. Central processing unit (CPU)
  62. Arithmetic and Logic Unit (ALU)
  63. Chip design
  64. Chip layout
  65. Circuit board layout
  66. Circuit board fabrication
  67. Encryption
  68. User credentials — user IDs and names, passwords, keys
  69. User authentication
  70. User authorization
  71. User privileges
  72. Language translation
  73. Software modules
  74. Software APIs
  75. Code libraries
  76. Application frameworks
  77. Network service APIs
  78. Web services
  79. REST APIs
  80. Software Testing
  81. Software performance characterization and testing
  82. Software capacity characterization and testing
  83. Algorithm design
  84. Specific algorithms
  85. Design patterns
  86. Brute force approaches — algorithms, attacks, calculations, methods, search
  87. Computer science — Complexity theory
  88. Arithmetic expressions, operators, and functions
  89. Boolean logic expressions, operators, and functions
  90. Bit and bit string expressions, operators, and functions
  91. Logical expressions, operators, and functions
  92. Character expressions, operators, and functions
  93. String expressions, operators, and functions
  94. Operations
  95. Operation codes (opcodes)
  96. Data types
  97. Data type conversion — implicit and explicit
  98. Language theory
  99. Languages — Human
  100. Languages — Computer programming
  101. Languages — specialized
  102. Language grammars
  103. Unrestricted languages and grammars, Type-0 languages
  104. Context-sensitive languages and grammars, Type-1 languages
  105. Context-free languages and grammars, Type-2 languages
  106. Regular languages and grammars, Type-3 languages
  107. Regular expressions, regex, pattern matching
  108. Grammar syntax rules
  109. BNF — Backus–Naur form for grammars, extended BNF
  110. Railroad diagrams for grammar syntax rules
  111. Language parsing
  112. Language lexemes and tokens
  113. Language parse trees
  114. Language syntax
  115. Language semantics
  116. Language meaning
  117. Compilers
  118. Language translators
  119. Code generation, code generators
  120. Code optimization
  121. Runtime systems
  122. Interpreters
  123. Virtual machines
  124. Knowledge
  125. Knowledge representation
  126. Knowledge semantics
  127. Knowledge meaning
  128. AI
  129. Machine intelligence
  130. Machine learning
  131. Sensors
  132. Real-time systems and programming
  133. Input and output devices (I/O)
  134. Interrupts and interrupt levels
  135. Internet of Things (IoT) devices
  136. Math — Computability theory
  137. Math — NP-complete
  138. Coding algorithms
  139. Specific programming languages
  140. High-level languages
  141. Assembly language programming
  142. Machine language programming
  143. Layout of machine code
  144. Dumps, hex dumps
  145. Command languages
  146. Shell scripts
  147. Protocols for data transmission
  148. Data persistence
  149. State
  150. Data formats
  151. Data formats for transmission
  152. Data formats for storage
  153. Data structures in general
  154. Data structure fields
  155. Data structure design
  156. Specific data structures — arrays, lists, sets, maps, heaps, hash tables, graphs, directed graphs, directed acyclic graph, trees, Merkle trees, blocks, records, rows, columns
  157. Objects and classes
  158. Object-oriented programming
  159. Class functions
  160. Memory allocation
  161. Heap memory
  162. Memory leaks
  163. Garbage collection
  164. Function call processing
  165. Pushdown stack
  166. Function recursion
  167. Constants
  168. Symbolic constants
  169. Arrays
  170. Lists
  171. Trees
  172. Records
  173. Rows
  174. Packets
  175. Database principles
  176. Data model design principles
  177. Data consistency
  178. ACID data principles
  179. Relational database principles
  180. Query languages
  181. SQL
  182. NoSQL
  183. Security
  184. Privacy
  185. Data protection
  186. Data privileges
  187. Code protection
  188. Code privileges
  189. Code synchronization — semaphores and locks
  190. Timeouts, retries, exponential backoff
  191. Distributed computing
  192. CAP theorem
  193. Data partition
  194. Data replication
  195. Server mirroring
  196. Data centers
  197. Cloud computing
  198. Clusters
  199. Nodes
  200. Single point of failure (SPOF)
  201. Client applications
  202. Middleware
  203. Writing requirements specifications
  204. Writing design specifications
  205. Writing architectural specifications
  206. Designing user interfaces
  207. Testing software
  208. Installing a computer
  209. Using a computer
  210. Maintaining a computer
  211. Human computer interaction (HCI)
  212. Human factors
  213. Response time
  214. Latency
  215. Automata
  216. Turing machines
  217. Finite state machine, automaton
  218. Pushdown automaton
  219. Cellular automata
  220. State machines
  221. Co-processors
  222. Auxiliary processing units
  223. GPU (Graphics Processing Unit)
  224. Central processor architecture
  225. Central processor block diagram
  226. Multi-core processors
  227. Hyperthreaded processing
  228. Parallel processing — Single Instruction, Multiple Data (SIMD) and Multiple Instruction, Multiple Data (MIMD)
  229. Supercomputers
  230. Microprocessors
  231. Embedded computers
  232. Data flow diagrams
  233. Flow charts
  234. Block diagrams
  235. Abstraction
  236. Representation
  237. Markup languages
  238. Specific markup languages — SGML, HTML, XML
  239. Graphic representations — 2D and 3D
  240. Picture and photographic representations
  241. Audio and video representations
  242. Analog and digital conversion for speakers, microphones, and cameras (still and motion video)
  243. Color models — RGB, HSV, HSL, CMYK
  244. Pixels
  245. Displays
  246. Keyboards
  247. Pointing devices
  248. Software tools
  249. Source code control
  250. Configuration management
  251. Debugging features and tools
  252. Testing features and tools
  253. Nondeterministic polynomial time (NP) problems
  254. NP-Complete problems
  255. NP-Hard problems
  256. Polynomial time (P) problems
  257. Church-Turing thesis
  258. Turing test for intelligence
  259. Hilbert space
  260. Hamiltonian energy operator
  261. Hermitian operators
  262. Adjoint operators
  263. Fourier transforms
  264. Heat flow
  265. Heat dissipation
  266. Cooling
  267. Websites, web pages
  268. Blogs
  269. E-commerce
  270. Email
  271. Social media
  272. Backup
  273. Checkpoint and restart
  274. Undo and redo
  275. Audit trail logging
  276. Event tracing
  277. Information theory (Shannon, et al) and the intersection of communication and computing; data and information coding
  278. Numerical analysis
  279. Matrices, matrix arithmetic, and matrix functions
  280. Computer gaming
  281. Simulation
  282. Monte Carlo simulation

Specific questions, unknowns, or issues

Beyond all of the general and specific areas of interest to traditional computing, some specific questions or issues include:

  1. Smallest transistor possible.
  2. Fastest transistor possible.
  3. Smallest logic gates (AND, OR, NOT, flip flop) possible.
  4. Fastest logic gates possible.
  5. Thinnest wire or connection possible between gates.
  6. Shortest wire or connection possible between gates.
  7. Smallest memory cell possible.
  8. Fastest memory cell possible.
  9. Impact of cosmic radiation on electronic components.
  10. Impact of background radiation on electronic components.
  11. Impact of impurities on electronic components.
  12. Speed of electron flow within and between electronic components.
  13. Impact of speed of light on electronic circuits.
  14. Impact of neutrinos on electronic circuits.
  15. Generation of random numbers. Quality of randomness or pseudorandomness.
  16. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by even the most sophisticated professional or certified genius.
  17. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by an average professional.
  18. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by even a team of the most sophisticated professionals.
  19. Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by a team of average professionals.
  20. All of the above applied to photonic components as well.
  21. What makes metals such as copper, aluminum, silver, and gold be conductors?
  22. What makes materials such as silicon dioxide be insulators?
  23. What makes materials be semiconductors?
  24. What thought processes are needed to solve problems with a traditional digital computer?
  25. How does a transistor really work (physics)?
  26. What’s so special about electrons (physics)?

Quantum mechanics

My presumption is that a deep understanding of quantum mechanics is essential to a deep understanding of quantum computing. Again, my interest is not understanding quantum computing at a shallow level, the way an undergraduate computer science major would understand traditional digital computing, but to understand at a deep enough level to sift through and dispel any and all hype.

Are quantum mechanical effects visible in digital computing

One interesting question is the degree of impact of quantum mechanics on traditional digital computing.

Are quantum mechanical properties visible at the macroscopic level of electronic circuits? Such as in the behavior of transistors and the motion of electrons.

Of course one could argue that all macroscopic effects exist on top of and can only exist because of quantum mechanical interactions.

But the question remains how much knowledge about quantum mechanics itself is needed to comprehend traditional digital computing.

And whether an understanding of the role of quantum mechanics in digital computing helps in any way to understand the special and distinctive qualities of quantum computing.

What’s next?

I’ll continue to update the lists in this paper as I dig deeper into quantum computing and realize how many points of commonality or difference quantum and traditional digital computing really have.

I’ll also start a parallel paper based on the initial list of questions for quantum computing listed at the front of this paper. The tentative title for that paper will be Knowledge Needed for Quantum Computing. I won’t be attempting to provide hard, definitive answers to any of those questions in that paper — that will be left to yet another paper or multiple papers depending and how simple or complex the correspondence between quantum and traditional digital computing turns out to really be.

I’m also interested in compiling a relatively comprehensive glossary of quantum computing terms which have distinct meaning from similar terms in traditional digital computing. As well as a glossary which is a subset of quantum mechanical terms which are essential to a deep comprehension of quantum computing.

At some point I intend to produce a What is Quantum Computing? paper. The puff pieces and hype that is out there, even the Wikipedia page, are all woefully inadequate.

Meanwhile, my main focus in the near term will be on completing the three MIT online courses that cover quantum mechanics. Or at least reviewing them even though I have no expectation of truly mastering the material, but I do expect to learn enough to pass judgement on the veracity of a lot of the claims that are being made about quantum computing, at least from the perspective of the underlying quantum mechanical theory.

How much of all of this I will succeed at completing and in what timeframe remains to be seen. My hope is that my preliminary list of questions about quantum computing at the front of this paper should be sufficient to kick start the thought and discussion process even if I accomplish very little beyond that.

--

--

Jack Krupansky
Jack Krupansky

No responses yet