What Knowledge Is Needed to Deeply Comprehend Quantum Computing?
How much of quantum computing is real and how much is hype, fantasy, and wishful thinking? How can you tell? This informal paper examines what fields of knowledge you need to comprehend in significant depth to fully grasp the capabilities and limitations of quantum computing. Not in a shallow, casual, or superficial way, but in a very deep and very hard-core manner. Down to the deepest levels of the nitty-gritty, both software and hardware, and even down to the underlying physics, theory, and mathematics.
The goal here is to develop a comprehensive framework for asking the right questions about quantum computing, so that we can know what quantum computing really is, what makes it tick, what it really can or can’t do, and what it is best for or maybe even isn’t good for.
This paper won’t have the answers and the details of the knowledge needed to deeply understand quantum computing, but it will list the questions and topics that do appear to be needed to gain such understanding. Future papers will pursue answers and details. The point of this paper is to provide the framework for that pursuit of knowledge.
A fair amount of information in this paper was already presented in a companion paper, Knowledge Needed to Deeply Comprehend Digital Computing, to make the foundation more clear, but this paper should stand by itself with regard to quantum computing.
Work in progress
The lists in this paper will be incrementally updated as my reading, research, and thinking progresses. This paper is simply an initial stab at the problem.
Sorry, no specific answers or deep knowledge here
Again, the intention of this paper is not to have all the answers or even any of the answers, or even be an introductory tutorial for quantum computing.
Framework for knowledge needed to deeply comprehend quantum computing
Rather, the goal here is to kick-start the process of developing a full framework for asking the right questions needed to fully and deeply understand quantum computing.
Answers to these questions are indeed the ultimate goal, but unless we endeavor to ask all the right questions no amount of answers will help us achieve enlightenment about quantum computing.
Update 9/3/2018: Some subsequent papers cover more specific detail:
- Criteria for Judging Progress of the Development of Quantum Computing
- Comprehensive Glossary for Quantum Computing (over 2,800 entries)
- Hardware and algorithms are the greatest challenges for quantum computing
- What Is a Universal Quantum Computer?
How does quantum computing compare to traditional digital computing?
Rather than start from scratch and detail every tiny detail of knowledge needed to deeply comprehend quantum computing, it would seem to make sense to start by looking back at traditional digital computing and assessing how much these two forms of computing have in common.
My initial presumption is that they have a lot in common, possibly even much more in common than they are different. Maybe that’s not exactly true, but it’s a good starting point for inquiry and would certainly save a lot of time and energy if it is in fact true.
For sure, there are certainly at least some differences, very real differences, and very significant differences.
But it would sure make this paper a lot shorter to be able to focus on simply the differences rather than reinvent the entire wheel of traditional digital computing.
My conjecture is that despite the differences between these two forms of computing there are still a significant number of parallels. Call it comparative computing, if you will.
A companion paper, Knowledge Needed to Deeply Comprehend Digital Computing, already summarizes to a fair level of detail the knowledge needed to deeply understand traditional digital computing. That paper can be used as the foundation and starting point for this paper.
Learning quantum computing
There are two basic models for learning about quantum computing:
- Starting with a solid knowledge of all aspects of traditional digital computing. What incremental knowledge of quantum computing is needed to build on that existing foundation?
- Starting from scratch. What is the total knowledge needed to deeply comprehend quantum computing? Likely to be at least some overlap with knowledge of traditional digital computing, but the goal is only that knowledge needed for quantum computing.
Incremental knowledge needed for quantum computing
Given that one is reasonably knowledgeable about traditional digital computing, exactly how much additional knowledge must one master to become equally knowledgeable about quantum computing?
Learning about quantum computing from scratch
Granted, none of us is currently faced with a quantum-only future for computing, and some hybrid with traditional digital computing is likely for the foreseeable future, it is nonetheless an enlightening thought experiment to contemplate what knowledge one would need to acquire to deeply comprehend quantum computing if one had no interest or knowledge about traditional digital computing.
Is the knowledge needed to master quantum computing alone substantially less than the knowledge needed to master traditional digital computing?
Would it be substantially easier for young students to learn only quantum computing?
Might it be better for young students to start with only quantum computing?
Or, is the reality that for the foreseeable future a hybrid world is the inevitable reality?
Which is the better foundation for the other, quantum computing built on a foundation of traditional digital computing, or traditional digital computing built on a fresh foundation of quantum computing?
Initial questions about quantum computing
I want to be able to answer any and all fundamental questions about quantum computing.
My initial set of questions about quantum computing includes but is certainly not limited to:
[Update: The questions on the list below have been move to a newer paper, Questions About Quantum Computing. There will be no further updates to the list here. It is provided strictly for historical reference, reflecting the list at the time this paper was originally written.]
- What is a quantum computer? Basic definition, but technically robust and free of hype or vagueness.
- What is quantum computing? Basic definition, but technically robust and free of hype or vagueness.
- How is quantum computing distinct from traditional digital computing?
- What can a quantum computer compute that a traditional digital computer cannot?
- What can a quantum computer do better than a traditional digital computer?
- Is speed the only truly significant advantage of a quantum computer?
- Is the concept of digital even relevant to quantum computing?
- Do quantum and traditional digital computers have more in common or more that differentiates them from each other?
- Is a qubit still considered digital?
- Is a qubit still considered discrete?
- Is a quantum computer still a digital computer?
- What precisely does digital mean?
- What precisely does digital computing mean?
- Is a quantum computer still a discrete computer (ala digital) or can it compute continuous data as an analog computer does?
- Can a quantum computer compute directly from continuous values from analog sensors, such as a voltage, current, temperature, audio, and video, or is an intermediate conversion to a discrete or digital value needed?
- How does a quantum computer handle analog to digital and digital to analog conversions?
- What operations can a quantum computer perform compared to operations that a traditional digital computer can perform?
- What data types, operators, and functions does a quantum computer support, compared to a traditional digital computer?
- Does a quantum computer perform Boolean logic (AND, OR, NOT with true and false — not to be confused with binary digits of 0 and 1), such as evaluating complex conditions, comparable to a traditional digital computer?
- Does a quantum computer have a processing unit comparable to the central processing unit (CPU) of a traditional digital computer?
- Does a quantum computer have an arithmetic and logic processing unit comparable to the arithmetic and logic unit (ALU) of a traditional digital computer?
- Does a quantum computer have registers for small amounts of data comparable to a traditional digital computer?
- Does a quantum computer have memory (for large volumes of data) comparable to a traditional digital computer?
- How is memory of a quantum computer (for large volumes of data) organized?
- Do quantum computers support virtual memory and paging?
- Does a quantum computer have a byte or word size or data path width comparable to a traditional digital computer (8, 16, 32, 64, or 128)?
- Does a quantum computer have addresses or pointers width comparable to a traditional digital computer? What width or range?
- Can a quantum computer compute values which cannot be represented on a traditional digital computer?
- How does a quantum computer represent real numbers (non-integers)?
- What are the largest and smallest real numbers (non-integers) that a quantum computer can represent?
- How many digits of precision can a quantum computer represent and compute for real numbers (non-integers)?
- What number base does a quantum computer use for real numbers (non-integers), 2, 10, or what?
- What does a quantum computer compute for 1.0 divided by 3.0, which has an infinite number of repeating digits?
- What does a quantum computer compute for 1.0 divided by 3.0 times 3.0–1.0 or 0.9999…?
- Does a quantum computer use so-called floating point arithmetic for real numbers like a traditional digital computer? Base 2, or what?
- How does a quantum computer compute infinite Taylor series expansions, compared to a traditional digital computer?
- What will a quantum computer compute for SQRT(2.0), which is an irrational number with infinite digits? How will it compare to a traditional digital computer?
- Can a quantum computer calculate SQRT(-1), otherwise known as i, since quantum mechanics is based on complex and imaginary numbers?
- Do quantum computers have more than one precision (bit width) for representing real numbers (non-integers)?
- Does a quantum computer compute with complex numbers more efficiently than a traditional digital computer, especially since complex numbers are the basis for quantum mechanics, which quantum computers are supposedly based on?
- How much formal knowledge of quantum mechanics does one need to fully and deeply comprehend to fully and deeply grasp all nuances of quantum computing?
- Can a quantum computer compute values which cannot be comprehended by a human being?
- How are quantum algorithms different from or similar to comparable algorithms of traditional digital computing?
- Can all, some, or no quantum algorithms be simulated (albeit much more slowly) on a traditional digital computer? What factors are involved in whether or how effectively any simulation can be performed?
- What slowdown factor can or should be expected when simulating a quantum computer (or quantum algorithm) on a traditional digital computer?
- What are the more common design patterns for algorithms for quantum computing?
- How practical is a quantum computer?
- How expensive is a quantum computer for a given task, especially compared to the cost of traditional digital computing?
- How much power (energy) does a quantum computer require for a given task?
- How large is a quantum computer for a given task?
- What technologies and knowledge are needed to design and produce a quantum computer?
- What physics knowledge is needed to design and produce a quantum computer?
- What physics knowledge is needed to understand how to use a quantum computer?
- What physics knowledge is needed to understand how to program a quantum computer?
- What mathematics knowledge is needed to design and produce a quantum computer?
- What mathematics knowledge is needed to understand how to use a quantum computer?
- What mathematics knowledge is needed to understand how to program a quantum computer?
- How much knowledge and skill with linear algebra (eigenfunctions, eigenvalues, Fourier transformations) is needed to be very skilled with quantum computing?
- How much knowledge and skill with vector spaces and quantum operators is needed to be very skilled with quantum computing?
- What kind of operating system is needed to run a quantum computer? What are the major components and features of a quantum operating system?
- Is a quantum computer more of a co-processor to be associated with a traditional general purpose digital computer, or can a quantum computer fully replace a traditional general purpose traditional digital computer?
- Does it make sense to speak of a grid of interoperating quantum computers or even a distributed cloud of quantum computers, or is each quantum computer a world of its own and unable to interact with another quantum computer except through an intermediary traditional digital computer or other custom traditional digital electronic or optical circuitry?
- Can quantum computers be networked comparable to local, wide area, and Internet networking of traditional digital computers?
- Does the concept of a website make sense for quantum computing? [imaginary or complex web pages??!! Just kidding.]
- Does the concept of networking protocols make sense for quantum computing?
- Would traditional network routers be relevant to networking of quantum computers?
- Can two or more quantum computers directly exchange qubits via quantum communication, or is some translation to and from digital format required to make the transition?
- Is coherence (technically, quantum decoherence) a fundamental limit or upper bound to quantum computing or simply a short-term technical matter that will be resolved soon enough?
- What degree of coherence can be expected over the next few to five to ten years?
- Is room temperature quantum computing even theoretically practical? How soon?
- What temperature of quantum computing will be practical over the next few to five to ten years?
- How much data can a quantum computer process, such as a large database or search engine, compared to the disk, flash memory, and main memory of a traditional digital computer, now and for the next few to five to ten years?
- What applications are most suitable for quantum computing?
- Are all applications suitable for quantum computing?
- Are any applications particularly unsuitable for quantum computing?
- What specific criteria can be used to determine the degree of suitability of an application for quantum computing?
- Can an automated profiling tool be used to determine the degree of suitability of a particular application or algorithm for quantum computing?
- What programming languages can be used for quantum computing?
- What programming languages are best or optimal for quantum computing?
- Is there a machine language and assembly language for quantum computing?
- How similar or dissimilar are quantum computers from different labs, designers, or vendors?
- What components are standard across all or most quantum computers?
- Can quantum computers run on relatively small batteries, or do they need a robust AC power source?
- Do quantum computers use direct current (DC)?
- What voltage levels do quantum computers operate on?
- Is statistical processing the same or different for quantum computing in contrast with traditional digital computing?
- How would a quantum computer compute the median (not mean or average, although those are of interest too) of a very large set of numbers or character strings? How would the performance differ from a traditional digital computer?
- Are all aspects of mathematics equally applicable to quantum computing and traditional digital computing?
- Does cybersecurity apply equally to quantum computing as to traditional digital computing?
- What is the quantum computing equivalent of a traditional digital Turing machine?
- Would a quantum computer perform natural language processing (NLP) in a qualitatively better way than a traditional digital computer?
- What specific aspects of artificial intelligence is quantum computing especially well-adapted for?
- What debugging and testing features and tools does a quantum computer provide?
- Can a quantum program be single-stepped to see all state changes and logic flow? Or, can this at least be done in a simulator for the quantum computer? Presumably not for the former, hopefully so for the latter.
- Can the full state of a quantum computer be dumped or otherwise captured for examination, analysis, debugging, and testing, or does the Heisenberg uncertainty principle and superposition preclude this?
- Can a quantum algorithm be interrupted and its state saved and later restored to resume where it left off?
- How does fabrication of chips and circuits differ for quantum computing compared to a traditional digital computer?
- How is color represented in a quantum computer, compared to RGB and other color models used by traditional digital computing?
- Would quantum computers still use pixels for representing images and video?
- How would audio be represented and processed in a quantum computer?
- Is there a decent and comprehensive glossary for quantum computing? Update: I’ve compiled an initial draft for such a glossary: Quantum Computing Glossary — Introduction.
- Is there a decent and comprehensive glossary for all aspects of quantum mechanics that is needed to fully comprehend quantum computing? Update: I’ve compiled an initial draft for such a glossary: Quantum Computing Glossary — Introduction.
- Is there a decent and robust introductory overview of quantum computing? Puff pieces and hype not welcome. Wikipedia entry is weak. Dense, academic textbooks have their place, but the question here is a decent introductory overview that is free of hype and adequately explains the technical differences from traditional digital computing.
- How much of quantum computing applications can be cost-effectively addressed using massively parallel grids of very small and very cheap traditional digital computers?
- Which is more cost effective, owning or leasing quantum computers?
- Is time-sharing and on-demand shared online access to quantum computers more cost effective and more efficient than either owning or leasing?
- How can the computational complexity of quantum computers best be described — polynomial (P), nondeterministic polynomial (NP), NP-Complete, NP-Hard, or what?
- What thought processes are needed to solve problems with a quantum computer, and how do they compare or contrast with the thought processes for solving problems with traditional digital computers?
- Is the concept of a user interface or user experience (UX) relevant to quantum computing?
- What logic gates are needed for quantum computing and how do they compare to the logic gates of traditional digital computing?
- What knowledge of logic gates is needed to develop algorithms and program applications for a quantum computer? Or are there more programmer-friendly higher-level language operators?
- What is the smallest quantum computing gate possible compared to the smallest traditional digital computing gate?
- What is the smallest qubit possible compared to the smallest traditional digital computing bit or memory cell?
- What is the fastest quantum computing gate possible compared to the fastest traditional digital computing gate?
- What is the fastest qubit possible compared to the fastest traditional digital computing bit or memory cell?
- What is the shortest quantum computing gate connection possible compared to the shortest traditional digital computing gate connection?
- What is the thinnest quantum computing gate connection possible compared to the thinnest traditional digital computing gate connection?
- Are there frameworks for quantum computing applications?
- Does the concept of software still apply for quantum computing? How might it differ from traditional digital computing?
- Does the concept of the software development life cycle (SDLC) (or systems software development life cycle) still apply to quantum computing? How might it differ from traditional digital computing?
- Does the concept of software engineering still apply to quantum computing? How might it differ from traditional digital computing?
- What problems are easiest to formulate for a quantum computer?
- What’s the preferred abbreviation for the term quantum computer?
- Is qbit still an acceptable variant of the term qubit?
- Is there a precise, correct technical term for each of the superposition terms or states in the value of a qubit? What’s a piece of a qubit?
- What is the precise technical term for a memory cell that holds a qubit as opposed to the value which is held in such a memory cell?
- How much energy is needed to represent a single qubit?
- How much energy is needed to represent a single value of a superposition in a qubit?
- Is quantum computing inherently a lot more energy efficient than traditional digital computing?
- Is superposition essentially free and cheap, or inherently costly?
- What is the technical definition of a quantum processor? How does that differ from a full quantum computer? Block diagram please.
- Is the cloud the best place for a quantum computer to live, as opposed to the office, your desk, or your hand? What are reasonable places for a quantum computer to live?
- Any github repositories for open source quantum computing work? Algorithms, code, libraries, etc.
- Are there any open source quantum computer designs?
- What are some examples of what a quantum computer could do given Hubble space telescope data (or other space telescopes)? Would they work best with the raw data or would some preprocessing be needed? Would one or more quantum computers be needed for all processing of space telescope data?
- Who are the top quantum computing vendors?
- Who are the hot startups in the quantum computing space?
- What are the use cases for quantum computing? Types, categories, or styles of applications.
- Who might use a quantum computer?
- How might someone use a quantum computer?
- Are quantum computers really probabilistic rather than deterministic? What does that actually mean? How does that limit or expand capabilities and applications?
- Is general-purpose quantum computing an oxymoron? Can it ever be achieved? Will it never be achieved? Based on what thinking, rationale, or logic?
- Can a Turing machine be simulated (or just implemented) on a quantum computer?
- Can finite state automata and pushdown automata be simulated or just implemented on a quantum computer?
- What stage of development has quantum computing achieved? Compared to traditional digital computing, what year or decade is quantum computing at — 1950, 1960, 1940, 1930??
- What technological, theoretical, and physical issues are constraining quantum computing at this stage?
- In what areas are dramatic breakthroughs required before quantum computing can come out of the shadows?
- How deeply does one need to comprehend Bell’s inequality to deeply comprehend quantum computing?
- Are all forms of quantum computers based on spin states? Could a quantum computer be based on a quantum state other than spin? Is spin only one of many possibilities for quantum computing?
- How much does one need to know about magnetism to comprehend quantum computing?
- Is the knowledge needed to master quantum computing substantially less than the knowledge needed to master traditional digital computing?
- Would it be substantially easier for young students to learn only quantum computing?
- Might it be better for young students to start with only quantum computing?
- Is the reality that for the foreseeable future a hybrid world is the inevitable reality?
- Is a hybrid of quantum computing and traditional digital computing a really good thing, a really bad thing, or an indeterminate mixed bag?
- Is SQL and the relational database model equally relevant for quantum computing, less relevant, or even more relevant?
- Will quantum computing mean the end of strong encryption?
- How credible is most of the hype about quantum computing?
- How much of the narratives about the promise of quantum computing is credible or has been proven to be true?
- How large a program can a quantum computer accommodate? How many instructions or operations, or lines of code?
- How complex can a quantum algorithm be?
- Do quantum computers support function calls and recursion?
- Do quantum computers support arrays, structures, and objects?
- How is code represented in a quantum computer? Same as in a traditional digital computer (bits) or using quantum computing qubits?
- Does a quantum computer have the equivalent of processes and threads of traditional digital computing?
- Does or could a quantum computer have a file system?
- Does or could a quantum computer have mass storage? Is it persistent or transient (only while a program is running)?
- How fast can a quantum computer count?
- How fast can a quantum computer divide two numbers? Such as to verify whether X is a factor of Y.
- What arithmetic and mathematical operations are most natural for a quantum computer?
- Do quantum computers have any special advantage for arithmetic and mathematics, other than raw speed of basic operations?
- How much of information theory (Shannon, et al) is still relevant to quantum computing? Or how might it be different (e.g., quantum communication)?
- How consequential is the length of interconnections between qubits? Does packing them much more densely yield a large benefit? Does separating them a modest amount have a large penalty?
- Might wafer-scale integration (WSI) have any significant benefit, performance or otherwise?
- Does a quantum computer have to operate within a Faraday cage, to shield from ambient electromagnetic radiation? Is the just the core quantum computer itself, the qubits, enclosed within a Faraday cage, or does the whole operating environment or entire room have to be in a Faraday cage?
- Do qubits require Josephson junctions, or is that just one option?
This informal paper will not attempt to answer any of those or other questions. That will be left for future papers. The goal here is simply to present a framework and point of departure for further inquiry into the nature of quantum computing.
I considered organizing that list into categories with separate lists, but the categories would be somewhat blurred and even overlap [like a superposition of quantum states!], so I opted to stick with a simpler single list, at least for now, but as work progresses a more clear categorization may emerge.
And that list lacks any clear order as well. I originally attempted to order it, but a lot of questions defied any clear order, so it ended up simply being the order that the questions occurred to me. Over time the ordering may improve.
Classic or traditional digital computing
We need a decent term to refer to non-quantum computers — the kind of computers most of us use today.
An unresolved issue here is whether there is any distinction between the terms:
- Digital computing, digital computer.
- Traditional digital computing, traditional digital computer.
- Classic computing, classic computer.
- Classical computing, classical computer.
- Classical digital computing, classical digital computer.
- Ordinary computing, ordinary computer.
- Regular computing, regular computer.
- Existing computing, existing computer.
- Standard computing, standard computer.
I’ve seen the term classical computing used in online discussion of quantum computing.
At this stage, it does seem that traditional and classical (or classic) are all equally valid modifiers to distinguish traditional digital computing from newfangled quantum computing.
That said, I’ll stick with traditional digital computing in my own writing, to distinguish that it is strictly digital and that it is not the new form of computing currently called quantum computing.
I also see that the terms quantum algorithms and classical algorithms are now used as well.
What is a quantum computer?
This paper is not intended to answer any questions about quantum computing, including basic definitions such as the meaning of the terms quantum computer and quantum computing.
But a future paper will indeed provide a robust and technically accurate definition for a quantum computer.
What is a digital computer?
The original intention of the term digital computer was to contrast with analog computer. The latter processes information as continuously varying electrical signals, commonly voltages, while the former processes information in discrete values, commonly as integers represented as sequences of the binary digits 0 and 1.
Generally a digital computer is an electronic digital computer, meaning that information is represented and transferred in the form of electrons. Although information can also be stored magnetically (disk and tape), which technically is not electronic.
Older computers were electromechanical, such as using relays.
Some computing research seeks to create photonic computers, also known as optical computers. They would still be digital computers, but would not be electronic digital computers since they would rely primarily on photons rather than electrons.
Is a quantum computer a digital computer?
Whether quantum computers are still digital computers is an open question, particularly since their reliance on the quantum mechanical property of superposition means that technically they are not operating on discrete values since a given value can be a superposition of many values.
In a traditional, electronic digital computer a bit is either a 0 or a 1, one or the other, while in a quantum computer a bit or qubit could be in any number of states rather than in only a single state.
This paper is not intending to take a firm stance on whether a quantum computer is or isn’t still a digital computer. A follow-up paper will indeed take a clear position on this matter, although it may end up being a somewhat arbitrary distinction and even a matter of debate and dispute.
Generally, data fed into a quantum computer or read out of a quantum computer will be in some digital format that will indeed be discrete values composed of sequences of strictly digital, binary 0’s and 1’s, but I/O generally isn’t the basis for defining how a computer operates, internally.
What does digital mean in the context of quantum computers?
Beyond how a digital computer processes and represents information internally, business, commerce, government, and consumers now refer to digital in many ways, most commonly simply to distinguish products and services which exist only online or only in electronic form, in contrast with real or physical products and services.
Also, media such as photos, images, audio, and video are now digitized in digital form rather than the analog forms common in the 1960’s.
Technically, that use of digital is still fully compatible with how a digital computer functions internally since data is still represented as discrete values.
But in the world of quantum computers, will digital in the real world still have the same meaning?
Technically, as currently implemented, today’s quantum computers process information internally in quantum format which uses superposition rather than discrete digital values, but the process of transferring information into a quantum computer or out of a quantum computer currently requires conversion from and to traditional, discrete digital format.
At least for the foreseeable future, any interactions between people or businesses in the real world and quantum computers would still require the use of digital data formats and protocols, so referring to business, products, and services as digital will still be sensible even in the world of quantum computing.
But data inside of a quantum computer would commonly not be discrete or technically digital.
In short, information on the outside would still be digital, but information on the inside would not be digital, strictly speaking.
Definition of quantum computing
There are already a variety of definitions for the terms quantum computing and quantum computer floating around, but I find them vague and otherwise lacking in precision and failing to enlighten anyone in any solidly meaningful technical sense.
People can read those definitions and pretend that they know what a quantum computing is or must be, but they will be deluding themselves. That’s the nature of the hype phase for a new technology.
Eventually I will indeed offer up my own definitions for these and related terms, but for now I’ll endeavor to exert a little discipline and refrain from prematurely adding to the noise around quantum computing.
For now, all I will say is that there is a need for much better definitions of these terms.
That said, here are two definitions (or at least characterizations) from Gartner, a fairly reputable information technology research organization:
- A quantum computer uses atomic quantum states to effect computation. Data is held in qubits (quantum bits), which have the ability to hold all possible states simultaneously. This property, known as “superposition,” gives quantum computers the ability to operate exponentially faster than conventional computers as word length is increased. Data held in qubits is affected by data held in other qubits, even when physically separated. This effect is known as “entanglement.” Achieving both superposition and entanglement is extremely challenging.
Also from Gartner:
- Quantum computing is a type of nonclassical computing that is based on the quantum state of subatomic particles. Quantum computing is fundamentally different from classic computers, which operate using binary bits. This means the bits are either 0 or 1, true or false, positive or negative. However, in quantum computing, the bit is referred to as a quantum bit or qubit. Unlike the strictly binary bits of classic computing, qubits can represent 1 or 0 or a superposition of both partly 0 and partly 1 at the same time.
Consistent with my previous comment, these are not bad definitions per se, and may be quite reasonable given the current state of hype, but nonetheless I find them a bit too vague, simplistic, unrealistic, and unenlightening — for my own personal tastes.
Roles which need to comprehend quantum computing
Not everybody who uses, programs, designs, builds, purchases, deploys, maintains, or repairs a quantum computer will need to have the same full depth of knowledge of all aspects of quantum computing.
A robust but not necessarily comprehensive sampling of the roles includes:
- Scientists. Especially physicists.
- Engineers. Electrical engineers. Mechanical engineers.
- Software designers.
- Software developers. Programmers. Coders.
- Software engineers.
- Software test engineers.
- Software testers.
- Application software developers.
- Technical writers.
- Project managers.
- Technical supervisors.
- Middle level technical managers.
- Product managers.
- Corporate executive management.
- User experience designers.
- Test engineers.
- IT specialists.
- Managers of IT specialists.
- Chief technology officers.
- Managers of users.
- Buyers of computers and software.
- Public policy staff.
- Government officials.
How much or what each of these roles needs to know about quantum computing remains to be seen.
Fields needed to fully and deeply comprehend quantum computing
What fields will one have to be fairly knowledgeable about to fully and deeply comprehend quantum computing?
The list presented here is the comparable list for traditional digital computing. The open issue is which of these fields is also needed to comprehend the full scope of quantum computing.
Think of it as a check list of the fields to be considered. It may well be that any given field on this list is not needed, but making that determination will require a fairly deep comprehension of quantum computing, which I do not yet possess.
This is a semi-comprehensive list of the fields of study that are likely to encompass the vast bulk of everything that goes on inside of a traditional digital computer:
- Chemistry of materials
- Semiconductor fabrication
- Electrical engineering
- Computer engineering
- Mechanical engineering
- Computer science
- Software engineering
- Circuit design — digital and analog
- Energy sources — AC, batteries, solar, power management, and heat dissipation
- Graphic design
- User experience design
These are some or most of the general areas of knowledge and expertise needed to fully and deeply comprehend traditional digital computing; it remains to be seen which are equally relevant or relevant at all to a deep comprehension of quantum computing:
- Physics — Newtonian mechanics, electricity, magnetism
- Physics — Solid state physics
- Physics — Quantum mechanics
- Physics — Quantum field theory
- Physics — Flow of electrons in wires
- Physics — Flow of photons in waveguides, optical cables
- Physics — Flow of heat
- Chemistry of materials — Conductors
- Chemistry of materials — Insulators
- Chemistry of materials — Semiconductors
- Chemistry of materials — Batteries
- Chemistry of materials — Light sensitivity
- Chemistry of materials — Color properties
- Chemistry of materials — Solutions and solvents
- Software design
- Software architecture
- Math — Number theory
- Math — Real numbers
- Math — Rational numbers
- Math — Irrational numbers
- Math — Sets
- Math — Complex numbers
- Math — Algebra
- Math — Geometry
- Math — Trigonometry
- Math — Transcendental functions
- Math — Calculus
- Math — Probability
- Math — Statistics
- Math — Logic
- Math — Computational complexity
- Math — Linear algebra
- Computer science — Algorithms
- Computer science — Data structures
- Distributed databases
- Blockchain and other distributed transaction ledger technologies
- Data modeling
- Search engines
- Artificial intelligence (AI)
- Machine intelligence
- Machine learning
Specific topic areas
These are many of the specific topic areas that span the full range of knowledge and expertise needed to be needed to fully master all aspects of traditional digital computing; it remains to be seen which are equally relevant or relevant at all to a deep comprehension of quantum computing:
- Waves and particles
- Special relativity
- General relativity
- Quantum nature of electromagnetic radiation
- Electric field
- Magnetic field
- Magnetism for storage
- Discrete electronic components
- Data sheets for specification of electronic components
- Design of logic gates
- Design with logic gates
- Resistors, capacitors, inductors, capacitors, diodes, transistors, crystals, switches
- Design of transistors
- Design with transistors
- Physics of transistors
- Analog electronic components
- Wires and cables
- Plugs, sockets, and connectors
- Switches and buttons
- Electronic digital computing
- Photonic digital computing
- The many ways a bit can be represented, stored, transferred, and operated on.
- Bits, bytes, words, nibbles, bit strings, characters, character strings.
- Decimal, binary, hex, octal, Boolean, enumeration, integer, floating point, arbitrary decimal precision, and complex number and data representations.
- Distinction between binary, bit, and Boolean
- What precisely does digital mean?
- What precisely does digital computing mean?
- Memory models
- Transient memory
- Static memory
- Dynamic memory
- Memory refresh
- Associative memory
- Parity and parity errors
- Error correction code (ECC) memory
- Data cache
- Instruction cache
- Random access memory
- Memory management — pages and segments, protection, sharing
- Virtual memory and paging
- Flash memory
- Mass storage
- Rotating storage, spinning storage
- Seek time
- Tape storage
- Central processing unit (CPU)
- Arithmetic and Logic Unit (ALU)
- Chip design
- Chip layout
- Circuit board layout
- Circuit board fabrication
- Language translation
- Software modules
- Software APIs
- Code libraries
- Application frameworks
- Network service APIs
- REST APIs
- Software Testing
- Software performance characterization and testing
- Software capacity characterization and testing
- Algorithm design
- Specific algorithms
- Computer science — Complexity theory
- Arithmetic expressions, operators, and functions
- Boolean logic expressions, operators, and functions
- Bit and bit string expressions, operators, and functions
- Logical expressions, operators, and functions
- Character expressions, operators, and functions
- String expressions, operators, and functions
- Operation codes (opcodes)
- Data types
- Data type conversion — implicit and explicit
- Language theory
- Languages — Human
- Languages — Computer programming
- Languages — specialized
- Language grammars
- Unrestricted languages and grammars, Type-0 languages
- Context-sensitive languages and grammars, Type-1 languages
- Context-free languages and grammars, Type-2 languages
- Regular languages and grammars, Type-3 languages
- Regular expressions, regex
- Language parsing
- Language lexemes and tokens
- Language parse trees
- Language syntax
- Language semantics
- Language meaning
- Language translators
- Code generation, code generators
- Code optimization
- Runtime systems
- Virtual machines
- Knowledge representation
- Knowledge semantics
- Knowledge meaning
- Machine intelligence
- Machine learning
- Real-time systems and programming
- Input and output devices
- Internet of Things (IoT) devices
- Math — Computability theory
- Math — NP-complete
- Coding algorithms
- Specific programming languages
- High-level languages
- Assembly language programming
- Machine language programming
- Layout of machine code
- Dumps, hex dumps
- Command languages
- Shell scripts
- Protocols for data transmission
- Data persistence
- Data formats
- Data formats for transmission
- Data formats for storage
- Data structures in general
- Data structure fields
- Data structure design
- Specific data structures — arrays, lists, sets, maps, heaps, hash tables, graphs, directed graphs, directed acyclic graph, trees, Merkle trees, blocks, records, rows, columns
- Objects and classes
- Object-oriented programming
- Class functions
- Symbolic constants
- Database principles
- Data model design principles
- Data consistency
- ACID data principles
- Relational database principles
- Query languages
- Data protection
- Data privileges
- Code protection
- Code privileges
- Code synchronization
- Distributed computing
- CAP theorem
- Data partition
- Data replication
- Server mirroring
- Data centers
- Cloud computing
- Single point of failure (SPOF)
- Client applications
- Writing requirements specifications
- Writing design specifications
- Writing architectural specifications
- Designing user interfaces
- Testing software
- Installing a computer
- Using a computer
- Maintaining a computer
- Human computer interaction (HCI)
- Human factors
- Response time
- Turing machines
- Finite state machine, automaton
- Pushdown automaton
- Auxiliary processing units
- GPU (Graphics Processing Unit)
- Central processor architecture
- Central processor block diagram
- Multi-core processors
- Hyperthreaded processing
- Parallel processing
- Embedded computers
- Data flow diagrams
- Flow charts
- Block diagrams
- Markup languages
- Specific markup languages — SGML, HTML, XML
- Graphic representations — 2D and 3D
- Picture and photographic representations
- Audio and video representations
- Analog and digital conversion for speakers, microphones, and cameras (still and motion video)
- Color models — RGB, HSV, HSL, CMYK
- Pointing devices
- Software tools
- Source code control
- Configuration management
- Debugging features and tools
- Testing features and tools
- Nondeterministic polynomial time (NP) problems
- NP-Complete problems
- NP-Hard problems
- Polynomial time (P) problems
- Church-Turing thesis
- Turing test for intelligence
- Hilbert space
- Hamiltonian energy operator
- Hermitian operators
- Adjoint operators
- Fourier transforms
- Heat flow
- Heat dissipation
- Websites, web pages
- Social media
- Checkpoint and restart
- Undo and redo
- Audit trail logging
- Event tracing
- Information theory (Shannon, et al) and the intersection of communication and computing; data and information coding
Specific questions, unknowns, or issues
These are some of the specific questions and issues beyond all of the general and specific areas of interest to traditional digital computing; it remains to be seen which are equally relevant or relevant at all to a deep comprehension of quantum computing:
- Smallest transistor possible.
- Fastest transistor possible.
- Smallest logic gates (AND, OR, NOT, flip flop) possible.
- Fastest logic gates possible.
- Thinnest wire or connection possible between gates.
- Shortest wire or connection possible between gates.
- Smallest memory cell possible.
- Fastest memory cell possible.
- Impact of cosmic radiation on electronic components.
- Impact of background radiation on electronic components.
- Impact of impurities on electronic components.
- Speed of electron flow within and between electronic components.
- Impact of speed of light on electronic circuits.
- Impact of neutrinos on electronic circuits.
- Generation of random numbers. Quality of randomness or pseudorandomness.
- Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by even the most sophisticated professional or certified genius.
- Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by an average professional.
- Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by even a team of the most sophisticated professionals.
- Most complex algorithm, computer program, software system, application, or computing system that can be fully comprehended by a team of average professionals.
- All of the above applied to photonic components as well.
- What makes metals such as copper, aluminum, silver, and gold be conductors?
- What makes materials such as silicon dioxide be insulators?
- What makes materials be semiconductors?
- What thought processes are needed to solve problems with a traditional digital computer?
- How does a transistor really work (physics)?
- What’s so special about electrons (physics)?
My presumption is that a deep understanding of quantum mechanics is essential to a deep understanding of quantum computing. Again, my interest is not understanding quantum computing at a shallow level, the way an undergraduate computer science major would understand traditional digital computing, but to understand at a deep enough level to sift through and dispel any and all hype.
There are so many online resources for quantum computing that it is difficult to know where to start. To a large extent, it depends on what your objectives and immediate needs are.
This paper won’t endeavor to catalog or even highlight the more notable resources.
But I can’t resist linking to Prof. Richard Feynman’s watershed 1982 paper, Simulating Physics with Computers.
The official publication page:
A (bootleg?) copy of the full journal article:
A related article:
Again, the goal here is not to provide the reader with answers on quantum computing, or even attempt to point them in the right direction, but simply to begin developing a framework of the right questions to be asking about quantum computing.
But, I can’t resist providing at least a few resources…
The Wikipedia is notorious for being a dubious source of information about anything, but for anybody with at least half a brain capable of properly filtering dubious input it is always a semi-decent starting point:
There is a separate Wikipedia page for a Timeline of quantum computing, which lists notable events, breakthroughs, and commercial offerings in the history of quantum computing, and is kept reasonably up to date:
IBM is doing a lot of work in quantum computing:
Microsoft as well:
As, might be expected, of course Google is right there as well:
Many if not most premier academic institutions are deep into all things quantum.
For a relatively complete list of major academic institutions with quantum computing programs:
Books? Yes, there are quite a few books on quantum computing. A Google search for “quantum computing books” will quickly highlight many of them. Even some available as free PDFs. Personally, I’m more interested in informal or web-style resources rather than dense formal textbooks — that are quite expensive. In any case, I’ll personally refrain from recommending specific books on quantum computing, at least for now.
I’m anxious to dig into the variety of startups in the quantum computing space, but I don’t have a decent list to present at this time. But I would mention one that gets a fair amount of press (and hype??) D-Wave Systems:
As mentioned upfront, this informal paper is a work in progress, so I’ll continue to update the lists in this paper as I dig deeper into quantum computing and realize how many points of commonality or difference there really are between quantum and traditional digital computing.
I won’t be attempting to provide hard, definitive answers to any of the questions in this paper — that will be left to yet another paper or multiple papers depending and how simple or complex the correspondence between quantum and traditional digital computing turns out to really be.
I’m also interested in compiling a relatively comprehensive glossary of quantum computing terms which have distinct meaning from similar terms in traditional digital computing. As well as a glossary which is a subset of quantum mechanical terms which are essential to a deep comprehension of quantum computing. Update: I’ve compiled an initial draft for such a glossary: Quantum Computing Glossary — Introduction.
At some point I intend to produce a What is Quantum Computing? paper. The puff pieces and hype that is out there, even the Wikipedia page, are all woefully inadequate, in my own view.
Meanwhile, my main focus in the near term will be on completing the three MIT online courses that cover quantum mechanics. Or at least reviewing them even though I have no expectation of truly mastering the material, but I do expect to learn enough to pass judgment on the veracity of a lot of the claims that are being made about quantum computing, at least from the perspective of the underlying quantum mechanical theory.
Once I get basic quantum mechanics nailed, then I will move on to slogging through a few of the various online courses on quantum computing in particular.
And I am very interested in cataloging the startup space.
How much of all of this I will succeed at completing and in what timeframe remains to be seen. My hope is that my preliminary list of questions about quantum computing should be sufficient to kick start the thought and discussion process even if I accomplish very little beyond that.
Update 9/3/2018: Some subsequent papers cover more specific detail: