# Lingering Obstacles to My Full and Deep Understanding of Quantum Computing

After over a full year of intensive effort on my part there are still more than a few essential concepts of quantum computing which remain more than a bit beyond my full grasp. The goal of this informal paper is simply to enumerate those areas where my knowledge of quantum computing has gaps which are serious enough to leave me feeling rather uncomfortable.

My goal in quantum computing is not to become an absolute expert or even a competent practitioner, but simply to be able to critically, competently, and credibly evaluate claims as to what computations can or cannot be performed by a quantum computer. In other words, to help dispel the massive hype that has arisen around quantum computing.

This informal paper is not designed to either provide a tutorial on quantum computing or to delve deeply into any of the issues that are raised.

My personal background is as professional software developer with an undergraduate and graduate education in computer science. I’m neither a physicist nor a mathematician, which makes it a lot more difficult to evaluate a lot of the material published about quantum computing.

Before diving deep into quantum computing in mid-2018, I first spent a couple of months laboring through the better part of two online MIT undergraduate courses in quantum mechanics (or quantum *physics* as they call it.) That doesn’t qualify me as an expert in quantum mechanics or linear algebra, but that gave me a reasonably solid foundation to actually dive into quantum computing. Or so I thought.

I began compiling a glossary of terms used in quantum computing, including a fair chunk of quantum mechanics, as my reading on quantum computing progressed. I first published my glossary on Medium in July 2018 with over 2,000 entries and have been updating it ever since. As of March 2019 it contains over 3,000 entries. You can find it here:

I can’t tell you how many academic papers, online lecture notes, and even chapters of books I’ve labored through over the past year, but even with all of that, I’ve still found that there are more than a few key concepts that are more than a little too fuzzy in my head for my taste.

I’ve already cataloged a rather long list of all of the minute details of quantum computing for which I have even a hint of uncertainty. You can find it here:

I have reached a semi-decent understanding of many or even most of the keys areas, maybe even to a 90% level of understanding, but still not to a 100% level of understanding.

I may have an opinion as to how to answer a lot of the questions in my document, but I still lack an absolute 100% certainty.

After diving into quantum computing proper for a few months in the summer of 2018, I spent about the past six months focused most intently on quantum algorithms, including and specifically Shor’s algorithm for factoring integers, which could presumably be used to crack even the strongest public key encryption using quantum parallelism, once we have quantum computers with enough qubits, in theory. I achieved mixed results — I definitely learned a lot and a lot is clear to me, but there are still way too many gaps and soft spots where I don’t feel as if I have a 100% comprehension of what is really going on. That uncertainty is the primary motivation for this paper.

I have already documented my knowledge and concerns about Shor’s algorithm in two informal papers:

*Some Preliminary Questions About Shor’s Algorithm for Cracking Strong Encryption Using a Quantum Computer**Ingredients for Shor’s Algorithm for Cracking Strong Encryption Using a Quantum Computer*

To put it as simply as possible, I feel that I need to achieve absolute 100% comprehension of the basic, popular algorithms of quantum computing before I will have a solid enough foundation to consider quantum computing more broadly.

I have been unable to locate a single source of knowledge on quantum computing which addresses all of my concerns to a 100% level of certainty. Each treatment appears to have its own areas of focus. Even in areas of overlap the treatments are not 100% consistent. Even if not outright explicitly inconsistent, some degree of vagueness precludes 100% certainty — at least on my part.

Some issues are more of a comfort factor — if I just understood better what is really going on, I’d feel more comfortable that I really know what is actually going on.

# Why all of the fuzziness?

A lot of the concepts of quantum computing are fuzzy for a number of reasons:

- Arcane conceptual basis of quantum mechanics itself.
- Much material was sourced from or for physicists and mathematicians, not classical computer scientists.
- The difficulty of linear algebra, especially how it meshes with real physical systems.
- Terminology that is needlessly inconsistent with classical computing. Operations or instructions on a quantum computer are called “gates”, which is classically a hardware term. A quantum program is referred to as a “circuit”, which is classically a physically-connected collection of hardware components rather than the sequence of operations which comprise a classical program. A qubit refers to physical hardware, comparable to a flip flop, in contrast to a classical bit which is information rather than a hardware device — which
*holds*a bit. - Quantum computing has not yet developed its own level of abstraction to remove it from the realm of physics, arcane mathematics, and discrete electronic devices (e.g., transistors, digital logic gates, and flip flops), comparable to how classical computing has developed with Boolean logic, Turing machines, and computer science in general. Need for development of formalisms for a
**quantum computer science**. - The
**probabilistic**nature of quantum computing requires a different mindset than the hard and reliable determinism of classical computing. - Lack of clarity, specificity, and detail of terminology of quantum computing, beyond the difficulties with quantum mechanics and linear algebra.
- Vagueness in specification of operations (gates) in quantum programs.
- Lack of detail for specification for edge cases for quantum operations.
- Lack of detail for parameters of quantum operations which are continuous rather than discrete 0 or 1.
- A little too much
**hand-waving**. Okay,*way*too much hand-waving. - More than a little too much
**hype**. Too much marketing when more basic science is needed. - Significant
**intuition**is needed, but the intuition is distinctly*unintuitive*. Much better guidance is needed for developing the requisite intuition. - Insufficient guidance for
**design of algorithms**. - Insufficient guidance and detail for
**basic building blocks for algorithms**. - Lack of consistency between the structure of documentation for quantum computers from different vendors.
- Severe hardware limitations of existing and near-term quantum computers make it very difficult to conceptualize practical solutions to practical problems that have a chance of running on current and near-term real quantum computers.

# The lingering obstacles

Enough with the preliminaries. Here’s my list of lingering obstacles, in two lists. The first is a more general list and the second is comprised of more narrow and specific issues.

# The more general obstacles

To be clear, I am not claiming that I am completely or even mostly ignorant about the concepts listed in this section (as well as more basic concepts which are not even listed), but simply that I have lingering uncertainties which I have been unable to resolve as of this date.

**Bloch sphere**is too simplistic and incomplete an account of how qubits really function. Amplitude is not represented. Orthonormal nature of basis vectors is not represented. Joint state is not represented, such as a register of qubits which are each superimposed using a Hadamard gate but not entangled.- Programming a quantum computer is comparable to
**machine or assembly language**on a classical computer, but even more primitive, with no notion of even**basic arithmetic**or the features of a**Turing machine**, and lacking higher-level language abstractions such as functions with parameters, conditional branching, looping, and rich data types (including text, images, and media data.) Developing software or designing algorithms for a quantum computer is like working with raw transistors, simple digital logic gates, and bare flip flops. - How does
**entanglement**factor into design of algorithms? The question is not how to entangle two qubits, but how generally to exploit entanglement in algorithms. To put it more plainly, what exactly is entanglement good for, and how should it be used? How does entanglement work in a typical algorithm? Need for explicit design principles and design patterns. - How does
**interference**factor into design of algorithms? Similar issues as with entanglement. How does interference work in a typical algorithm? Need for explicit design principles and design patterns. - How does
**quantum parallelism**factor into design of algorithms? Similar issues as with entanglement and interference. How does quantum parallelism work in a typical algorithm? Need for explicit design principles and design patterns. A lot of the steps are fairly straightforward, but there are some unintuitive intuitive leaps as well. The unintuitive intuition needs more detailed elaboration with less hand waving. A lot of the treatments explicitly admit that you have to be quite*clever*at extracting usable (classical) data from the results of a quantum parallel computation, but offer little explanation of how to arrive at that cleverness. - How does
**phase estimation**factor into design of algorithms? Need for explicit design principles and design patterns. - How does
**phase**in general factor into design of algorithms? Need for explicit design principles and design patterns. - How exactly does
**amplitude amplification**work, and how is it factored into the design of algorithms? Referenced a few places, but sketchy on details, let alone more general guidance with principles and design patterns. - Are rotations about the three axes inherently
**analog and continuous**rather than digital or quantized? In contrast to the binary nature of the two basis states of |0> and |1>. - Why must all quantum computation necessarily be
**reversible**? Okay, quantum mechanics says so, but…*why*? What is the actual phenomenon at work here? - Why does
**measurement**necessarily result in**collapse of the wave function**? Okay, quantum mechanics says so, but…*why*? What is the actual phenomenon at work here? - Why is the
**no-cloning theorem**necessarily true? Again, okay, quantum mechanics says so, but…*why*? What is the actual phenomenon at work here? And how can the SWAP gate work at all if the no-cloning theorem is true? - It is difficult to mentally conceptualize how to factor
**quantum decoherence**into the design of quantum algorithms beyond toy-like size. Such as with Shor’s algorithm where millions of gates will have to be executed for factoring 2048 and 4096-bit integers. - It is difficult to mentally conceptualize the
**probabilistic**nature of quantum parallelism into the design of quantum algorithms beyond toy-like size. Such as with Shor’s algorithm where modular exponentiation will be performed on 2048 and 4096-bit integers — and deterministic results are needed. - Relevance of
**number theory**to quantum computation, such as order-finding and Shor’s algorithm, and phase estimation. It’s too much to expect algorithm designers to be expert in the arcane mathematics of number theory, so a customized, sanitized subset as well as principles and design patterns focused on quantum algorithms are needed. Not just the magic formulas to be used, but guidance for developing an intuition for how to integrate number theory and quantum computation. - Need for a lot more
**examples**of quantum computing concepts being applied to real-world problems, with detailed plain English narrative about how the quantum computing concepts are actually being used and how they actually work. - Why exactly does a Clifford gate set plus CNOT constitutes a
**universal gate set**? Universal with respect to what? Plain language, please. - What
**computational tasks**are possible using quantum computing? Current descriptions are rather minimal at best. - Need for a richer set of quantum
**algorithmic building blocks**, rather than simply raw gates and a few larger algorithms such as QFT and phase estimation. With both their external function and their internal operation explained fully in detail in plain language. - Need for a plain language narrative
**design guide**for how to convert plain language real-world problems into quantum solutions using quantum algorithmic building blocks. - Quantum computing is
*infected*with too many**Greek symbols and odd notations**inherited from quantum mechanics and mathematics which make it much more difficult to read and certainly more difficult to write. Simple symbolic names are needed for most constructs. - More
**online interactive quantum simulators**are needed, both for real machines and for general, abstract, theoretical quantum computers. They should display all amplitude transitions, even though amplitudes are not observable in real quantum computers. - More
**online interactive quantum code repositories**(GitHub) are needed for common gate sequences. And easy to access them within online interactive simulators. - Detailed
**comments**are needed for all quantum code. Detail*why*the code is doing what it is doing, especially any interesting or unexpected consequences or side effects. Include links to any online academic papers from which the gate sequences were derived. - Need for more formalized specification and
**documentation for the programming model**for quantum computers, both in general and for each specific model of machine. Focusing on the instruction set, gates, or operations. Everything which a quantum programmer needs to know to successfully develop a quantum program, without the need for experimental trial and error just to figure out what operations (gates) actually do. See mypaper for a proposal for the content of such a document. Current documentation is vague, imprecise, and incomplete.*Framework for Principles of Operation for a Quantum Computer* - Even with relatively complete documentation of the programming model, it may still be necessary to have
**detailed documentation of the implementation of the programming model**to gain enough insight to develop a true intuition for programming a particular quantum computer or even quantum programming in general. See the*Implementation Specification*section of mypaper for a proposal for the content of such a document.*Framework for Principles of Operation for a Quantum Computer* - Even with relatively complete documentation of the programming model
*and*its implementation, it may still be necessary to have access to the**source code for the firmware**and possibly even the**schematics for the electronics**of the quantum computer, including any configuration parameter settings, to gain enough insight to develop a true intuition for programming a particular quantum computer or even quantum programming in general. But, the theory is that the documentation — principles of operation and implementation — should be sufficient and have all the necessary hints to develop the necessary intuition. Still, a fully open source quantum computer would be a very good thing. - Being
**probabilistic**, how close to**deterministic solutions**can we expect from quantum computing? How does this constrain or limit what problems or algorithms are appropriate for quantum computing? Alternatively, to what extent have we been fooling ourselves all along when accepting strictly deterministic solutions from classical computing for real-world problems which are inherently probabilistic rather than strictly deterministic? - How much depth of knowledge is needed about
**quantum information theory**? - We need to devise a
**new set of terminology and language constructs for computer science**that facilitate discussion of quantum computing in the context of a hybrid of classical and quantum computing. We need semantics for one world of computing, not two independent worlds. We need both highly technical terms and plain, natural language constructs as well. For example, “quantum parallelism” is fine as far as it goes as a simple, generic, abstract, umbrella term, which dead ends at a semantic cliff with no further plain language, non-jargon support.

# The more specific obstacles

- How can quantum parallelism work for an n-qubit “register” if those input qubits are not entangled in some way? The Hadamard gates on each qubit created a superposition within each qubit, but not entanglement between qubits.
- What is the simplest (and useful) example of quantum parallelism? With a full and detailed plain language explanation of how entanglement, interference, and other quantum computing concepts are being used. Most treatments skip over too many essential details — leaving them as “an exercise for the reader.”
- Can more than two qubits be entangled — on current machines? So-called multipartite entanglement.
- Can the |GHZ> and |W> states be constructed — on current machines, using only pairwise, bipartite entanglement gates?
- Can CNOT gates be daisy-chained after an H gate to entangle more than two qubits — on current machines?
- Which unitary transform formulations necessarily result in entanglement? What is the rule or magic than causes entanglement in some cases but not others?
- Which gates necessarily result in entanglement? Only CNOT, or others as well?
- Can some gates conditionally result in entanglement? Why? What is the underlying phenomenological mechanism at work?
- What is the number of binary bits or decimal digits of precision supported for entries of unitary matrices and gate parameters, such as rotation and phase? Single-precision floating point? Double? Quad precision? What does the software API support, what does the firmware support, what does the gate execution support, what do the lasers, microwaves or other qubit manipulation technology support, and what do the actual qubits support?
- What is the finest resolution of rotation for rotation gates? What is the smallest possible rotation? What is the largest possible rotation? What is the smallest possible difference between two rotations which can have a discernible impact on the results? Are rotations inherently
*analog*rather than digital — continuous with a large number of values vs. discrete with a relatively small number of values? Is there any reason to think of rotations as digital rather than analog — other than rotation angles of multiples of pi/2? How many unique angles can be represented for each of the three axes? How many unique pairs of angles can be represented by simultaneous rotations of two axes? How many distinct states could a qubit be in? - Is the precision supported in unitary matrices and gate parameters sufficient for reasonably large quantum Fourier transforms (QFT) and phase estimation, such as is needed for using Shor’s algorithm to factor large integers, such as for 1024, 2048, and 4096-bit public encryption keys?
- How does the SWAP gate avoid the no-cloning theorem?
- If the two qubits for a SWAP gate are entangled with two other qubits, what exactly happens with those other two qubits? Sequence: entangle 1 and 2, entangle 3 and 4, now try to swap 2 and 3.
- If the two qubits for a SWAP gate were entangled with each other, what exactly happens? Does anything change at all? It should be possible to answer this question purely from reading the doc for the SWAP gate. Or from the doc for CNOT if SWAP simply expands to three consecutive CNOT gates.
- What happens if the target qubit of a CNOT gate is in a superimposed state already — it’s not exactly 1.0|0> or 1.0|1>? What if the magnitude of the amplitudes of the two basis states of the target qubit are unequal (and non-zero), such as sqrt(0.75)|0> + sqrt(0.25)|1>?
- What happens if CNOT is performed on two qubits which are already entangled with two other qubits? Are all four qubits then entangled? Do the other two qubits become unentangled? Sequence: entangle 1 and 2, entangle 3 and 4, now try to entangle 2 and 3. And what happens if this last step was replaced with trying to entangle 1 and 4? Again, this needs to be clear from the doc — either or both the CNOT and entanglement doc.
- What is the physical realization of phase? How discrete/quantum is it, or is it more analog? What resolution of rotation can be realized at the hardware/physics level, as distinct from what the machine is designed to provide to the programmer?
- Why are there no Bell states with complex amplitudes? Or at least no examples, that I have seen.
- What are mixed and pure states? Are there more than one meaning for these terms? Are the terms reserved only for multi-qubit ensembles, or can they apply to the two basis states of a single qubit?
- If a state of |0> or |1> has an amplitude of 1.0, is it considered a
*pure state*, or is that a misuse of the term, reserving it for multi-qubit*ensembles*? If pure state is not the proper term, what is the proper and preferred term? - Are joint states real, or just an artificial mental device? Such as performing a Hadamard transform on n qubits.
- What is the smallest possible amplitude? If I have 1024 qubits and perform a Hadamard transform on all of them is the amplitude of each of the 2¹⁰²⁴ computational basis states of the joint state really 1/(2¹⁰²⁴), which seems too impossibly small to be real?
- What exactly does it mean to measure in other than the computational basis? When might this be useful? More plain language and more examples are needed.
- How is amplitude of the basis vectors represented in the Bloch sphere?
- What exactly does a CNOT gate do if amplitude of |1> in control or target qubits is not exactly 0.0, 1.0, or sqrt(2)? Most descriptions simply refer to “if the control qubit is |1>” without describing what happens when a superposition with amplitude other than 1.0 or 0.0 is used.
- What is a density matrix or density operator and how is it used or required in quantum computation?
- What is the significance of the
**trace**of a unitary matrix? How should programmers be using the concept in algorithm design? It’s simply defined as the sum of the entries on the main diagonal of a square matrix, but phenomenologically, what is actually going on and why is it significant? It has something to do with a density matrix and pure vs. mixed states. A more clear definition of pure and mixed states might clarify the matter. - How does Shor’s algorithm really work, or how
*well*does it work? I have more detailed questions in mypaper.*Some Preliminary Questions About Shor’s Algorithm for Cracking Strong Encryption Using a Quantum Computer* - How does quantum parallelism actually work? The technical details, but at a computational level rather than (and in addition to the) raw physics and raw quantum mechanics.
- How does interference actually work? The technical details, but at a computational level rather than just the raw physics and raw quantum mechanics.
- How is it that amplitude cannot be directly measured, but phase can be estimated?
- How does phase estimation really work? If you can design a clever algorithm to estimate phase, why not just encode that algorithm in hardware or firmware and offer it as a gate, possibly with some parameters, including how many bits of precision you want, and then say that this pseudo-gate
*can*measure phase (otherwise part of amplitude, which cannot be directly measured)? - How does a Fourier transform (quantum or non-quantum) actually work? What are the exponentials actually doing — in plain language? What is the
*intuition*needed to feel comfortable with quantum Fourier transforms? - How much precision is needed for a quantum Fourier transform to be useful? What considerations and factors are relevant to the algorithm designer?
- How much resolution is needed for a typical QFT? What factors need to be considered by the algorithm designer?
- How restrictive can
*banding*or other forms of*approximate*Fourier transform be without resulting in such an extreme loss of precision so as to render them relatively useless for other than mere toy examples? - There is disagreement between various treatments as to whether the exponent for a QFT is positive or negative. And hence, whether an
*inverse*QFT has a negative or positive exponent. A classical DFT has a negative exponent, and its inverse has a positive exponent, but it seems common for treatments to claim that a QFT has a positive exponent. Maybe they are actually using an inverse QFT but neglecting the adjective. It feels too loose and sloppy to me. I see this treatment in Shor’s paper. - Is the divisor in the exponent of a QFT the number of bits or the number of states superimposed in those bits — n vs. 2^n?
- How much do we need to know about Clifford groups and Clifford gates to adequately exploit quantum computing? Again, ALL in plain language — the raw mathematics is completely unhelpful.
- What exactly would a continued fraction expansion look like? I’ve seen a couple of references, particularly for Shor’s algorithm, but no actual code. Presumably, this would be classical code, but what exactly would the quantum input look like?
- Specific meaning of the phrases “
*up to a factor*”, “*up to a global phase*”, and “*up to a global phase factor*”. Again, in plain language. - Is there a connection between period-finding (or order-finding) and phase estimation? Phase estimation is thrown around a little too loosely to be very helpful.
- It would be helpful to give an intuitive, plain language justification for the need for complex numbers for amplitudes. Just because quantum mechanics uses them is not a sufficient justification. Maybe physicists and mathematicians are comfortable with them, but that’s hardly a justification for their use in quantum computing. Again, plain language.
- It would be helpful to give an intuitive, plain language justification for the need for phase. I suspect that phase is one of the features of quantum computing which has a lot of untapped potential, but it’s still a bit too vague to know for sure. It hasn’t been presented well. Again, plain language is needed.
- What operations can be performed on entangled qubits without disrupting (collapsing) their entanglement? It would be helpful to have an intuitive, plain language enumeration and explanation of the utility of various ways of working with entangled qubits.
- Is there an explicit way to
**unentangle**two (or more) qubits? I’ve seen no mention of this anywhere. And what state does it leave each qubit in? - Need a clear definition, explanation, discussion, and examples for
*global property*. Including how it relates to interference and quantum parallelism. Again, plain language. - The concept of
**uncomputation**needs more thorough treatment. Related to the concept of all quantum computation needing to be*reversible*. Seems to be more narrowly related to the reuse of*ancillary*qubits. The claim is that without it,*interference*needed for quantum computation will be disrupted. But, it’s all a bit too vague. Maybe a more thorough treatment of interference will shed some light. In particular, it’s unclear why uncomputation won’t be just as disruptive to interference as raw reuse of ancillary qubits. - How much of a deep knowledge of
**linear algebra**is needed to fully comprehend and fully exploit quantum computation? - How much of a deep knowledge of
**Hilbert spaces**is essential to fully comprehend and fully exploit quantum computation? Or**vector spaces**in general? - How much of a deep knowledge of
**complex numbers**is needed to fully comprehend and fully exploit quantum computation? - How much of a deep knowledge of
**adjoint operators**is needed to fully comprehend and fully exploit quantum computation? - How much of a deep knowledge of
**unitary matrices**and**unitary operators**is needed to fully comprehend and fully exploit quantum computation? - How much of a deep knowledge of
**Hermitian operators**is needed to fully comprehend and fully exploit quantum computation? - Is there any Bloch sphere-like visual model for Bell states?
- Are there Bell states for entanglement of more than two qubits? There are also the |GHZ> and |W> states — are they technically “Bell” states or not?
- How much of a deep knowledge of
**Ising coupling**is needed to fully comprehend and fully exploit quantum computation? - How much of a deep knowledge of
**spin**is needed to fully comprehend and fully exploit quantum computation? - How much depth of knowledge is needed about
**Hamiltonians**? - Is a deep understanding of the
**wave-particle duality**of quantum mechanics necessary to fully comprehend and fully exploit quantum computation? When are wave aspects more helpful and when are particle aspects more helpful? - How much depth of knowledge of
**wave functions**is needed to fully comprehend and fully exploit quantum computation?

# And even more obstacles

Sure, the are plenty of additional questions on my list (** Questions About Quantum Computing**), but I feel that if I could just get to 100% confidence on the issues presented here in this paper, then I’d be a lot more certain about quantum computing in general, and likely to be well-positioned to answer a lot of my other questions on my own. And, I’d be well-positioned to begin turning my master list of questions into an FAQ with actual, confident answers.

# Greatest challenges for quantum computing

And beyond all of the obstacles for my own comprehension of quantum computing listed thus far, I’ve also written an informal paper which documents the greatest challenges faced by quantum computing itself:

# Under the hood

Looking under the hood is a time-honored solution to figuring out how things really work for any device, machine, or product, but this is problematic with proprietary machine designs and still difficult even with fully-open source designs. The proper solution is for the doc for all features visible to the quantum programmer to be fully explicit to the degree that the answers to all of the types of questions I am asking are clearly, fully, and readily available.

# Quantum ready?

Hah! Not even close! Current and projected near-term quantum computers are not even remotely close to being *quantum ready*. There’s no point in programmers being quantum ready for quantum computers which are not getting close to being ready for real-world applications rather than small, toy-like applications

Any efforts to become intimately familiar with today’s quantum computers will *not* prepare software developers for the quantum computers which will be commonplace five to ten years from now.

The technology for quantum computers and the algorithm design techniques for quantum computation will have to evolve dramatically, no, make that *radically*, before quantum computers are truly ready for general use by other than the most elite of software developers.

Even three to four years would be extremely optimistic. Five to seven years would be a safer bet. And eight to ten years would be closer to a slam dunk.

Personally, I don’t expect much more than incremental progress over the next two years. Somewhere in the 160 to 256 qubit range. Maybe 512 qubits at the very outside of optimism in two (to three) years. And that’s just the raw qubit hardware, without any notion as to how algorithm design techniques will evolve.

Whether we can start turning the corner in three or four years is a real crap shoot, in my opinion. Only then can we start *considering* what *quantum ready* will really mean.

# What’s next?

I’ll keep slogging through every bit of literature on quantum computing which I can get my hands on, hoping to resolve at least a tiny fraction of my uncertainties as the months go by. I’ll update my ** Quantum Computing Glossary** as I go.

I’m thinking of going back and reading the oldest papers I can find, to get a stronger sense of what was originally motivating the design of quantum hardware and how they expected the hardware to be used. I’ll start with Feynman’s 1982 paper and go from there.

I’m also thinking of going back and reading older papers on quantum hardware design to get a better sense of what the hardware under the hood can actually do.

And I’ll keep an eye out for fresh papers and treatments which may illuminate any of the issues I have raised in this paper.

My hope is that I will incrementally stumble on treatments which do provide clear and complete accounts of my specific uncertainties, but I don’t have great confidence in uncovering adequate treatments for the vast majority of my uncertainties. Instead, I hope to uncover enough bits and pieces or “puzzle pieces” that enable me to come up with adequate treatments on my own.

I may or may not continue to expand the lists in this paper in the months to come since I feel that I really have already uncovered most of the stumbling blocks. A few may remain undiscovered, but probably not many.

At some stage I finally will turn the corner and start answering more questions than I can ask. I really would like to start turning ** Questions About Quantum Computing** into a true FAQ. But until that point, I’ll continue to expand my lists of questions.

My ultimate goal at some stage is to write a semi-decent tutorial for quantum computing which addresses all of the shortcomings I’ve identified in this informal paper. And also a guide for design of quantum algorithms.

I’d also like to eventually take a stab at designing a true quantum programming language in which all of the principles and design patterns for algorithms are fully supported as first-class operations, rather than manually constructed quantum programs based on discrete, low-level gates.

Or… maybe I’ll just take a break, a hiatus, for three to six or nine months or so and give this emerging and evolving industry a chance to catch up with me.

In any case, I expect that I will have a more confident report a year from now.

I’d also like to eventually take a stab at designing a true quantum programming language in which all of the principles and design patterns for algorithms are fully supported as first-class operations, rather than manually constructed quantum programs based on discrete, low-level gates.

Or… maybe I’ll just take a break, a hiatus, for three to six or nine months or so and give this emerging and evolving industry a chance to catch up with me.

In any case, I expect that I will have a more confident report a year from now.