NISQ has been a great way to make a lot of rapid progress in quantum computing, but its limitations, particularly its noisiness and lack of robust, automatic, and transparent error correction, preclude it from being a viable path to true, dramatic, and compelling *quantum advantage* for compute-intensive applications which can be developed by non-elite developers which would simply not be feasible using classical computing. Absent perfect qubits, automatic and transparent *quantum error correction* (QEC) is needed to achieve *fault-tolerant qubits* — *logical qubits* — to support *fault-tolerant quantum computing* (FTQC.) …

Living in Washington, DC, it was easy for me to swing over to the U.S. Capitol to attend President Trump’s first impeachment trial in the Senate back in 2020. Now that Trump is on trial for impeachment *again*, I figured that my experiences might be of interest. This informal paper is a compilation of my contemporaneous Facebook posts from late January and early February of 2020.

- For the most part, my writing here is strictly apolitical, merely describing my experiences and direct observations, with no partisan political interpretation. The only possible exception is the postscript sections with my views on…

Here are my top wishes for developments in quantum computing for Christmas in 2020. Okay, that’s too tall an order with too little time left, so this informal paper lists the quantum computing developments I really want to see in the coming year, 2021. It’s a fun list, but it’s also a real list — these are advances that have a realistic chance of occurring over the coming year.

For reference, here’s my Christmas wish list from last year:

I still want everything from my 2019 wish list…

Phase is central to quantum computing, but there is a risk that algorithm designers may lean too heavily on fine granularity or gradations of phase, seeking a precision that just isn’t there in the theory or actual physics or engineering of real qubits. This informal paper will explore the phase property of qubits, with an eye on the limits of precision or granularity of phase.

Unfortunately, this paper may provide much more in the way of questions than in actionable answers, but coming up with deep and clear questions is the first step to getting answers that are both meaningful…

The IBM paper which introduced the notion of *quantum volume* as a metric for the power of a quantum computer has the odd caveat that it applies only to quantum computers of “*modest size*”, up to approximately 50 qubits. Why this odd limitation? Simple: because IBM’s method requires classical simulation of randomly-generated quantum circuits, which is exponential in the number of qubits, so 2⁵⁰, which is roughly one quadrillion (1,000,000,000,000,000 — a million billion) is considered the limit of the number of quantum states which can be represented and *simulated on a current classical computer*. …

No, quantum computers are not appropriate for *big data* problems. Rather, they are best for problems with a fairly small amount of data which has a very large solution space — so called *combinatorial explosions*. So, rather than call it *Big Data*, I call it *Little Data* with a *Big Solution Space*. This informal paper introduces the notions of *Little Data with a Big Solution Space*.

Topics in this informal paper:

- In a nutshell
- Quantum computer as a coprocessor
- Centrality of quantum parallelism
- Quantum advantage is the whole point
- Quantum supremacy
- Combinatorial explosion
- The essential goal: exponential speedup
- Big solution…

How did I get started in quantum computing? How did I first become aware of quantum computing? How did I get to where I am now in quantum computing? What was the arc of my trajectory? This informal paper chronicles the major milestones — and obstacles — along the way of my journey into the quantum world of quantum computing, quantum mechanics, and quantum information science in general. Maybe something in my own trajectory might benefit others as they consider their own entry and path in this new field of study and sector of technology and commerce.

There is a…

Although quantum computing has proved to be feasible to some degree, it still has not been able to advance beyond being a *mere laboratory curiosity*. The primary impediment being the lack of ability to handle production-scale real-world problems and deliver substantial real-world value. This informal paper will explore the question of what it will take for quantum computing to transition to being a commercial success and succeed at enabling practical applications which solve production-scale real-world problems and deliver substantial real-world value — and achieve a dramatic *quantum advantage* over classical computing.

Topics to be discussed in this paper:

- What is…

A scientific discovery or an engineering prototype in a laboratory may or may not have a significant application in the real world. When is a technology merely a *laboratory curiosity* and when does it warrant the attention of the real world? This informal paper will explore what criteria can be used to distinguish the two.

My personal underlying motivation is to set the stage for discussing this topic in the context of particular advanced technologies such as artificial intelligence and quantum computing, but they will be pursued in separate papers. …

Unlike classical computers which are known for their *predictable determinism*, quantum computers are inherently *probabilistic* by nature, which they inherit from the quantum mechanical physics underlying the implementation of their qubits, and made even less predictable by the variety of *noise, errors, and environmental interference* inherent in the technology of NISQ devices. The solution is to *run the same quantum circuit many times* and see what results occur most commonly. The average result of a number of repetitions of a quantum circuit (sometimes called *shots*) is roughly what physicists call the *expectation value* and is the closest that a quantum…

Freelance Consultant