Phase is central to quantum computing, but there is a risk that algorithm designers may lean too heavily on fine granularity or gradations of phase, seeking a precision that just isn’t there in the theory or actual physics or engineering of real qubits. This informal paper will explore the phase property of qubits, with an eye on the limits of precision or granularity of phase.

Unfortunately, this paper may provide much more in the way of questions than in actionable answers, but coming up with deep and clear questions is the first step to getting answers that are both meaningful and actionable. …

The IBM paper which introduced the notion of *quantum volume* as a metric for the power of a quantum computer has the odd caveat that it applies only to quantum computers of “*modest size*”, up to approximately 50 qubits. Why this odd limitation? Simple: because IBM’s method requires classical simulation of randomly-generated quantum circuits, which is exponential in the number of qubits, so 2⁵⁰, which is roughly one quadrillion (1,000,000,000,000,000 — a million billion) is considered the limit of the number of quantum states which can be represented and *simulated on a current classical computer*. …

No, quantum computers are not appropriate for *big data* problems. Rather, they are best for problems with a fairly small amount of data which has a very large solution space — so called *combinatorial explosions*. So, rather than call it *Big Data*, I call it *Little Data* with a *Big Solution Space*. This informal paper introduces the notions of *Little Data with a Big Solution Space*.

Topics in this informal paper:

- In a nutshell
- Quantum computer as a coprocessor
- Centrality of quantum parallelism
- Quantum advantage is the whole point
- Quantum supremacy
- Combinatorial explosion
- The essential goal: exponential speedup
- Big solution…

How did I get started in quantum computing? How did I first become aware of quantum computing? How did I get to where I am now in quantum computing? What was the arc of my trajectory? This informal paper chronicles the major milestones — and obstacles — along the way of my journey into the quantum world of quantum computing, quantum mechanics, and quantum information science in general. Maybe something in my own trajectory might benefit others as they consider their own entry and path in this new field of study and sector of technology and commerce.

There is a lot of material here. After reading the first few sections for background, you might want to skip ahead to the central…

Although quantum computing has proved to be feasible to some degree, it still has not been able to advance beyond being a *mere laboratory curiosity*. The primary impediment being the lack of ability to handle production-scale real-world problems and deliver substantial real-world value. This informal paper will explore the question of what it will take for quantum computing to transition to being a commercial success and succeed at enabling practical applications which solve production-scale real-world problems and deliver substantial real-world value — and achieve a dramatic *quantum advantage* over classical computing.

Topics to be discussed in this paper:

- What is a laboratory curiosity? …

A scientific discovery or an engineering prototype in a laboratory may or may not have a significant application in the real world. When is a technology merely a *laboratory curiosity* and when does it warrant the attention of the real world? This informal paper will explore what criteria can be used to distinguish the two.

My personal underlying motivation is to set the stage for discussing this topic in the context of particular advanced technologies such as artificial intelligence and quantum computing, but they will be pursued in separate papers. …

Unlike classical computers which are known for their *predictable determinism*, quantum computers are inherently *probabilistic* by nature, which they inherit from the quantum mechanical physics underlying the implementation of their qubits, and made even less predictable by the variety of *noise, errors, and environmental interference* inherent in the technology of NISQ devices. The solution is to *run the same quantum circuit many times* and see what results occur most commonly. The average result of a number of repetitions of a quantum circuit (sometimes called *shots*) is roughly what physicists call the *expectation value* and is the closest that a quantum computer can come to the deterministic results characteristic of a classical computer. …

I’m a technologist rather than an application developer, so I have no personal application for quantum computing per se, but as a technologist I’m interested in the nature of the technology itself — the **capabilities** of quantum computers, their **limitations**, and whatever **issues** might interfere with the exploitation of the technology by real-world application developers. This informal paper will outline my personal focus on the capabilities, limitations, and issues with quantum computing.

Although I couch this informal paper in terms of my own personal interests, I am also saying that this is my personal view of the opportunities and challenges that almost everyone will encounter with this new technology. As such, the reader can and should presume that my views expressed herein represent a rough summary of the topic areas relevant to the state of the art of quantum computing, not so much as a snapshot of the moment, but the likely trajectory of the sector in the coming years, not so much in terms of specific numbers and specific technical features, but in more general terms of capabilities, limitations, and issues to be addressed. …

The *quantum effects* of physics are at the atomic and subatomic level, brought to us courtesy of *quantum mechanics*, and hold the key to major advances — *quantum* leaps — in computing, communication, measurement, and sensing, known collectively as *quantum information science*. …

Before deciding to utilize a quantum computer for a particular application, one should first assess whether a chosen quantum algorithm will indeed offer a dramatic performance advantage — a so-called *quantum advantage* — over a comparable algorithm running on a classical computer. Alas, a typical published quantum algorithm tends *not* to offer a clear statement of exactly what the *quantum advantage* of the algorithm might be. In short, the concept of quantum advantage is rather problematic, at least as currently practiced. …

About