Is Lack of Fine Granularity of Phase and Probability Amplitude the Fatal Achilles Heel Which Dooms Quantum Computing to Severely Limited Utility?

In a nutshell

The focus of this paper is solely on gate-based quantum computers

The focus of this paper is solely on two-level quantum systems (qubits)

Granularity and gradations of continuous values

The essence of the problem

The essence of the problem, more simply

Dramatic quantum advantage — delivering on the full promise of quantum computing

Fractional quantum advantage — an advantage, but not delivering on the full promise of quantum computing

Exponential speedup — the great promise of quantum computing, but can it really be delivered?

Background

What is a quantum Fourier transform (QFT)?

The problem

My conjecture

Implications of my conjecture

I can’t prove my conjecture, but technical specifications don’t disprove it either

I’m at least half-convinced that my conjecture is true

To be clear, my conjecture is still only tentative and may yet be proven wrong

Utility: quantum parallelism and quantum advantage

How severely limited will utility be?

People don’t currently see a problem because limited qubit fidelity, limited qubit connectivity, and limited circuit depth prevent people from actually running into the problem

Beware of Quantum Algorithms Dependent on Fine Granularity of Phase

Quantum Fourier transform is critically dependent on fine granularity of phase

Quantum phase estimation (QPE) and quantum amplitude estimation (QAE) are also dependent on quantum Fourier transform and fine granularity of phase

Any other techniques which also rely on fine granularity of phase or probability amplitude will also encounter the same limitations on quantum advantage

What is a DAC (Digital to Analog Converter)?

Where is the DAC?

The DAC effectively limits the granularity of phase and probability amplitude

Analog signals likely also severely limit the granularity of phase and probability amplitude

Additional analog circuitry after the DAC may also limit the granularity of phase and probability amplitude

The analog signal must be transported to the qubit, typically as an electromagnetic signal, which probably limits the granularity of phase and probability amplitude

The qubit itself limits the granularity of phase and probability amplitude

Ultimately the physics of the underlying quantum phenomenon used to implement the quantum state by the qubits limits the granularity of phase and probability amplitude

Some ultimate limit at the Planck level as well as quantum uncertainty itself

General uncertainties around all factors

It’s not clear which of the factors is the more limiting factor

I personally don’t know enough about classical digital and analog electronics to figure all of this out

Gate execution and qubit measurement

Fidelity of SWAP gates

Do the same limits apply to both phase and probability amplitude? Likely but not certain

How did this all happen? Too many silos with gaps between them

People don’t currently see a problem because limited qubit fidelity, limited qubit connectivity, and limited circuit depth prevent people from actually running into the problem

When will we hit the wall and start seeing the problem? Maybe two to three years

Quantum information — discrete binary and continuous values

Some alternatives

Are there any reasonable alternatives to quantum Fourier transform?

What might be more powerful than quantum Fourier transform?

No, variational methods don’t show any promise of delivering any dramatic quantum advantage

Are there any reasonable alternative approaches to implementation of quantum Fourier transform?

Might an alternative quantum computing architecture or programming model avoid my concern?

Is a radically different technology and architecture, and maybe even programming model required to get past my concern?

Might placing the classical electronics in the cryostat reduce interference to enable a few more bits of precision?

Bits of precision vs. qubits of precision

Currently this limitation is a non-issue since current limits on qubit fidelity and connectivity prevent running into these limits

The exponential math issue

Inverse quantum Fourier transform

Euler’s formula

Two pi conversion factor from normalized phase to phase angle in radians

Phase angle of two superposed basis states

Mapping phase angle to the input range for the DAC

Limited precision of entries in a unitary transform matrix

Total qubits vs. quantum Fourier transform precision

Pulses to control qubits

Pulse control

No hints about pulse precision in IBM Qiskit Pulse documentation and tutorials

Notes on IBM pulse control

How many qubits is reasonable to expect for a quantum Fourier transform?

10–12 qubits seems like a slam dunk, maybe 14–16 qubits

16 to 18 qubits is likely a slam dunk for two to three years from now

20 or 24 qubits as a reasonable expectation for what we might have in two to three years

28 to 32 qubits seems out of reach, but who really knows

36 to 40 qubits is clearly beyond the realm of what we can expect, but it’s worth contemplating as a theoretical matter

Nobody should be expecting quantum Fourier transform for more than 40 qubits

44 to 48 qubits are well beyond the realm of expectation for what a quantum Fourier transform could handle

Demand transparency — transparency is mandatory

Insist or demand that all limitations be clearly documented

Transparency is needed everywhere

No transparency on internal electronics for digital and analog circuitry

Transparency needed for number of gradations for phase and probability amplitude

The number of gradations can’t be greater than the DAC resolution, by definition

Related circuitry and noise may reduce the absolute resolution of the DAC

An open source design for a quantum computer would be helpful

Quantum advantage must be fully disclosed and adequately discussed

What’s your quantum advantage?

What other approaches are there if quantum Fourier transform isn’t available?

Are there ANY known opportunities to address the root issues? Nope

Isn’t there ANY pathway to fine granularity of phase and probability amplitude? Nope.

Isn’t there ANY pathway to quantum Fourier transform for over 50 qubits? Nope.

What about Shor’s factoring algorithm? Sorry, but it will only work for small numbers, not very large numbers

What about derivative algorithms of Shor’s factoring algorithm? Not if they still rely on quantum Fourier transform

Large-scale physics simulations may still be feasible even without support for quantum Fourier transform

Computer science experiments can also achieve significant quantum advantage

Google’s quantum supremacy experiment or cross-entropy benchmarking (XEB) can achieve dramatic quantum advantage

What about quantum computational chemistry using quantum phase estimation (QPE)? Maybe, sometimes, it depends

Still open questions as to the quality of the final quantum Fourier transform result even if a certain number of bits is supported

Scalability of quantum algorithms is vital, but problematic in the absence of clarity and predictability for fine granularity of phase and probability amplitude

How does quantum error correction (QEC) deal with fine granularity of phase and probability amplitude? Unknown!

Quantum amplitude amplification has similar issue to phase granularity

Simulators need to be configured to reflect granularity of phase and probability amplitude

No hands-on testing or empirical data

Need for benchmarking

Include granularity on label of quantum capabilities for quantum computers, algorithms, and applications

Clarion call: All silos on deck!

Limited granularity is a long-term issue that won’t be fixed in a few years and then just go away

48 fully-connected near-perfect qubits may be the sweet spot goal for near-term quantum computing

But start with an upgraded 27-qubit quantum computer

Maybe even a 36-qubit stepping stone

We’re stuck in quantum toyland with no prospect of escape

But maybe what seems toy-like to some may actually be great for others

Once again, my conjecture is still only tentative and may yet be proven wrong, but I’m not holding my breath

Never say never

Now what? What’s next?

Focus on more research to potentially discover alternative approaches

A golden opportunity for some fresh academic research

My original proposal for this topic

Summary and conclusions

--

--

Freelance Consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store