My Top Open Questions in Quantum Computing

Jack Krupansky
3 min readSep 25, 2023


These are my additions to the list of Open Questions in Quantum Computing posted on LinkedIn by IBM’s Zlatko Minev.

His post on LinkedIn:

His document with his actual list:

My list of additions:

  1. What are the fundamental limitations of quantum computing?
  2. How close to perfect can we make uncorrected qubits? Even with the most ideal qubits, does quantum uncertainty impose some minimum uncertainty or error rate?
  3. Is there an upper limit to the gain from quantum error correction? Even with the most ideal quantum error correction, does quantum uncertainty impose some minimum uncertainty or error rate?
  4. If there really are no hidden variables, is quantum computing the ultimate limit for any computation — can there be no form of computation more powerful (beyond exponential speedup)? Would being more powerful than an exponential speedup really imply a need for hidden variables?
  5. Is there some minimum level of probability amplitude or phase (angles, theta and phi), at the Planck level, and some minimum difference between two adjacent probability amplitudes or phases (angles, delta theta and delta phi)? And would this imply an upper limit to how many qubits could be superimposed and fully entangled into 2^n distinct states, each with a probability amplitude of 1/sqrt(2^n)? Granted, our modeling is “just math” which would allow for an infinite count of infinitely small probability amplitudes (real/complex numbers), but does reality and practicality impose a hard limit on our elegant math and physics models? Consider Shor’s factoring algorithm as a test case.
  6. Will photonic quantum computing ever be able to compete with superconducting transmon and trapped-ion quantum computing? Does it have any future for general-purpose quantum computing or is it never better than a niche, special-purpose quantum computing device?
  7. Can qudits (qutrits or d > 3) add any significant power to quantum computation? Or are they too difficult to conceptualize and use to make a real difference?
  8. What would it take to demonstrate convincingly to most people that quantum computers are real enough to address practical production-scale real-world problems and to achieve a significant if not dramatic quantum advantage, and deliver unbelievable business value? What I call the ENIAC moment for quantum computing.
  9. What precision is needed for the entries in a unitary transformation matrix? In particular, how many digits or bits of precision are needed when irrational numbers are involved, such as pi and the square roots of small integers such as 2, 3, 5, etc., for rotation angles and probability amplitudes? How much precision is needed in the software and firmware which convert unitary transformation matrix entries to digital and analog signals, and ultimately apply to the physical hardware of qubits, such as spin? Is even quad-precision floating point sufficient for complex and advanced quantum algorithms such as Shor’s factoring algorithm? Does quantum mechanics determine a limit, or do the physical phenomena of qubits and their controlling hardware determine the limits and constrain the effective precision? Given the role of irrational numbers (pi), will a rotation by any angle followed by the rotation of the negative of that angle really reliably return a qubit to its origin state, or will there always be a small but non-zero error due to truncation of the irrational value of pi? Should quantum algorithm designers be paying attention to this issue? How long can it be avoided? [For as long as our quantum algorithms are merely Adventure in Quantum Toyland!] Does every vendor have to answer the question of which precision they support and require?
  10. Will we ever achieve a truly universal quantum computer which includes all capabilities of classical computers and supports those capabilities under quantum parallelism? Are there stages or subsets of classical computing capabilities which can be incorporated under quantum parallelism short of supporting all classical computing capabilities? What are the limiting factors which would have to be overcome or which would present insurmountable obstacles?

This list will be supplemented as new questions come to my mind.