# Future Topics for My Writing on Quantum Computing

This informal paper lists many of the topics I may be writing on related to quantum computing in the coming months and years, and even decades. And on quantum information science in general or indeed anything quantum, including quantum mechanics.

**Minor caveat:** My primary intention for this informal paper is for my own personal benefit, to simply capture, organize, maintain, and review my list of potential writing topics for quantum computing for my own review. Hopefully others will find my list of topics enlightening as well, but my only real goal is to facilitate my own writing. That said, I did try to put in a reasonable effort to make the list readable by others as well.

**Update #2 — May 16, 2021:**

- Added a leading section
**My current active topics**which is a short list of the topics which I am either currently working on or hope to begin working on soon. It’s always a horse race as to which of these topics makes it to completion first. Sometimes I get bogged down in a topic and then switch to another topic where I feel like I can make more rapid progress — or has more interest for me.

**Update #1 — May 6, 2021:**

The original version of this list, posted in October 2019, had 308 topics — they are mostly intact as before, but many additional topics have been added, bringing the total to over 1,000 topics. Some of the existing topic titles or notes may have been edited as well, but generally the existing topics are the same.

- I spent over two months poring through hundreds of pages of my quantum-related notes, culling out topics that I had thought about over the past four years.
- In addition, there is now a
**Top 25 topics**list as well as a**Runner-Up topics**list (about 75 topics) which precede the full list. Those two lists highlight topics which feel more likely to get earlier attention in my thinking and writing. The full list (“**Full list of topics**”) includes all of the top topics. - There is also a third extra section,
**Additional runner-up topics**, with close to 100 topics. I know it’s extra clutter since they’re already on the main list, but I just couldn’t bring myself to refrain from highlighting them as higher priority than merely being on the main list. I’m hoping to winnow this list down over time. Although every time I think I have my lists under control they just seem to grow even more! - In truth, all 200 or so topics on the three Top Topic lists are equally fair game when I’m choosing a next topic to write about.
- The Top Topic lists will be fluid as priorities and interests change, while the full list will remain mostly static (other than occasional content updates), with new topics added at the end chronologically.
- Looking for the new arrivals on the list, my latest thinking? Jump to the end of the main list (full list — “
**Full list of topics**”) to find the most recent additions to the list. The last 10, 20, 50, or 100 topics on the list will give you a sense of my most recent thinking. Although I also sometimes update the notes for older topics on the list, but not very frequently.

# Preface

I have no idea in what order I’ll tackle my remaining writing ideas. In fact, I commonly come up with other topics and write about them first.

Readers should feel free to comment on what they would find most helpful in the near term. And suggest additional topics as well. And also cite existing online writing which might seem to at least partially satisfy any of the listed topics.

There is no particular order to this list, just as they came to mind and as I collected them from old notes of the past four years.

Currently I am doing a fair amount more reading and research than writing, particularly on the quantum mechanics and physics side of the house, but I hope to get back to writing more purely about quantum computing within a few months. Or not — I may take a longer break, three months, six months, a year, or maybe even 18 months to give the field time to catch up with my writing.

Many of my proposed writing topics in fact require significant reading and research before I’ll be prepared to write in any detail about the topic. In fact, in some cases it may be more appropriate for someone else to write on some of these topics. Either way, the presence of a topic on my list indicates a strong interest on my part that the topic be written about, whether by me or someone else.

I have already written fairly extensively on many of these topics, but my real goal is to take it to the next level, the next level of depth. There’s too much superficial hype, useless jargon, and extraneous detail floating around; my goal is to dig down to real, actual ground truth. And to limit myself to plain language to the maximal extent possible — minimal Greek, symbols, and math.

In a lot of cases I don’t yet know enough to write with any true sense of authority. I view research, reading, and writing as one integrated super-activity. If I find that I have gaps in knowledge when writing, that motivates me to do more reading and research to endeavor to fill those gaps. Writing is a good test of how complete and deep your knowledge really is.

Ultimately, I think I’d like to see 50 to 100 “*What is…?*” papers to cover the breadth of key concepts of quantum computing. That might be suited to be a full book, but I’m more interested in having easily digested chunks of free online material than a physical book.

There is another, shorter, list at the end of topics that I have no intention of writing about.

This is a living list and will grow on an almost-daily basis.

# My interests — capabilities, limitations, and issues

My writing about quantum computing revolves around my interests in quantum computing — I’m a technologist, not a researcher or application developer, so I am most interested in the capabilities of quantum computing, its limitations, and any and all issues related to quantum computing. And quantum information science as well, and quantum mechanics as well, to some degree. I’m less interested in hands-on issues, specific applications, deep theory, or hard-core math.

I actually wrote an informal paper to discuss my interests:

That should give a sense of why the topics listed in this paper were included.

# Simple text only

Since I am a text-only guy, I won’t be publishing pictures or photos, diagrams, charts, tables, artwork, or complex mathematical formulas or anything involving lots of Greek letters or mathematical symbols. Just plain-language text and bullet points.

That also means I won’t be writing detailed papers on topics which require complex math, lots of Greek letters, symbols, etc.

# Blog posts?

Some of the topics may in fact be suitable for relatively simple blog posts, but generally I will endeavor to cover topics to a depth and level of detail far beyond what makes sense for a typical blog post. I rarely write less than ten to thirty pages on a given topic, and sometimes fify to a hundred or more pages — hardly appropriate for a blog post. I’ll leave normal blogging (a few paragraphs, a page or two) to others.

# FAQ?

Many of the topics are couched as questions and at least superficially seem appropriate for an FAQ, but generally my intention is to cover topics to a level of depth and detail far beyond a typical FAQ entry. A brief summary for some topics may indeed be appropriate for an FAQ — and some day I may in fact do a true FAQ, but that’s not my intention at this stage, and not my intention for questions listed here as topics for writing.

# Long form writing on Medium

All of my writing is posted on *Medium*, a perfect medium for long-form writing (as opposed to blogging.)

# Topics for others as well

Although all of the topics I list here are my own interests, others are certainly free to research and write about them as well — with or without my permission.

# Quantum information science in general a well

My writing interests are not limited to quantum computing proper, but also over all of *quantum information science*, including quantum communication, quantum networking, quantum metrology, and quantum sensing. And quantum mechanics and quantum effects as well. Anything related to quantum.

For a general overview of quantum information science, check out:

# My existing writing

First, here’s a full list of my existing writing related to quantum computing:

# My list of questions

I’ve also compiled a separate list of questions that I have about quantum computing. Some of those questions also show up as topics here in this paper. Generally a question will become a topic here if an answer to the question is likely to be much more extensive than a relatively brief paragraph. And even if any of the questions are not listed here, they are still also fair game for future writing topics for me — or someone else. That separate list of questions can be found here:

# Ripe to write?

Generally, topics will sit on my list indefinitely until they become *ripe to write*, meaning that a critical mass of my own interest, my own knowledge, and relevance to the community has been achieved. Generally, all three of those factors have to coalesce and converge before I’ll begin to tackle a topic.

Sure, I may make preliminary notes or start any number of topics, but those preliminary efforts don’t always bear fruit.

# Everything I write is informal

Some of my interests are worthy of formal treatment in a formal paper, but formality is not my thing. I’ll leave formal academic papers to academics and researchers. I try to focus on the content that I feel is important, and not worry about the formal structure of an academic paper, formal citations, peer review, etc.

My goal is simply to identify key ideas and concepts and express them as simply, concisely, clearly, and completely as I can muster.

I’m especially interested in nuances and finer details, which I feel are woefully under-covered in the quantum computing sector, even in lofty, peer-reviewed academic papers. That’s what motivates a lot of my questions.

# New arrivals — most recent additions to the list

Looking for the new arrivals on the list, my latest thinking? Jump to the end of the main list (full list — “**Full list of topics**”) to find the most recent additions to the list. The list is maintained in strict chronological order.

The last 10, 20, 50, or 100 topics on the main list will give you a sense of my most recent thinking.

# My last hurrah?

I’ve already done a ton of writing about quantum computing. I’m well ahead of the state of the art, so it might be beneficial for me to hit *pause* and hold off on too much additional writing until research and product development catches up to at least some degree. But I will continue writing until I feel that I don’t have anything else that I feel an urgent need to say.

My current expectation is that I may write another five to twenty informal papers before pausing. That’s not a precise or specific goal, just a rough expectation. And very subject to change.

Even if I do pause overall, I may still find occasion to write something, just not as frequently.

# My current active topics

This is a short list of the topics which I am either currently working on or hope to begin working on soon. It’s always a horse race as to which of these topics makes it to completion first. Sometimes I get bogged down in a topic and then switch to another topic where I feel like I can make more rapid progress — or has more interest for me. This list will be updated fairly frequently, particularly as I complete one topic and begin working on the next.

Topics may be on my Top 25 list but not this list if my interest is high but the topic is not quite ripe for writing. For example, they may require significant reading, research, or thinking first.

Only the topic titles are given here. Additional notes can be found in the top, runner-up, and full lists.

**Five modes of a classical quantum simulator.****Where are all of the 40-qubit algorithms?****Staged model for scaling of algorithms.****Fractional quantum advantage or partial quantum advantage or degrees of quantum advantage.****No point to quantum computing until and unless dramatic quantum advantage is achieved.****Will quantum computing be effectively useless until quantum phase estimation and quantum Fourier transform are practical at production-scale?****Prescription for forward movement in quantum computing: dramatically dial down commercialization efforts and dramatically dial up basic research in hardware, simulation, programming models, and algorithms.****Reality of quantum computing now and in the near-term.****What should people be expecting to see in quantum computers in five years?****Quantum computing needs a massive reset.****Can a NISQ quantum computer ever achieve dramatic quantum advantage?****Beyond NISQ — Proposed terms for quantum computers based on noisy, near-perfect, and fault-tolerant qubits of various sizes.****What is a near-perfect qubit?****Need to focus on near-perfect qubits.****Why NISQ starts at 50 qubits.****No, most current quantum computers are technically not true NISQ devices.****Cautions for enterprise managers and executives when contemplating quantum computing investments.****Personas, use cases, and access patterns for quantum computing.**

# Top 25 topics

Before diving into the full list, here are my current Top 25 (or so) candidates for topics to write about in the near future, in no particular order, but particularly appealing topics will tend to bubble up to the top.

See the next section for 75 *Runner-up topics*, and the section after that for another 100 or so additional runner-up topics.

**Five modes of a classical quantum simulator.**Classical quantum simulators. 1) Ideal, perfect simulation, noise-free, no limits, maximum precision. 2) Very closely match the error and limit profile for a particular real quantum computer — so simulation results closely match the real machine, and given a relatively small number of real machines, could run a very large number of simulations on more plentiful classical machines. 3) Tune noise and error profile for proposed enhancements to evaluate how effective they will be — exactly match proposed evolution for new machines (simulate before you build.) 4) Ideal real simulation — not quite perfect since there are theoretical limits from quantum mechanics. 5) Near-perfect qubits — configure for number of nines. Five distinct purposes. Make sure intentions for each simulator run are clear.**Where are all of the 40-qubit algorithms?**Toy algorithms (using less than 28 qubits) are great to prove that basic hardware features of a quantum computer function properly, but they don’t even come close to demonstrating scalability or potential for dramatic quantum advantage. We need to see algorithms using at least 32 to 44 qubits in a single Hadamard transform for quantum parallelism to show that we are on the cusp of dramatic quantum advantage. Show algorithms that scale cleanly and smoothly from 24 to 28 to 32 to 40 to 44 to prove that scaling to 48 and 50 and 56 and 64 and 72 are credibly (and provably) within reach. Simulating algorithms for 32 to 44 qubits on a classical quantum simulator should be practically doable at this time, so… why aren’t we seeing algorithms in that range? Counterpoint: No real need for 40-qubit algorithms until qubit fidelity, connectivity, and phase precision is sufficient for 40-qubit quantum phase estimation (QPE) or quantum Fourier transform (QFT).**Staged model for scaling of algorithms.**Initially prove algorithm at 4 and 8 qubits using both classical quantum simulators and actual quantum hardware. Then incrementally scale in stages, testing on both classical quantum simulators and actual quantum hardware, the stages being 4 qubits, 8, 12, 16, 20, 24, 28, 32, 40, 44, 48, 56, 64, 72, 80, 96, 128, 192, 256 qubits. Prove scaling from 4 to 40 qubits using classical quantum simulators as well as actual quantum hardware. Expectation that algorithms working at one stage will scale reliably to subsequent stages. If scaling works for each stage from 4 to 40 qubits, expectation is that scaling will work for stages from 40 to 64 qubits, and beyond as higher capacity and higher fidelity hardware becomes available. At least this is the model, in theory. Whether real quantum hardware does indeed scale in this way remains to be seen. Actually, I’m reasonably confident that it*doesn’t*scale this way at present, but if we have the scaling and testing framework in place, we can test and prove advances in scalability as they become possible as the hardware progresses. The goal is to provide algorithm designers with a model for development and testing, and to provide hardware vendors with a model for evaluating their progress on the scaling fronts — both hardware and algorithms using that hardware.**Why NISQ starts at 50 qubits.**Actually, it’s*intermediate-scale*that starts at 50 qubits. I don’t know with certainty, but my informed speculation is that a few dozen qubits was viewed as being too few to do anything useful, 2⁵⁰ quantum states was viewed as the point where quantum advantage over classical computing could be achieved, and a number of machines had been announced at the time (2018) with 50 or more qubits — Intel at 49 qubits, IBM at 50 qubits, Google at 72 qubits, and Rigetti at 128, so there wouldn’t have seemed to be any need to place any attention on machines with less than 50 qubits. Or so it seems.**Can a NISQ quantum computer ever achieve dramatic quantum advantage?**Even if some advantage, any prospect of it being dramatic? Or is the noisiness too much of an impediment. Depends on how strict or loose quantum advantage is defined, but for purposes here, strict, true, dramatic quantum advantage is the target. What criteria would have to be met to achieve true, dramatic quantum advantage using a NISQ device? Maybe based on the product of qubit fidelity and circuit depth — shallow circuits allow more noise, deep circuits allow much less noise. Presume 64 qubits in a single Hadamard transform as the target algorithm width. 50 qubits might be okay, or maybe 53–55 as the minimum.**Will quantum computing be effectively useless until quantum phase estimation and quantum Fourier transform are practical at production-scale?**I strongly suspect so. Variational methods can work, in a fashion, but are unlikely to deliver dramatic quantum advantage. Dramatic quantum advantage is the threshold minimum for declaring that a quantum computer is useful, in my mind. Until applications can regularly utilize quantum phase estimation (QPE) and quantum Fourier transform (QFT) on qubit registers of at least 50 if not 55 or more qubits wide, any advantage over classical computers will be minimal at best. QPE and QFT on 32 to 40 qubits would be a great stepping stone on the path to 50, 55, and 60 qubits. All of this is predicated on QPE and QFT being the most powerful algorithmic building blocks currently known. Further research might produce even more powerful building block algorithms, but QPE and QFT are our best bets today — and they really aren’t even practical today or in the near term. They’re unlikely in the next two years. They’ll have to wait until qubit fidelity is up into three to five nines (99.9%, 99.99%, or 99.999%.) Five years might be a safe bet, but even that is pure speculation.**Prescription for forward movement in quantum computing: dramatically dial down commercialization efforts and dramatically dial up basic research in hardware, simulation, programming models, and algorithms.**Commercialization of current technology will not lead to dramatic quantum advantage. Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out. Better to focus on simulating 40-qubit algorithms using higher-fidelity qubits than on smaller NISQ algorithms to be ready to exploit future hardware as it becomes available. Better to push for higher performance and higher capacity classical quantum simulators to enable current simulation of 32 and 40-qubit and even 48-qubits algorithms with deeper circuits. The current programming model is grossly insufficient for commercial consumption.**Quantum computing needs a massive reset.**Need for a reset for the entire field. Back to square one for theory, terminology, programming model, algorithm design, and qubit technology. In fact, we may need multiple resets, not unlike the numerous generations of classical computing. I sincerely don’t believe that we’ll get to dramatic quantum advantage with the current approach to quantum computing (NISQ devices.) Radically different or improved qubit technologies are needed. Significantly more nines of qubit fidelity are needed. A higher-level programming model is needed. A richer set of algorithmic building blocks are needed. We may need a*Quantum Computing 2.0*to get to The ENIAC Moment, a*Quantum Computing 3.0*to get to The FORTRAN Moment, and a*Quantum Computing 4.0*to get to a Universal Quantum Computer integrating full quantum and classical computing capabilities.**The 10 most important things I can say about quantum computing.**Reference my papers for details, but provide concise summaries. Focus on the here and now as well as the future, but not solely on ultimate promise.**Thoughts on probability, stochastic processes, statistical aggregation, and determinism.****Cautions for enterprise managers and executives when contemplating quantum computing investments.**Quantum computing isn’t anywhere near to being “ready” for enterprise deployment of production-scale applications to address real-world problems with dramatic*quantum advantage*over classical systems and deliver dramatic real business value. It’s okay to “get ready” and do some preliminary research and studies and experiments, but it’s*not*time to make significant investments and expect business value to be delivered within the next two years. Be prepared to wait two or three or four or five years — or even longer — before quantum computing actually can deliver dramatic quantum advantage for real-world problems.**How would you explain quantum computing to someone unfamiliar with the technology?**“*Quantum computing isn’t just a step-change in quantum or in computation — it’s a complete paradigm shift, like moving from a candle to a lightbulb. It’s operating off entirely new principles of physics, offering a novel way of interacting with and manipulating information.*” — is that a reasonable, useful, and actionable thing to say? Niche applications. Small portion(s) of applications. Little data with huge solution space and little output. Coprocessor. Hybrid quantum/classical. Need elite staff to analyze real-world problems and synthesize quantum solutions. Need prepackaged quantum solutions so that less-elite organizations can deploy quantum solutions.**My journey into quantum computing has given me a newfound appreciation for the incredible intellectual power of classical computing.**Quantum computing may do a few things much better (quantum parallelism, probabilistic computing, generation of random numbers), but classical computing has so many features to offer that are not available in quantum computing, yet. At best, a quantum circuit is no more than a simple code block in a classical program. So many wonderful classical features… Loops. Conditionals. Functions with parameters. Rich data types. Arrays, matrices, tables, lists, trees, graphs, and maps. Limitations of quantum computing: everything must be expressed in terms of raw physics, all operations must be reversible, no fanin allowed, no fanout allowed — no-cloning theorem precludes copying, no persistence — no I/O, mass storage, file systems, databases, or network access.**Why am I still unable to write a brief introduction to quantum computing?**Lack of a concise quantum Hello World program that effectively uses quantum parallelism to achieve quantum advantage. No coherent high-level programming model. Lack of a great set of coherent quantum algorithmic building blocks. No coherent methodology for mapping real-world problems to models readily implemented on a quantum computer. No real-world problems solvable with so few qubits. Difficulty achieving quantum advantage — the only reason to use a quantum computer. Particulars of quantum parallelism, especially limitations. Product state, multi-qubit computational basis state. Utility and application of phase. Multi-qubit entanglement vs. strict bipartite — any limits — short-term, theoretical, medium-term? Top reason: I’m not so interested in what you can do with current or near-term quantum computers — which are too limited and not suited to production-scale applications. I’m far more interested in what we will eventually be able to do once the technology has evolved and matured to support production-scale applications and easily achieve dramatic quantum advantage. I fully expect the technology and programming model to evolve to something very different from what exists today, so that describing one does not help you understand the other to any significant degree. The singular benefit of current quantum computing technology is to justify funding for research to evolve to the quantum computing technology of the future — what quantum computing will be once it can support production-scale applications which can easily achieve dramatic quantum advantage. Current technology is more of a*trivial sandbox project*, not so worthy of my attention.**What is quantum computing?**Both hardware and software. Lots of superficial and misleading puff pieces out there. See*Why am I still unable to write a brief introduction to quantum computing?*What is a quantum computer? How does quantum computing fit in with classical computing? What is a… Quantum application, Quantum program, Quantum circuit. Measurement. Quantum Fourier transform. Quantum phase estimation. Quantum parallelism. Interference. Phase. Entanglement. Probability amplitude. Probability. Superposition. Unitary transforms. Quantum logic gates. Rotations. Bloch sphere. Basis states. Qubits. Various formats… One paragraph. One-pager. Two-pager. Four-pager. 10-pager. 20-pager. 50–75 page mini-book. 100-page mini book. Brief glossary (or glossaries, plural) — 10 most essential terms — to drop at cocktail parties, 25 terms, 50 terms, 100 terms, 250 terms. Torn between describing quantum computing is it exists today versus a vision of what it will eventually be like once it becomes useful and supports production-scale applications and achieves dramatic quantum advantage.**Can quantum computing address these applications?**List application areas which are compute-intensive, but where it may not be obvious whether or how quantum computing can deliver a substantial and dramatic quantum advantage over classical computing solutions. Such as a large amount of input data or a large amount of complex logic which may not be reducible to realistic quantum circuits. Or real-time sensor processing. Or generation of large amounts of output data.**Take a stab at introduction to quantum computing.**As limited and problematic as that might be. At least describe the visible tip of the iceberg even if much of the subsurface of the iceberg is not quite so clear. Start with what I wrote inAlso see*What Is Quantum Information Science?*.*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***The allure of quantum computing (for me) is dimming at a rapid pace.**My growing disillusionment with both progress and direction. Depth of existing hype. Pace of fresh hype. Problems which are super-exponential. Slow progress towards quantum advantage. Lack of discussion of technical criteria for practical applications. Insufficient funding for research.**How long before a quantum computer can do anything practical at production scale?**The ENIAC moment, or some other criteria? Specific technical criteria for practical. Quantum advantage. Well beyond largest supercomputer — 10X? 100X? Or all that matters is that existing computers can’t achieve solutions that a quantum computer can achieve. Can anything practical be done with 40–45 perfect qubits (limit of simulators)? Do ethical issues matter before practicality occurs?**The four basic building blocks of quantum computing.**Quantization — two discrete energy levels, Superposition, Entanglement, Interference. And Hadamard transform, quantum parallelism, quantum phase estimation, quantum Fourier transform, etc.**How scalable is your quantum algorithm?**Or, is your quantum algorithm scalable?**What greatest advance would I like to see next on the quantum computing front?**If I could only see one major advance in quantum computing in the coming year, what would it be?**Proposal for criterion for quantum advantage — can your quantum algorithm do in no more than an hour what would take 50,000 classical computers more than eight hours.**The whole point of quantum advantage is not just that a quantum computer is faster than a classical computer, but that a complex problem can be solved promptly using a quantum computer while it would take an incredible level of classical resources to achieve the same solution, and even then, likely not in a timely manner. As a prototypical example, suppose a business had a thorny optimization problem such as route scheduling which they needed to solve once a day within much less than eight hours to be acceptable. An hour or two might be acceptable, but more than a few hours would simply not be viable for the business requirements. So, 50,000 classical computers working for eight hours seems like a good metric for comparison. These are not precise numbers, just a rough ballpark to express the conceptual comparison. Alternatives are what a quantum computer can do in one second, ten seconds, one minute, ten minutes, or thirty minutes versus a large number of classical computers can do in one day, 48 hours, 72 hours, one week, one month, three months, or one year.**Fractional quantum advantage or partial quantum advantage or degrees of quantum advantage.**There is no generally accepted numeric metric for how much of a performance advantage of a quantum solution over a classical solution constitutes*true quantum advantage*, but I generally refer to*dramatic quantum advantage*to emphasize that the advantage is much more than a relatively minor advantage. Generally, I would say that*true, dramatic quantum advantage*is more than a few*orders of magnitude*(powers of ten) advantage. Thousands or millions or more would clearly be a dramatic advantage. If one can match or exceed the advantage of a quantum solution simply by adding ten, a hundred, or even 1,000 classical machines, that’s*not*what we mean by a true, dramatic quantum advantage since those are tasks easily accomplished by any competent IT staff today — no quantum computer required. That said, even a solution which is only ten or twenty times faster, or even four or five times faster is still interesting in many niches, but I would call such quantum solutions*fractional quantum advantage*or*partial quantum advantage*since they aren’t delivering the full, promised potential of quantum computing. I don’t have particular metrics in mind, but one million and ten thousand are two candidates, among others. So if a quantum solution could do the work of 500,000 (or 5,000) classical computers, that could be considered a fractional quantum advantage of 0.5. Or, doing the work of 1,000 (or 10) classical computers could be considered a fractional quantum advantage of 0.001. Alternatively, consider degrees of quantum advantage: 1) Below classical, 2) Near parity with classical, 3) Modestly better than classical, 4) Better than even a massively parallel and distributed classical solution, 5) Degrees of parallel/distributed — 2x, 4x, 8x, 16x, 64x, 256x, 1024x, 4Kx, 16Kx, 64Kx, 256Kx, or 10x, 25x, 50x, 100x, 500x, 1Kx, 10Kx, 25Kx, 50Kx, 250Kx, 6) Moderately better than classical, 7) Well above classical, and 8) Amazingly above classical — quantum supremacy — classical can’t even get the job done with any realistic level of resources and time.**Personas, use cases, and access patterns for quantum computing.**Who, what, and how for the use of quantum computers. Including managers, executives, non-quantum technical staff, IT service staff, scientists, engineers, operations staff (e.g., those seeking optimization), marketing, sales, technical support, etc. I’ve written such a categorization for databases and cybersecurity; every field could use one. Important for focusing on audiences for writing, products, and services.

This is a live list and could be updated on any day.

# Runner-up topics

These topics are just as important and interesting as the top choices, but didn’t quite make the cut. In some cases the topic might be sufficiently appealing, but I may not yet have enough information to do it justice.

**What might you do with 1,000 qubits?**Any off the shelf algorithms that can immediately scale up (or down)? What about Shor’s algorithm? Questions about total gates, maximum circuit depth, and coherence time. Is quantum error correction needed?**What is quantum parallelism?**This is probably the single most important topic. What exactly is computed in parallel? How is the result measured? Is statistical repetition of the calculation required to tease out the range of possible solutions to be evaluated classically? Ditto for Shor’s and calculating order — repeat the same exact quantum calculation k times and examine the distribution?**What does the field really need right now, for the coming year?**Deliver promised qubit increases. Improve coherence time.**What application is likely to reach dramatic quantum advantage first?**Or at least which application category? I’m personally rooting for quantum computational chemistry. Business process optimization would seem to be a slam dunk. What factors will make the choice? Could ability to tolerate an approximate solution be a key factor?**What application is likely to reach The ENIAC Moment first?**This could be the same as*What application is likely to reach dramatic quantum advantage first?*, but a true dramatic quantum advantage is not required provided that enough of a*partial quantum advantage*is achieved that really impresses people.**Need for multiple ENIAC Moments, one for each major application category.**The needs of the various application categories are too different for a solution in one to automatically translate into a solution in any of the others. But maybe the others follow in fairly rapid succession from the first one, which blazes the trail.**What might you do with a million qubits?**Scarcity of algorithms. Something that is commercially relevant or scientifically interesting. Besides quantum error correction and logical qubits. Sensor-backed qubits? Still too small for Big Data, and no I/O. What single computations might need a million qubits, as opposed to a collection of independent computations which could be executed in parallel on a modular quantum computer with say 16K qubits on each of 64 QPU modules? What single Hadamard transform or QFT would need a million qubits or a sizeable fraction thereof?**What might you do with 512 qubits?****What might you do with 256 qubits?****What might you do with 128 qubits?****Need for 32 to 64-qubit algorithms.**Smaller algorithms don’t really demonstrate the true potential of quantum computing — dramatic quantum advantage. 32–40 qubits can be classically simulated, so focus there first, with emphasis on scaling from 32 to 64 qubits.**How soon will algorithms hit a hard wall without quantum error correction?**Algorithms using more than 28 or 32 or 36 or 40 qubits? For usable, practical algorithms addressing real-world business problems, not specialized computer science experiments.**Wanted: True Nobel-class geniuses to propel quantum computing more than a few quantum leaps forward.**Mere incremental engineering just won’t cut it to finally make quantum computing usable and useful at production scale for real-world problems. Breathtaking advances are needed, but incremental advances are simply not enough. Quantum computing will just be dead air until then.**Would moonshot projects or Manhattan Project-style of projects really help quantum computing make a quantum leap to its full potential?**Timing is everything — doing either Project Apollo or the Manhattan Project even a mere five years earlier would have been an absolute disaster — a critical mass of the critical technologies, coupled with a critical mass of resources and a critical mass of commitment are essential. Quantum computing is still basically stumbling around in the dark. Besides, look how well classical computing developed, without a moonshot project. But also look at how the long stream of critical developments occurred over an extended period of time.**Can quantum variational methods ever achieve quantum advantage?**Sure, variational methods are a great way of performing computational chemistry calculations on NISQ devices, but not with any apparent let alone dramatic quantum advantage. Too fragmented. Each invocation is too limited. Need to use 50 or more qubits in a single Hadamard transform computation to achieve quantum advantage.**Are variational methods a deadend?**Inability to achieve dramatic quantum advantage?**Near-perfect qubits.**Much less noisy than NISQ devices, but still not as absolutely stable as full quantum error correction. Exactly how close to perfect is near-perfect? How close to perfect would many applications require? See also:*Nines of qubit fidelity*. See also:.*Preliminary Thoughts on Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits***Importance of focusing on near-perfect qubits.**Especially as a near-term research target. Full-blown quantum error correction (QEC) is too far out. Near-perfect qubits are needed for QEC anyway. NISQ won’t get to quantum advantage. Range of possible nines — generally, 3 to 6. Nines and fractional nines.**NPISQ vs. NISQ devices.**NPISQ = Near-Perfect Intermediate-Scale Quantum. Not quite as error-free as full quantum error correction, but getting close, and far less noisy than NISQ. May be good enough for many applications. And doesn’t require the sheer volume of qubits that full QEC requires. Full QEC (FTISQ = Fault-Tolerant Intermediate-Scale Quantum) may still be needed for some applications, but NPISQ may be good enough for many.**Noisy, near-perfect, and fault-tolerant quantum computers.**Fault-tolerance requires dramatically more qubits (redundancy). Noisy is too noisy for most applications. Near-perfect still has some minor amount of noise and isn’t as absolutely perfect as fault-tolerant QEC, but may be good enough for many applications. Prefixes: N, NP, FT.**Small scale, intermediate scale, and large scale quantum computers.**Intermediate-scale quantum computers, as in NISQ, have 50 to a few hundred qubits. Small scale would be under 50. Large scale would be more than a few hundred, thousands, or even millions. Prefixes: SS, IS, LS. Combine with prefixes for noisiness (N, NP, FT): NSSQ, NISQ, NLSQ, NPSSQ, NPISQ, NPLSQ, FTSSQ, FTISQ, FTLSQ.**Importance of focusing classical quantum simulators on near-perfect qubits.**Although the definition and specification of qubit fidelity for near-perfect qubits will vary over time, it would be very beneficial to configure classical quantum simulators to precisely match the qubit fidelity of real near-perfect qubits so that simulation runs will closely match the results of execution on a real quantum computer constructed of near-perfect qubits meeting those qubit fidelity specifications. In addition to precisely matching existing machines, qubit specifications could be tuned to match proposed qubits to see how they are likely to perform even before the machine is actually built. A wide range of possible near-perfect qubit specifications could be simulated to get a sense of expected results even before real machines are actually built. Specify number of nines for qubit fidelity. Focus algorithm designers and applications developers on how many nines of qubit fidelity they are working with. Focusing on both NISQ and perfect qubits is an unproductive distraction. Granted, initial focus will be on very-low nines (1 to 2–90% to 99%), but explicit focus on nines is a more productive mindset.**We desperately need near-perfect qubits to accelerate algorithm development.**Noisy qubits are an incredible burden and impediment to designing quantum algorithms and developing quantum applications. Logical qubits based on full, automatic, and transparent quantum error correction will solve a plethora of problems, but won’t be available for quite a few years, so the intermediate steppingstone of near-perfect qubits will enable a fairly wide range of algorithms and applications. Near-perfect qubits are needed to accelerate algorithm design, application development, and deployment of quantum computing solutions.**Might an application-specific or domain-specific quantum computer lead to the greatest breakthrough in quantum computing?**We do have D-Wave, but it is algorithm-specific without general-purpose capabilities. Classical computing leaped over analog computing with the capability of a general-purpose Turing machine.**Where is quantum computing today?**Outside chance of something useful in 2 yrs or 2–3 years? Greater chance of something useful in 5 years or 4–5 years? Nothing really useful today? Unlikely in next 6 months? Still rather unlikely in 12 months? Or even 18 months? Is two years still a little too far out to be trying to accurately predict? Will Honeywell and IonQ break out, stall, or stumble? Toy apps — 5 to 17 qubits, no real apps with over 20–24 qubits, other than Google quantum supremacy example? Unclear how many qubits are usable with quantum simulators — very small numbers, 20, 40 (only in theory)? Connectivity is still a limiting factor — transmon — hard limitation, trapped ion — promises any to any, but not yet a reality for a significant number of qubits?**Three stages of deployment for quantum computing: The ENIAC Moment, configurable packaged solutions, and The FORTRAN Moment.**The initial stage of deployment — The ENIAC Moment — for quantum computing solutions relies on super-elite STEM professionals using a wide range of tricks and supreme cleverness to achieve solutions. For example, manual error mitigation. The second stage — configurable packaged solutions — also relies on similar super-elite professionals to create frameworks for solutions which can then be configured by non-elite professionals to achieve solutions. Those non-elite professionals are able to prepare their domain-specific input data in a convenient form compatible with their non-elite capabilities, but not have to comprehend or even touch the underlying quantum algorithms or code. The final stage — The FORTRAN Moment — relies on a much more advanced and high-level programming model, application frameworks, and libraries, as well as logical qubits based on full, automatic, and transparent quantum error correction to enable non-elite professionals to develop solutions from scratch without the direct involvement or dependence on super-elite professionals.**No, most current quantum computers are not NISQ devices.**Technically,*intermediate-scale*(the IS in NISQ) means at least 50 qubits, up to a few hundred qubits. As such, quantum computers with 5 to 32 qubits, the bulk of most current quantum computers don’t technically qualify as true*NISQ devices*. That said, most people consider these smaller quantum computers to be NISQ devices anyway. That’s life in a young sector where hype is rampant.**People are jumping the gun and prematurely acting as if quantum computing technology was near to being ready for deployment in the next few years.**A belief in deployment of quantum computing in the next two or three years is clearly not currently justified. Even deployment in three to five years is currently not technically justified. Five to seven years*might*be a practical timeframe, but that’s really merely speculation at this juncture.- Enhancements to
The quantum hypothesis — in general. Historical role in transitioning from the classical world to the quantum world. Momentum, angular momentum, orbital momentum. Add Helicity (spin, angular momentum), Chirality (handedness), polarization, and detail on*What Are Quantum Effects and How Do They Enable Quantum Information Science?***creation and annihilation operators**(primarily to facilitate quantum mechanical modeling of many-particle systems.) Harmonics? Localization vs. spooky action at a distance? Pure state, mixed state, density matrix, density operator. Wave packets. Nuclear magnetic resonance (NMR). Some early efforts in quantum computing relied on NMR. Nuclear electric resonance. Gravity — nominally lies in the realm of General Relativity rather than quantum mechanics, but… who knows for sure. Can a quantum computer be used to accurately simulate or model gravity and gravitational effects, including effects on time and space? Clearly quantum metrology relates to gravity and gravity waves in some way. Is friction a quantum effect? Avalanche and threshold effects as quantum effects? Correlated quantum matter and quantum materials. Mention hidden variables. Mention Hilbert space. **Toolkits, libraries, frameworks, and packaged solutions for quantum computing applications.**Relative value. Relative benefits. Limitations. Toolkits help and give you some, but limited, leverage: Little in the way of deep, dramatic abstractions, Significant knowledge of under the hood is required, Work across multiple or even all domains, Intimate knowledge of probabilistic and statistical solutions rather than deterministic solutions. Frameworks gives you significant leverage: Many deep and meaningful abstractions, But still a lot of gaps and connections that user must fill in, Some, but limited, knowledge of under the hood may be required, Degree and depth of knowledge will vary from framework to framework and domain to domain, May work across at least some domains, but may be domain-specific, Some degree of knowledge of probabilistic and statistical solutions rather than deterministic solutions. Packaged solutions: It’s all there, User gets to work exclusively in terms of higher-order abstractions, User can do some degree of configuration and customization, but all in terms of higher-order abstractions, No knowledge of under the hood required, Usually domain-specific, although there may be some niche horizontal solutions, Deterministic solutions, or maybe some degree of statistical solutions, if that is appropriate for the application domain, otherwise strictly deterministic, even if that is accomplished using statistical methods. Tools vs. toolkit — standalone tools as apps vs. modules linked from libraries? Libraries separate, or simply the implementation mechanism for toolkits and frameworks? Under the hood: Math, Physics, Greek symbols, Arcane jargon.**Quantum hell.**Waiting for… Hardware, Algorithm metaphors, and Applications. Needed and critical: genius-level innovation.**How do you talk about quantum to lay people?**So many different audiences. So many different interests. So many different levels of sophistication. Granularity of the universe. Qubits behave communally — enormous parallelism. Chemistry is fundamentally quantum. Molecules are quantum. Much greater sensitivity. Classical mechanics is an approximation — it doesn’t capture everything. Quantum mechanics has been around for a long time. A lot of uncertainty in and about quantum technology.**Which will come next: more qubits or higher quality qubits?**Either way, we need near-perfect qubits, even for quantum error correction. Quality would help a lot. Not clear that algorithms are capable of utilizing more qubits yet, especially if better connectivity is needed. Will ion traps reach commercialization soon or have a slower slog towards a higher number of qubits?**What’s wrong with quantum volume?**Limitations. Low utility. Lack of specificity.**Alternatives to quantum volume for benchmarking.**Separate grade for each capability. Number of qubits. Nines of qubit fidelity for gate execution and measurement. Coherence time. Connectivity — any to any vs. SWAP network fidelity? Gate execution time as well (ion traps supposedly much slower.)**The essential concepts in quantum computing.**Not all of the technical details, math, physics, and nuances, but what really matters. Quantum parallelism. Interference. Phase. Circuit repetitions (shots) for statistical significance. Probabilistic. Little data with a big solution space and little output. Also see.*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***NISQ era.**Does it have any real practical consequences, or is it simply an inconsequential steppingstone to get to a proverbial*post-NISQ era*where there actually are dramatic practical consequences? AFAICT, NISQ alone won’t enable production-scale practical applications with dramatic quantum advantage.- Add the recent (2019) Preskill comments from Quanta to my quantum advantage paper. Preskill in Quanta:
— https://www.quantamagazine.org/john-preskill-explains-quantum-supremacy-20191002/*Why I Called It ‘Quantum Supremacy’* **Quantum parallelism as the central transform for quantum algorithms.**- Update
Large number of measurement shots needed — can force use of sub-optimal algorithms in a bid to dramatically reduce the number of measurements/shots needed. Is the ultimate rule for shot count polynomial or exponential? More emphasis on advanced, powerful simulators. Or is the current section enough? Maybe mention AtoS, hardware accelerators. More massive memory.*What Is Quantum Algorithmic Breakout and When Will It Be Achieved?* **Fractional parallelism.**Unable to do full desired parallel computation in one step — must break it into pieces with optimization or other processing between the pieces. Blocks for D-Wave. Variational methods. Machine learning?**Status of two-year path to initial success.**When did I first say I saw a 2-year path to a real, meaningful application of quantum computing? Later I codified it as. Now, how much longer is it likely to take? Is it slipping a year every year?*The ENIAC Moment***Still In the 1930’s of classical computing.**Current quantum computers are not even comparable to what we could do with classical computers in the early 1940’s. For a timeline of historical classical computers, see.*Timeline of Early Classical Computers***What happens during the gap between the ENIAC moment and the FORTRAN moment?**The Lunatic Fringe reigns supreme with raw machine language serving a limited, elite audience and market. Packaged solutions requiring only configuration but no algorithm coding can enable solutions to problems for a wider audience. A number of mini-stages before the full FORTRAN moment is achieved — partial FORTRAN moments, such as a limited number of logical qubits. Some niche applications may be able to verge on The FORTRAN Moment even before many or most applications are enabled for the full sense of The FORTRAN Moment.**Is a quantum computer merely a quantum calculator?**Is a quantum computer really a computer or simply a calculator? The coprocessor model. Merely a single block of code rather than a full program or application, with no control structures, data structures, nested function calls, rich data types, I/O, database access, or network access. Separate paper for quantum coprocessor? Hybrid computation — quantum subroutine, quantum computer doesn’t perform a full computation, only a portion, a “calculation”, such as the variational quantum eigensolver (VQE) method. What are the key differentiators between a computer and a calculator? Such as, both perform basic operations, logic — conditional execution, loops, iteration, nested functions, with parameters, input and output during processing, persistent storage — shared between runs of the same or different programs. Full Turing machine? What defines what that is?**Qubit is a hybrid combination of a storage element and a processing element.**And information as well. Can storage and processing be separated, either completely or temporally?**Quantum computing for dummies.**The easy stuff that everybody needs and wants to know. But none of the deep stuff, no math, no Greek, no physics. Plenty of these out there, but I know I can do it better and more simply — and more correctly.**My focus for quantum computing.**The technology and how it can be applied, but no specific applications per se. 2–5 year timeframe. Not so interested in the next year. Likely not the second year either. Not as interested in 10+ years, other than eventual destination and universal quantum computer — hybrid with classical. 5–10 years less an interest, but mabe 7 if delays for technology which should have happened in 5 years. But… likelihood of dramatic progress in 2–5 years is a coin flip, so expectations for 2–5 years may not transpire until 5–10 years.**Classical computers are based on quantum mechanics and quantum effects too.**Yes, they are. Quantum effects within transistors — and in wires as well. All classical bits and gates are ultimately derived solely from quantum effects. Somehow, quantum effects can be converted into non-quantum macroscopic effects — through the magic of statistical aggregation. Probability plus statistical aggregation equals approximate but practical determinism. Contrast with absolute determinism. Or does an exponential limit plus quantum threshold guarantee determinism, short of hardware failure? Every classical computer is also a quantum computer, in some sense. Simply collapses all of the quantum states. But still depend on quantum state transitions between the collapses. Technically, it may be more correct to say that classical computing is based on quantum effects rather than on the logical operations of the Bloch sphere and unitary matrices on qubits. It would be fascinating to explore the boundary between the quantum and non-quantum worlds, possibly even producing a hybrid, mixed model of computation.**Configurable packaged solutions are the greatest opportunity for widespread adoption of quantum computing.**Prewritten code — complete applications — addressing particular niches of quantum computing applications. User supplies the input data and a variety of application-oriented configuration parameters. Prewritten classical code will generate the necessary quantum circuits needed to implement the underlying algorithms using the user’s input data and configuration parameters. Algorithms and code for such solutions must be designed and developed by very elite professional teams, well beyond the ability of even most Fortune 500 companies, who will be the target customers of such packaged solutions. No knowledge needed of the physics or math of quantum computing. No knowledge needed of the internals of the underlying algorithms of the packaged solutions. Examples — none exist, yet. D-Wave is a step in this direction, but falls short.**Roadmap for molecular modeling of chemistry.**Define a wide range of molecules, from simplest to most complex, as the milestones on the roadmap. Such as salt, caffeine, sugar, gasoline, etc. Estimate qubits, circuit depth, and coherence needed for each milestone.**When will an application be appropriate for a quantum computer?**Solving a substantial, production-scale, real-world problem using a quantum computer.**Levels of use.**Related to personas and use cases, but a simpler formulation. 1) Direct use of qubits by algorithm designers. 2) Application developers use high-level libraries which generate quantum circuits and convert qubit results to application results. 3) STEM managers and executives who have teams to design, implement, deploy, and operate quantum-based solutions. 4) Non-STEM business and operational managers and executives who rely on the results of quantum-based applications. 5) Executives of organizations which rely on quantum-based computations, such as drug discovery, material design, business process optimization, etc. — better or more efficient operations, better or more efficient financial results. 6) Customers and users who use the products and services of organizations which rely on quantum computations.**Not ready for Quantum Ready.**Because quantum is not ready for us. Not just a need for much more advanced hardware, but algorithms and programming models as well. Algorithmic building blocks — richer set of levels of abstraction. Standardized primitive operations. Programming language. Standardized APIs. Design patterns. Clearly explained examples — fully commented.**What computational tasks are possible using quantum computing?**Not applications or application problems per se, but specific types of computations.**How many more qubits are needed before we can do anything significant with a quantum computer?**Maybe that’s*The ENIAC Moment*, or maybe something short of the full ENIAC moment.**Deep dive on phase estimation.**What makes it work? How does phase actually get captured? Why isn’t this a primitive operation of the firmware? How many 9’s of qubit fidelity are needed to capture each bit of phase? Need solid, real-world examples.**Is quantum computing on the verge of a breakout, or a breakdown into a quantum computing winter?****Beware of quantum algorithms which don’t discuss and fully characterize scalability.**Discuss, demonstrate, and even prove scalability, if possible. Fine to run on a small real quantum computer, but should accurately simulate on larger classical quantum simulators (up to 40 to 45 qubits.) Simulation should use a realistic noise model comparable to expected real machines in a target timeframe (one year, two years, five years, seven years, ten years.) Identify and discuss any limiting factors, such as phase granularity.**Scaling of algorithms and applications won’t be easy, automatic, and free.**Too many odd factors and tight limits. Someday it may get easier, more automatic, and relatively cheap, but not anytime soon. For the foreseeable future it will be tedious, difficult, and outright problematic. Quantum Fourier transform, quantum phase estimation, and anything dependent on fine granularity of phase will be especially problematic.**When will we get to see the emperor’s clothes?**The tailors have made many wildly extravagant promises, and we’re all presuming that the reality will be even more extravagant than the promises, but… here we are, waiting patiently (okay,*very*impatiently!) for the emperor and his super-extravagant clothes to make their appearance. We won’t be disappointed… will we? Seriously, how much of the hype should be treated as fact, or even understatement of fact?**What is the truth about quantum computing?**Beyond the hype. What are the strongest statements we can make which are in fact grounded in fact?**When will quantum computing start to become interesting?**128 vs 256 vs. 1K qubits? 100 vs. 1K vs. 10K gates? 100 vs. 250 vs. 500 vs. 1K circuit depth? (coherence) What algorithms, besides Shor’s? Only when production-scale is reached? Short of production scale, but sufficient to demonstrate that production-scale is close to being in reach? Maybe it’s*The ENIAC Moment*, by definition?**What can a quantum computer actually do?**Problems. Applications. Functions. Mathematical functions. Computable functions. Tasks. Logic gates. Absolutely, and relative to what a classical computer can accomplish.**When is quantum computing projected to be able to do anything interesting or practical?**Maybe 2–3 years. Maybe 5 years. Maybe 7 years. Maybe 10 years. Maybe 15 years.When might The ENIAC Moment arrive?**Is a Quantum Winter coming?**Based on what criteria? What would it look like? How might it end and transition to a Quantum Spring and Quantum Summer?**Quantum computing is (currently) far too complicated for all but the most elite of professionals.**To do anything useful, that is. When will that change? Maybe, or in theory, The FORTRAN Moment.**What criteria must an application meet to be appropriate for a quantum computer?**Probabilistic results are okay. Approximate results are okay. The application problem must be reducible to a problem in physics. Can’t require complex logic with conditionals, looping, functions with parameters, or rich data types. Must utilize quantum parallelism with a register of substantially more than fifty qubits under a Hadamard transform to achieve dramatic quantum advantage. Only dramatic quantum advantage justifies the use of a quantum computer.**Need for a lone physicist with a single great idea.**Achieving The ENIAC Moment for quantum computing might be less about some big corporate push, and more about a solitary elite scientist or engineer identifying a kernel of a simple but great idea and pushing forward to a breakthrough implementation that dramatically blows away everyone and everything else. Possibly even developing new hardware, new tools, and a new programming model focused on achieving great results for that single great idea. Minimal resources, but maximal impact. Even if that one implementation is specialized for that one idea, it could provide the foundation for generalizing to a much broader class of problems. Are there any young Feynmans or Oppenheimers out there poised for the challenge?**The Mantra: Quantum computing must deliver substantial and dramatic enterprise value far beyond what classical computing can deliver.**In the form of quantum advantage —*dramatic*quantum advantage.**The siren song of quantum computing: The lure and promise of quantum parallelism.**Making the promise a reality.**Quantum variational methods considered harmful.**Variational methods are useful in general in physics, and seem to work modestly well on NISQ devices, but they don’t appear to offer any prospect of achieving dramatic quantum advantage. They are a mediocre substitute for true quantum parallelism on a scale capable of achieving dramatic quantum advantage.**How best to frame quantum computing.**Especially for different audiences or personas. Will evolve over time as well as the technology, use cases, and access patterns evolve.**What is the largest semiprime number which a quantum computer can reasonably be expected to factor within the next 2–3 years?**Or are the precision and fidelity requirements for quantum Fourier transform still too great for near-term machines? Can we achieve enough nines of qubit fidelity for near-perfect qubits, or are even more nines or full quantum error correction required?**What do senior executives need to know about quantum computing?**Right now, just that it is a research area — fund much more research before expecting that it can be deployed to solve real-world problems. Less about the raw technology and what’s under the hood, and more about applications, benefits, and limitations.

This is a live list and could be updated on any day.

# Additional runner-up topics

I know it’s a lot of extra clutter since these topics are also in the full list, but all of them feel important enough to warrant my attention sooner rather than later. I just couldn’t bring myself to remove them. I’m hoping to winnow this list down over time. Although every time I think I have my lists under control they just seem to grow even more!

**What mindset is needed to excel at quantum computing?**How to analyze real-world problems. How to couch problems in quantum terms. How to architect and synthesize quantum solution approaches. How to design quantum algorithms. How to develop applications which use quantum algorithms.**What are quantum computers good for?**What problems are they most applicable to? Key benefits — besides raw performance, solving problems previously thought unsolvable using classical computers.**Integrating quantum computing with real-time quantum sensing.**I’m not sure exactly what this would really look like, but the potential is… awesome. Need some sort of real-time loop — snapshot quantum sensed data, process, optionally output classically, rinse and repeat. Actual quantum processing would have to be re-thought from the ground up — not limited by or based on classical real-time processing. A quantum version of a CCD image sensor would be an obvious use case.**How can we automatically validate scalability of a quantum algorithm?****What is the secret sauce (or sauces) of quantum computing?**What really makes quantum parallelism tick… and roar? How is quantum advantage actually achieved?**Spoiled by the richness of classical computing, Turing machine, data type abstractions.****Quantum algorithm design, quantum application development, and quantum application deployment.**Overall architectural and methodology model for quantum computation. Core quantum algorithms. Hybrid classical code to bridge the gap between application data and quantum circuits. Classical code for applications which use those quantum algorithms. Real-world deployment of those applications.**What are pure states and mixed states?**And density matrix and density operator. Do they relate to superposition, entanglement, or both? What’s the point? Why should we care? What are they good for? How do we use them? FWIW, I rarely see mention of them. What are the proper terms to use when the probability amplitudes of the basis states of a single, unentangled qubit are 0.0 and 1.0 vs. neither 0.0 nor 1.0 — pure and mixed states, or some other terms? Seeby Scott Aaronson.*Introduction to Quantum Information Science Lecture Notes***Bespoke algorithms vs. reusable algorithms.**Tradeoff between algorithms custom-designed and tailored for particular applications, and general-purpose algorithms usable across many applications. Advantages and disadvantages to both. Bespoke algorithms can be more efficient, run faster, and use less resources, but require dramatically more intellectual effort, and be much more difficult to maintain. General-purpose algorithms can be slower and use more resources, but much easier to understand and use. Generalities, but many exceptions.**Challenges of modeling complex systems for quantum computing.**Other than physics and chemistry which are naturally modeled by quantum mechanics. Telling people to simply model their problem as a physics problem is too simplistic and not so helpful for non-physicists and non-chemists.**GIGI — Garbage In, Garbage Out — How will your quantum algorithm or application behave if the input data is problematic?**Detecting errors. Rules for good, clean data. Issues with quantum circuits. Validate data before generating a quantum circuit.**What does it mean to be quantum?**In general, but simplified. See also:*What Are Quantum Effects and How Do They Enable Quantum Information Science?***Necessity is the mother of invention: Let application requirements drive the design of the quantum computer.**The designs of all of the early classical computers were driven by the intense needs of particular high-value applications. Bell Labs, Harvard, Zuse, Atanasoff, Colossus, ENIAC, EDVAC, EDSAC, SEAC, Whirlwind, MANIAC, SAGE, et al. But somehow there was still an intense interest in building a fairly general purpose machine.**My pet peeves about quantum computing.**So many! Where to start?!**What are the most problematic aspects of quantum computing today?**Which are preventing production-scale applications which deliver substantial business value not achievable using classical computers. Capacity and fidelity are the first two.**Debunking quantum hype.**Where to start! Never-ending! Not much left?**What might a BASIC language for quantum computing look like?**What types of problems can be solved very easily?**No need to start learning about quantum computing until 40-qubit algorithms are the commonplace norm.**Scalability is one of the more important and urgent early lessons — it’s okay to code and test an 8 to 12-qubit algorithm, but you must not presume that it works until you scale it up to 32 to 48 qubits. And this is just the starting point — quantum computing won’t be in full swing until 64 to 80-qubit algorithms are the commonplace norm. IOW, dramatic quantum advantage is the norm.**What’s common across all quantum computing application categories?**Opportunities for sharing and reuse vs. need for unique and specialized approaches.**Essence of quantum computing.**In one sentence, one paragraph, one page, two pages, four pages, ten pages, twenty pages, fifty pages, and 100 pages. What gets added at each stage, and what gets left out to get to the previous stage.**The great seduction of quantum computing.**How can we reclaim our dignity?**Ideal qubit technology has not yet been invented.**Ideal as in good enough to achieve dramatic quantum advantage. Still in the early days. May be another 5–10 years before we see qubit technology which will enable dramatic quantum advantage for a wide range of applications.**Never underestimate the boundless cleverness which can propel classical computing to great new levels of capability.**Clever heuristics and intuitive leaps can surmount even extremely daunting obstacles.**Proof points for quantum computing to achieve widespread adoption.**Milestones to quantum advantage and beyond. The ENIAC Moment and The FORTRAN Moment would be two. Various numbers of qubits would be others — 72, 80, 96, 128, 192, 256, 512, 1024, etc. Various nines of qubit fidelity would be others — two nines, three nines, four nines, five nines, six nines, etc. Full quantum error correction would be another. Various circuit depths of coherence would be others — 10, 20, 50, 75, 100, 150, 250, 500, 1,000, etc. Fractions of full, dramatic quantum advantage would be others — 10%, 25%, 50%, 75%, 90%, etc., although different application categories might have different requirements.**What is the key to quantum parallelism and quantum advantage?****Things I like to say about quantum computing.**Wisdom. Fundamental general knowledge. Like, the ideal qubit technology has not yet been invented. Summarize the field. Simplify the field. Characterize the limits of the field. Cut through all of the hype.**How much can a quantum computer compute without recursion?**Seems like a rather severe limitation.**How do you do anything useful on a quantum computer — like what?**Tends to be too opaque and inscrutable. Need some direct, plain language description.**The power of randomness in a deterministic world.**Using randomness, probability, and statistical aggregation to approximate determinism. Many real events have a random character, so coping with randomness is a good first step. Being overly-reliant and overly-dependent on absolute determinism can cause problems when so many random factors are in play. Randomness can average out to an approximate determinism.**My function as doing due diligence for quantum computing.****Quantum algorithms need to be provably scalable.**Generally scalable via parameterized dynamic generation of quantum circuits. Test on both real machines and simulators for smaller sizes, simulator-only for medium sizes (32 to 50 qubits), and reliance on proof of scalability for largest sizes, beyond 50 or so qubits.**Am I on the verge of becoming an apostate for quantum computing?**Yes, I’m getting more gloomy about the short-term prospects for quantum computing, but I still haven’t given up on the longer-term prospects, yet.**The truths you need to know about quantum computing.****Bitter truths about quantum computing.**Tough to accept.**Why is quantum computing such a big deal?**Quantum parallelism provides an exponential speedup which gives a dramatic quantum advantage over classical computing.**Probability plus statistics equals approximate determinism.**Statistical aggregation of probabilistic data can approximate determinism. Is all apparent determinism really simply approximate determinism? Is there any absolute determinism in a universe based on quantum mechanics? Dampening of quantum mechanical effects, oscillation at the Planck level.**How to Get Started in Quantum Computing?**To do what? What target persona? What target use case? Preparation for what? Role-specific — producers — research and vendors, consumers. Application architects. Application developers. Algorithm designers. Business managers with business operations problems to solve. STEM managers with STEM problems to solve. Clearly document your intentions — what problems are you trying to solve, specific use cases.**When should senior managers and senior executives expect that quantum computing will be capable of delivering dramatic top-line revenue and bottom-line profit — 3, 5, 7, 10, 12, 15, or 20 years?**Tough question. Significant investment will be needed well in advance of production-scale deployment, but business managers need to focus on delivering dramatic business value using mature technology, not availability of risky early versions of technology.**The ugly truth about NISQ computers — great for testing but not for production.**Small scale is a possibility, but not production scale.**What is a post-NISQ quantum computer?**What would it look like and what can it do that a NISQ quantum computer can’t do?**My concerns about quantum computing at this stage.**Lack of robust application examples. Lack of great algorithmic building blocks.**I’m not ready to give up on quantum computing yet, but…**Not even close to being useful. Not even just over any reasonable horizon. Serious issues even on a much longer timeline.**Quantum computing: All dressed up and no place to go.**Maybe great (or at least decent) algorithms that can be simulated, up to a degree, but no hardware to achieve true quantum advantage, for now. “Maybe next year [or the year after that or…].”**Stuck/mired in the quantum swamp.**A superposition of futures — simultaneously very bright and very discouraging! Great potential for various niches, but not value or irrelevant for many others.**What will it take to get to the ENIAC moment?****When can we expect practical, production-scale real-world applications with a dramatic quantum advantage?**5–7 years? Two years after The ENIAC Moment — the first production-scale application.**A quantum computer as a function, subroutine, or coprocessor.**Think of a quantum computer as a function, subroutine, or coprocessor.**Can quantum computing ever expand beyond being a mere coprocessor for narrow computations?**Complex logic, rich data types, complex data structures, Big Data, functions, classes/objects, I/O, databases — all while under quantum parallelism.**Future of quantum computer/processor as a feature of a universal computer rather than as a separate computer.**Coprocessor vs. just additional instructions. Special data types vs. an attribute of existing data types.**Should I take a break from quantum computing, maybe even a protracted Rip Van Winkle sleep?**How long? Maybe two years, or maybe just six months, or maybe even sleep for 2–4 years, or 5–7 years, or even 15–25 years and wake up when quantum computing really is universal and finally ready for deployment at production scale, or a year before The ENIAC Moment. Criteria for resumption, threshold. Let the actual technology catch up with at least a fraction of the hype — and my own expectations. What to monitor while I wait — news, papers, conferences, books. Definitely monitor progress, but how to do that most effectively. Maybe not a break so much as shifting gear or slowing down or part time so I can focus on other things as well.**Quantum computing as a Mount Everest problem.**Relatively easy to climb up the lower portions of the slope, but lack of oxygen on the upper reaches of the slope can just crush your soul and spirit and bank account or at least the patience of your management and backers.**Clarify the terminology: quantum computer, quantum computer system, quantum computing, quantum computation, quantum processor, quantum processing unit, QPU, quantum device, NISQ device.**Where are the boundaries between pure quantum and non-quantum hardware?**Venture capital opportunities in quantum computing.**Really only a research play right now. “Quantum Ready” is more of a fiction than a reality. More than a few years from commercial or even laboratory availability of a quantum computer capable of delivering dramatic quantum advantage for production-scale real-world problems.**Quantum volume — how does it work, what does it mean?**Deeper dive. List all of the factors. Explain in plain language. Can it be reverse engineered? What does it really mean for application developers? Does it really not work for >50 qubits?**A few general comments on quantum volume.****How to decode quantum volume.**How to get square size. How wide and long a non-square rectangle is supported. Sense of overall fidelity. Sense of connectivity. Anything else? Or is a single number a one-way hash which hides rather than enhances value? Or is it only useful for comparing machines, relative to each other, and not comparing rectangle size, maximum circuit length, or connectivity?**Companies looking to deliver data science, machine learning, AI, drug discovery, material design, and business optimization over the next 2–3 years should not be looking to quantum solutions in that timeframe.**Experimentation, prototyping, and leading-edge research, yes, but not production-ready solutions.**The fantasy of quantum computing.**Top 10 fantasies — coming soon — within 2 years, reliable qubits, coherent qubits, high-capacity simulator, portability of algorithms and applications between vendors (hardware platforms), high-level programming model and language, transition from hyper-hype to real reality.**The problem and opportunity of quantum networking.**No-cloning theorem — not simply copying information from A to B. Entanglement — transmitting shared information, establishing a portal (of sorts.)**Breakthroughs vs. incremental progress.**Will quantum error correction be enough? Are more dramatic hardware breakthroughs needed? Is a dramatic algorithmic breakthrough needed? Need a much higher level programming model, with logical operations. What is needed for The ENIAC Moment, or The FORTRAN Moment?**What’s my most valuable contribution to quantum computing at this stage?**Fighting hype. Simplifying jargon. Uncovering nuance. Uncovering issues. Understanding — and explaining — limits and limitations.**Essential issues for quantum computing.**Probabilistic and statistical rather than deterministic. Noise and errors using NISQ vs. eventual QEC fault-tolerant quantum computing. Reduction of problems from application terms and traditional math and computer science to raw physics. Need for elite staff. Classical computing still has plenty of gas and runway. Classical Boolean logic and classical mathematical algebraic expressions don’t translate well (at all) into quantum code. Individual qubits vs. numbers.**Need a roadmap for quantum computing for what is needed, not simply what can be done.**Needs to be more about problem solutions than the raw technology.**Should quantum applications even need to be aware of qubits, even logical qubits, or should higher-level abstractions be much more appropriate for an application-level quantum programming model?**Classical programming doesn’t require knowledge of bits: Integers, real numbers, floating point, Booleans — logical true and false — binary, but not necessarily implemented as a single bit, text, strings, characters, character codes, structures, objects, arrays, trees, maps, graphs, media — audio, video, structured data, semi-structured data.**What is quantum networking?**Doesn’t exist yet, but hypothetically.**Quantum area networks (QAN).**Goal is modular quantum computing systems. Limit to qubits in a micro-area (1 mm or 1 cm?). Need ability to “shuttle” quantum state across relatively short distances, a few mm, a cm, maybe a few cm, or maybe a few feet, or even 10–20 feet to enable two or more modular quantum systems to be combined into a single quantum computer. Maybe some way to daisy chain QPUs to have even a large data center complex as a single quantum computer system. Distinguish “moving” entirety of a qubit quantum state vs. enabling two-qubit gates for entanglement between two adjacent modules.**Is Shor’s algorithm the worst-case limit for quantum algorithms, or just the starting point for serius, heavy-duty quantum algorithms?****What’s the largest single quantum computation we can imagine at this time?**If we could build the largest quantum computer we wanted to solve the largest and hardest computation problems we could possibly imagine, what computations or problems might they be?**Google quantum supremacy one year on.**Why didn’t this feat open the floodgate for practical applications or even one additional example of quantum supremacy? Preskill: What’s Next After Quantum Supremacy? http://theory.caltech.edu/~preskill/talks/Preskill-Q2B-2019.pdf**Risk of intellectual property (IP) causing a quantum dark age.**Accenture patents? Put emphasis on open source.**Hello World program for quantum computing.**No clue what it should really look like. A single X or H gate might suffice, but my inclination is that it should be the simplest program which shows a practical example of quantum parallelism with results that most people can understand. Something functional, such as 2, 3, and 4-qubit QFT. Something that most people can relate to and say “*Wow, that’s cool!*” Should attempt to show quantum advantage, but that could take 50 qubits or more. Maybe a graduated collection of Hello World programs is best.**What’s wrong with the traditional quantum example algorithms.**Not solving real problems. Too artificial. Too inscrutable. Grover provides only a quadratic speedup and really isn’t appropriate for “databases” per se. The examples, or their narrative, is unclear about how quantum parallelism really works.**Need a better collection of example algorithms.**Solve problems people can relate to. Solve real-world problems. Demonstrate quantum advantage.**My five (or so) key stumbling blocks in quantum computing.**How are multi-qubit product states represented physically — for n qubits, 2^n quantum states. Granularity of phase and probability amplitude.**Top priorities for quantum computing research.**Better qubit hardware. Qubit fidelity — coherence, gate operations, measurements. More qubit hardware technologies. Better environmental isolation and shielding. Logical qubits, error correction. Larger numbers of qubits. Modular quantum processor designs for very large numbers of qubits. Algorithms capable of scaling to quantum advantage. Tools to analyze algorithms for scalability issues. Tools for testing and debugging algorithms. Better classical quantum simulators — performance, capacity, accuracy, configurability.**We need more basic, fundamental research for quantum computing.**We need at least 15 or 25 projects for alternative qubit technologies — qubits which are higher reliability, smaller, cheaper, and faster.**Some day all computers will be quantum computers.**Quantum-level logic could enable much smaller and much more efficient classical logic. Use logical qubit technology to construct classical bits. Simultaneously use desire for smaller and more efficient classical bits to drive down size and performance of physical qubits and logical qubits.**What practical, production-scale real-world problems can quantum computing solve — and deliver dramatic quantum advantage and dramatic real business value?**And be clearly beyond what classical computing can deliver.**Quantum phase estimation (QPE) and quantum Fourier transform (QFT) as discrete (parameterized) firmware operations.**Permit algorithm designers and application developers to use QPE and QFT as simple, atomic black boxes rather than a blizzard of gates. Optimize any SWAP networks needed for connectivity within the firmware and hardware.**Quantum computing as it exists today and in the near future is a dead end.**No hint of The ENIAC Moment. Quantum advantage is not possible without quantum error correction or very near-perfect qubits. Next two years. For the foreseeable future? Hardware — Not enough qubits, Weak fidelity, Limited connectivity. Algorithms — No robust programming model, No robust set of algorithmic building blocks, No easy paths from problem statements to quantum solutions, No robust set of introductory example programs, No clear path to quantum advantage. Not suitable for production-scale real-world problems.**How precisely can a probability amplitude or probability or phase be estimated?**Can 0.0 or 1.0 or even 0.50 be precisely achieved? Or is there always a tiny epsilon of uncertainty down at the Planck level — 0.0 plus epsilon, 1.0 minus epsilon, 0.50 +/- epsilon? Same or different epsilon for probability amplitude and probability? Separate epsilons for the real and imaginary parts of a probability amplitude? Epsilon as a Planck probability? How tiny? Also for minimum difference between two probability amplitudes. Even for 0.50, how close can two probabilities be? Has anyone else written about this?**Applying quantum effects to general AI and simulation of the human brain/mind.**Probability and statistical aggregation. Data-driven processing, not program-driven. And sensor-driven as well — quantum image sensors.**Is quantum computing needed to achieve human-level artificial intelligence?**Maybe not quantum computers as currently envisioned, but some level of computing using quantum effects. Turing u-machine. Quantum operations — cannot be computed classically.**Is quantum computing real?**Capable of solving production-scale real-world problems and delivering substantial real-world value?**Is there any material or substance that can’t be used as a qubit?**A grain of salt, sugar, or sand? A drop of water? A small ball bearing, BB pellet, or lead shot? A snippet of copper wire? A capacitor? In theory, anything could be used to build a quantum computer (or qubit) — everything is based on the same fundamental quantum mechanics. Isolation, control, coherence, and measurement are the gating factors. Since the natural state of the world is quantum, “Almost anything becomes a quantum computer if you shine the right kind of light on it.” — MIT physicist Seth Lloyd**Quantum Internet.**What exactly is it? Is it simply a special instance or application of quantum networking, or a distinct concept. Regular Internet using quantum links? Quantum channels? Quantum hubs — store and forward?**Short piece on achieving quantum parallelism using Hadamard transform (gate).****Where is the value in quantum machine learning if there is no support for Big Data?****Grover’s algorithm considered harmful.**Or at least less than helpful. Not really relevant to traditional Big Data databases. Only a quadratic speedup — not an exponential speedup. What would be a better example? What would be a realistic app to use Grover? How many qubits? What phase granularity requirement?**Citation of Shor’s algorithm considered harmful.**There are no practical implementations of the original algorithm, as written, so there is no point to citing it as an example of a practical quantum algorithm — and this is not likely to change any time soon. It’s very misleading to speak of Shor’s algorithm for factoring large semiprime numbers as having been “demonstrated” or implying that it is feasible or practical for relatively large numbers. It adds no real value to any published paper to cite Shor’s algorithm. At best, it’s a mere distraction. At worst, it detracts attention from more worthy near-term pursuits.**Quantum computing is stuck in Toy Land.**Currently only suitable for relatively trivial toy-like algorithms and applications. Not practical for real-world production-scale problems.**What are the quantum volume requirements for an algorithm or application?**Applying quantum volume to algorithms and applications. Need automated analysis. Especially tricky for dynamically-generated algorithms and applications, such as for a Python program which generates and executes quantum circuits.**How to map your application problem solution to a problem in physics.**Need for application frameworks and domain-specific mappings so elite pioneers can do all the heavy lifting to map the problem area to physics problems so that their followers can then more directly exploit (reuse) those mappings without necessarily even understanding what’s going on under the hood.**Tasks to pursue to get deeper into quantum computing.**Read a lot more of the papers coming out. Read more of the older papers, the foundational material. Deeper understanding of how qubits are actually implemented. Deeper understanding of quantum mechanics. Deeper understanding of quantum chemistry. Deeper understanding of Shor’s algorithm, and others — look at the derivative algorithms, read Miller’s ERH paper. Study number theory. Quantum error correction. Quantum communication. Post-quantum cryptography. Fill in as many of the TBDs in my glossary as possible. Consider a quantum Rip Van Winkle nap to wake up when quantum computers finally are mainstream with widespread production-scale applications very common — how many years should I set my alarm clock for?**Might Fortran make a comeback for quantum computing?**Purely speculative, but based on scientific computing being a primary focus for quantum computing. My notion of The FORTRAN Moment was merely using the original FORTRAN programming language as a metaphor for a transition to mass adoption by non-elite professionals as occurred in the 1950’s for classical computing when the original FORTRAN was introduced, but if scientists flock to quantum computing, many of them are already and still using Fortran. Note: FORTRAN (all caps) became Fortran (capitalized) beginning with the Fortran 90 standard. Many existing scientific applications may have been developed using the FORTRAN 77 standard, or even earlier versions of FORTRAN (FORTRAN IV, FORTRAN 66.)**Model for scalability of algorithms.**General approach and need for discussion of scalability. Demonstrating algorithms on smaller machines, with details for how the algorithm would scale to larger machines. Demonstrating algorithms on classical quantum simulators with noise models that match actual or projected real machines.**Potential for quantum sensors and quantum effectors.**Super-fine sensing and robotics. Integration of quantum sensing and quantum computing.**Preliminary thoughts on an ontology for quantum computing.**And overall quantum information science as well. My glossary of terms for quantum computing might be a good starting point for terms and concepts. The goal for this particular topic is not the full ontology, but more of a framework or outline for the full ontology. Criteria. Goals. In theory an ontology should be machine readable, as well as templates and automated processes for producing human-readable versions of the machine-readable ontology. Not clear who or what would use the ontology, although it should be a definitive map of the landscape of quantum computing.**Fundamental organizing principles and key insights for quantum computing.**What really makes quantum computing tick. What is quantum parallelism really all about. How does product state for entangled qubits really work its magic? Not all the gory low-level technical details, but the key abstractions and functional elements of quantum computing. See also:*What Are Quantum Effects and How Do They Enable Quantum Information Science?***Key questions about quantum advantage to ask when reviewing algorithms, applications, papers, projects, and products.**Mostly just making sure the matter is discussed fully and thoroughly, and in specific detail. How scalable is it as well. See section in*What Is Dramatic Quantum Advantage?*

This is a live list and could be updated on any day.

# Full list of topics

Without further ado, here is the full list of topics I could conceivably write on in the coming months and years, and maybe even decades, roughly in chronological order as they were added to the list, newest at the bottom, and includes all of the top topics highlighted in previous lists:

**FAQ.**I’ve catalogued quite a few questions, but answers are needed. I already have many answers at least at a superficial level, but I want to get to the bottom of things, to ground truth before trying to sound too authoritative. The FAQ will contain only relatively brief answers, and link to individual papers which have much more expansive detail on the topic, if necessary. Distinct FAQs may be needed for distinct audiences (personas.)**Glossary for quantum computing.**I already have one, but it is too-comprehensive — over 3,000 entries, and is more of a comprehensive dictionary than a brief glossary. May need different glossaries for different audiences (personas.) Something on the order of 50 to 100 terms would be ideal. In addition, my current glossary has many TBD entries which still need to be fleshed out.**Personas, use cases, and access patterns for quantum computing.**Who, what, and how for the use of quantum computers. Including managers, executives, non-quantum technical staff, IT service staff, scientists, engineers, operations staff (e.g., those seeking optimization), marketing, sales, technical support, etc. I’ve written such a categorization for databases and cybersecurity; every field could use one. Important for focusing on audiences for writing, products, and services.**Tutorial.**Plural, for different audiences, personas, use cases, and access patterns. Actually, I have no intention of doing such writing for the foreseeable future since I’m more focused on ideas, principles, and theory than hands-on usage. Still, at some stage I may be tempted to do at least some kind of tutorial. Meanwhile, I expect that much of my writing will be usable for an introduction to many facets of quantum computing.**What is Quantum Ready?**Seems a bit vague and ambiguous. Needs a crisper presentation. Existing focus seems to be on users being ready for the future and some not-yet-existent future technology rather than existing technology being ready for clear and existing use cases. So, it’s more of a marketing pitch for the most part. Still, some are likely to be confused about it, so it is worth pointing out in detail what it is not.**Are current quantum computers really Quantum Ready?****Current quantum computers are not really Quantum Ready.****Are current quantum computers really ready for Quantum Ready?****What are the criteria for a quantum computer to be Quantum Ready?**Moving beyond the stage of being a mere laboratory curiosity. See*When Will Quantum Computing Advance Beyond Mere Laboratory Curiosity?***For what personas and use cases are current quantum computers Quantum Ready?**Researchers. The Lunatic Fringe — they’re ready to try anything, regardless of how ready it might be for practical applications. See*When Will Quantum Computing Be Ready to Move Beyond the Lunatic Fringe?***Which personas are most ripe for Quantum Ready?**Researchers. The Lunatic Fringe — they’re ready to try anything, regardless of how ready it might be for practical applications. See*When Will Quantum Computing Be Ready to Move Beyond the Lunatic Fringe?***What are the stages for quantum readiness for the various personas and use cases?****Will understanding of current quantum computers really help that much if and when the technology evolves substantially over the next five to ten years?**I suspect not. Maybe 30% of the subject matter will still… matter and be relevant.**Might most organizations be much better off by waiting for The ENIAC Moment if not the FORTRAN moment before even dipping a toe into the waters of quantum computing?**Might The ENIAC Moment and The FORTRAN Moment be better standards to judge the onset of the Quantum Ready era? Unless of course an organization has one or more teams of Lunatic Fringe personas who can handle any and all technologies no matter how difficult and undeveloped.**What are appropriate demonstration projects for quantum computing?****What are the advantages of quantum computing?**Only two: the exponential speedup of quantum parallelism and probabilistic computation rather than strict determinism (true random numbers as an intrinsic feature rather than requiring an external source of entropy.)**Where is quantum computing as of**Status update every few months. Not all the gory details but just an overall report card for progress towards mainstream adoption.*<today>*?**Is quantum computing still at the stage of basic research, experimentation, and waiting?**IOW, not close to being ready for mainstream adoption? Is it even ready for the Lunatic Fringe yet? We’ll be at this stage for the foreseeable future — research and experimentation will advance, but “Waiting for Quantum” will be the M.O. for most organizations.**What’s next for quantum computing as of**Companion to “*<today>*?*Where is quantum computing…*” Focus on 6–12 month outlook. Update several times a year, as advances occur.**How close to the ENIAC moment are we as of***<today>*?**How close to the FORTRAN moment are we as of***<today>*?**Are quantum computers still only usable by the Lunatic Fringe as of***<today>*?**What big breakthrough would open up the floodgates for quantum computing?****What is the maximal number of qubits which can be simulated using classical processors?**Somewhere in the 48 to 53 range?**What is a quantum computer?**Short and sweet, but very, very precise. Still too many areas which are more than a bit foggy. Of one thing I am certain: Nobody needs another puff piece on quantum computing. The difficulty is that it’s problematic to reach many different audiences with one paper.**What is quantum computing?**Both hardware and software. Lots of superficial and misleading puff pieces out there. See*Why am I still unable to write a brief introduction to quantum computing?*What is a quantum computer? How does quantum computing fit in with classical computing? What is a… Quantum application, Quantum program, Quantum circuit. Measurement. Quantum Fourier transform. Quantum phase estimation. Quantum parallelism. Interference. Phase. Entanglement. Probability amplitude. Probability. Superposition. Unitary transforms. Quantum logic gates. Rotations. Bloch sphere. Basis states. Qubits. Various formats… One paragraph. One-pager. Two-pager. Four-pager. 10-pager. 20-pager. 50–75 page mini-book. 100-page mini book. Brief glossary (or glossaries, plural) — 10 most essential terms — to drop at cocktail parties, 25 terms, 50 terms, 100 terms, 250 terms. Torn between describing quantum computing is it exists today versus a vision of what it will eventually be like once it becomes useful and supports production-scale applications and achieves dramatic quantum advantage.**What is quantum computing?**Ditto, but the software more than the hardware. Lots of superficial and misleading puff pieces out there.**Introduction to quantum computing.**Alternate title.**What are the basics of quantum computing?**Not a puff piece, but solid information. How does quantum parallelism actually work — how do you use it?**Hello World program for quantum computing.**No clue what it should really look like. A single X or H gate might suffice, but my inclination is that it should be the simplest program which shows a practical example of quantum parallelism with results that most people can understand. Something functional, such as 2, 3, and 4-qubit QFT. Something that most people can relate to and say “*Wow, that’s cool!*” Should attempt to show quantum advantage, but that could take 50 qubits or more. Maybe a graduated collection of Hello World programs is best.**Need 4–6 introductory algorithms that have obvious practical application to replace the current introductory algorithms which do not have obvious practical application.**Hello World quantum program could be one of them, or incorporate one of them. And probably at least one quantum program which incorporates at least two if not three of the algorithms. Focus on algorithmic building blocks.**What are the 5–100 concepts which must be understood to understand the power of quantum computing?**Unclear how many or even what the minimal list is.**What are the 500 to 1,000 concepts which must be mastered to master quantum computing?**Where does it begin? And where does it end?**What is quantum computation?**The terminology can get confusing. What does the hardware (and firmware) do vs. what can be accomplished using a quantum circuit.**What does a quantum algorithm look like?****What does a quantum program look like?****What does a quantum application look like?****What is a quantum circuit?**No, it’s not hardware.**What is hybrid quantum computing?****What is quantum parallelism?**This is probably the single most important topic. What exactly is computed in parallel? How is the result measured? Is statistical repetition of the calculation required to tease out the range of possible solutions to be evaluated classically? Ditto for Shor’s factoring algorithm and calculating order — repeat the same exact quantum calculation k times and examine the distribution?**What is quantum mechanics?****What is quantum communication?**The world of Bob and Alice.**No, quantum computing and quantum communication are not the same thing.****What is quantum networking?**Doesn’t exist yet, but hypothetically.**What is quantum information science?**Quantum computing and quantum communication, and eventually quantum networking. And quantum metrology and quantum sensing. And who knows what else in the more distant future.**What is quantum information?**Tricky. Lots of hype. Qubits vs. quantum state vs. subset of quantum state. No definitive definition. Somewhere between qubits and quantum state. Information vs. representation. Are probabilities “information” per se? Is phase “information” per se? Does phase represent separate information from the information implied by probability amplitude?**Quantum information theory.**New concept, or at least a new term. No clear definition. Seeby Wilde.*From Classical to Quantum Shannon Theory***Does it make sense to contemplate quantum storage?**Hypothetically. Qubits are quantum “storage” technically, but they can’t be read, copied, or moved around freely as with bits in classical storage. Still, an interesting topic to contemplate for the more distant future.**What is a qubit?**Seems obvious, but there is plenty of nuance. A treatment is needed that is functionally complete and easy to understand. In plain language.**What is quantum state?****What can I do (and not do) with a qubit?****How does a qubit work?**Partially independent of physical technology, but partially dependent on the particular physical technology.**Is a qubit comparable to the transistor of classical computing?****A qubit is comparable to a flip flop in classical computing.**It can hold one unit of information (quantum state) and operate on it.**A qubit and a bit are not comparable — one is a hardware device and the other is information****A qubit is a storage and processing device for quantum state rather than being information per se****What is a quantum gate (or quantum logic gate)?**Not hardware, comparable to classical software operation or instruction.**What is a quantum circuit?**Sequence of quantum logic gates, a quantum program.**What is a quantum program?**Basically the same as a quantum circuit, although a quantum circuit could be embedded in a larger quantum circuit — a fraction or portion of the larger quantum circuit, while a quantum program is the totality of what a quantum computer is given to execute, with measurements of qubits to be returned as “results.”**What is measurement?**Capture the final state of a qubit upon completion of a quantum program, to be passed back to the classical application which invoked the quantum program. Captures either a binary 0 or a binary 1 even if the qubit has some more complex quantum state.**What is a quantum application?**A classical application which utilizes one or more quantum programs (circuits.)**What constitutes a “working” quantum computer?**Such as a minimum number of qubits, minimum coherence time, and degree of connectivity. This is an evolving standard — at one time even a single “working” qubit was a super-big deal. Or even 5 or 8 qubits. Now, is 53 really enough to “work” on real-size real-world problems? 128? 256? Or is coherence time the main limiting factor? Or connectivity? Still, the term gets used and thrown around far too casually, begging for clarification.**What is linear algebra?**What does it accomplish for us? What was the original source for linear algebra? What was the original motivation for linear algebra?**How much physics do you need to know to understand the basics of quantum computing?****How much physics do you need to know to master quantum computing?****How much quantum mechanics do you need to know to understand the basics of quantum computing?****How much quantum mechanics do you need to know to master quantum computing?****What is bra-ket notation?**How much of it is relevant to quantum computing?**Three interpretations of Schrödinger’s cat.**1) ignorance — the cat is either alive or dead, but the observer just doesn’t know until the box is opened, 2) dead, alive, or simultaneously dead and alive, and 3) dead, alive, or rapidly alternating or oscillating between dead and alive with a probability distribution. Which interpretation applies to quantum mechanics and quantum computing? And likely none are absolutely correct. The original model does not take into account asymmetric probability amplitudes. Applying real-world examples to the world of quantum mechanics may never be exactly correct. Many-Worlds Interpretation (MWI) as a fourth interpretation? Is the state of Schrödinger’s cat an underlying physical phenomenological quality or simply an artifact of our modeling of the phenomenon?**What are the key concepts of quantum mechanics?****How much linear algebra do you need to know to understand the basics of quantum computing?****How much linear algebra do you need to know to master quantum computing?****How much number theory do you need to understand to understand advanced quantum algorithms?**Such as using order-finding (period-finding), including Shor’s factoring algorithm.**What are eigenstates, eigenvectors, and eigenvalues?****What are the eigenvalues for a qubit?**The basis states or the probability amplitudes? Or is an eigenvector an eigenvalue plus an amplitude?**What are the eigenvalues for entangled qubits?**The computational basis states or the probability amplitudes?**What is probability amplitude?****What are pure states and mixed states?**And density matrix and density operator. Do they relate to superposition, entanglement, or both? What’s the point? Why should we care? What are they good for? How do we use them? FWIW, I rarely see mention of them. What are the proper terms to use when the probability amplitudes of the basis states of a single, unentangled qubit are 0.0 and 1.0 vs. neither 0.0 nor 1.0 — pure and mixed states, or some other terms? Seeby Scott Aaronson.*Introduction to Quantum Information Science Lecture Notes***How do you debug a quantum application?****How effective are quantum simulators for debugging quantum circuits?**Are quantum simulators the answer for most common debugging issues?**Trick for debugging on a real quantum computer.**Already described in. Can effectively single step by executing only the first k gates and then measuring, then reset and execute the first k+1 steps and measure again, rinse and repeat. Rerun each k steps some number of times to get a statistically valid sample of how stable or distributed qubit values are at each step. Can apply the same technique to both physical and simulated quantum computers, although with a simulator the quantum state could be examined without collapse, so rerun would not be needed, although still need repetitions for statistically valid results. Issue: As described, this doesn’t allow capture of phase — phase estimation could be used after each debug step, but that wouldn’t capture both probability amplitude and phase at the same time, although maybe capturing both separately might generally be close enough for many or even most situations.*The Greatest Challenges for Quantum Computing Are Hardware and Algorithms***What criteria should be used to judge the quality of quantum code?****How should quantum code be commented?****What is quantum computational chemistry?**Sometimes just*quantum chemistry*or*computational chemistry*and in contrast to*classical computational chemistry*. Readby McArdle, et al for more in-depth treatment.*Quantum computational chemistry***Glossary for quantum computational chemistry.****What is VQE?****What is a variational quantum eigensolver?****What is quantum simulation?****What is the difference between quantum simulation and a quantum simulator?**The former is a simulation of quantum physics on a quantum computer, while the latter is simulating a quantum computer and quantum circuit on a classical computer. Still, people get confused and conflate them. More recently, I’ve taken to writing*classical quantum simulator*to emphasize that the simulation of a quantum computer is running on a classical computer.**What are the key features of a quantum simulation?****What criteria must an application meet to be appropriate for a quantum computer?**Probabilistic results are okay. Approximate results are okay. The application problem must be reducible to a problem in physics. Can’t require complex logic with conditionals, looping, functions with parameters, or rich data types. Must utilize quantum parallelism with a register of substantially more than fifty qubits under a Hadamard transform to achieve dramatic quantum advantage. Only dramatic quantum advantage justifies the use of a quantum computer.**When is and when isn’t quantum computing appropriate for an application?****What is computational diversity?**The use of a variety of types of computing hardware — classical digital processors — both basic processors and high-end processors, multiple classical processors, supercomputers with a large number of processors, analog signal processing, GPUs, FPGAs, custom hardware, and finally, quantum computers.**How many qubits will be needed before we see a significant quantum application?**To reach The ENIAC Moment.**What is a quantum resource?**Superposition, entanglement, interference, quantum parallelism. See*What Are Quantum Effects and How Do They Enable Quantum Information Science?***What benefits does quantum superposition provide?****What’s really going on with superposition?**How does it really work, under the hood.**What benefits does quantum entanglement provide?****What is the precise phenomenological mechanism for entanglement?**How does it really work, under the hood.**What benefits does quantum interference provide?****How does quantum interference really work, under the hood?**What is the underlying physics?**What are the theoretical limits of quantum computing?****What are the practical limits of quantum computing?****How quantum computing has given me a much deeper appreciation of the fantastic intellectual power of classical computing.**See alternate title, with details:*My journey into quantum computing has given me a newfound appreciation for the incredible intellectual power of classical computing*.**Will quantum Fourier transforms work for a large number of qubits?**See Shor’s factoring algorithm. Banding or approximate FFTs work, sort of, but will they have enough precision for applications needing a fairly deterministic result, such as factoring of large numbers with Shor’s factoring algorithm.**What qualities are needed for a true quantum programming language?****What are the criteria for a quantum high-level language?**Some interesting high-level abstractions which can automatically be translated or compiled into raw quantum circuits. Ability of the compiler to optimize across those high-level abstractions, and even to optimize globally. The abstractions should be semantically rich so that many common mistakes or misuses can be detected and reported by the compiler. Some sort of quantum data types are needed, which can be compiled into qubits.**What is the programming model for quantum computing?**Needs to be much richer than simply rotations of the Bloch sphere and CNOT.**What criteria must a programming model have to be suitable for quantum computing?****Is quantum computing a model which only a physicist could love?****Is D-Wave a true general-purpose quantum computer?****When is D-Wave a better choice than a gate-based quantum computer?****What is a gate-based quantum computer?****What is NISQ?**How noisy?**What are NISQ and FTQC devices?**How fault-tolerant?**Do we really need quantum error correction (QEC)?**Or can we just ride the wave of steadily improving qubit quality?**Is there a happy medium between NISQ and FTQC devices?**See NPISQ — Near-Perfect Intermediate-Scale Quantum devices.**What is quantum error mitigation?****Is nearest-neighbor connectivity a major limitation, just an annoyance, or a non-problem?****Hype about quantum computing.**Most of what I write is intended to dispel hype about quantum computing as it is. Is all hype harmful? Is any hype beneficial**Is hype about quantum computing deterring meaningful progress?****To what extent can we project quantum computing based on the historical trajectory of classical computing?****Parallels to the evolution of computers in the 1940’s, 1950’s, 1960’s, 1970’s, and 1980’s.**Dramatic changes, sometimes leaps, sometimes gradual incremental advances. Significant changes in underlying technologies.**Parallels to maturing of 1940’s computing.**Still some relays. Ascendency of vacuum tubes. Switch from decimal to binary. Invention but not use of transistors. Physical size. But… more amenable to traditional math and algorithms, and Turing machines.**What does the future hold for quantum computing?****Need for much-higher performance and much-higher accuracy in quantum computer simulators.**Possibly even using massively parallel classical supercomputers, with thousands of processors, to get as far as we can, especially until we have more powerful quantum computers with enough qubits and long enough coherence. Possibly using GPUs and FPGAs, or even full-custom hardware.**Are we on the verge of entering a dark age for quantum computing where competitive companies and secretive government agencies which have a proprietary or security interest in secrecy will be reluctant to publicly and transparently publish their quantum accomplishments?**Both hardware and algorithms. Or, they may publish*some*of their work, but*sanitized*to hide the most significant work.**Quantum computer as a coprocessor.**Much of the processing for a typical application — or even something such as Shor’s factoring algorithm — must be performed on a classical computer, with a quantum circuit (or quantum program) being simply a “subroutine” called in the middle of overall processing.**Is a quantum computer merely a quantum calculator?**Is a quantum computer really a computer or simply a calculator? The coprocessor model. Merely a single block of code rather than a full program or application, with no control structures, data structures, nested function calls, rich data types, I/O, database access, or network access. Separate paper for quantum coprocessor? Hybrid computation — quantum subroutine, quantum computer doesn’t perform a full computation, only a portion, a “calculation”, such as the variational quantum eigensolver (VQE) method. What are the key differentiators between a computer and a calculator? Such as, both perform basic operations, logic — conditional execution, loops, iteration, nested functions, with parameters, input and output during processing, persistent storage — shared between runs of the same or different programs. Full Turing machine? What defines what that is?**What does it mean to have a quantum computer in the cloud?****What potentially new applications might quantum computing enable or outright create rather than existing applications it can be applied to?**Of course, who could possibly know in advance. Consider as basic research. Apply this principle to algorithms as well as machines and applications. Who knew what applications the telephone, TV, or the Internet would eventually enable — or classical computers in the early 1940’s?**Characteristics of algorithms to document.**Qubit requirements. Connectivity requirements — is nearest-neighbor enough, big-O for swaps needed based on input size. Coherence needed — big-O for gates based on input size.**Algorithmic building blocks, design patterns, quantum circuit libraries, and quantum application frameworks.**Building blocks for quantum programs. More semantically meaningful than raw, low-level quantum logic gates.**What are the most important algorithmic building blocks for quantum applications?****What are the physical concepts of quantum computing?**But abstracted away from particular implementations. And maybe separately link from each physical abstraction to each concrete physical conception.**What is a computational basis state?****What is a computational basis?****How is each distinct computational basis state of an n-qubit ensemble represented physically?**Using energy? Or some other physical phenomenon? If n qubits subjected to n Hadamard gates have 2^n computational basis states, how are all of those computational basis states represented, physically? If by energy, that’s a huge amount of energy. Or, are computational basis states merely a bookkeeping fiction, and if so, what do they really stand in for, physically?**What is a Hilbert space?**Does the average user need to know? If so, what exactly do they need to know?**What can and can’t the Bloch sphere tell you about quantum computing?****What are the merits and limits of the various technologies for implementing quantum computers?****How much can a layperson understand about quantum computing?****Need for algorithm design guidelines.**What factors to keep in mind when designing quantum algorithms. And guidelines for criteria for reviewing quantum algorithms.**Need for guidelines for problem decomposition to develop a quantum algorithm.**What opportunities to look for to exploit the capabilities of quantum computing, primarily quantum parallelism.**Design and development process and tasks for quantum algorithms.****How can we understand quantum computing when there are no classical analogs?**Actually, we have plenty of analogs from the real world that don’t fit cleanly into classical computing. And we do implement many of them using classical computing, just not very efficiently.**Quantum supremacy for one application does not imply quantum supremacy for any other applications.****Will any of the current crop of quantum computing hardware technologies be the one which achieves broad quantum supremacy, across a wide range of practical applications, or has that ultimate hardware technology not yet been invented?****When can we expect to see quantum supremacy for a practical problem?**Business, science, engineering, finance.**What is the smallest quantum computer which will solve a practical problem?**Number of qubits, length of coherence. May imply quantum supremacy. Or maybe simply easier to implement than a classical solution. Maybe a classical solution is available, but requires hundreds or thousands of processors.**What is a quantum Fourier transform (QFT)?**For that matter, what is a Fourier transform (or discrete Fourier transform)? Why does it matter, and when can and should a QFT be used? How does an algorithm have to be designed or adapted to utilize a QFT? What are the limits?**What is a resonator?**Or coupler. What specific functions does it perform?**How is a qubit read?**Measured.**How are two qubits entangled?****What is quantum coherence?****What is quantum decoherence?****How reliable do qubits need to be to solve a wide range of practical problems?****What is the minimum quantum coherence of qubits needed to solve a wide range of practical problems?****Need for a free, online, interactive quantum computer simulator.**And nice to integrate that with the option to run on a real, physical quantum computer as well, with the same user interface. There actually are some.**How good are quantum simulators?****How fast are quantum simulators?****What does it mean that quantum computing is probabilistic rather than deterministic?****How many repetitions of a quantum circuit are needed to assure that an accurate result is captured?**What formula or rule of thumb to use to calculate. See.*Shots and Circuit Repetitions: Developing the Expectation Value for Results from a Quantum Computer***What is computational complexity and Big-O?**Or algorithmic complexity. See*What Is Algorithmic Complexity (or Computational Complexity) and Big-O Notation?***What are BQP and QMA and which is better and why?**Bounded-probability quantum polynomial complexity class, quantum Merlin-Arthur complexity class. BQP vs. QMA == P vs. NP. P and BQP are “efficient” while QMA and NP are not — polynomial vs. exponential (or worse.)**Complexity classes.**Demystify the complexity of complexity!**Quantum computing for technical managers.**Not the details about quantum algorithms, but enough to facilitate the management of technical teams who are deep in the details.**Quantum computing for non-technical managers.****Quantum computing for technical executives.****Quantum computing for non-technical executives.****Quantum computing for IT staff.**They’re not deep into details of quantum circuits, but they need to plan, deploy, support, and maintain infrastructure to support quantum computing.**Quantum computing for senior classical algorithm designers.**What conceptual framework do they need to change their mindset?**Quantum computing for entry level quantum software engineers.**They are free of the baggage of classical computing (for the most part, and lacking an emotional commitment to it.) Quantum computing from scratch. Very clean. Devoid of casting quantum computing in terms of classical computing — no “It’s like a classical 0 and 1, but…” Quantum natives.**Quantum computing for technical journalists.**Definitely a need for this!**Quantum computing for non-technical journalists.**Definitely a need for this! But it may be a lost cause. Maybe just treat quantum as a black box and vaguely focus on summarizing benefits in nontechnical terms. But stay away from hype. If anything, try to counter the hype — moderate wild promises.**Quantum computing for policymakers.**Government types. Definitely a need for this!**Budgeting for quantum computing.**Especially for moving from experimental stage to production-scale operations. What do managers and executives need to know. Free prototyping does not translate into free production operation.**What is a unitary transformation?**Or unitary transform, or unitary matrix.**Relationship between quantum logic gates and unitary transformations.****Which unitary transformations are permitted and which are prohibited?****Why must a quantum computation be reversible?**Derived from quantum mechanics. Derived from laws of physics? Strict, hard determinism?**Why are complex numbers needed?****What is phase?**See*What Are Quantum Effects and How Do They Enable Quantum Information Science?***Why must phase be estimated rather than directly measured?****What is phase estimation?**And how to use it. When might this be feasible.**What are the limits to phase estimation?**Number of distinct phases and the minimum difference between any two phases.**When will quantum phase estimation be practical?**How many qubits, what circuit depth, and what coherence time will be needed to get various levels of precision? How much, if any, can we do with today’s hardware (18–20 qubits and limited coherence time)? Will 53 qubits enable a useful degree of quantum phase estimation? If not, how many qubits and coherence time will be needed to get interesting levels of precision?**What is order-finding?**And how to use it.**What is period-finding?**And how to use it.**What is the difference between order-finding and period-finding?**None that I can discern, but there may be some nuance that has escaped my comprehension.**What is amplitude amplification?**And how to use it.**What is a wave function?****What does it mean for a wave function to collapse?****What can quantum tomography tell us about what is happening to the quantum states of the qubits in a quantum computer while and after a quantum circuit has been executed?****How does a SWAP gate work, especially in light of the no-cloning theorem?**How does SWAP get around the no-cloning theorem? Background:.*Breaking Down the Quantum Swap***How can SWAP and “routing” be used to overcome limited connectivity between qubits?**Background:.*Breaking Down the Quantum Swap***What is the no-cloning theorem?**Why does it matter? Does it really impact much at all?**Why can’t I set a qubit to a specific quantum state?**Unless you know it’s current state already. Can only rotate relatively, not set to a specific angle of rotation. Technically could set an ancillary qubit and then swap with the desired qubit. IBM is adding a qubit reset feature.**How to achieve explainable quantum computing.**As with AI. Unfortunately, that could be a real challenge when using quantum parallelism and probabilistic computing.**How do you create a quantum program?****Using classical code to generate quantum circuits.**Commonly with Python libraries.**Using templates to generate quantum circuits.****Using application frameworks to generate quantum circuits.****Need to highlight the quantum parallelism portion of every algorithm.**What gives it an advantage over a functionally equivalent classical program.**When might we expect to see the first universal quantum computer — merging quantum gates and classical instructions?****How complex can a quantum program be?**Maybe that’s the same as total gate count, or maybe some gates are more expensive. Graph complexity as well. And, many gates could be executed in parallel, presuming that the firmware supports execution of multiple gates in parallel.**Is quantum computing required to achieve artificial general intelligence?**The answer may be yes, but for now it is not known or at least unproven.**What is the potential and prospects for photonic quantum computing?**What is Xanadu really up to and who else might be pursuing similar approaches?**What’s the simplest quantum computer possible?**Just for curiosity.**What is post-quantum cryptography?**AKA quantum-safe cryptography or quantum-resistant cryptography. Is it needed — will Shor’s algorithm work any time in the next 25 years? And when is it needed? What will it cost? NIST has ongoing efforts on this front. What’s available today? What will be available in 2–3 years, 5 years, 7 years? What should people be targeting for implementation?**When will Shor’s factoring algorithm be able to break strong encryption?****Suggested milestones for judging progress on implementing Shor’s algorithm for cracking public key encryption****Will Shor’s factoring algorithm really work for very large public keys?**What are the factors working against an effective solution? How many hardware advances will be required?**Need for a rewrite of Shor’s algorithm that is more complete and provides full justification of all details.**See my list of issues with Shor’s algorithm.**Need a baseline implementation of Shor’s algorithm.**There can be and are many derivative algorithms and implementations and improvements, but there should be a single agreed-upon starting point even if suboptimal. A baseline for technical comparison.**Need a baseline classical implementation of Shor’s algorithm.**Something to compare against, at least for smaller numbers. Maybe up to 16 or even 32 bits? Simply coding the quantum order-finding subroutine in classical code. Three levels of implementation: 1) simple, naive, single-processor, trying each order candidate sequentially, 2) multi-processor for each candidate order in parallel, 3) parallel multi-processor computations for a batch of trial random numbers in parallel.**Citation of Shor’s algorithm considered harmful.**There are no practical implementations of the original algorithm, as written, so there is no point to citing it as an example of a practical quantum algorithm — and this is not likely to change any time soon. It’s very misleading to speak of Shor’s algorithm for factoring large semiprime numbers as having been “demonstrated” or implying that it is feasible or practical for relatively large numbers. It adds no real value to any published paper to cite Shor’s algorithm. At best, it’s a mere distraction. At worst, it detracts attention from more worthy near-term pursuits.**In what areas is research still required for quantum computing?**Plenty. Like all areas. We need better hardware, more hardware choices, and better approaches and tools for algorithm design.**How much additional money should be pumped into quantum computing?**And in what areas would additional money really make a difference? Focus primarily on research, both hardware and algorithms.**How much additional basic research on quantum computing should the federal government itself fund?**As opposed to the commercial sector. And in what areas? Which areas benefit more from federal government funding than commercial sector funding?**GitHub as the repository of record for quantum algorithms, circuits, and applications.**Include configuration data as well as at least a few sets of sample input data and sample results — to facilitate tests for reproducibility. Include minimal documentation as well, and link to any relevant published papers (preferably on arXiv.org.)**arXiv.org as the repository of record for preprints of any and all formal papers related to quantum computing.**Hiding the full text of papers behind paywalls is simply not acceptable.**Which aspects of artificial intelligence (AI) can benefit significantly from quantum computing, and which are not likely to get any significant benefit?**Significance must be measured as true, dramatic quantum advantage.**Could video games benefit significantly from quantum computing?**Just curious.**Could high-resolution image and video processing benefit significantly from quantum computing?**May depend on the availability of much higher qubit counts.**Could audio processing benefit significantly from quantum computing?****Could quantum computing enable 3-D interactive video?****What are the GHZ and W quantum states, and how can they be exploited?****Need for open source and full transparency for all libraries used to build quantum applications.****What quantum computing advances are covered by intellectual property restrictions?****Might intellectual property (IP — patents) deter rapid progress in quantum computing, or might IP incentivize progress in alternative technologies to get around restrictive IP policies?****Do we need open source designs for quantum computers?****Need for open source and transparency for the software, firmware, and control logic for quantum computers.**The digital logic, firmware, and software which directly controls qubits — maps quantum logic gates and unitary transforms to qubit control signals (laser, microwave, flux bias, etc.) All math formulas and equations, and all critical constants and tunable settings.**Big Data and quantum computing.**Need to “chunk” data and “stitch” results, even for D-Wave.**No, quantum computing doesn’t magically solve all Big Data problems.**Or any of them for that matter. A quantum computer can only work with a very limited amount of input data, which must be encoded in the gate structure of the quantum circuit, and can only produce a very limited amount of output data, limited to one classical bit for every qubit. See.*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***I/O, database access, and network access for quantum computing.**There is no I/O, database access, or network access from a quantum circuit. Need preprocessing and post-processing to feed relatively small chunks of data into a quantum circuit and then post-process a relatively small number of classical bits of output. The quantum computer is used as a coprocessor.**Might quantum-only computing be meaningful?**Purely speculative. May be irrelevant if we achieve a universal quantum computer which merges both quantum and classical computing.**Quantum-inspired algorithms for classical computers.**Especially for massively parallel systems (dozens, hundreds, thousands of processors) and distributed systems. Essentially a stop-gap until we get to large-scale quantum computers (hundreds or thousands of qubits, or even 50–75 qubits), but may still have significant value for algorithms which cannot be cleanly mapped to pure quantum algorithms.**What is quantum-inspired computing?**Design algorithms and applications based on what’s optimal for quantum computers — the starting point, but also to run reasonably well on classical computers — especially parallel and massively distributed systems. And segue well to true quantum computers. Design patterns for algorithms so that they can be “compiled” to run optimally on native classical computers, and also run directly on real quantum computers.**Can quantum computers ever completely replace classical computers?**Maybe when we achieve universal quantum computers which are a deeply-integrated hybrid of quantum and classical operations and data.**How might quantum computing fit in with Kurzweil’s Singularity?**Pure speculation, but interesting nonetheless. Which might occur first? Are they definitely interrelated, or categorically distinct? Related, somehow. I just don’t ever recall Kurzweil discussing quantum computing or quantum information science as core to his Singularity.**What are the advantages of a trapped-ion quantum computer?**Pros and cons. Any to any connectivity. Longer qubit coherence. Slower gate execution.**How many different technologies are there for implementing quantum computers?****Can qubits have more than two states?**Asymmetric probability amplitudes and phase. Qutrits and qudits as well. Also the*continuous-variable model*(CV) qumodes of Xanadu’s photonic quantum computing.**What datatypes are supported by quantum computers?**Just raw, individual qubits — anything else is purely interpreted by the application.**What programming languages are supported by quantum computers?**None really. It’s all raw machine language. A wide range of classical programming languages can be used to construct a quantum circuit, one quantum logic gate at a time, which is then downloaded to an actual quantum computer (or a quantum computer simulator) for execution and to retrieve the results. But no high-level language code is executed on a quantum computer.**List of early, seminal papers for quantum computing.**Beyond Feynman’s.**The ethics of quantum computing.**Not clear, but it is worth pondering.**Software engineering for quantum computing.**Summary of the issues.**Proposal for quantum software engineering.**Ala software engineering not being the same as programming or software development.**The first qubit.**When was it? And the first gate execution. And the first entanglement of two qubits. Wikipedia is a bit fuzzy on this.**The first quantum computer.**That depends on criteria for what constitutes a working quantum computer.**What was the first notable quantum computer?**IBM 5-qubit? Some earlier lab experiments? NMR-based lab experiments? D-Wave (but not general-purpose)? Rigetti 3-qubit?**What are the Chinese up to?**Or the Russians? Quantum communication vs. quantum computing.**What is the killer application for quantum computing?****What did Feynman have to say about quantum computing?****Need for much higher standards for documentation.**Current doc is very uneven and even incomplete. Make it easier to compare across machines.**Need for standards for quantum computing.**Hardware and software. All interfaces and APIs. Common libraries of circuits, algorithmic building blocks, design patterns, and frameworks available across all or at least multiple families of disparate quantum computers. Common programming models and programming languages, both high-level and low-level, available across all or at least multiple families of disparate quantum computers. Common terminology, vocabulary, and glossaries available across all or at least multiple families of disparate quantum computers. Common metrics for performance, timing, reliability, and coherence available across all or at least multiple families of disparate quantum computers.- Refine and maybe even resolve my
**Lingering Obstacles to My Full and Deep Understanding of Quantum Computing****.** **Need simple plain text notation for even complex quantum states.**No Greek letters, math symbols, or other obscure notation. No complex or obscure graphics. 0 and 1 are obvious. 0/1 for superposition of 0 and 1. Etc.**How to generate a random decimal number using only qubit operations.**Powers of 2 are easy — 2^n is n random bt values. But how to generate non-power of 2 random numbers. How to map power of 2 random numbers to any decimal range. Dice — 6, 36. Can simulate roll of a dice with six qubits in W state. Decimal — 10, 100, 1,000, 1,000,000. How to map n qubits to a 0.0 to 1.0 range of real numbers — parameterize by precision.**How much of the****Quantum Algorithm Zoo****has relevance to current quantum computers and simulators?**How many of the algorithms are still relevant and useful?**Quantum computing 2.0 — quantum computing needs a reboot.**The underlying basics (physics) may not have changed, but a more modern formulation is needed, that is simultaneously algorithmically more powerful and easier to comprehend.**What is spin physically and how does it fit in with quantum computing?****How does spin make a qubit work?****How does magnetism fit in with quantum computing?****How are qubits reset to zero at the start of a quantum computation?**By what physical mechanism? How long does that take compared to executing a single quantum logic gate?**What is quantum co-design?**Quantum co-design principles.**What does it mean to measure in the computational basis?****What does it mean to measure in other than the computational basis?**Simply rotate qubits around the X or Y axis before measurement? Measurement is always in the Z-axis basis? But why do it — motivation?**How many pieces of information can a qubit represent?**Phase — imaginary portion of complex probability amplitude, in addition to the difference between the probability of |0> and |1>. Is that three? Or still just two? Or… what?**What is a universal gate set?**What constitutes it? What can and can’t you do with it? Clifford group and a single two-qubit gate (CNOT)?**What is a Clifford group?**What constitutes it? What can and can’t you do with it? What does the average quantum algorithm designer or quantum application developer need to know about Clifford groups? How do Clifford gates relate to Clifford groups?**Can we estimate the probability amplitude(s) for a qubit?**Not directly observable (measurable), but maybe some combination of rotation and phase estimation? How many repeated runs of a circuit would be needed to produce an accurate estimation?**Full list of the common quantum logic gates.**Maybe graphical symbols as well — or links. Link to details of the specific unitary transforms. Plain language descriptions of the gates. Not-so-common gates as well. Each quantum computer should have its own doc, but a fair amount of this doc should be common across machines.**What do we know about Google’s 72-qubit quantum computer?**Dead? Reduced to their 53-qubit machine? Still in process? What will come after it, and when?**The twin challenges of how to do basic computations and how to exploit parallelism on a quantum computer.**Even relatively simply math and algebraic expressions are not directly available on a quantum computer. And how to restructure iterative algorithms for parallel execution is a real challenge.**What does it mean to be Quantum Native?**As in**#QuantumNative**. Actually, it’s ambiguous, at least three if not four distinct meanings.**What is a Hamiltonian?**Total energy of a system, but what does that really involve and why does it matter? What do quantum algorithm designers and quantum application developers need to know about Hamiltonians?**What are the key performance indicators (KPI) or metrics for an algorithm?**And how do different algorithms compare? Ditto for overall quantum applications.**What is the difference between code, algorithm, and circuit?****Quantum algorithms as a process that produces a circuit.****What is adiabatic quantum computing?****What is time evolution?****The importance of heuristic methods.**Even on a quantum computer, many quantum simulation problems are still exponentially hard, so clever shortcuts are needed.**What are T1 and T2?**And T2*. What are common, desirable, and acceptable values?**What is the Hartree-Fock method?**Quantum simulation of both physics and chemistry, quantum computational chemistry. When is it relevant, when can it be avoided and how? What are its implications?**Career opportunities in quantum computing.**In general, and in detail, to some degree. Dovetails with personas.**What tasks and applications might still be beyond even quantum computing in 20 to 25 years?****When will a quantum computer be able to calculate the ground-state energy of aspirin?**C9H8O4. How many qubits? How many gates? What connectivity? How many repetitions (“shots”)? If drug design is a key future for quantum computing, it seems as if aspirin is a good test as a “starting gate” or baseline. Or caffeine. Or sugar. Or salt. Or gasoline. Question of chemical accuracy. What accuracy is really desired — and needed to have a dramatic advantage over classical computers?**What might post-quantum computing look like?**Speculation. Just for fun. Or maybe a great science fiction story. Make use of worm holes for networking? Make use of time travel (e.g., send data into the past, compute, and then it is complete in the present, or retrieve data or results from the future)? Spiritual computing? Extrasensory perception for UX using quantum sensing? Or, might the purported Singularity technically be post-quantum computing? Maybe qubits which can do everything a neuron can? True, human-level AI?**How to get quantum computing beyond the mere laboratory curiosity stage?**See*When Will Quantum Computing Advance Beyond Mere Laboratory Curiosity?***Is the inability to directly observe or measure probability amplitudes of qubits a fatal flaw of quantum computing?**Its Achilles Heel? Can it be corrected?**What would Turing say about quantum computing?**Very interesting question.**How does quantum computing relate to a Turing machine?****Would it make sense to have a multiprocessor quantum computer?**Could run multiple quantum programs simultaneously, or multiple variations (ansatze) simultaneously for variational methods. How many parallel processors? 4, 8, 16, 64, 128, 256, 1024, 4096?**My personal cut on Google’s quantum supremacy announcement.**Technical “cheat”, profound significance, or somewhere in the middle? What might be the first “real” (practical) application to achieve quantum supremacy? What might be the first “real” (practical) application to use more than 20–24 qubits, or say, 40 of those 53 qubits? What does Google’s feat actually allow us to do, today?**When did quantum computing initially earn the status of being an emerging technology?**Maybe IBM Q Experience availability? Or, has it yet?**When will quantum computing exit from being an emerging technology and enter the mainstream?**Break out as a standalone topic, but currently embedded in.*What Is Quantum Algorithmic Breakout and When Will It Be Achieved?***Checklists for documentation of algorithms, implementations, and test cases.**And a secondary checklist for implementations for documenting deviations from a published algorithm. Including what documentation should appear in a GitHub repository.**Is quantum computing a fad?**Or at least going through a fad stage (or stages plural) even if eventually it will achieve practical results.**What problems can a quantum computer compute that a Turing machine cannot compute — at all?**True random numbers. Possibly some aspects of human intelligence — creativity (based on true random numbers?)**Can |0> or |1> be forced in the middle of a quantum circuit?**Given a qubit in a random state, can it be forced to 0 or 1? Other than maybe measuring two qubits and then some CNOT combination? Or just use an ancilla qubit, but need one ancilla for each 0> or 1> you might need.**Is |0> a reasonable initial state for a quantum system, or is a random state more realistic?**Is |0> too artificial? While applications require initial 0 and which would benefit from purely random initialization.**Does an ion trap quantum computer require cryogenic temperature?**Isn’t a high vacuum cryogenic in temperature by definition, even if there is no helium cooling?**Will quantum computing break blockchain/bitcoin?**Exhaustively try strings to hash? But can’t change a hash that is already recorded. Impact on mining?**How large a quantum program can be simulated essentially instantaneously — in less than a single second?**Maximum circuit depth. Maximum number of gates. Maximum number of qubits. Formula for combining all three — qubits, gates, depth. Impact of entanglement — minor (20%), insignificant (5%), major (50–75%), or dramatic (2x or more).**Why am I still unable to write a brief introduction to quantum computing?**Lack of a concise quantum Hello World program that effectively uses quantum parallelism to achieve quantum advantage. No coherent high-level programming model. Lack of a great set of coherent quantum algorithmic building blocks. No coherent methodology for mapping real-world problems to models readily implemented on a quantum computer. No real-world problems solvable with so few qubits. Difficulty achieving quantum advantage — the only reason to use a quantum computer. Particulars of quantum parallelism, especially limitations. Product state, multi-qubit computational basis state. Utility and application of phase. Nuances of interference and how to exploit them. Multi-qubit entanglement vs. strict bipartite — any limits — short-term, theoretical, medium-term? Top reason: I’m not so interested in what you can do with current or near-term quantum computers — which are too limited and not suited to production-scale applications. I’m far more interested in what we will eventually be able to do once the technology has evolved and matured to support production-scale applications and easily achieve dramatic quantum advantage. I fully expect the technology and programming model to evolve to something very different from what exists today, so that describing one does not help you understand the other to any significant degree. The singular benefit of current quantum computing technology is to justify funding for research to evolve to the quantum computing technology of the future — what quantum computing will be once it can support production-scale applications which can easily achieve dramatic quantum advantage. Current technology is more of a*trivial sandbox project*, not so worthy of my attention.**My journey into quantum computing has given me a newfound appreciation for the incredible intellectual power of classical computing.**Quantum computing may do a few things much better (quantum parallelism, probabilistic computing, generation of random numbers), but classical computing has so many features to offer that are not available in quantum computing, yet. At best, a quantum circuit is no more than a simple code block in a classical program. So many wonderful classical features… Loops. Conditionals. Functions with parameters. Rich data types. Arrays, matrices, tables, lists, trees, graphs, and maps. Limitations of quantum computing: everything must be expressed in terms of raw physics, all operations must be reversible, no fanin allowed, no fanout allowed — no-cloning theorem precludes copying, no persistence — no I/O, mass storage, file systems, databases, or network access.**How much depth of knowledge is needed about quantum information theory?****Musings on Schrödinger’s cat.**(Mew-sings?!) What state is it really in?**What are killer apps for quantum computing?****Applying quantum effects to general AI and simulation of the human brain/mind.**Probability and statistical aggregation. Data-driven processing, not program-driven. And sensor-driven as well — quantum image sensors.**Synchronous vs. asynchronous processing.****Natural quantum entanglement.**How much of the universe was entangled in the earlier stages? How much remained entangled? How much of the original entanglement persists today? How much of the universe is entangled today? What fraction? How common? What are the most common mechanism for quantum entanglement today? Can or do chemical reactions induce entanglement? Can or do chemical reactions reduce entanglement? What is the primary function of entanglement in the natural universe? Is it essential or merely a side effect?**What is Xanadu really up to?**Are they real (finally) or just more smoke and mirrors? In what timeframe might they have a real machine? Do they have any prototype machines? What’s their development roadmap? What milestones? Seems like more of an ongoing research project rather than simply commercialization of existing academic research.**The allure of quantum computing (for me) is dimming at a rapid pace.**My growing disillusionment with both progress and direction. Depth of existing hype. Pace of fresh hype. Problems which are super-exponential. Slow progress towards quantum advantage. Lack of discussion of technical criteria for practical applications. Insufficient funding for research.**My passion for quantum computing is dwindling — fast.**Still have curiosity. Still some potential. But potential is not as great as previously claimed or perceived. Exponential speedup doesn’t solve super-exponential problems.**My disappointment/disenchantment with the current state of the quantum computing field.****How long before a quantum computer can do anything practical at production scale?**The ENIAC moment, or some other criteria? Specific technical criteria for practical. Quantum advantage. Well beyond largest supercomputer — 10X? 100X? Or all that matters is that existing computers can’t achieve solutions that a quantum computer can achieve. Can anything practical be done with 40–45 perfect qubits (limit of simulators)? Do ethical issues matter before practicality occurs?**Can quantum computing ever expand beyond being a mere coprocessor for narrow computations?**Complex logic, rich data types, complex data structures, Big Data, functions, classes/objects, I/O, databases — all while under quantum parallelism.**Future of quantum computer/processor as a feature of a universal computer rather than as a separate computer.**Coprocessor vs. just additional instructions. Special data types vs. an attribute of existing data types.**Can quantum computing be considered practical without quantum advantage?**I don’t think so.**Quantum algorithmic tools.**Phase kickback. Quantum teleportation — other than communication? What else?**Distance scales for interconnecting quantum processing elements.**We need decent terminology and scales of reference when discussing the interconnection of quantum processing elements, whether they be individual qubits, modules of qubits, or distinct quantum computers, from 1 angstrom to 50,000 miles. This covers*quantum networking*as well as individual modular quantum computer systems.**The four basic building blocks of quantum computing.**Quantization — two discrete energy levels, Superposition, Entanglement, Interference. And Hadamard transform, quantum parallelism, quantum phase estimation, quantum Fourier transform, etc.**Quantum ethics.**What are the implications of a quantum computing society? What if classical computers were a thousand, million, billion, trillion, quadrillion, quintillion times faster? What could a terrorist or anarchist do with a quantum computer? Or criminals. Or hackers. Or young people with no sense of context or judgment. Will quantum computers truly serve humanity? Can there really be any ethical issues before quantum advantage? Can hype be considered an ethical issue — if over-promising causes attention to be diverted to quantum computing that could have been focused on a classical or traditional solution, or if not advancing classical computing as fast as it could be advanced?**How to Get Started in Quantum Computing?**To do what? What target persona? What target use case? Preparation for what? Role-specific — producers — research and vendors, consumers. Application architects. Application developers. Algorithm designers. Business managers with business operations problems to solve. STEM managers with STEM problems to solve. Clearly document your intentions — what problems are you trying to solve, specific use cases.**How can we automatically validate scalability of a quantum algorithm?****What greatest advance would I like to see next on the quantum computing front?**If I could only see one major advance in quantum computing in the coming year, what would it be?**What will be the next major breakthroughs for quantum computing?**More than simply lots of incremental advances.**What does the field really need right now, for the coming year?**Deliver promised qubit increases. Improve coherence time.**What might you do with 1,000 qubits?**Any off the shelf algorithms that can immediately scale up (or down)? What about Shor’s algorithm? Questions about total gates, maximum circuit depth, and coherence time. Is quantum error correction needed?**What might you do with a million qubits?**Scarcity of algorithms. Something that is commercially relevant or scientifically interesting. Besides quantum error correction and logical qubits. Sensor-backed qubits? Still too small for Big Data, and no I/O. What single computations might need a million qubits, as opposed to a collection of independent computations which could be executed in parallel on a modular quantum computer with say 16K qubits on each of 64 QPU modules? What single Hadamard transform or QFT would need a million qubits or a sizeable fraction thereof?**What might you do with 512 qubits?****What might you do with 256 qubits?****What might you do with 128 qubits?****Thoughts on the IBM Quantum Roadmap****.**See.*IBM’s Roadmap For Scaling Quantum Technology***Quantum computing as a Mount Everest problem.**Relatively easy to climb up the lower portions of the slope, but lack of oxygen on the upper reaches of the slope can just crush your soul and spirit and bank account or at least the patience of your management and backers.**Clarify the terminology: quantum computer, quantum computer system, quantum computing, quantum computation, quantum processor, quantum processing unit, quantum device, NISQ device.**Where are the boundaries between pure quantum and non-quantum hardware?**Venture capital opportunities in quantum computing.**Really only a research play right now. “Quantum Ready” is more of a fiction than a reality. More than a few years from commercial or even laboratory availability of a quantum computer capable of delivering dramatic quantum advantage for production-scale real-world problems.**Issues for more qubits.**Connectivity. Fraction which are most reliable. Topology for connectivity and specific algorithm subcircuits. Opportunities for error mitigation or even correction. Perform the same subcircuit multiple times in a single overall circuit, or is shot count a better approach?**Annual Christmas and New Year wish list.****Christmas wish list half-year update.**Noteworthy progress. Notable lack of progress. New items to add or remove. Maybe a full copy of the original with annotations. Or just a list of updates.**Simpler introduction to quantum effects.****What would robots be able to do if they had access to a quantum computer?****What would robots be able to do if they were based on a quantum computer?****A new business model: research venture capital.**Fund research with exit strategy of selling research results and expertise to a product-development company. Could also turn the research venture into a product development organization, but… different staff needed, different skills needed, and interests are different. And time scales are different. Could license IP and expertise to multiple vendors.**Does quantum computing need to be domain-specific?**Is there some advantage for designing and building more-specialized quantum computers? Is a fully-general quantum computer too difficult, too complex, or too time-consuming to achieve in a timely manner? Simulating physics. Simulating chemistry. Business optimization. Material design. Battery design. Drug design.**When will we transition to talking primarily about logical qubits rather than physical qubits?****When will algorithm designers and application developers no longer be working directly with physical qubits?**Some, the elite, may continue working with raw physical qubits for extreme niches. The FORTRAN Moment will likely provide the transition for non-elite technical staff.**The art of crafting a quantum algorithm.**The art of crafting an interesting interference pattern.**Quantum volume — how does it work, what does it mean?**Deeper dive. List all of the factors. Explain in plain language. Can it be reverse engineered? What does it really mean for application developers? Does it really not work for >50 qubits? See*Why Is IBM’s Notion of Quantum Volume Only Valid up to About 50 Qubits?***A few general comments on quantum volume.****How to decode quantum volume.**How to get square size. How wide and long a non-square rectangle is supported. Sense of overall fidelity. Sense of connectivity. Anything else? Or is a single number a one-way hash which hides rather than enhances value? Or is it only useful for comparing machines, relative to each other, and not comparing rectangle size, maximum circuit length, or connectivity?**Companies looking to deliver data science, machine learning, AI, drug discovery, material design, and business optimization over the next 2–3 years should not be looking to quantum solutions in that timeframe.**Experimentation, prototyping, and leading-edge research, yes, but not production-ready solutions.**The fantasy of quantum computing.**Top 10 fantasies — coming soon — within 2 years, reliable qubits, coherent qubits, high-capacity simulator, portability of algorithms and applications between vendors (hardware platforms), high-level programming model and language, transition from hyper-hype to real reality.**The problem and opportunity of quantum networking.**No-cloning theorem — not simply copying information from A to B. Entanglement — transmitting shared information, establishing a portal (of sorts.)**Five modes of a classical quantum simulator.**Classical quantum simulators. 1) Ideal, perfect simulation, noise-free, no limits, maximum precision. 2) Very closely match the error and limit profile for a particular real quantum computer — so simulation results closely match the real machine, and given a relatively small number of real machines, could run a very large number of simulations on more plentiful classical machines. 3) Tune noise and error profile for proposed enhancements to evaluate how effective they will be — exactly match proposed evolution for new machines (simulate before you build.) 4) Ideal real simulation — not quite perfect since there are theoretical limits from quantum mechanics. 5) Near-perfect qubits — configure for number of nines. Five distinct purposes. Make sure intentions for each simulator run are clear.**Three purposes for classical quantum simulators.**Oops… see*Five modes of a classical quantum simulator*.**Quantum effects in biological life.**Quantum effects in human senses — sight, hearing, touch. Smell and taste? Quantum effects in human (and animal) memories. Quantum effects in insects (and birds). Quantum effects in plant life. Quantum effects… everywhere. Any quantum effects peculiar to biological life?**Breakthroughs vs. incremental progress.**Will quantum error correction be enough? Are more dramatic hardware breakthroughs needed? Is a dramatic algorithmic breakthrough needed? Need a much higher level programming model, with logical operations. What is needed for The ENIAC Moment, or The FORTRAN Moment?**What’s my most valuable contribution to quantum computing at this stage?**Fighting hype. Simplifying jargon. Uncovering nuance. Uncovering issues. Understanding — and explaining — limits and limitations.**Essential issues for quantum computing.**Probabilistic and statistical rather than deterministic. Noise and errors using NISQ vs. eventual QEC fault-tolerant quantum computing. Reduction of problems from application terms and traditional math and computer science to raw physics. Need for elite staff. Classical computing still has plenty of gas and runway. Classical Boolean logic and classical mathematical algebraic expressions don’t translate well (at all) into quantum code. Individual qubits vs. numbers.**Quantum computing needs a massive reset.**Need for a reset for the entire field. Back to square one for theory, terminology, programming model, algorithm design, and qubit technology. In fact, we may need multiple resets, not unlike the numerous generations of classical computing. I sincerely don’t believe that we’ll get to dramatic quantum advantage with the current approach to quantum computing (NISQ devices.) Radically different or improved qubit technologies are needed. Significantly more nines of qubit fidelity are needed. A higher-level programming model is needed. A richer set of algorithmic building blocks are needed. We may need a*Quantum Computing 2.0*to get to The ENIAC Moment, a*Quantum Computing 3.0*to get to The FORTRAN Moment, and a*Quantum Computing 4.0*to get to a Universal Quantum Computer integrating full quantum and classical computing capabilities.**Need a roadmap for quantum computing for what is needed, not simply what can be done.**Needs to be more about problem solutions than the raw technology.**Relation of a qubit grid or lattice to a classical systolic array.**Potential to reorganize a qubit grid with a processing bus so that the quantum state of two qubits can be shuttled to a processing unit to be processed and then shuttled back to qubit storage.**What will it take for Honeywell to double quantum volume every year?**Simply add one single qubit, as well as assure coherence and gate errors are sufficient to add another layer of gates, and presuming that full any-to-all connectivity remains supported. But quantum volume requires classical simulation to validate results, so limited to roughly 50 qubits, max.**Should quantum applications even need to be aware of qubits, even logical qubits, or should higher-level abstractions be much more appropriate for an application-level quantum programming model?**Classical programming doesn’t require knowledge of bits: Integers, real numbers, floating point, Booleans — logical true and false — binary, but not necessarily implemented as a single bit, text, strings, characters, character codes, structures, objects, arrays, trees, maps, graphs, media — audio, video, structured data, semi-structured data.**Quantum area networks (QAN).**Goal is modular quantum computing systems. Limit to qubits in a micro-area (1 mm or 1 cm?). Need ability to “shuttle” quantum state across relatively short distances, a few mm, a cm, maybe a few cm, or maybe a few feet, or even 10–20 feet to enable two or more modular quantum systems to be combined into a single quantum computer. Maybe some way to daisy chain QPUs to have even a large data center complex as a single quantum computer system. Distinguish “moving” entirety of a qubit quantum state vs. enabling two-qubit gates for entanglement between two adjacent modules.**Is Shor’s algorithm the worst-case limit for quantum algorithms, or just the starting point for serius, heavy-duty quantum algorithms?****What’s the largest single quantum computation we can imagine at this time?**If we could build the largest quantum computer we wanted to solve the largest and hardest computation problems we could possibly imagine, what computations or problems might they be?**What might a modern-day alchemist be able to do with a quantum computer?**No longer limited by trial and error experimentation — at least in theory.**Google quantum supremacy one year on.**Why didn’t this feat open the floodgate for practical applications or even one additional example of quantum supremacy? Preskill: What’s Next After Quantum Supremacy? http://theory.caltech.edu/~preskill/talks/Preskill-Q2B-2019.pdf**Risk of intellectual property (IP) causing a quantum dark age.**Accenture patents? Put emphasis on open source.**How best to fight all of the hype surrounding quantum computing?****My two-year outlook for quantum computing.**Should update every year.**Should I take a break from quantum computing, maybe even a protracted Rip Van Winkle sleep?**How long? Maybe two years, or maybe just six months, or maybe even sleep for 2–4 years, or 5–7 years, or even 15–25 years and wake up when quantum computing really is universal and finally ready for deployment at production scale, or a year before The ENIAC Moment. Criteria for resumption, threshold. Let the actual technology catch up with at least a fraction of the hype — and my own expectations. What to monitor while I wait — news, papers, conferences, books. Definitely monitor progress, but how to do that most effectively. Maybe not a break so much as shifting gear or slowing down or part time so I can focus on other things as well.**Quantum computational biology.**Quantum computing for bioinformatics and genomics.**What’s wrong with the traditional quantum example algorithms.**Not solving real problems. Too artificial. Too inscrutable. Grover provides only a quadratic speedup and really isn’t appropriate for “databases” per se. The examples, or their narrative, is unclear about how quantum parallelism really works.**Need a better collection of example algorithms.**Solve problems people can relate to. Solve real-world problems. Demonstrate quantum advantage.**Parse problem definition statements to determine if the problem could be solved with a quantum computer.**If you wanted to program an AI to parse problem definition statements to determine if the problem could be solved with a quantum computer — with a significant, dramatic advantage over a classical solution — what exactly should the AI look for?**Initial thoughts for quantum computing abstraction.**How to abstract away from quantum mechanics. Criteria. Opportunities. Obstacles. Hardware resources vs. software resources. Is interference a quantum mechanical resource? It’s just the imaginary part of probability amplitude. Is probability amplitude even a quantum mechanical resource? Is phase a quantum mechanical resource? Does it have an existence distinct from merely probability amplitude? RFP for algorithm abstraction. Application domain: concepts, processes, goals, inputs, data structures.**Significance of PSPACE.**How exactly to characterize algorithmic complexity for quantum computing. What qualities from problem space to map to qubits vs. computational basis states.**Quantum multiprocessing.**2, 4, 8, 16, 64, 128, 256, 1024, 4096 QPUs. Architecture. Connectivity, networking. Algorithm design.**Quantum race.**Players. Objectives. Milestones. Multiple races.**Anatomy of a quantum algorithm.**What are the pieces? How do they relate or interconnect? How many distinct design patterns? SPAM — State Preparation And Measurement. Hybrid models. Clever reduction of algorithmic complexity.**Algorithm categories from Quantum Algorithm Zoo.**See.*Quantum Algorithm Zoo***Possible abstractions for a quantum programming language.****My five (or so) key stumbling blocks in quantum computing.**How are multi-qubit product states represented physically — for n qubits, 2^n quantum states. Granularity of phase and probability amplitude.**Spoiled by the richness of classical computing, Turing machine, data type abstractions.**How to live without those rich capabilities. See also:*My journey into quantum computing has given me a newfound appreciation for the incredible intellectual power of classical computing*.**Standardized syllabus for introduction to quantum computing.**But there may be multiple levels of introduction. Various end states. General awareness, no technical depth. Ready for sophisticated applications. Each application area has different depth requirements. Also introduction to quantum and quantum information science as well as quantum computing itself. Different programming languages and runtime environments. And I’m still not ready to write my own What is quantum computing? paper!**Top priorities for quantum computing research.**Better qubit hardware. Qubit fidelity — coherence, gate operations, measurements. More qubit hardware technologies. Better environmental isolation and shielding. Logical qubits, error correction. Larger numbers of qubits. Modular quantum processor designs for very large numbers of qubits. Algorithms capable of scaling to quantum advantage. Tools to analyze algorithms for scalability issues. Tools for testing and debugging algorithms. Better classical quantum simulators — performance, capacity, accuracy, configurability.**We need more basic, fundamental research for quantum computing.**We need at least 15 or 25 projects for alternative qubit technologies — qubits which are higher reliability, smaller, cheaper, and faster.**Some day all computers will be quantum computers.**Quantum-level logic could enable much smaller and much more efficient classical logic. Use logical qubit technology to construct classical bits. Simultaneously use desire for smaller and more efficient classical bits to drive down size and performance of physical qubits and logical qubits.**Introduction to quantum variational methods.****Can quantum variational methods ever achieve quantum advantage?**Sure, variational methods are a great way of performing computational chemistry calculations on NISQ devices, but not with any apparent let alone dramatic quantum advantage. Too fragmented. Each invocation is too limited. Need to use 50 or more qubits in a single Hadamard transform computation to achieve quantum advantage.**Take a stab at introduction to quantum computing.**As limited and problematic as that might be. At least describe the visible tip of the iceberg even if much of the subsurface of the iceberg is not quite so clear. Start with what I wrote inAlso see*What Is Quantum Information Science?*.*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***What practical, production-scale real-world problems can quantum computing solve — and deliver dramatic quantum advantage and dramatic real business value?**And be clearly beyond what classical computing can deliver.**Quantum phase estimation (QPE) and quantum Fourier transform (QFT) as discrete (parameterized) firmware operations.**Permit algorithm designers and application developers to use QPE and QFT as simple, atomic black boxes rather than a blizzard of gates. Optimize any SWAP networks needed for connectivity within the firmware and hardware.**Quantum computing as it exists today and in the near future is a dead end.**No hint of The ENIAC Moment. Quantum advantage is not possible without quantum error correction or very near-perfect qubits. Next two years. For the foreseeable future? Hardware — Not enough qubits, Weak fidelity, Limited connectivity. Algorithms — No robust programming model, No robust set of algorithmic building blocks, No easy paths from problem statements to quantum solutions, No robust set of introductory example programs, No clear path to quantum advantage. Not suitable for production-scale real-world problems.**The fundamental tension between isolation, control, interaction and entanglement at the quantum level.**How the conflicts can be managed and what are the limits to coping with the tension.**Does information have a state, or do only devices and media have states?****What is quantum error correction?****What might lurk beyond surface codes for quantum error correction?****How precisely can a probability amplitude or probability or phase be estimated?**Can 0.0 or 1.0 or even 0.50 be precisely achieved? Or is there always a tiny epsilon of uncertainty down at the Planck level — 0.0 plus epsilon, 1.0 minus epsilon, 0.50 +/- epsilon? Same or different epsilon for probability amplitude and probability? Separate epsilons for the real and imaginary parts of a probability amplitude? Epsilon as a Planck probability? How tiny? Also for minimum difference between two probability amplitudes. Even for 0.50, how close can two probabilities be? Has anyone else written about this?**Cautions for enterprise managers and executives when contemplating quantum computing investments.**Quantum computing isn’t anywhere near to being “ready” for enterprise deployment of production-scale applications to address real-world problems with dramatic*quantum advantage*over classical systems and deliver dramatic real business value. It’s okay to “get ready” and do some preliminary research and studies and experiments, but it’s*not*time to make significant investments and expect business value to be delivered within the next two years. Be prepared to wait two or three or four or five years — or even longer — before quantum computing actually can deliver dramatic quantum advantage for real-world problems.**Could some portion of my writing be easily turned into a book?**Some collection of papers or a summary targeting some particular audience. But is it really worth my time, energy, and attention, or just a distraction. Who would really benefit — what audience to target?**Is quantum computing needed to achieve human-level artificial intelligence?**Maybe not quantum computers as currently envisioned, but some level of computing using quantum effects. Turing u-machine. Quantum operations — cannot be computed classically.**Is quantum computing real?**Capable of solving production-scale real-world problems and delivering substantial real-world value?**How would you explain quantum computing to someone unfamiliar with the technology?**“*Quantum computing isn’t just a step-change in quantum or in computation — it’s a complete paradigm shift, like moving from a candle to a lightbulb. It’s operating off entirely new principles of physics, offering a novel way of interacting with and manipulating information.*” — is that a reasonable, useful, and actionable thing to say? Niche applications. Small portion(s) of applications. Little data with huge solution space and little output. Coprocessor. Hybrid quantum/classical. Need elite staff to analyze real-world problems and synthesize quantum solutions. Need prepackaged quantum solutions so that less-elite organizations can deploy quantum solutions.**Is there any material or substance that can’t be used as a qubit?**A grain of salt, sugar, or sand? A drop of water? A small ball bearing, BB pellet, or lead shot? A snippet of copper wire? A capacitor? In theory, anything could be used to build a quantum computer (or qubit) — everything is based on the same fundamental quantum mechanics. Isolation, control, coherence, and measurement are the gating factors. Since the natural state of the world is quantum, “*Almost anything becomes a quantum computer if you shine the right kind of light on it.*” — MIT physicist Seth Lloyd**Models for quantum computing hardware.**Simple bare hardware. Complex software and optimization over simple hardware. Complex firmware over simple hardware. Simple software over complex hardware, with macro operations in firmware and hardware. Multiple parallel hardware — multiple shots of the same circuit in parallel (2, 3, 4, 16, 64, 256, 1024), possibly using simple voting to select the dominant result. Argue for extra research for more reliable qubits vs. error correction/mitigation.**Probability plus statistics equals approximate determinism.**Statistical aggregation of probabilistic data can approximate determinism. Is all apparent determinism really simply approximate determinism? Is there any absolute determinism in a universe based on quantum mechanics? Dampening of quantum mechanical effects, oscillation at the Planck level.**Quantum computing today is NOT like buying a PC in 1981.**Not even like buying a MITS altair in 1974. Not even like building a custom system using a 4004 microprocessor. Not even like building a computer from scratch using discrete transistors (1958–1970) — or vacuum tubes or electromechanical relays.**Is the wave function the unit of quantum information?**Combines 0 and 1 — superposition. Represents entanglement. Wave function volume = total number of terms of all wave functions for a system with non-zero probability amplitudes. Basis states may be fractions of a unit of quantum information, but not a complete unit of quantum information.**Quantum Internet.**What exactly is it? Is it simply a special instance or application of quantum networking, or a distinct concept. Regular Internet using quantum links? Quantum channels? Quantum hubs — store and forward?**What if we had a 10X quantum computer — 10X qubits, 10X coherence, 10X reduction in all errors, dramatically greater connectivity (or full A2A), what could/would we do with it?**Are there algorithms on the shelf ready to go? QPE, QFT? Is even 10X enough? If not, how much is? If enough, which applications are most promising?**Short piece on achieving quantum parallelism using Hadamard transform (gate).****Will IBM end up as they did with the PC — early lead but then overtaken and overshadowed by Compaq, Microsoft, Apple, and Dell?****Quantum computing as analog computing.**Phase and probability amplitude as a continuous value (albeit ultimately discrete.)**Quantum computing technology vs. products.**Raw technical capabilities vs. features and brands.**Integrate the other aspects of quantum information science with traditional quantum computing.**Quantum communication. Quantum networking. Quantum metrology. Especially quantum sensing for real-time imaging and signal processing.**How small can we make a controllable qubit?****What exactly would we use more than 50 (or even 32) qubits for?****How exactly does AI mesh with quantum computing?**What is quantum machine learning really all about? How much can it do? What are its limits? How does it scale?**Where is the value in quantum machine learning if there is no support for Big Data?****Foundational skills for quantum computing.**What are they? Where’s the line, and what belongs on the other side of the line?**Can we control a photon’s quantum state?**Or can we capture enough of its state with a matter interaction, do the control, and then regenerate a fresh photon with the updated state?**Amplitude amplification.**Definition. Earliest paper. Definitive paper. Lecture notes. applications.**Squeezed states of light.**What are they and how can they be used for quantum computing?**Potential for distributed quantum simulator.**Is it easy or hard? Theoretical limits. Practical limits. Using specialized processors, memory, and connectivity. Using commodity hardware and networks only.**When will we have the first customer installation of a quantum computer?**A commercial model, not bespoke from a vendor such as IBM, Rigetti, IonQ, or Honeywell?**What can we learn from the timelines of ENIAC, EDVAC, EDSAC, et al to UNIVAC I and IBM 701?****When will we have the first deployment of a commercial quantum application on a customer’s on-premise quantum computer?****Is quantum computing appropriate for problems of factorial complexity?**Or must they first be reduced to exponential complexity? Well, they do have to be reduced to polynomial complexity anyway. If it takes two steps with an intermediate reduction, fine. But better if there was a one-step reduction. The traveling salesman problem?**Relative merits of a dedicated cloud quantum computer vs. on-premise quantum computer.**Is it truly significant or merely a logistical detail or organizational preference.**Should we be focusing more effort on boosting high-end simulators in order to enable much more sophisticated algorithms? Rather than stunting algorithm ambitions by focusing them on noisy low-qubit current machines?**Make 32–40 qubits the sweet spot for algorithms. And emphasize the need for 32–40 qubit algorithms that are likely to scale as real hardware becomes available that exceeds even the best simulators.**How many wave functions are needed to define the full state of a quantum computer?**One for each isolated system, each unentangled qubit. One for each collection of entangled qubits. If two qubits are entangled by a 2-qubit gate, how does the number of wave functions change?**Chaos theory and quantum computing (or quantum information science in general).**Quantum chaos theory?**Keep a list of all papers I’ve read.**Should try to do summaries. At least the LinkedIn comments. Also papers that I’ve noted but haven’t read yet.**Convert my Quantum Computing Notes document to a live posted paper.**Notes all papers I’ve run across, regardless of whether I’ve read them or not.**Levels or stages of NISQ.**At a minimum: today, coming year, 1–3 years, 5 years, 10 years. Limit of NISQ before quantum error correction and flawless logical qubits. Dimensions: number of qubits, fidelity (coherence time, gate errors), connectivity.**Can a quantum computer compute general relativity?**Asking since general relativity can’t be expressed as quantum mechanics, the native science of quantum computers.**Can the****Wolfram Physics Project****be computed using a quantum computer?**Is Wolfram’s Fundamental Theory of Physics compatible with quantum mechanics, the native science of quantum computers. Could some variant of quantum computing be more compatible with Wolfram’s Fundamental Theory of Physics? Is there any workable synergism between Wolfram’s Fundamental Theory of Physics and quantum computing?**What game-changing big new thing can we expect or wait for next for quantum computing?****Design patterns for quantum algorithms and applications.****A quantum computer as a function, subroutine, or coprocessor.**Think of a quantum computer as a function, subroutine, or coprocessor.**A few issues for API design for quantum computing.**Three API tensions: function, performance, ease of use. Issues… Difficulty of knowing the proper set of functions in advance. Difficulty of predicting usability problems in advance — how developers may actually use the API, the use cases. Performance is commonly problematic and difficult to predict profiles for real applications or tolerance for performance by real users. Semantic mapping from application semantics to API semantics.**What industry or government agency would most likely have such a critical need that they could fund development of a custom quantum computer focused on that need, resulting in an ENIAC moment?****Might a big hedge fund (e.g., D. E. Shaw) be the first to achieve The ENIAC Moment with a machine tailored to their extreme need for edges in trading and asset pricing?**A very concrete need rather than abstract and generalized or general purpose. Much more focused and intense need than run of the mill banks for financial applications. Or other types of organizations which have a very focused need and the resources to fund and focus a specialized machine. Can later generalize from that specialized machine.**Could a pure quantum computer ever compete with the capabilities of a classical computer?****What is the essence of a classical computer?**Turing machine, control flow (including function calls), data structures, rich data types, functions exposed by operating system, including file systems, databases, device control, and network access.**What is the essence of quantum computation?**Superposition, entanglement, and interference.**Wall clock problems.**A quantum circuit may be really fast, but number of iterations times number of shots or circuit repetitions may be very high. Seconds vs. a few minutes vs. an hour vs. multiple hours. An organization might need a solution in a specified interval of time, whether minutes or hours for optimization problems or seconds or milliseconds for real-time problems.**Could a quantum computer simulate U.S. presidential electoral college results based on popular polls for each state?**Margin of error for each state. Granularity of results for each state — how continuous, how discretely chunked.**How big a quantum simulator do we need to prove our quantum algorithms?**How much scalability must be explicitly tested vs. how much can be safely extrapolated or mathematically proved.**Simulation of spin lattice and continuous-space problems.****ESP using entanglement?**Embed a quantum chip in the human brain.**Which is deterring progress the most, hardware limits or limited algorithms?****How exactly can you get n isolated qubits into 2^n quantum states?**Walk through the detailed steps, explaining the underlying theory and process, for each quantum state of some small examples such as 2, 3, and 4 qubits (4, 8, and 16 quantum states.)**Introducing the GHZ and W quantum states.**Walk through the theory, steps, and process, for each quantum state of some small examples such as 3 and 4 qubits. Bell states as well, as an introduction**Little data — large solution space.**See.*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***Advice for students contemplating a career in quantum computing.****Implications of the U.S. Army paper:****The Weaponization of Quantum Mechanics: Quantum Technology in Future Warfare****.****My concerns about quantum computing at this stage.**Lack of robust application examples. Lack of great algorithmic building blocks.**I’m not ready to give up on quantum computing yet, but…**Not even close to being useful. Not even just over any reasonable horizon. Serious issues even on a much longer timeline.**Quantum computing: All dressed up and no place to go.**Maybe great (or at least decent) algorithms that can be simulated, up to a degree, but no hardware to achieve true quantum advantage, for now. “Maybe next year [or the year after that or…].”**What does the merger of classical and quantum computing consist of?**Merger implying integration. Other than a mere superficial juxtaposition (or superposition?!?!). Mere “hybrid” of quantum and classical? How exactly would they blend? See*What Is a Universal Quantum Computer?***What is a complex quantum algorithm?**What’s the threshold or criteria to distinguish from a simple algorithm? When is a quantum algorithm no longer a trivial or simple or non-complex quantum algorithm? Raw number of gates? Degree of entanglement? Number of qubits in the largest register with all qubits in a Hadamard superposition?**Not every application which overwhelms classical computing is necessarily automatically an application for quantum computing.**Need specific or even general criteria. Are we really using classical computers as effectively as we could? What applications might overwhelm even a quantum computer?**Quantum phase considered harmful.**Or at least risky and potentially problematic, for now, although improved qubit fidelity could make it safer. What’s the maximum problem size it could be used for? See.*Beware of Quantum Algorithms Dependent on Fine Granularity of Phase***Quantum variational methods considered harmful.**Variational methods are useful in general in physics, and seem to work modestly well on NISQ devices, but they don’t appear to offer any prospect of achieving dramatic quantum advantage. They are a mediocre substitute for true quantum parallelism on a scale capable of achieving dramatic quantum advantage.**Grover’s algorithm considered harmful.**Or at least less than helpful. Not really relevant to traditional Big Data databases. Only a quadratic speedup — not an exponential speedup. What would be a better example? What would be a realistic app to use Grover? How many qubits? What phase granularity requirement?**Quantum computing is stuck in Toy Land.**Currently only suitable for relatively trivial toy-like algorithms and applications. Not practical for real-world production-scale problems.**Adventures in Toy Land — Quantum computing is not ready for prime time.**Alternative title for*Quantum computing is stuck in Toy Land.*Prototype algorithms and applications have great appeal, but the technology is not even close to being ready for prime-time, real-world, production-scale problems. No viable near-term path, only promise of a more distant future, couched as if it was coming soon when it is not.**Need for novel classical processor designs to support large-scale quantum simulators.**Much smaller and simpler classical processors. Don’t need to do a lot, just manage a large number of quantum states in memory. And simple qubit operations. Need to manage connectivity. Smaller so many more can be placed on a single chip — maybe 256, 1024, 4K, or even 16K or more processors on a single chip. Maybe a goal of 10M processors for a single simulator, each with 64 GB RAM each. Manage connectivity and routing of operations with a smaller set of external processors.**What are the quantum volume requirements for an algorithm or application?**Applying quantum volume to algorithms and applications. Need automated analysis. Especially tricky for dynamically-generated algorithms and applications, such as for a Python program which generates and executes quantum circuits.**How to map your application problem solution to a problem in physics.**Need for application frameworks and domain-specific mappings so elite pioneers can do all the heavy lifting to map the problem area to physics problems so that their followers can then more directly exploit (reuse) those mappings without necessarily even understanding what’s going on under the hood.**Stuck/mired in the quantum swamp.**A superposition of futures — simultaneously very bright and very discouraging! Great potential for various niches, but not value or irrelevant for many others.**What will it take to get to The ENIAC Moment?****Fiction: A story centered on reaching The ENIAC Moment for quantum computing.**Give the topic some color, excitement, and possibly even a hint of scandal.**When can we expect practical, production-scale real-world applications with a dramatic quantum advantage?**5–7 years? Two years after The ENIAC Moment — the first production-scale application.**Does NISQ really offer any significant prospect for quantum advantage?**See*Can a NISQ quantum computer ever achieve dramatic quantum advantage?***Quantum advantage for current, NISQ variational methods.**Any true quantum advantage at all? Too fragmented, with no significant opportunity for dramatic quantum parallelism?**Incremental progress to quantum advantage.**Slow slog. Does it have to be this slow? How could it be faster?**How soon will algorithms hit a hard wall without quantum error correction?**Algorithms using more than 28 or 32 or 36 or 40 qubits? For usable, practical algorithms addressing real-world business problems, not specialized computer science experiments.**The coming balkanization of quantum computing?**Will quantum computing split into more-specialized research and vendors to address the differing needs of different application categories? Physics/chemistry simulation vs. business optimization, finance, etc. What are the major areas? Non-physics/chemistry science — drugs, materials, batteries? Engineering — aerospace, etc. Business process optimization. Machine learning and other AI. Along the same lines as FORTRAN, COBOL, BASIC, SQL.**What quantum volume is required for practical quantum computing applications?**Maybe 32 qubits by 32 gates = 2³² = 4 billion? 40 x 40 = 2⁴⁰. 50 x 50 = 2⁵⁰. 60 x 60 = 2⁶⁰.**Clearly define application areas.**Discrete optimization. Quantum optimization algorithms.**Quantum supply chain.**Meaningful term or just hype? What is a supply chain vs. ecosystem?**How many qubits do you need to be a NISQ device?**Technically, at least 50 — “50 to a few hundred” is the official range for*intermediate-scale*as per Preskill’s original paper. But, most people seem to presume that*all*current quantum computers are NISQ.**Need for quantum circuit narration.**Ascertaining the net effect of a quantum circuit or even a short snippet of a quantum circuit can be tedious, error-prone, or even beyond the reach of many users. An automated tool for*narrating*the incremental and net effect would be very helpful to answer the question*What does this quantum circuit (or snippet or portion of the circuit)**do**?*Alternatively, a library of common circuit patterns with narrative English descriptions of the effect and purpose of each pattern would accomplish much of the purpose of such an automated tool. Alternatively, authors of quantum circuits could carefully comment their circuits, explaining what each sequence or arrangement of gates is attempting to do — it’s net effect and purpose. The net effect is that quantum circuits need to be more readable by real people, not simply executable on a quantum computer.**Quantum snake oil.**Lots of hype. Lots of exaggerated promises. Very little of practical utility (production-scale) actually delivered.**Quantum snake oil and its purveyors.**Alternate title.**Wanted: True Nobel-class geniuses to propel quantum computing more than a few quantum leaps forward.**Mere incremental engineering just won’t cut it to finally make quantum computing usable and useful at production scale for real-world problems. Breathtaking advances are needed, but incremental advances are simply not enough. Quantum computing will just be dead air until then.**What would Feynman say about the current state of quantum computing?**I suspect that he would be rather disappointed since his original impetus was to be able to fully simulate physics far beyond what can be done using classical computers, but current quantum computers can simulate precious little of physics, and any serious simulation of physics still requires classical computers.**What might an ideal high-level quantum programming model look like?**What would the fundamental algorithmic building blocks look like? How does one proceed from problem statement to solution?**What might an ideal high-level quantum programming language look like?**The syntax is less interesting than how computations can be expressed in the programming model.**Is D-Wave more of an analog computer than a digital computer?**So unclear, but seems so.**What is a post-NISQ quantum computer?**What would it look like and what can it do that a NISQ quantum computer can’t do?**When should senior managers and senior executives expect that quantum computing will be capable of delivering dramatic top-line revenue and bottom-line profit — 3, 5, 7, 10, 12, 15, or 20 years?**Tough question. Significant investment will be needed well in advance of production-scale deployment, but business managers need to focus on delivering dramatic business value using mature technology, not availability of risky early versions of technology.**The ugly truth about NISQ computers — great for testing but not for production.**Small scale is a possibility, but not production scale.**100 qubits = 2¹⁰⁰ = a billion billion trillion quantum states in a single computation as the sweet spot for truly dramatic quantum advantage.**2¹⁰ = one thousand. 2²⁰ = one million. 2³⁰ = one billion. 2⁴⁰ = one trillion. This isn’t the total number of qubits in the quantum computer, but the number which are used in a single Hadamad transform-based computation.**Fill in the TBDs of my quantum computing glossary.****Are variational methods a deadend?**Inability to achieve dramatic quantum advantage?**Qubit storage memory with a qubit bus and a single central quantum processing unit as an alternative architecture for a modular quantum computer.**Performing gate operations on each qubit in situ is a physically expensive proposition with all of those per-qubit cables and electronics outside of the cryostat. A simplified qubit storage memory with a bus to shuttle qubits to a central quantum processing unit and then back to qubit storage would seem to be much simpler overall, and likely even much more efficient than the bulky arrangements currently used with significant amounts of electronics outside of the cryostat. Multiple buses and multiple QPUs could also be supported. Much larger numbers of qubits could be supported.**Significance of quantum phase difference with entangled qubits (product states.)**Quantum phase difference makes sense for the wave function for a single isolated qubit with two terms with two phases, but what does the phase difference mean with more than two product states, such as Bell states, GHZ states, and W states.**Near-perfect qubits.**Much less noisy than NISQ devices, but still not as absolutely stable as full quantum error correction. Exactly how close to perfect is near-perfect? How close to perfect would many applications require? See also:*Nines of qubit fidelity*. See also:.**NPISQ vs. NISQ devices.**NPISQ = Near-Perfect Intermediate-Scale Quantum. Not quite as error-free as full quantum error correction, but getting close, and far less noisy than NISQ. May be good enough for many applications. And doesn’t require the sheer volume of qubits that full QEC requires. Full QEC (FTISQ = Fault-Tolerant Intermediate-Scale Quantum) may still be needed for some applications, but NPISQ may be good enough for many.**Noisy, near-perfect, and fault-tolerant quantum computers.**Fault-tolerance requires dramatically more qubits (redundancy). Noisy is too noisy for most applications. Near-perfect still has some minor amount of noise and isn’t as absolutely perfect as fault-tolerant QEC, but may be good enough for many applications. Prefixes: N, NP, FT. See**Small scale, intermediate scale, and large scale quantum computers.**Intermediate-scale quantum computers, as in NISQ, have 50 to a few hundred qubits. Small scale would be under 50. Large scale would be more than a few hundred, thousands, or even millions. Prefixes: SS, IS, LS. Combine with prefixes for noisiness (N, NP, FT): NSSQ, NISQ, NLSQ, NPSSQ, NPISQ, NPLSQ, FTSSQ, FTISQ, FTLSQ.**We desperately need near-perfect qubits to accelerate algorithm development.**Noisy qubits are an incredible burden and impediment to designing quantum algorithms and developing quantum applications. Logical qubits based on full, automatic, and transparent quantum error correction will solve a plethora of problems, but won’t be available for quite a few years, so the intermediate steppingstone of near-perfect qubits will enable a fairly wide range of algorithms and applications. Near-perfect qubits are needed to accelerate algorithm design, application development, and deployment of quantum computing solutions.**Quantum processor vs. quantum computer — where does memory fit in?**Currently, qubits are the only memory for quantum information in a quantum computer, and all qubits reside in the quantum processing unit (QPU). This contrasts with a classical central processing unit (CPU) where only registers are in the CPU and all memory and storage is external to the CPU. Current quantum computers have only registers (qubits) with no quantum equivalent to either the main memory or mass storage of classical computers. But this begs the question of how quantum computers will evolve once the concept of true*quantum memory*enters the scene.**Applicability of Church-Turing thesis to quantum computing.****How fundamental is Bell’s theorem/inequality to quantum computing?****Pre-1982 influence on Feynman.**Others who made contributions. Understand the fundamental principles which guided their thinking. How their thinking evolved. What fueled Shor’s thinking. Origin of Hadamard transform and quantum parallelism.**Feynman’s original paper.**Carefully read and summarize Feynman’s 1982 conference paper — International Journal of Theoretical Physics, “Simulating Physics with Computers” — https://people.eecs.berkeley.edu/~christos/classics/Feynman.pdf — based on a conference keynote in 1981. 1985 magazine article — Optics News — “Quantum Mechanical Computers” — http://www.quantum-dynamic.eu/doc/feynman85_qmc_optics_letters.pdf.**Might an application-specific or domain-specific quantum computer lead to the greatest breakthrough in quantum computing?**We do have D-Wave, but it is algorithm-specific without general-purpose capabilities. Classical computing leaped over analog computing with the capability of a general-purpose Turing machine.**OpenFermion — is it a toolkit, framework, or solution?**Seems like a toolkit, but maybe some framework features. Definitely not a solution. Not strictly an algorithmic building block, but it may have algorithmic building blocks under the hood, and may offer algorithmic building blocks to users.- Annual update for:
**Lingering Obstacles to My Full and Deep Understanding of Quantum Computing****.**Maybe just focus on a top 10 or top 25. What would provide me with the greatest insight into quantum computing? **What does the CNOT gate really do?**What is its true purpose? What can you really do with it? Focusing on the cases where the probability amplitudes are neither exactly 0.0 or 1.0. The definition “*leaves the control qubit unchanged and performs a Pauli-X gate on the target qubit when the control qubit is in state ∣1⟩; leaves the target qubit unchanged when the control qubit is in state ∣0⟩*” leaves a lot unsaid. No mention of entanglement or Bell states. Definition is too classical, insufficiently quantum. “Controlled NOT” is not a satisfying description.**What exactly happens when you apply a single-qubit gate to one qubit of a collection of entangled qubits?**Such as an X gate or a phase shift. What quantum mechanical insight should apply? What happens phenomenologically with real physical qubits?**Where is quantum computing today?**Outside chance of something useful in 2 yrs or 2–3 years? Greater chance of something useful in 5 years or 4–5 years? Nothing really useful today? Unlikely in next 6 months? Still rather unlikely in 12 months? Or even 18 months? Is two years still a little too far out to be trying to accurately predict? Will Honeywell and IonQ break out, stall, or stumble? Toy apps — 5 to 17 qubits, no real apps with over 20–24 qubits, other than Google quantum supremacy example? Unclear how many qubits are usable with quantum simulators — very small numbers, 20, 40 (only in theory)? Connectivity is still a limiting factor — transmon — hard limitation, trapped ion — promises any to any, but not yet a reality for a significant number of qubits?**What’s the best way to learn quantum computing?**It all depends on what your interests and objectives are. Differing personas: algorithm designer, application developer, hardware engineer, software engineer, mathematician, physicist, business strategist, business analyst, chief technical officer, manager — technical, manager — non-technical, wide range of executives.**What are OKRs for quantum computing?**Objectives and Key Results.**What’s the singular most compelling application of quantum computing?**Many possibilities, but what really stands out?**Three stages of deployment for quantum computing: The ENIAC Moment, configurable packaged solutions, and The FORTRAN Moment.**The initial stage of deployment — The ENIAC Moment — for quantum computing solutions relies on super-elite STEM professionals using a wide range of tricks and supreme cleverness to achieve solutions. For example, manual error mitigation. The second stage — configurable packaged solutions — also relies on similar super-elite professionals to create frameworks for solutions which can then be configured by non-elite professionals to achieve solutions. Those non-elite professionals are able to prepare their domain-specific input data in a convenient form compatible with their non-elite capabilities, but not have to comprehend or even touch the underlying quantum algorithms or code. The final stage — The FORTRAN Moment — relies on a much more advanced and high-level programming model, application frameworks, and libraries, as well as logical qubits based on full, automatic, and transparent quantum error correction to enable non-elite professionals to develop solutions from scratch without the direct involvement or dependence on super-elite professionals.**No, most current quantum computers are not NISQ devices.**Technically,*intermediate-scale*(the IS in NISQ) means at least 50 qubits, up to a few hundred qubits. As such, quantum computers with 5 to 32 qubits, the bulk of most current quantum computers don’t technically qualify as true*NISQ devices*. That said, most people consider these smaller quantum computers to be NISQ devices anyway. That’s life in a young sector where hype is rampant and terminology is somewhat problematic.**People are jumping the gun and prematurely acting as if quantum computing technology was near to being ready for deployment in the next few years.**A belief in deployment of quantum computing in the next two or three years is clearly not currently justified. Even deployment in three to five years is currently not technically justified. Five to seven years*might*be a practical timeframe, but that’s really merely speculation at this juncture.**Baseline list of potential quantum applications in order of likely implementation.**Most interested in when each application will reach dramatic quantum advantage.- Enhancements to
The quantum hypothesis — in general. Historical role in transitioning from the classical world to the quantum world. Momentum, angular momentum, orbital momentum. Add Helicity (spin, angular momentum), Chirality (handedness), polarization, and detail on*What Are Quantum Effects and How Do They Enable Quantum Information Science?***creation and annihilation operators**(primarily to facilitate quantum mechanical modeling of many-particle systems.) Harmonics? Localization vs. spooky action at a distance? Pure state, mixed state, density matrix, density operator. Wave packets. Nuclear magnetic resonance (NMR). Some early efforts in quantum computing relied on NMR. Nuclear electric resonance. Gravity — nominally lies in the realm of General Relativity rather than quantum mechanics, but… who knows for sure. Can a quantum computer be used to accurately simulate or model gravity and gravitational effects, including effects on time and space? Clearly quantum metrology relates to gravity and gravity waves in some way. Is friction a quantum effect? Avalanche and threshold effects as quantum effects? Correlated quantum matter and quantum materials. Mention hidden variables. Mention Hilbert space. **Is a quantum computer more of an analog computer?**Or at least could be used as an analog computer for some applications. Continuous range for probability amplitudes and phase. Instant calculation based on physical effects. Probabilistic and approximate rather than deterministic. Approximation, questionable precision, questionable determinism. Qubits are partially binary, but partially analog. Analogous to raw physics rather than the abstract mathematics of Boolean logic and Turing machines.**Quantum computing — how much is truth and reality and how much is folklore and dubious dogma?**Is quantum error correction really needed or not? Is it possible to achieve dramatic quantum advantage using NISQ devices? What are reasonable milestones vs. imposed impediments? Classical vs. quantum — reality vs. myths and folklore. How much is mere wishful thinking? Is it true that quantum has to be hard? Can just anybody learn and be successful with quantum? Is Quantum Ready a real thing or just a marketing scam? Relative merits of transmon vs. trapped ion qubits — is one clearly better, or is some entirely new or hybrid approach to qubits needed? Are cryogenic temperature and/or extreme vacuum essential or just near-term temporary requirements?**Phase kickback.**What is it? How is it used? More examples. Minor vs. big-deal feature? Theoretical basis. Treat it more formally. Practical ramifications. Scalability. Limitations? Alternatives? Update TBD in glossary.**Can a quantum computer process or create music?**Relatively small circuit but with streaming data. Can process in either direction, in theory. Practical ramifications. Is dramatic quantum advantage a practical possibility?**Model for a quantum computer as a coprocessor — what computations can it compute?**In theory. Technical specification. What can’t it compute? What can it best compute compared to a classical computer?**What reasons for wanting to learn quantum computing can be satisfied and lead to satisfaction in the next 2 years? 3–4 years? 5 years? 6–10 years?****Why do you want to learn quantum computing?**Who needs to learn quantum computing? What problems are you trying to solve and what constraints of classical computing are getting in your way?**Toolkits, libraries, frameworks, and packaged solutions for quantum computing applications.**Relative value. Relative benefits. Limitations. Toolkits help and give you some, but limited, leverage: Little in the way of deep, dramatic abstractions, Significant knowledge of under the hood is required, Work across multiple or even all domains, Intimate knowledge of probabilistic and statistical solutions rather than deterministic solutions. Frameworks gives you significant leverage: Many deep and meaningful abstractions, But still a lot of gaps and connections that user must fill in, Some, but limited, knowledge of under the hood may be required, Degree and depth of knowledge will vary from framework to framework and domain to domain, May work across at least some domains, but may be domain-specific, Some degree of knowledge of probabilistic and statistical solutions rather than deterministic solutions. Packaged solutions: It’s all there, User gets to work exclusively in terms of higher-order abstractions, User can do some degree of configuration and customization, but all in terms of higher-order abstractions, No knowledge of under the hood required, Usually domain-specific, although there may be some niche horizontal solutions, Deterministic solutions, or maybe some degree of statistical solutions, if that is appropriate for the application domain, otherwise strictly deterministic, even if that is accomplished using statistical methods. Tools vs. toolkit — standalone tools as apps vs. modules linked from libraries? Libraries separate, or simply the implementation mechanism for toolkits and frameworks? Under the hood: Math, Physics, Greek symbols, Arcane jargon.**Quantum hell.**Waiting for… Hardware, Algorithm metaphors, and Applications. Needed and critical: genius-level innovation.**What is the computational complexity of your quantum algorithm?****How scalable Is your quantum algorithm?**Or, is your quantum algorithm scalable?**What might come after quantum computing?**Super AI — heuristic shortcuts to offer a super-exponential advantage. Role of quantum gravity. Address problems with factorial complexity (super-exponential.) What else?**What are quantum computers good for?****What is quantum?**See*What Are Quantum Effects and How Do They Enable Quantum Information Science?***What does a quantum technologist do?**What are they interested in? Capabilities, limitations, and issues. See.*My Interests in Quantum Computing: Its Capabilities, Limitations, and Issues***How do you talk about quantum to lay people?**So many different audiences, personas. So many different interests, use cases. So many different levels of sophistication. Granularity of the universe. Qubits behave communally — enormous parallelism. Chemistry is fundamentally quantum. Molecules are quantum. Much greater sensitivity. Classical mechanics is an approximation — it doesn’t capture everything. Quantum mechanics has been around for a long time. A lot of uncertainty in and about quantum technology.**How to create a quantum literacy workforce.****Which will come next: more qubits or higher quality qubits?**Either way, we need near-perfect qubits, even for quantum error correction. Quality would help a lot. Not clear that algorithms are capable of utilizing more qubits yet, especially if better connectivity is needed. Will ion traps reach commercialization soon or have a slower slog towards a higher number of qubits?**Beyond quantum computing.**Quantum mechanics does not include gravity. Cannot simulate any theory of everything, or any phenomenon which includes any gravity-like effects.**Intuitive computing — mental processing using quantum effects.**How does the human mind accomplish quantum, intuitive leaps? How do quantum effects in the human mind enable intuitive leaps? How can a computer, say a quantum computer, replicate the intuitive aspects of the human mind.**If we had a full-scale quantum computer today, how could it help us fight the coronavirus outbreak?**Why not? Capacity? Coherence? Error rate? How might a much better quantum computer have helped? What algorithms?**What’s wrong with quantum volume?**Limitations. Low utility. Lack of specificity.**Alternatives to quantum volume for benchmarking.**Separate grade for each capability. Number of qubits. Nines of qubit fidelity for gate execution and measurement. Coherence time. Connectivity — any to any vs. SWAP network fidelity? Gate execution time as well (ion traps supposedly much slower.)**Application-specific benchmarking and application category-specific benchmarking.****The hidden shift problem.**What exactly is it useful for? Hidden subgroup problem (HSP). Given two functions f, g such that there is a shift s for which f(x) = g(x + s) for all x. The problem is then to find s. If f(x) = g(x + s) then s is referred to as a shift.**Quantum memory.**What is it really? Not comparable to classical memory. What is it needed for? What can be done with it? Quantum communications. Quantum network. Quantum storage? Flying qubit? Repeater?**The double-slit experiment.**What’s it really all about? What’s really going on? Its significance to quantum information science? Any significance to quantum computing? How can photons interfere if they can’t interact due to BES (Bose-Einstein Statistics)? Can we simulate it with a quantum computer? Do we need to deeply and intuitively understand it to understand quantum computing, possibly interference in quantum computing? Are photons directly interfering and interacting with each other, or is the interference occurring as the photons interact with the target matter?**Customer journey maps for quantum computing.**How does that fit in with personas, use cases, and access patterns?**Eigenvectors and eigenvalues.**What are they really? And eigenfunctions and eigenstates. Is amplitude really the eigenvalue, or is the basis state the eigenvalue? Is the basis state the eigenvector, or is the eigenvector the basis state plus the amplitude? Is an eigenvector an eigenvalue plus an amplitude?**Is all interaction in the universe quantum interaction and hence quantum information processing?****Requirements for Hello World for quantum computing.**A single X or H gate might suffice, but my inclination is that it should be the simplest program which shows a practical example of quantum parallelism with results that most people can understand. Something functional, such as 2, 3, and 4-qubit QFT. Something that most people can relate to and say “*Wow, that’s cool!*” Preferably, effectively uses quantum parallelism to achieve quantum advantage. Should attempt to show quantum advantage, but that could take 50 qubits or more. Maybe a graduated collection of Hello World programs is best.**Whose hardware and programming model is more inscrutable, D-Wave or Xanadu?**D-wave’s QUBO algorithm has a precise mathematical definition, but understanding how and which applications it can apply to — and achieve dramatic quantum advantage — remains inscrutable. How to compare QUBO and gate-based quantum computing remains inscrutable. Xanadu’s hardware and programming model seems to be constantly evolving and rather inscrutable. It’s unclear how to compare Xanadu to gate-based quantum computing. Especially with qumodes vs. qubits, continuous values (CV), squeezed states of light, etc. All of that said, it’s not clear if the current gate model of gate-based quantum computing really is the best way to go anyway, especially to achieve dramatic quantum advantage.**Basis state vs. basis vector.**Unravel the terminology. And computational basis state and product state as well. And eigenstate while we’re at it.**Does a quantum computer have or need any sense of time, other than a simple ordering of events (unitary transforms)?**Granted, timing has great relevance for the implementation of gate execution and measurement, and performance.**Is quantum computing just a passing fad?**Will it ever have dramatic practical utility?**Best practices for quantum computing.**Both general and specific.**Tasks to pursue to get deeper into quantum computing.**Read a lot more of the papers coming out. Read more of the older papers, the foundational material. Deeper understanding of how qubits are actually implemented. Deeper understanding of quantum mechanics. Deeper understanding of quantum chemistry. Deeper understanding of Shor’s algorithm, and others — look at the derivative algorithms, read Miller’s ERH paper. Study number theory. Quantum error correction. Quantum communication. Post-quantum cryptography. Fill in as many of the TBDs in my glossary as possible. Consider a quantum Rip Van Winkle nap to wake up when quantum computers finally are mainstream with widespread production-scale applications very common — how many years should I set my alarm clock for?**What’s the quantum advantage of your algorithm?**See*What Is the Quantum Advantage of Your Quantum Algorithm?***Quantum variational algorithms.**(Or is it variational quantum algorithms?!) Including VQE and QAOA. Limitations. Preprocessing. Optimization. Tuning parameters. Variations, iterations. Scope, range, applications. But is quantum advantage really possible?**The essential concepts in quantum computing.**Not all of the technical details, math, physics, and nuances, but what really matters. Quantum parallelism. Interference. Phase. Circuit repetitions (shots) for statistical significance. Probabilistic. Little data with a big solution space and little output. Also see.*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***What really cool science fiction could be enabled by quantum computing?**Or has all of that already been done by scifi classical computers which just happen to have all of the capabilities of quantum computers? AI. Predict the future. Predict human behavior. Explain the past. Discover patterns. Resolve and cure human social problems. Simulate God?!**How are gates executed on transmon qubits?**How are X, Y, and Z rotations performed? How are two-qubit gates executed? How are the complex numbers of unitary transform matrices mapped to physical operations?**How are qubits reset to |0> for transmon qubits?**How long does that take, relative to gate execution? How close to absolute zero probability amplitude can you really get? What residual error for probability amplitude is there? Does the residual error have only a real part or an imaginary part as well? What might the imaginary part look like — constant vs. random?**How advantageous is it to run on a real quantum machine rather than on a high quality simulator, which has more qubits, greater coherence, and better connectivity?**Especially for applications where a short-term proof of concept is a dead-end and longer-term higher-quality algorithm is a better long-term goal.**NISQ era.**Does it have any real practical consequences, or is it simply an inconsequential steppingstone to get to a proverbial*post-NISQ era*where there actually are dramatic practical consequences? AFAICT, NISQ alone won’t enable production-scale practical applications with dramatic quantum advantage.- Add the recent (2019) Preskill comments from Quanta to my quantum advantage paper. Preskill in Quanta:
— https://www.quantamagazine.org/john-preskill-explains-quantum-supremacy-20191002/*Why I Called It ‘Quantum Supremacy’* **Do we really need to use current, very limited real quantum computers to be Quantum Ready?**Aren’t quantum simulators good enough and in fact better since they transcend the severe limitations of NISQ devices, at least up to 40 or so qubits? What transition threshold are we trying to be ready for? Quantum supremacy — approx 50 qubits? 72? 128? 256? 1024? 2K? 4K? 8K? 16K? Or what? Or are even many-qubit machines likely to be noisy for more than 5–10 years? So that we need to be ready for significant noise indefinitely? But can’t we effectively model such eventual noise levels on classical simulators anyway? We should.**Vague terms to clarify.**Quantum theory — synonym for quantum mechanics, or… what? Quantum science — umbrella over what, or what narrower meaning? Quantum technologies — super-vague.**Quantum parallelism as the central transform for quantum algorithms.****Need for a quantum industry association.**Or more than one. Standards — NIST role, international, non-governmental. Business best practices. Community — communities. Standard, common, shared libraries — executable code, test code, test data. Benchmarks.**Can photons be controlled and entangled to construct qubits?****Can the probability amplitude of one basis state have an imaginary component and the probability amplitude of the other basis state does not?**Maybe as the result of entanglement? If possible, how can it be used in computation?- Update
Large number of measurement shots needed — can force use of sub-optimal algorithms in a bid to dramatically reduce the number of measurements/shots needed. Is the ultimate rule for shot count polynomial or exponential? More emphasis on advanced, powerful simulators. Or is the current section enough? Maybe mention AtoS, hardware accelerators. More massive memory.*What Is Quantum Algorithmic Breakout and When Will It Be Achieved?* **Is the inability to directly measure probability amplitudes the fatal flaw of quantum computing?**But what about phase estimation? Takes too many qubits and requires too much coherence. Destructive read.**What actually happens when two qubits become entangled?**The step-by-step process. Especially if either or both were previously entangled with other qubits.**How exactly do two qubits become unentangled?**Step by step process. Final state of each qubit.**Fractional parallelism.**Unable to do full desired parallel computation in one step — must break it into pieces with optimization or other processing between the pieces. Blocks for D-Wave. Variational methods. Machine learning?**Status of two-year path to initial success.**When did I first say I saw a 2-year path to a real, meaningful application of quantum computing? Later I codified it as. Now, how much longer is it likely to take? Is it slipping a year every year?*The ENIAC Moment***Algorithmic design process and building blocks are still too primitive.**Still at the aspirin stage of drugs. What would be needed to simulate aspirin, in terms of qubits, gates, connectivity?**Still In the 1930’s of classical computing.**Current quantum computers are not even comparable to what we could do with classical computers in the early 1940’s. For a timeline of historical classical computers, see.*Timeline of Early Classical Computers***Timeline of Early Classical Computers****.**For reference and comparison with quantum computing. 1930’s to 1970’s.**Fake Op-Ed — where is quantum computing at?**Progress. Supremacy. Hype. False hope — way too much of it. Unrealistic promises — way too many of them, too many unfulfilled, too few fulfilled. Real potential, but way too soon to herald a new era — “Coming not-quite soon”, 5–10 years, 2–5 years if you’re optimistic, 10–20 years if you’re pessimistic. Significant research needed — more raw federal money, more targeted federal projects, more targeted federal commercial projects.**Product states vs. entangled states.**Decomposition into tensor products of individual states vs. cannot decompose into tensor products.**Avoid gimmicks, shortcuts, and short-term tricks and workarounds.**Focus on longer-term utility and quality.**Algorithmic Qubits (AQ).**Proposed by IonQ. Alternative to IBM’s Quantum Volume. “*defined as the largest number of effectively perfect qubits you can deploy for a typical quantum program*”. “*define a typical quantum program (circuit) as one that has a size (number of fully-connected gate operations) that scales with the square of the number of algorithmic qubits.*” AQ = log2(QV), or inversely, QV = 2^AQ.**NSSQ is a better term for most current quantum computers than NISQ.**Only three current quantum computers fully qualify as true NISQ devices — 53 qubits from Google and 53 and 65 qubits from IBM. Technically, intermediate-scale (IS in NISQ) starts at 50 and runs to a few hundred qubits, so current quantum computers with fewer than 50 qubits are not technically NISQ devices. So I came up with the term NSSQ to refer to Noisy Small-Scale Quantum devices, where small-scale refers to fewer than 50 qubits.**Categorically distinct status of probability amplitude.**The fact that it is not measurable. Similar to phase? What actually changes physically on execution of an RY gate, which shifts between the two probability amplitudes of the two basis states? In contrast to what physically changes on execution of an RX gate, which shifts phase.**Quantum Phase Estimation (QPE) is the natural translation of the FCI procedure of quantum chemistry to quantum computers.**“*Exact simulation of quantum chemistry systems is widely regarded as one of the problems that would benefit enormously from quantum hardware. The Quantum Phase Estimation (QPE) algorithm is the natural translation of the FCI procedure to quantum computers.*” Seeby Cao, et al.*Quantum Chemistry in the Age of Quantum Computing***Suggested metadata for quantum computing papers.**Qubit count, total circuit size, maximum circuit depth. Quantum volume required. Estimated Big-O — computational complexity. Estimated quantum advantage. Citation(s) for comparable classical algorithms and estimate or quantify specific or approximate quantum advantage over classic solutions. Specify circuit repetitions (shots) required for each run of the algorithm — specify any criteria, metric, or formula used to define the repetition count, and whether parameterized by particular input values for each run. NISQ vs. FTQC for production applications. Production vs. proof of concept vs. experiment. Detail scalability of current implementation. Connectivity requirements — degree of SWAPs needed to overcome limited connectivity. SQUID vs. ion trap vs. suitable for both. Too painful and tedious — and frequently incomplete — to tease the information out of the text of most papers.**XX and ZZ interactions.**What are they? How are they used in the design of quantum algorithms?**What happens during the gap between The ENIAC Moment and the FORTRAN moment?**The Lunatic Fringe reigns supreme with raw machine language serving a limited, elite audience and market. Packaged solutions requiring only configuration but no algorithm coding can enable solutions to problems for a wider audience. A number of mini-stages before the full FORTRAN moment is achieved — partial FORTRAN moments, such as a limited number of logical qubits. Some niche applications may be able to verge on The FORTRAN Moment even before many or most applications are enabled for the full sense of The FORTRAN Moment.**What’s next now that Google has claimed quantum supremacy?**Google says quantum error correction. Is that “next” or nX improvement in coherence and near-perfect qubits?**If the Google quantum supremacy experiment ran one million shots, which would take 10,000 ears to simulate classically, what might a small number of classical simulation runs, each taking 3–4 days accomplish?**What would a single run tell us? 4–8 runs? 10 runs? 25 runs? 50 runs? 100 runs (one year)? How many of Google’s one million runs are required due to the noisiness of their quantum hardware, so that a smaller number of perfect, noise-free simulation runs would accomplish what one million noisy runs accomplished? How much could we learn from a small fraction of the perfect runs that Google’s experiment implies — the runs needed to accommodate the probabilistic nature of the experiment, not the noise itself? 10,000 years divided by 1M = 0.01 years = 3–4 days = one simulation run.**Modern methodologies that may be applicable to quantum computing.**Any? What might be some possibilities? How can they be adapted? Problem analysis vs. design synthesis.**Prerequisite knowledge needed to use a quantum computer at a semi-expert level.**What’s common to all quantum computers? What’s special to a particular quantum computer? In what subject areas is comprehensive and deep knowledge of the entire subject needed vs. a more cursory or particularized subset which can be taught or read much more simply than an entire course (or multiple courses) and textbook.**Separate worlds of probability and statistics vs. deterministic calculation.**How often would a quantum computer be able to add 2 + 2 and get 4 as opposed to getting 3 or 5 or who knows what else (0, 8?).**Quantum mechanics vs. quantum physics — which term is more correct?**And in what context? Should I switch or just stay the course? I use quantum mechanics to refer to the underlying science of all quantum phenomena, and quantum physics to refer to application of quantum mechanics to specific phenomena in physics — solving physics problems using quantum mechanics.**How much of quantum mechanics and quantum computing is appropriate at the high school level?**What criteria to use? How much depth? How much math?**Turing test for quantum computing.**Is a particular machine really a quantum computer? Can you really tell? Can a test for ability to generate random numbers tell? Probabilistic rather than deterministic results. And achieve quantum advantage. Maybe, if a particular machine cannot achieve quantum advantage, then it shouldn’t be considered a quantum computer. Allow for simulators, but also wish to detect a simulator vs. a real quantum computer.**Does it make sense to speak of the basis state(s) of a register, single qubit, or pair of qubits separate from the basis state(s) of all qubits in the system?****How real are basis states and amplitudes, or are they simply a perspective on some fraction of the total state of the system?****Why is spin so much more appealing for quantum computing?****Derivation of Planck constant and relevance to quantum computing.**What is the quanta for probability amplitude, phase angle, and probability? What is the physical, phenomenological basis for these quanta? What are the practical implications, such as minimum and maximum angles of rotations? What are the quanta for errors?**Which aspects of quantum computing have reasonable prospects for scaling?****Which aspects of quantum computing do not have reasonable prospects for scaling?****What criteria must be met to create a qubit?**What exactly enables superposition? What exactly enables entanglement? Interference? Phase rotation? Basis amplitude rotation? What requirements must be met to enable measurement?**How exactly does a wave function collapse?**How long does it take? What factors control the time? Is energy lost? How and to where? Is there absorption or emission of photons? Is excitation to a higher intermediate state needed to get emission down to the final state?**Qubit is a hybrid combination of a storage element and a processing element.**And information as well. Can storage and processing be separated, either completely or temporally?**If a classical program is like a recipe or control of a washing machine, what metaphor works for quantum programs?**Same, just more limited? Or exactly the same, just completely linear, with no loops? Or, what?**Key to exploiting quantum computing is reduction of algorithmic complexity.**Reducing from an algorithm of higher complexity to a lower level of complexity, primarily from exponential to polynomial. Even reduction from exponential to polynomial may not be sufficient — even linear may be too intensive. Is super-polynomial still too expensive or not?**Phonon-based quantum computer?**Superimposed frequencies? “*mechanical quantum mechanical computer*”. A bunch of papers, but “far behind” other approaches.**Expectation value.**Definition. Use. Implications. See.*Shots and Circuit Repetitions: Developing the Expectation Value for Results from a Quantum Computer***What can a D-Wave quantum computer compute and what can’t it compute?**When is it appropriate? What are its limitations? What exactly is its programming model — in detail, in plain language.**Bitcoin encryption vs. quantum computing.**Will quantum computing break Bitcoin? Mining? Decryption?**Need for 32 to 64-qubit algorithms.**Smaller algorithms don’t really demonstrate the true potential of quantum computing — dramatic quantum advantage. 32–40 qubits can be classically simulated, so focus there first, with emphasis on scaling from 32 to 64 qubits.**The superficial basics for quantum computing.**What is quantum computing — for dummies. Magic, hype, promises. The easy stuff that everybody needs to know. But none of the deep stuff, no math, no Greek, no physics. The first layer. The first chapter of a broader and deeper introduction.**Quantum computing for dummies.**The easy stuff that everybody needs and wants to know. But none of the deep stuff, no math, no Greek, no physics. Plenty of these out there, but I know I can do it better and more simply — and more correctly.**Quantum computing without quantum.**The essence of quantum computing, but without any of the physics. In the same sense that classical computing doesn’t require any knowledge of physics or how a transistor works. Even knowledge of the function of a transistor doesn’t require any knowledge of physics.**How much knowledge of the Schrödinger equation is needed for quantum computing, if any?**Sure, it applies under the hood, and certainly for simulation of physics, but for non-physics algorithms is much or even any knowledge of it needed or is there any discernible impact on measured results at all? The time-dependent Schrödinger equation (TDSE). The time-independent Schrödinger equation (TISE).**How much knowledge of Hamiltonians is needed for quantum computing, if any?**Sure, it applies under the hood, and certainly for simulation of physics, but is it needed at all for non-physics algorithms? Even if some superficial knowledge might be needed in some situations, how much depth is needed?**How much knowledge of wave functions is needed for quantum computing, if any?**Sure, it applies under the hood, and certainly for simulation of physics, but is it needed at all for non-physics algorithms?**Clearly documented cannonial gate sequences.**Gate sequence of raw gates can be very cryptic to decode. It would be helpful to have clearly documented gate sequences as the starting point for algorithmic building blocks. Design patterns as well.**Quantum pseudo-code.**Informal language or even simply commenting conventions which allow algorithms to be expressed as*what*they are doing rather than*how*they are doing it. High level, but detailed enough that an average developer could translate to working quantum circuits. Would be good for published papers.**9’s roadmaps.**Progress in qubit fidelity — 90%, 95%, 99%, 99.9%, 99.99%, 99.999%, etc. For each vendor and overall. Timelines. Distinguish single and two-qubit gates, or aggressively presume that two-qubit gates should be the primary metric?**Algorithmic building blocks.**Define more clearly. Many are already built into a typical classical programming model and languages. Clearly documented canonical code sequences. Pseudo-code operations. Complete algorithmic building blocks of code. Design patterns. Application-specific modules. Application frameworks. Classical code and libraries to generate quantum circuits. Need recognizable patterns to enable rapid visual inspection of circuits. General purpose building blocks: QFT, order-finding, phase estimation — need a lot more, but what are the app-pull needs? Libraries. Toolkits. Debug tools. Metering tools. Monitoring tools. Repetition and statistical analysis of results. Code analysis. App category-specific: finance, quantum modeling, computational chemistry, molecular modeling.**Quantum information vs. quantum information storage.**Manipulation as well. Computing, communication, sensing, networking.**What are the key features of a quantum simulation?**Physics. Chemistry. Other?**What are the key concepts of quantum mechanics — as distinct from quantum physics?**Pure quantum mechanics vs. the applications of quantum mechanics to physics.**My focus for quantum computing.**The technology and how it can be applied, but no specific applications per se. 2–5 year timeframe. Not so interested in the next year. Likely not the second year either. Not as interested in 10+ years, other than eventual destination and universal quantum computer — hybrid with classical. 5–10 years less an interest, but mabe 7 if delays for technology which should have happened in 5 years. But… likelihood of dramatic progress in 2–5 years is a coin flip, so expectations for 2–5 years may not transpire until 5–10 years.**Classical computers are based on quantum mechanics and quantum effects too.**Yes, they are. Quantum effects within transistors — and in wires as well. All classical bits and gates are ultimately derived solely from quantum effects. Somehow, quantum effects can be converted into non-quantum macroscopic effects — through the magic of statistical aggregation. Probability plus statistical aggregation equals approximate but practical determinism. Contrast with absolute determinism. Or does an exponential limit plus quantum threshold guarantee determinism, short of hardware failure? Every classical computer is also a quantum computer, in some sense. Simply collapses all of the quantum states. But still depend on quantum state transitions between the collapses. Technically, it may be more correct to say that classical computing is based on quantum effects rather than on the logical operations of the Bloch sphere and unitary matrices on qubits. It would be fascinating to explore the boundary between the quantum and non-quantum worlds, possibly even producing a hybrid, mixed model of computation.**Isn’t classical computing still based on quantum computing?**Or at least based on the same underlying quantum effects as quantum computing. See*Classical computers are based on quantum mechanics and quantum effects too.***Configurable packaged solutions are the greatest opportunity for widespread adoption of quantum computing.**Prewritten code — complete applications — addressing particular niches of quantum computing applications. User supplies the input data and a variety of application-oriented configuration parameters. Prewritten classical code will generate the necessary quantum circuits needed to implement the underlying algorithms using the user’s input data and configuration parameters. Algorithms and code for such solutions must be designed and developed by very elite professional teams, well beyond the ability of even most Fortune 500 companies, who will be the target customers of such packaged solutions. No knowledge needed of the physics or math of quantum computing. No knowledge needed of the internals of the underlying algorithms of the packaged solutions. Examples — none exist, yet. D-Wave is a step in this direction, but falls short.**Roadmap for molecular modeling of chemistry.**Define a wide range of molecules, from simplest to most complex, as the milestones on the roadmap. Such as salt, caffeine, sugar, gasoline, etc. Estimate qubits, circuit depth, and coherence needed for each milestone.**What’s really going on with superposition.**How does it really work? What underlying phenomenon enables superposition? A step by step account of how superposition is initiated — at the raw physics level, and ended as well. A step by step account of what happens to the superposition during measurement. Is more energy required for superposition? Does the energy vary based on probability amplitudes or phase difference? How is superposition controlled — specific details for lasers, microwaves, etc.?**What’s really going on with entanglement.**How does it really work? What underlying phenomenon enables entanglement? A step by step account of how entanglement is initiated — at the raw physics level, and ended as well. A step by step account of what happens to the entanglement during measurement. Is more energy required for entanglement? Does the energy vary based on probability amplitudes or phase difference? How is entanglement controlled — specific details for lasers, microwaves, etc.?**Even though squares of probability amplitudes sum to 1.0, shouldn’t raw probability amplitudes “sum” to something specific or at least balance in some way as well?**I just find it curious. Maybe some geometric relationship between the raw amplitudes?**Might a quantum computer permit capture and processing of holographic images?**Holographic cameras. Holographic displays. Holographic image representation — need full quantum state, or at least phase? “*The interference pattern recorded on the hologram is a Fourier transform of the object. When illuminated by a laser beam similar to the one which has produced it, the hologram reproduces the appearance of the object by inverse Fourier transformation.*”**Quantum image processing.**Sensors, cameras. Immediate processing — of raw quantum pixels. Structure — static images, dynamic video. Image storage. Display. Load and manipulate. Navigate image data — program control API — sequential, random access — by position, by time, search — Image characteristics, image objects, features, tagged features. UX during display — motion, paths, time jumps and rates, keyword search — tags, image characteristics, image features, image objects, match fragments of other images. Audience size — single user, two users, small group of users, larger group of users, very large and open group of users. Issues — colors, lighting, resolution — what is a quantum pixel? Quantum alternative to pixels. 3D or even 4D cells. Image quanta. What could sensors capture? Match to the sensor — could convert to other or neutral formats later. Want quantum state, which implies more info than can be classically measured. Transitions between pixels. Relation to human brain and mind and how they process images (imagery.) Stereoscopic views — two — ala human vision, more than two — enhanced beyond human vision, greater ability for machine processing, may not be fully viewable by human senses. Audio to go with images — what would quantum audio be like, contrast with normal analog?**Measurement in other than the computational basis.**What does that really mean? What basis would that be? How to specify? What criteria would be used to judge this? Would it allow access to phase state?**Quantum SPAM is not a bad thing.**What is quantum SPAM? State Preparation And Measurement. Quantum computing requires SPAM and that’s a good thing.**Was 1994/1995 the essential watershed moment for quantum computing, at least theoretically?**1981 was the key kickoff moment, but 1994/1995 was when fruit was born.**Does phase affect probability amplitude and measurements?**Or does it get normalized away?**4-bit adder and multiplier.**Fully elaborate all details, all computational basis states. Give an intuitive sense of how it all works. Maybe start with 2-bit case first for simplicity.**4-bit quantum Fourier transform.**Fully elaborate all details, all computational basis states. Give an intuitive sense of how it all works. Maybe start with 2-bit case first for simplicity. Issue: what input data or states.**The non-destructive control and manipulation of single quantum particles.**The heart and essence of quantum computing. Elaborate on this essence. “*The non-destructive control and manipulation of single quantum particles*” … “*strongly motivated by the prospect of exploiting these systems in order to develop new ways to process quantum information.*” Serge Haroche, 2012 Nobel Physics lecture —.*Controlling photons in a box and exploring the quantum to classical boundary***Potential for quantum tomography to detect and measure more aspects of the quantum state of a qubit.**More than just a probabilistic collapsed wave function.**What is magnetic moment?**Or moment in general?**When will an application be appropriate for a quantum computer?**Solving a substantial, production-scale, real-world problem using a quantum computer.**Superposition, entanglement, and raising Schrödinger’s cat.**Dave Wineland — 2012 Nobel Physics prize lecture —. Elaborate on Wineland’s perspective on the early history of quantum computing.*Superposition, entanglement, and raising Schrödinger’s cat***Basic concepts in quantum computation.**Seminal paper on quantum computing. Vintage 2000. Discuss the basic concepts — codified long before the hype set in.— Artur Ekert, Patrick Hayden, Hitoshi Inamori. Ekert had a 1995 paper as well.*Basic concepts in quantum computation***Using quantum computing to analyze, discover, and characterize chemical reactions.**Moving well beyond ground state energies. Researchers seem too stuck or bogged down in the basics to get to full-blown chemistry simulation. Is this a lack of qubits, a lack of coherence, a lack of qubit connectivity, or a lack of theory in a form that can be directly transformed into quantum algorithms?**Is there a sense of “work” being performed during execution of quantum logic gates?**Work being a transfer of energy. If a substantial number of qubits are entangled, how would the “work” or a single gate be characterized? How much “work” is performed by a Hadamard gate that creates a superposition? How much “work” is performed by a Hadamard gate that undoes a superposition?**What are the prerequisites for quantum computing?**Many different levels and areas of quantum computing. Range from physicists to algorithm designers to application developers to users. A master set of prerequisites and numerous defined subsets. The starting point is who has to know how much about linear algebra. And who has to know what about Hamiltonians, wave functions, and differential equations. How much of classical computer science is required? And who has to know how much about number theory (primarily for cryptography, factoring, order finding, Shor’s algorithm.)**Why is pi sprinkled everywhere in quantum computing?**Pi, two pi, and pi over 2 are most common. Gates are essentially just rotations by angles, with two pi radians in a full circle. Rotation is periodic, with two pi radians of rotation being one period.**What are the consequences in quantum computing of pi being irrational?**Since pi has an infinity of digits, what precision actually matters in quantum computing? Is there some limit, some Planck-level quantum unit, to precision of gates? How many digits of pi should we use, in general? Is 3.14 good enough? 3.1416? 3.14159? 3.1415926535? Or what?**Deep dive on bra-ket notation.**More than just the basics. What exactly goes between |…|? What does x>y> mean? What does >x y< mean?**How much of bra-ket notation is needed by the average quantum algorithm designer and application developer?**Not much, I think. But where’s the line.**Levels of use.**Related to personas and use cases, but a simpler formulation. 1) Direct use of qubits by algorithm designers. 2) Application developers use high-level libraries which generate quantum circuits and convert qubit results to application results. 3) STEM managers and executives who have teams to design, implement, deploy, and operate quantum-based solutions. 4) Non-STEM business and operational managers and executives who rely on the results of quantum-based applications. 5) Executives of organizations which rely on quantum-based computations, such as drug discovery, material design, business process optimization, etc. — better or more efficient operations, better or more efficient financial results. 6) Customers and users who use the products and services of organizations which rely on quantum computations.**Operations (gates) per second vs. quantum states processed per second as proper measure of throughput.**m gates on n qubits vs. m gates on 2^n quantum states.**How is phase represented physically in a typical qubit implementation?**For example, when spin is the representation of 0 and 1.**Quantum computers apply to any application which can be reduced to a physics problem.**Or any application which can be reduced to a sub-application which can in turn be reduced to a physics problem. And so on, ad infinitum. What about these sub-applications, quantum algorithm building blocks: Hadamard transform, quantum parallelism, quantum Fourier transform, order-finding — but is it really since it can’t be completely computed by solely a quantum logic circuit?**Quantum insight from Planck’s 1901 paper.**Was there an earlier paper as well?**Will artificial general intelligence (AGI) require quantum computing?**Quantum effects. Turing u-machine — functions which are not computable by a classical Turing machine.**What can a gate-based quantum computer compute that the D-Wave system can’t?**Which applications for a gate-based quantum computer can be solved better on the D-Wave system? Classes of applications — short list w/ brief summaries.**Not ready for Quantum Ready.**Because quantum is not ready for us. Not just a need for much more advanced hardware, but algorithms and programming models as well. Algorithmic building blocks — richer set of levels of abstraction. Standardized primitive operations. Programming language. Standardized APIs. Design patterns. Clearly explained examples — fully commented.**Framework for Implementation Specification for a Quantum Computer.**The*Implementation Specification*document for a particular model of quantum computer details all of the gory details of the hardware implementation, beyond those high-level details given in the*Principles of Operation*document for that quantum computer, including performance, electrical, physical, mechanical, and environmental details. This paper details a framework or rough template for the kinds of information to be included in an*Implementation Specification*. See the*Implementation Specification*section offor some preliminary thoughts. In contrast, the*Framework for Principles of Operation for a Quantum Computer**Principles of Operation*document for a quantum computer details everything an algorithm designer or application developer needs to know to effectively use a particular quantum computer or family of compatible quantum computers. Beyond what is already listed in the*Implementation Specification*section ofpaper, additional details include… Qubit hardware technology, the science and engineering. Cardinality of qubit quantum basis states: two — qubit, three — qutrit, ten — qudit, or whatever. Phase… resolution: minimum phase, maximum phase other than exact multiples of pi, minimum difference detectable between two phases, technology for implementing phase. Timing… Gate execution: single gate — any differences between gates, multiple gates. Maximum rate for large number of gates. Block size — how many gates can be sent to the quantum computer in a single macro operation. Time for 10 gates. Time for 100 gates. Time for 1,000 gates. Time for one million gates. Time for ten million gates. Time for one hundred million gates. Decoherence times — various forms? Time to reset to |0> state. Measurement time: single qubit, bulk: n qubits, all qubits. Method of entanglement. Implementation of connectivity between qubits. Details related to tomography. Pulse-level details.*Framework for Principles of Operation for a Quantum Computer*- Update to
. Specification of both principles and implementation should be in a machine-readable form which can be semantically processed by application code as well as fed through a template to generate the human readable text of both. Various end-user formats could be supported.*Framework for Principles of Operation for a Quantum Computer* **What was the first quantum computer?**Some capabilities were demonstrated in the lab, but not as a fully functional quantum computer. Maybe IBM’s 5-qubit machine was first? Maybe D-Wave, but it is more of an analog computer and a specialized machine rather than a general purpose computer. Was there an early 2, 3, or 4-qubit quantum computer before IBM’s 5-qubit machine? What about early NMR-based efforts?**What early specialized classical computers were comparable to the D-Wave machines?**Maybe some of the relay-based machines? Most were general purpose, including in terms of arithmetic, which current quantum computers don’t have, but did not have stored programs until Manchester and EDVAC.**How real are any of the machine learning or optimization problems to date?**In terms of practical use cases that address actual real-world problems, and… can clearly be scaled to production-scale usage, and… achieve dramatic quantum advantage.**Does SWAP work for an entangled qubit?**Is entanglement preserved in the swap process and transferred between the swapped qubits?**Does SWAP work for two qubits which are entangled but not with each other?**Scenario: entangle A and B, entangle C and D, then SWAP B and C. Will A and C be entangled, and B and D? If A-B and C-D were different Bell states, what would the A-C and B-D Bell states be? Similar for SWAP A and C, or SWAP A and D, or SWAP B and D.**What computational tasks are possible using quantum computing?**Not applications or application problems per se, but specific types of computations.**Where is China on quantum computing?**Have they done much of anything on quantum computing proper — rather than quantum communications? How many qubits have they produced? Recent report of achieving so-called quantum advantage, but apparently not with a general-purpose quantum computer.**Will quantum computing ever… cure cancer, solve inequality, cure poverty, cure the climate crisis, or enable nuclear fusion as an energy source?**How can we properly characterize what quantum computing can actually do, what problems it can address or solve?**How many more qubits are needed before we can do anything significant with a quantum computer?**Maybe that’s The ENIAC Moment, or maybe something short of the full ENIAC moment.**What is the simplest quantum computer?**That can still be considered an actual, general-purpose quantum computer? Two or three qubits? Maybe at least four qubits are needed? Maybe five are needed to just prove the point convincingly. But not a single qubit alone — entanglement not possible, quantum parallelism not possible.**Use of SWAP gate to set a qubit to an arbitrary quantum state.**Generally, only incremental rotations can be applied to qubits and there is no provision to directly set a qubit to a specific quantum state. But if a sufficient number of additional qubits can be kept in reserve, they can be swapped with any qubit, resetting the qubit to 0>, which then allows an arbitrary quantum state to be set for that qubit. IBM has since added a conditional reset feature to their systems.**Deep dive on phase estimation.**What makes it work? How does phase actually get captured? Why isn’t this a primitive operation of the firmware? How many 9’s of qubit fidelity are needed to capture each bit of phase? Need solid, real-world examples.**Fully worked out examples of phase estimation.**Four, eight, twelve, and sixteen qubits. Quantum state preparation of a qubit, quantum Fourier transform for result qubits. Maybe only four, five, or six bit results fully detailed, but examples for twelve and sixteen bits would be helpful. What is the practical limit of precision for near-term machines (estimated for a two-year horizon)?**Requirements for a high-level quantum programming language.**Levels of abstraction between raw gates and full applications. And major algorithms such as QFT, phase estimation, and order-finding as primitive operations.**Reusable components for quantum computing.**How to enable and facilitate reusability to minimize unnecessary reinvention of the wheel.**How much quantum entanglement exists in the wild?**Common or rare? How did it come into existence? By what natural processes?**How much natural quantum entanglement exists at the macroscopic scale — beyond individual atoms and molecules?**By what natural processes does macroscopic quantum entanglement occur? By what natural processes is it destroyed or dissipated?**How much quantum entanglement existed at t=0 just before the Big Bang?**What form(s) did it take?**How much quantum entanglement existed shortly after t=0 just after the Big Bang?**What forms did it take? By what processes did it occur? How common or rare was it?**How much of the quantum entanglement that existed shortly after t=0 just after the Big Bang still exists?**What forms does it take? By what processes did it occur? How common or rare is it?**How long did the quantum entanglement that existed shortly after t=0 just after the Big Bang persist?**How and why did it dissipate — by what natural processes?**What is the oldest quantum entanglement that still exists in the universe?**How common or rare is it? By what natural processes was it created?**How distant are the quantum systems which are entangled with quantum systems we can find locally?****How much quantum entanglement persists after a supernova?****What happens to quantum entanglement when one or more portions of an entangled quantum system is consumed by a black hole?****Can quantum entanglement survive a black hole?****How can quantum information science exploit naturally-occurring quantum entanglement?**Even beyond quantum computing.**Can we identify which other qubits are entangled with a particular qubit?**For debugging purposes, it would be nice to be able to validate whether a qubit is entangled as we expected. We should be able to do this using a classical quantum simulator. Whether we can do this using tomography is a thornier question.**What next big breakthrough is needed most for quantum computing?**More qubits vs. more coherence vs. more connectivity vs. higher-level programming model vs. much more impressive algorithms … what?**Bookshelf for quantum computing.**Mostly for programmers, that is, but for others as well. What is the best intro paper and book? List of Top 100 papers. Top 10?**Is quantum computing on the verge of a breakout, or a breakdown into a quantum computing winter?****Could a quantum computer compute pi more efficiently than classical methods?**Or would it need n qubits to computer n bits of pi, so over three trillion qubits would be needed to compute pi to one trillion decimal places? Simply, is computing pi simply phase estimation where phase is… pi? But does that essentially mean needing the value of pi to create a phase of one pi? Or is there some way to perform a rotation of exactly one half of a full cycle without knowing pi in advance? Could an incremental approach be taken, one bit at a time? Is asin(1.0)*2 exactly pi? What is the series expansion for asin? Would quantum parallelism help at all? Can’t give a result with more bits of precision than qubits since a quantum computing has no I/O or access to mass storage.**Need for robust commenting conventions for quantum circuits.**The “why” rather than the “what” of circuits. Intentions and motivations. Intended purpose. Intended outcome. Pseudo-code as well.**How many probability amplitudes exist for n qubits after performing an H gate on each to put each into a superposition?**No entanglement. Is it just n * 2 probability amplitudes or 2 ^ n probability amplitudes? How exactly can n qubits be placed into 2^n quantum states?**What is a reasonable high-level of abstraction for quantum computing?**Hide: raw physics, quantum mechanics, linear algebra, Hilbert space. Reversibility? Emphasize quantum parallelism. How to model superposition — amplitude and probability, entanglement, interference, measurement, rich data types — integers, fractional values, arithmetic, quantum Fourier transform, phase estimation.**Potential to implement a classical computer using only qubits.**Transistors. Memory cells. Logic gates. Flip flops.**Device characterization and modeling for each model of quantum computer.**Gordon Bell-style cpu model: K — processor, M — memory, but it gets confused since qubit is both memory and processing, still…**How specifically is quantum computing better than classical computing?**Quantum parallelism: exponential speedup. Trivial to generate true random numbers, rather than merely pseudo-random numbers or need for specialized hardware. Anything else? Does superposition yield any advantage other than enabling quantum parallelism? Does phase offer anything special?**Review and revise definition of quantum computer/computation.**Quantum system. Basis states. Quantum state. Product states. Computational basis states. Qubit. Superposition. Entanglement. Interference. Phase. Quantum parallelism.**Is phase comparable to qumodes of Xanadu?**How does computing using qumodes compare to computing using qubit phase?**Beware of quantum algorithms which don’t discuss and fully characterize scalability.**Discuss, demonstrate, and even prove scalability, if possible. Fine to run on a small real quantum computer, but should accurately simulate on larger classical quantum simulators (up to 40 to 45 qubits.) Simulation should use a realistic noise model comparable to expected real machines in a target timeframe (one year, two years, five years, seven years, ten years.) Identify and discuss any limiting factors, such as phase granularity.**What might a quantum software methodology look like?**Contrast with classical software methodologies. Any special opportunities of quantum computing that can be exploited?**XML and JSON for running a quantum program.**<circuit>, <measure> — selected or all gates, but also allow <measure> as a gate, <repeat> — how many times to rerun the same program. Option to specify an initial state other than all zeroes — for all runs, for specific runs, or have <repeat> specified for each alternative initial state. Option for randomized initial state — same as H gate, other choices for randomization. May not be as useful for quantum circuits which must be dynamically generated by classical code, but maybe still useful to record any generated quantum circuit.**Quantum Amplitude Estimation (QAE).**Ala quantum phase estimation (QPE). Seeby Grinko, Gacon, Zoufal, and Woerner.*Iterative Quantum Amplitude Estimation***Can probability amplitudes (or probabilities) of the basis states be determined from the Bloch sphere?**I haven’t seen any geometric representation of them, particularly for superposition. Certainly not possible for entangled qubits.**Is there any fundamental distinction between interaction of two quantum systems (two individual qubits) and observation of a qubit?**Can an interaction do something special that an observation cannot do?**What is the simplest example of quantum parallelism?****How can quantum parallelism work for an n-bit integer if the n qubits are not entangled?****Is quantum computer science distinct from classical computer science?**How much overlap? Or maybe quantum computer science is a module to add to classical computer science?**Quantum computing needs a comparable distinction as between classic boolean logic and integer arithmetic and the physics of transistors.**Classical developers need not understand the physics of transistors. Quantum developers need a comparable semantic distance from the underlying physics.**Levels of quantum computing comparable to levels of classical computing.**Classical computing hardware: electrons, transistors, basic logic gates — boolean logic, flip flops and memory cells, boolean operations, arithmetic — integer and real, bytes, byte strings, character strings, instructions and instruction sequences, testing and comparisons and branching, looping, stacks and function calls. Classical computing software: operating systems, runtime systems, libraries, I/O, file systems, databases, network services, application libraries, applications. Quantum computing: TBD.**Quantum iota.**Smallest unit of anything related to quantum computing. Including phase shifts and amplitude magnitudes. Is it a unit of energy, or… what?**What is really happening at the physics level for a conditional gate, such as CNOT?**And at the control electronics level as well. What really constitutes a “1” — especially since quantum state could be a superposition? What must the amplitude of the |1> basis state be, again, especially since the quantum state could be a superposition? What’s the math for alpha and beta of the target qubit?**Is self-modifying code possible on a quantum computer?**Quantum code as data, qubits as code? Currently, only on the classical side, where quantum circuits are dynamically constructed. Does it have any potential? What could you do with it? How would it work? All unknown at present, but intriguing to contemplate.**When will quantum computing start to become interesting?**128 vs 256 vs. 1K qubits? 100 vs. 1K vs. 10K gates? 100 vs. 250 vs. 500 vs. 1K circuit depth? (coherence) What algorithms, besides Shor’s? Only when production-scale is reached? Short of production scale, but sufficient to demonstrate that production-scale is close to being in reach? Maybe it’s The ENIAC Moment, by definition?**What does “**A phase shift that has no impact on the outcome of measurement? Change in amplitude that leaves probability unchanged? What exactly is the point or purpose or effect of saying it? Other phrases… Up to a relative phase shift. Up to an unobservable global phase factor. (What is a phase factor vs. phase shift?) Up to an unimportant constant factor. Up to an unimportant global phase T. Up to an unimportant global phase. Up to an unimportant global phase factor. Up to an unimportant global phase shift. Up to a global phase. Up to a global phase shift. Up to an irrelevant overall phase. Up to an irrelevant overall phase factor. Up to an irrelevant global phase. Up to a normalization factor. Up to an overall multiplicative factor. Up to an error on a single qubit of the output. Up to a constant factor. Up to a unitary transformation. Up to a relative phase shift. “*up to a factor*” or “*equivalent up to global phase*” really mean?*equivalent up to global phase*” — “*Two quantum states |ψi and |ϕi are equivalent up to global phase if |ϕi = e^iθ |ψi, where θ ∈ R. The phase e^iθ will not be observed upon measurement of either state.*” — https://arxiv.org/abs/0705.0017.**What exactly is a**https://en.wikipedia.org/wiki/Root_of_unity http://mathworld.wolfram.com/RootofUnity.html*root of unity*and how is it used in quantum computing?**Is quantum computing Quantum Ready?**No, but a lot of people act as if it were. A lot of people are ready to believe that they are ready for quantum computing to be ready.**Quantum computing is not Quantum Ready.**Stating the reality, even if many people are ready to believe that they are ready for quantum computing to be ready.**What can a quantum computer actually do?**Problems. Applications. Functions. Mathematical functions. Computable functions. Tasks. Logic gates. Absolutely, and relative to what a classical computer can accomplish.**What can a quantum computer really do?**Alternative title for*What can a quantum computer actually do?***When is quantum computing projected to be able to do anything interesting or practical?**Maybe 2–3 years. Maybe 5 years. Maybe 7 years. Maybe 10 years. Maybe 15 years.When might The ENIAC Moment arrive?**When is quantum computing forecast to be able to do anything interesting or practical?**Alternative title for*When is quantum computing projected to be able to do anything interesting or practical?***Track progress on semiprime factorization.**Implementations and redesigns of Shor’s algorithm or alternative algorithms. Anything above 15 or 21? Limit of classical quantum simulation. Limits of current, near-term, and projected real quantum computers. 6-bit, 7-bit, 8-bit, 9–12 bits, 16-bit, 20-bit, 24-bit, 32-bit, 48-bit, 64-bit, 72-bit, 96-bit, 128-bit. The basics, even before large numbers can be factored.**Quantum aware and quantum curious.**Alternatives to Quantum Ready.**Intuition for quantum computing.**Starting with intuition for quantum mechanics. Generalizing to intuition of all of quantum information science. Specialization for quantum computing?**Model for scalability of algorithms.**General approach and need for discussion of scalability. Demonstrating algorithms on smaller machines, with details for how the algorithm would scale to larger machines. Demonstrating algorithms on classical quantum simulators with noise models that match actual or projected real machines.**Staged model for scaling of algorithms.**Initially prove algorithm at 4 and 8 qubits using both classical quantum simulators and actual quantum hardware. Then incrementally scale in stages, testing on both classical quantum simulators and actual quantum hardware, the stages being 4 qubits, 8, 12, 16, 20, 24, 28, 32, 40, 44, 48, 56, 64, 72, 80, 96, 128, 192, 256 qubits. Prove scaling from 4 to 40 qubits using classical quantum simulators as well as actual quantum hardware. Expectation that algorithms working at one stage will scale reliably to subsequent stages. If scaling works for each stage from 4 to 40 qubits, expectation is that scaling will work for stages from 40 to 64 qubits, and beyond as higher capacity and higher fidelity hardware becomes available. At least this is the model, in theory. Whether real quantum hardware does indeed scale in this way remains to be seen. Actually, I’m reasonably confident that it*doesn’t*scale this way at present, but if we have the scaling and testing framework in place, we can test and prove advances in scalability as they become possible as the hardware progresses. The goal is to provide algorithm designers with a model for development and testing, and to provide hardware vendors with a model for evaluating their progress on the scaling fronts — both hardware and algorithms using that hardware.**Exploiting quantum parallelism.**Principles. Guidance. Restrictions?**Deep dive on definition of quantum parallelism.**What are all of the elements of quantum parallelism?**What are the various forms of quantum parallelism?****Thinking beyond quantum parallelism.**What are the limits of current models of quantum parallelism? What could be done for problems which exceed those limits?**Am I overthinking quantum computing?**But is that exactly what is needed — much more depth and precision on nuance?**Deep dive on noise in quantum computers.**What is it really? What are all of the types? What mitigation measures are effective? Interference from adjacent qubits. Does use of a qubit as a control degrade its state? Subtle manufacturing variations or even outright defects that are not easily detected. Variations between qubits even on the same machine. If you endlessly and repeatedly flipped the state of a qubit, would it decohere at all or any slower or faster?**Chunking as an approach to processing more data than a quantum computer can directly handle.**An essential challenge. Multiple runs with variations of seams — look for statistical balancing. Major post-processing — using a quantum computer simply to guide a classical computer to where to focus effort. Gridding as an essential tool — using a quantum computer simply to guide a classical computer to where to focus effort.**Chunking and machine learning.**How much data can a quantum computer handle in a single chunk? How much data for each complete problem to be chunked? How to handle the seams between chunks?**Limits of quantum computing.**What about problems and solutions which are of super-exponential complexity? How can complexity be reduced from super-exponential to exponential? Techniques and heuristics for complexity reduction, such as Shor reducing factoring to order-finding.**Techniques and heuristics for complexity reduction.**Such as Shor reducing factoring to order-finding.**What is an order-finding algorithm (OFA)?**Ala Shor’s factoring algorithm. What’s the point of order-finding? What does it accomplish? What can be done with it?**What are applications of order-finding algorithms (OFA)?**Beyond Shor’s factoring algorithm.**90/10 model for hybrid mode of quantum computing — 90% classical, 10% quantum.**90% classical: original data, data preparation, post-processing, chunking data for limited qubits and large data. 10% quantum: or maybe even less — 2–5%, or even 1%.**Remote job entry, service bureaus, and time-sharing models for quantum computing.**Cloud access. Utility computing. Processing batches of circuits. Shades of traditional mainframe batch processing and remote access.**Is a Quantum Winter coming?**Based on what criteria? What would it look like? How might it end and transition to a Quantum Spring and Quantum Summer?**Intuition for complex numbers.**Motivation. Purpose. Function. Why bother with them at all? Periodic or cyclic phenomena.**How much data can a quantum computer process?**Not really suited for Big Data per se in a classical sense. Volume. Velocity. Variety. As opposed to the number of operations per second per qubit. Measure in bits per qubit per second of input processed. Aggregate for number of qubits. Focus on quantum states per second since that’s the source for quantum advantage — 2^n for n qubits. Must take shot count or circuit repetitions into account as well.**If we advance to a computer being defined as a universal quantum computer, is there a better term for a classical computer?****What’s the difference between a quantum logic gate and a qubit?**Short piece. There’s a clear difference, but it needs to be expressed clearly for a broader audience.**Quantum computers of useful size are not yet available.**What size will generally be considered useful — 32, 48, 64, 92, 128, 192, 256, or more qubits? What algorithm should be used as the benchmark for a useful quantum computer? If not full Shor’s factoring algorithm for large public encryption keys, maybe simply to factor numbers as an interesting, intensive computation, rather than focusing on cracking high-end cryptography. Or maybe Grover? No others come quickly to mind. But none of the existing common algorithms seem appropriate.**Reflections on Shor’s factoring algorithm.**Key obstacles. Opportunities. Limits. Can it ever really work for non-trivial semiprimes, such as even 128 bits on a real quantum computer? I’ve already written, but a simpler summary is needed. See also:*Ingredients for Shor’s Algorithm for Cracking Strong Encryption Using a Quantum Computer*and*Some Preliminary Questions About Shor’s Algorithm for Cracking Strong Encryption Using a Quantum Computer*.*References for Shor’s Algorithm for Cracking Strong Encryption Using a Quantum Computer***The material science physics for how qubits actually work.**What their limits are. Qubit gate timing. Minimum size, maximum density.**Does Shor’s factoring algorithm suffer from the halting problem?**Not guaranteed to finish (halt), even though probabilistically it may be very likely to finish (halt) in most cases. Roughly 50% chance that a randomly chosen integer will produce a viable order that results in a factor. And if the quantum computer doesn’t have a low enough error rate, that 50% chance could be reduced enough to make non-halting a possible issue in some cases, even if reasonably infrequent.**How can we best make progress towards a universal quantum computer which combines and fully integrates quantum processing and classical processing?**Can we expand quantum parallelism to include the full capabilities of a classical Turing machine, or not, or what subset of a Turing machine?**How likely is it that we will eventually see a Y2K-like moment for using a quantum computer to crack strong encryption?**What are the possible trajectories and timeframes? Or is the necessary level of qubit fidelity and phase precision, even with full quantum error correction, simply going to be forever beyond our technical and technological abilities — a theoretical possibility in an ideal sense, but simply not practical in any real sense? What might be the largest integer that a quantum computer could factor with certainty in a practical sense, even if well short of thousands of bits? Maybe this could be explored in the form of a fictional novel?**Deep dive into Miller’s 1976 paper which forms the basis for Shor’s factoring algorithm.**Miller, G. L.,*Riemann’s hypothesis and tests for primality*, J. Comput. System Sci., 13, (1976), pp 300–317, http://www.cs.cmu.edu/~./glmiller/Publications/Papers/Mi75.pdf, https://www.cs.cmu.edu/~glmiller/Publications/Papers/Mi76.pdf.**Quantum algorithm for calculating factorial.**Just curious. Can a quantum computer actually do it? How exactly. No looping available. And do it faster than a classical computer?**Can a quantum algorithm compute a transcendental function?**Normally require an infinite series expansion on a classical computer. No looping available. Sine, cosine, exponential, logarithm.**Can a quantum computer compute a simple exponential?**X to the Y power. Where X and Y might be either integer or real numbers. No looping available.**Can a quantum computer compute compound interest?**Requires simple exponential. No looping available.**Transition from each new machine being relatively unique to a long and broad range of a family with a compatible architecture.**Ala IBM System 360 or Intel x86. Still way out in the future — 1964 vs. 1944. But maybe 20 years can be compressed to 5, 7, or 10 years. Currently, each new machine is either very unique or an incremental enhancement over its predecessors, but with no sense of an overall, long-term compatible family.**Prospects for QRAM — Quantum Random Access Memory.**See section inInterleave bits for connectivity — “registers”. Adjacent DRAM columns — distinct memory range. Model memory organization as “registers”. Parallel bit connectivity. Standard register size? 128 qubits? Need 8192 for Shor’s on 4096-bit keys. Maybe multiple 128-qubit sub-registers. But if we need to string them together anyway, maybe 16-qubit or 8-qubit is fine, if no cost to stringing n sub-registers. No-cloning theorem — cannot copy “memory”, well, you can copy, but implicitly measures and collapses quantum state. Support “swap” operation more efficiently in raw hardware.*What Is a Universal Quantum Computer?***Does Shor’s factoring algorithm break bitcoin or blockchain?**No, one-way hash? Ahhh… mining might get “broken” (made too easy) by being accelerated?**Is there any merit to executing multiple copies of the same quantum circuit in parallel on the same quantum processor?**Or is that no better than serially executing the same circuit? For example, if the quantum processor has 53 qubits but a quantum volume of only 64 to 256 (6 to 8 qubits), why not execute six copies in parallel — and compare the results? Any advantage to running those six copies serially instead? Maybe crosstalk and interference between qubits. Can the six (or five or four) sets of 6–8 qubits be physically isolated sufficiently?**What exactly does quantum entanglement accomplish?**Not what it is or how it works, but what logical function can it perform for a user?**How are Bell states actually created?**Not the gate sequence, but what actually happens under the hood in terms of quantum mechanics, and what actually makes the states persist and behave the way they do. In plain language, not the math per se. Ditto for GHZ and W states.**Bell’s inequality explained.**In plain language, not the math or quantum mechanics per se. What makes it work as it does? How exactly does it offer a benefit to quantum computing?**Hilbert space explained.**In plain language, not the math or quantum mechanics per se. What makes it work as it does? How exactly does it offer a benefit to quantum computing?**Complex numbers explained.**In plain language, not the math or quantum mechanics per se. What makes them work as they do? How exactly do they offer a benefit to quantum computing?**Paul Benioff’s seminal contributions to quantum computing.**Such as(or here.) Wikipedia page for Paul Benioff. Connect to Feynman and others active in the early 1980’s. How much of Benioff’s influence is seen in current quantum computers? How much of Benioff’s influence might be seen in future quantum computers?*Quantum Mechanical Models of Turing Machines That Dissipate No Energy***How to translate real-world non-physics problems into physics problems that can be solved on a quantum computer?**Translating problems to the physics of quantum effects.**Quantum computations are comparable to algebraic expressions.**Limited subset of what a classical Turing machine can compute. No conditional branching, looping, or function calls, or mass storage or I/O. And missing integer and real numbers at that.**Risk of bulking up quantum computing infrastructure without a coherent high-level programming model.****Quantum storage.**Roughly comparable to classical mass storage, but what is it really? What does it mean to have long-term storage or superpositions and entangled qubits? Alternatively, is it more like classical main memory? Or maybe more like modern flash storage?**Intellectual property in quantum computing.**Who has what patents — IBM, Microsoft, Intel, Google, Rigetti? How important are the patents? How freely and cheaply are patents licensed? Who has what copyrights — code, algorithms, books, papers, training materials? Benefits and roles of open source — pure free licensing, semi-limit licensing.**Criteria for when quantum computing is appropriate for financial applications.**Generally, financial applications, especially transactions, must be to-the-penny accurate. Generally, they must be deterministic. And generally must complete or not, can’t be maybe it completed. Generally need audit trails. Generally records must remain intact indefinitely, can’t dissipate unexpectedly or… ever. So, where does quantum fit into finance?**What types of problems do not require determinism (deterministic results) and which do?**Good enough even if not the most optimal — optimization, selection. Learning — is it really okay to learn the wrong lessons or to learn part of the material? Consumer interests — no clear deterministic solution anyway. Problem diagnosis — want options, possibilities which can then be further evaluated. List of possibilities vs. single solution which has some probability of being wrong. Financial applications which require strict deterministic results. Immediate results vs. storage and persistence of results.**Photonic quantum computing.**How does it really work? How is it different from non-photonic quantum computing? How are gate operations performed if photons do not interact (since they obey BES — Bose-Einstein Statistics)?**Is photonic computing the only truly viable path to large-scale general purpose quantum computing?**Room temperature operation. Simple, reliable fabrication. Resistance to errors and noise. Application to Turing machines as well — one technology for both. Lower power — much lower power.**Where are all of the 40-qubit algorithms?**Toy algorithms (using less than 28 qubits) are great to prove that basic hardware features of a quantum computer function properly, but they don’t even come close to demonstrating scalability or potential for dramatic quantum advantage. We need to see algorithms using at least 32 to 44 qubits in a single Hadamard transform for quantum parallelism to show that we are on the cusp of dramatic quantum advantage. Show algorithms that scale cleanly and smoothly from 24 to 28 to 32 to 40 to 44 to prove that scaling to 48 and 50 and 56 and 64 and 72 are credibly (and provably) within reach. Simulating algorithms for 32 to 44 qubits on a classical quantum simulator should be practically doable at this time, so… why aren’t we seeing algorithms in that range? Counterpoint: No real need for 40-qubit algorithms until qubit fidelity, connectivity, and phase precision is sufficient for 40-qubit quantum phase estimation (QPE) or quantum Fourier transform (QFT).**What is the optimal classical computer architecture for simulating a quantum computer?**Presumably the key issue is dealing with exponential states — 2^n states for n qubits. Multiple cores per system — how many might be optimal? Would a much larger number of much simpler processor cores give the most bang for the buck? Which processor operations need to be optimized the most to achieve both greater performance and greater capacity. Optimal encoding of individual quantum states. Potential for custom hardware accelerator for encoding and decoding quantum states in memory storage. Using only main memory for quantum states vs. mass storage. Optimal size of memory and mass storage per processor core. Massive distributed systems vs. largest local area networked systems. How many systems before network overhead becomes too onerous. How many quantum states could realistically be achieved? 2⁴⁵, 2⁵⁰, 2⁵⁵, 2⁶⁰, 2⁶⁴, 2⁷²?**Service level agreements (SLA) for quantum computing.**How frequent jobs must be run — explicit timing vs. delay between runs. Repetitions required for each job. Permissible tolerance on timing and frequency. Expected duration of jobs. Resource requirements per job. Audit trails for runs. Capturing and archiving results.**Need for a critical mass of powerful tools — ala C and UNIX.**Algorithm designers and application developers need to feel productive, that they can rapidly go from a raw glimmer of an idea to a working implementation.**Need for a lone physicist with a single great idea.**Achieving The ENIAC Moment for quantum computing might be less about some big corporate push with a large team, and more about a solitary elite scientist or engineer identifying a kernel of a simple but great idea and pushing forward to a breakthrough implementation that dramatically blows away everyone and everything else. Possibly even developing new hardware, new tools, and a new programming model focused on achieving great results for that single great idea. Minimal resources, but maximal impact. Even if that one implementation is specialized for that one idea, it could provide the foundation for generalizing to a much broader class of problems. Are there any young Feynmans, Fermis, or Oppenheimers out there poised for the challenge?**No, we’re not yet on the cusp of… anything in quantum computing.**The only thing immediately in front of us in quantum computing is a lot more basic research, hardware, software, tools, algorithms, everything, all of it. Anything else is just an illusion.**Quantum computing: Magic elixir or… snake oil?**Sorry, but right now there’s a lot more snake oil than magic elixir.**What is the physical process to flip the spin of a qubit?**Or even to shift the probability amplitude between the two spins.**The Mantra: Quantum computing must deliver substantial and dramatic enterprise value far beyond what classical computing can deliver.**In the form of quantum advantage —*dramatic*quantum advantage.**Can quantum computing address these applications?**List application areas which are compute-intensive, but where it may not be obvious whether or how quantum computing can deliver a substantial and dramatic quantum advantage over classical computing solutions. Such as a large amount of input data or a large amount of complex logic which may not be reducible to realistic quantum circuits. Or real-time sensor processing. Or generation of large amounts of output data.**Potential for eventual quantum exhaustion.**Frustration and lack of satisfaction despite best efforts.Too many people, too many projects, too many managers, too many executives, and too many organizations get too tired of too many projects with too little results to show for all of the expenditure of time, energy, money, and attention drawn from alternative investments.**How much data can you put into a quantum computer?**Input data must be represented in the gate structure of the quantum circuit. Intermediate results can be computed on a colossal scale using quantum parallelism. But the real bottom line is that only a very modest amount of data can be input directly into a quantum computer.**Operating systems for quantum computers.**Currently, there is no need for an operating system on a quantum processing unit (QPU), but the overall quantum computing system certainly has an operating system for communication via a network, running jobs, and returning results. As well as support functions such as calibration and tuning. Unclear what other operating system-type functions current quantum computing systems perform.**Aren’t all quantum computers photonic?**Using either microwaves or lasers for controlling qubits. Microwave resonators used to control, connect and read superconducting transmon qubits. Lasers used to control trapped ion qubits. Lasers used to control diamond nitrogen vacancy center qubits. But… the qubits themselves are not photons.**How large and complex a molecule would have to be simulated to have a dramatic impact on chemists?**True, dramatic quantum advantage, if not outright quantum supremacy.**Preliminary thoughts on a roadmap for quantum computational chemistry.**Range or spectrum of sizes and complexity of molecules and reactions. Qubit fidelity (nines) needed for various milestones. At what stage would full-blown quantum error correction be needed? How many stages are feasible using only near-perfect qubits, for a range of nines of qubit fidelity? Triage stages according to qubit fidelity requirements: relatively low nines, medium to high nines, very high nines, perfect logical qubits. What’s the largest and most complex molecule which can be simulated using classical computing? What’s the smallest and least complex molecule which is beyond the range of any classical computer?**How long will the quantum honeymoon last?**So much of the abundant buzz about quantum computing is exciting, thrilling, and outright exhilarating, but how long will the honeymoon last before reality sets in? Another two years? Any chance at all that it will last more than another five years?**Open source should be the rule for quantum computing.**Open everything. Okay to charge for consulting services and premium or production-scale access to hardware, but all code, algorithms, libraries, papers, tutorials, documentation, and training materials should be strictly open source and freely available online. Free access to cloud-based hardware for education, training, and experimentation.**Potential for quantum sensors and quantum effectors.**Super-fine sensing and robotics. Integration of quantum sensing and quantum computing.**Levels of Quantum Ready.**Not a one-size-fits-all. Vary by persona and use case.**Quantum folklore.**The good, the bad, the great, and the ugly. How much of it is actually beneficial? How much is innocent, and possibly even helpful.**Quantum pearls.**Concise nuggets of wisdom and knowledge about quantum computing, quantum information science, and quantum mechanics which convey a lot with minimal effort.**The bleeding edge of quantum computing.**Tasks which require excessive effort and frequently don’t work out. Beyond the leading edge. Some people and projects desperately need the potential benefits from the bleeding edge. Many or most people and projects are best advised and are recommended to stay well clear of the bleeding edge, at least unless absolutely necessary, and even then are still well-advised to stay away and stick to the leading edge.**How best to frame quantum computing.**Especially for different audiences or personas. Will evolve over time as well as the technology, use cases, and access patterns evolve.**QISC — CISC vs. RISC for quantum computing.**QISC = Quantum Instruction Set Computer. How simple or complex should quantum operations be? QFT as a single operation?**Data modeling for quantum computing.**Generally, how to think about data and computational results for quantum computing. How to organize input data. How to organize and process result data. How to organize complex data sets for quantum computing. Dealing with data types. Working with shot count or circuit repetitions and distributions of results and development of expectation values.**Working with real numbers on a quantum computer.**When all you have are raw qubits. How to process real numbers on a quantum computer. How to get real numbers for the results of quantum computations.**Risk of a quantum age of false enlightenment.**Risk of confusion about real and useful results. Hype. Dubious promises. Distraction from deep science and addressing real-world problems. A lot of movement and expenditure of energy and resources but little durable and meaningful progress.**Waiting for the day when a world-class quantum computer can be built using only off-the-shelf technology — no custom hardware.**When many more companies can rapidly introduce quantum computers. Waiting for the quantum equivalent of an Intel microprocessor. Waiting for the MITS Altair of quantum computing.**How can quantum computing be dramatically simplified?**Need higher-order paradigm and conceptual framework. Fewer but more powerful concepts. How best to exploit quantum parallelism. Higher-order programming model and algorithmic building blocks, and a programming language to go with it.**Does measurement of a single qubit cause the full collapse for all entangled qubits participating in the wave function in which the measured qubit is participating?**The full semantics of measurement and collapse needs to be more fully and accurately articulated. Do all of the qubits participating in the wave function collapse to exactly pure states — with an absolute guarantee of no mixed states? Or is it just the qubit(s) being measured. Should the order in which entangled qubits are measured matter? At least for Bell states and quantum communication, it seems as though both qubits collapse at the same time, but this needs to be articulated more clearly. Curious about GHZ and W states.**Does measurement of a qubit guarantee collapse to an absolutely pure state?**Are all other qubits participating in the same wave function also guaranteed to collapse to exact pure states rather than mixed states?**How stable are pure states for qubits?**Are pure states any more or less stable than mixed states?**What is the Chinese remainder theorem and how can it be used in quantum computing?**Does Shor’s factoring algorithm actually or indirectly rely on it?**What is the largest semiprime number which a quantum computer can reasonably be expected to factor within the next 2–3 years?**Or are the precision and fidelity requirements for quantum Fourier transform still too great for near-term machines? Can we achieve enough nines of qubit fidelity for near-perfect qubits, or are even more nines or full quantum error correction required?**Can the classical outer loop of Shor’s factoring algorithm be fully implemented as a quantum circuit?**But it is essentially a loop. Question of halting problem. Need for repetitions of the quantum inner circuit (order finding.) How to implement GCD as a quantum circuit. His paper seems to imply that the tail-end of order finding (deducing r from s/r) can be implemented as a quantum circuit (even though his paper presumes a classical implementation), but that section seems like too much of a hand wave.**How does an X gate actually change a basis state?**From |0> to |1> or vice versa.**How does an H gate actually create a second basis state?**Not the matrix math, but at the physical phenomenological level. And how does it give the two basis states equal probability amplitudes? And how does it do the reverse — destroy one of the two basis states?**What are Ising coupling and Ising gates?**And how can they be used in quantum computing?**IBM 709 moment for quantum computing.**The ENIAC Moment was the first significant demonstration of a realistic application on a classical computer, but the IBM 709 was the advent of beefy and capable hardware that enabled more substantial applications, commercial applications — and in fact enabled The FORTRAN Moment.**What aspects of quantum computing are in fact truly deterministic?**Are there any? Or is everything quantum inherently probabilistic and uncertain? Is entanglement partially deterministic in the sense of the relationships between states, such as Bell states — always the same or always opposite? Unitarity — sum of probabilities is 1.0 — is deterministic, in some sense.**Will breaking the 100-qubit barrier be a watershed moment for quantum computing?**Possibly enabling dramatic quantum advantage — enough extra qubits to find ways to arrange qubits to achieve 50-qubit computations for interesting, practical applications? Flip a coin whether this moment coincides with enough nines of qubit fidelity to constitute near-perfect qubits.**What might the Lunatic Fringe do next with a quantum computer?**No telling what those guys will think of next!**General intuition and guidance for how to solve a real-world problem on a quantum computer.**No clarity whatsoever, at present. The best I can say currently is to convert your real-world problem to a physics problem, but even that is not terribly helpful — even for solving physics problems. In any case, this is an urgent need.**The distinction between a quantum calculation and a quantum computation.**If any. Calculation limited to basic algebraic formula. Computation can include logic, control flow, conditional execution, etc. Where does entanglement fit it? How does quantum parallelism fit in — a single computation of many calculations?**How to get past the Lunatic Fringe for quantum computing?**The first step is for the lunatic fringe to eventually deliver a whole series of production-scale solutions to real-world problems. We’re not even close to that precondition yet. Configurable packaged solutions and The FORTRAN Moment may be the final, ultimate answer(s). See*When Will Quantum Computing Be Ready to Move Beyond the Lunatic Fringe?***Can it ever be reasonable for a quantum computation to presume or require two probability amplitudes to be absolutely identical?**Given the inherent uncertainty of quantum mechanics, I presume the answer is or should be No, but… maybe there can or should be the notion of*close enough*, within some delta window. Should it be considered a*bug*if a quantum circuit or classical code examining quantum results appears to depend on absolute equality?**Might it be helpful for a quantum algorithm designer or application developer to know how many wave functions their algorithm requires?**An indication of complexity. Maybe an indication of a problem if the count is too high or too low. Show a time series of wave function count as circuit execution progresses — where does it peak, how does it end. Single metric would be peak count of wave functions.**If simulating physics, does it ever really make sense for the circuit to always start in the ground state?**After all, in the real world, there is always something going on before an experiment is run. Maybe the initial state can or should be constrained to a relatively narrow range of state values, but not absolutely pure ground state. Ditto for chemistry and chemical reactions — always some relatively noisy background. That said, for purposes of debugging anomalous cases, it can be helpful to capture and initiate to some known noisy state, to reproduce problematic test cases. Or, maybe classical code should always do the selection and recording of a particular initial state, even if intending it to be random.**Extreme computing — Just how extreme is quantum computing.**How to put computing performance and capacity in perspective. Other metrics?**When exactly can we legitimately say that quantum computing is at the dawn of a new era?**What specific criteria must quantum computing meet? Right now, we’re still before the dawn’s early light.**How exactly is a qubit different from a bit?**All of the details, not the traditional brief hand wave. Gate vs. information. Probability. Phase. Basis states. Superposition. Entanglement and product states. Measurement and wave functions. No cloning theorem. No fan out. No fan in. Correction?**How real is quantum computing?**Beyond current toy algorithms and applications, how likely is quantum advantage for production-scale applications?**Requirements for a mature quantum industry.**Would it be a distinct industry or sector, or just dissolve into the existing computing industry/sector? Well past The FORTRAN Moment. Transcend all of the hype.**The siren song of quantum computing: The lure and promise of quantum parallelism.**Making the promise a reality.**Is everything a quantum computer; is God a quantum computer?**How can something not be a quantum computer?**Is quantum computing an emergent phenomenon?**Is it a natural consequence of the quantum effects of everything?**What are the 50 to 100 concepts needed to comprehend quantum computing?****What do senior executives need to know about quantum computing?**Right now, just that it is a research area — fund much more research before expecting that it can be deployed to solve real-world problems. Less about the raw technology and what’s under the hood, and more about applications, benefits, and limitations.**Quantum computing for senior executives.**See*What do senior executives need to know about quantum computing?***Can you really understand quantum computing without a deep comprehension of the firmware and control logic?**How exactly are unitary matrices turned into control of qubits? Full and transparent access. What’s really going on. What are the real limitations.**How exactly are superposition and entanglement achieved?**Not the matix math, but the underlying physics and electronic and photonic control.**When will off the shelf commodity qubits be commonly available?**Ala the PC industry with microprocessors and memory chips. When will anybody be able to build a quantum computer without having to design and build qubits?**Fatal flaw of quantum computing: Intermediate results are unobservable.**Make debugging problematic. Makes explainability problematic. Makes audit trails problematic. Make verification problematic. Makes testing problematic. But see the*Trick for debugging on a real quantum computer*topic elsewhere in this document.**What mindset is needed to excel at quantum computing?**How to analyze real-world problems. How to couch problems in quantum terms. How to architect and synthesize quantum solution approaches. How to design quantum algorithms. How to develop applications which use quantum algorithms.**Can AI be applied to simplifying quantum computing?**Parsing problem statements and recasting in quantum terms. Or is there some key, magical insight and intuition needed, some tacit knowledge, which cannot readily be encoded into an AI algorithm? In theory, there shouldn’t be, but… maybe that simply means that AI has a lot further to go.**Is co-design of quantum computer hardware and quantum algorithms a good thing or a bad thing?**Maybe just a sign of how immature both quantum hardware and quantum algorithms are. Maybe a needed stepping stone until hardware becomes much more mature and algorithms become much more mature. But eventually, algorithm design should be hardware-independent, and hardware design should be independent of the algorithms and applications which will run on that hardware. Of course, there will be occasional, rare exceptions, such as when dramatic advances are made in either hardware or algorithms which impact the other, hopefully in a very positive manner.**Characterize and quantify the computational leverage of each instance of quantum parallelism.**n qubits in a single Hadamard transform = 2^n copies of a computation executed in parallel. But, then you need to reduce that*gross leverage*by how many circuit repetitions may be needed to overcome noise and the probabilistic nature of quantum computing. A single algorithm or circuit may have multiple parallel computations embedded within it, in series, or in parallel, or both.**Fractional quantum advantage or partial quantum advantage or degrees of quantum advantage.**There is no generally accepted numeric metric for how much of a performance advantage of a quantum solution over a classical solution constitutes*true quantum advantage*, but I generally refer to*dramatic quantum advantage*to emphasize that the advantage is much more than a relatively minor advantage. Generally, I would say that*true, dramatic quantum advantage*is more than a few*orders of magnitude*(powers of ten) advantage. Thousands or millions or more would clearly be a dramatic advantage. If one can match or exceed the advantage of a quantum solution simply by adding ten, a hundred, or even 1,000 classical machines, that’s*not*what we mean by a true, dramatic quantum advantage since those are tasks easily accomplished by any competent IT staff today — no quantum computer required. That said, even a solution which is only ten or twenty times faster, or even four or five times faster is still interesting in many niches, but I would call such quantum solutions*fractional quantum advantage*or*partial quantum advantage*since they aren’t delivering the full, promised potential of quantum computing. I don’t have particular metrics in mind, but one million and ten thousand are two candidates, among others. So if a quantum solution could do the work of 500,000 (or 5,000) classical computers, that could be considered a fractional quantum advantage of 0.5. Or, doing the work of 1,000 (or 10) classical computers could be considered a fractional quantum advantage of 0.001. Alternatively, consider degrees of quantum advantage: 1) Below classical, 2) Near parity with classical, 3) Modestly better than classical, 4) Better than even a massively parallel and distributed classical solution, 5) Degrees of parallel/distributed — 2x, 4x, 8x, 16x, 64x, 256x, 1024x, 4Kx, 16Kx, 64Kx, 256Kx or 10x, 25x, 50x, 100x, 500x, 1Kx, 10Kx, 25Kx, 50Kx, 250Kx, 6) Moderately better than classical, 7) Well above classical, and 8) Amazingly above classical — quantum supremacy — classical can’t even get the job done with any realistic level of resources and time.**Degrees of quantum advantage.**See*Fractional quantum advantage or partial quantum advantage.***What are global properties and how are they beneficial in quantum computing?****Need for domain-specific algorithmic building blocks.**Provide dramatic intellectual leverage compared to more general-purpose algorithmic building blocks. Much higher level of abstraction.**Bespoke algorithms vs. reusable algorithms.**Tradeoff between algorithms custom-designed and tailored for particular applications, and general-purpose algorithms usable across many applications. Advantages and disadvantages to both. Bespoke algorithms can be more efficient, run faster, and use less resources, but require dramatically more intellectual effort, and be much more difficult to maintain. General-purpose algorithms can be slower and use more resources, but much easier to understand and use. Generalities, but many exceptions.**Challenges of modeling complex systems for quantum computing.**Other than physics and chemistry which are naturally modeled by quantum mechanics. Telling people to simply model their problem as a physics problem is too simplistic and not so helpful for non-physicists and non-chemists.**GIGI — Garbage In, Garbage Out — How will your quantum algorithm or application behave if the input data is problematic?**Detecting errors. Rules for good, clean data. Issues with quantum circuits. Validate data before generating a quantum circuit.**Are there any aspects of quantum computing which cannot be adequately simulated on a classical computer?**Other than performance and capacity if too many qubits (>40–55).**Checklist for documentation and quality assurance (QA) of quantum algorithms.****What does it mean to be quantum?**In general, but simplified. See also:*What Are Quantum Effects and How Do They Enable Quantum Information Science?***Necessity is the mother of invention: Let application requirements drive the design of the quantum computer.**The designs of all of the early classical computers were driven by the intense needs of particular high-value applications. Bell Labs, Harvard, Zuse, Atanasoff, Colossus, ENIAC, EDVAC, EDSAC, SEAC, Whirlwind, MANIAC, SAGE, et al. But somehow there was still an intense interest in building a fairly general purpose machine.**Is quantum computing still inherently binary?**Granted, there are qutrits and qudits, but qubits are still modeled around bits — binary digits. Phase is not binary, more of an analog continuous value, in theory. A rather strange hybrid.**Is the concept of a bit still central to quantum computing?**And quantum information science in general?**Does phase make a qubit a miniature analog computer?****My pet peeves about quantum computing.**So many! Where to start?!**Quantum computing media literacy.**The many pitfalls. The rampant hype. Unwarranted promises. Outright false claims. Literacy for readers. Literacy for nontechnical writers. Literacy for non-quantum writers. Literacy for quantum writers! Watch the jargon!**Quantum manifesto.**This emergent field deserves one. Not sure what it should say, or what the central focus should be.**How might the concept of governance apply to quantum computing?**As in*data governance*.**Issues with the talent pool for quantum computing.**For example, comments by Google’s Hartmut Neven at CSIS on January 29, 2020:.*American Innovation in the Quantum Future***How exactly can quantum computing compete with Monte Carlo simulation?**Where approximate solutions are acceptable. Potential for a hybrid?**Computing for gravity-like problems.****Goal of quantum computing is breakthrough solutions, not cost savings or mere optimization.**Goal is to do what can’t be done today or can’t be done in a required timeframe, not merely to do it somewhat faster.**What are the most problematic aspects of quantum computing today?**Which are preventing production-scale applications which deliver substantial business value not achievable using classical computers. Capacity and fidelity are the first two.**Considerations for composing a quantum computing application team.**Criteria. Roles. Process. Budget. QA. Planning. Deployment. Customer support. Sales support**Debunking quantum hype.**Where to start! Never-ending! Not much left?**What might a BASIC language for quantum computing look like?**What types of problems can be solved very easily?**Quantum computing needs a high-level algebra.**Classical computing directly paralleled algebraic expressions and simple logic. What might a high-level algebra for quantum computing look like? It certainly isn’t linear algebra. Needs to relate to the terms of application domains, in the same way that application concepts related to traditional algebra, but… in a completely different way.**Need for multiple ENIAC Moments, one for each major application category.**The needs of the various application categories are too different for a solution in one to automatically translate into a solution in any of the others. But maybe the others follow in fairly rapid succession from the first one, which blazes the trail.**No need to start learning about quantum computing until 40-qubit algorithms are the commonplace norm.**Scalability is one of the more important and urgent early lessons — it’s okay to code and test an 8 to 12-qubit algorithm, but you must not presume that it works until you scale it up to 32 to 48 qubits. And this is just the starting point — quantum computing won’t be in full swing until 64 to 80-qubit algorithms are the commonplace norm. IOW, dramatic quantum advantage is the norm.**Computable functions for quantum computing.**What exactly can a quantum computer compute or not compute, in the formal mathematical sense. Presumably quantum computers can compute more than classical computers, but how can that difference be formalized. All operations of a Turing machine are computable in that mathematical sense, but operations of a Turing u-machine are not required to be computable. True random numbers are not computable in the classical sense of a classical Turing machine, but can readily be computed by a quantum computer.**How an LC circuit works.**L = inductor, C = capacitor. A resonant circuit. The essence of how a superconducting transmon qubit works. How a quantum state is created, controlled, and maintained.**Hurry up and wait — for quantum computing.**What’s the hurry? The wait will be quite long.**Quantum computing is much ado about… very little, so far.**Quantum computing has a bright future and great promise, but not yet, not in the here and now, not today, not now, not yet, not soon.**Quantum computing remains a great big science sandbox project.**Still a research topic. Still a laboratory curiosity.**Is phase the Achilles heel of quantum computing?**Continuous value, but with unclear precision or sense of determinism. An intense need for clarity — too vague at the moment.**What’s common across all quantum computing application categories?**Opportunities for sharing and reuse vs. need for unique and specialized approaches.**What is the largest problem solvable with a single quantum circuit?**Of reasonable length.**Quantum analogs to the rich data and control structures of classical computing.**Reality “computes” quite well with energy, matter, mass, charge, atoms, subatomic particles, molecules, forces, fields, chemical reactions, crystal lattices, liquids, gases, and plasmas. Not sure how any of this can be applied to quantum computing, but reality certainly does “compute” a lot of interesting phenomena based solely on quantum effects and these other emergent phenomena. What data and control structures are really needed to “compute” or simulate reality at the level of quantum effects?**Essence of quantum computing.**In one sentence, one paragraph, one page, two pages, four pages, ten pages, twenty pages, fifty pages, and 100 pages. What gets added at each stage, and what gets left out to get to the previous stage.**What might it mean to hack a quantum computer?**Doesn’t the No-Cloning Theorem preclude copying or stealing quantum state? Maybe the QPU itself is unhackable, but the classical software layers that control the QPU are just as hackable as any other classical code.**Summarize the references to quantum effects and quantum technologies from Michael Crichton’s book****Timeline****.****Naked numbers considered harmful.**Need for dimensional analysis of all numbers. Be explicit about units. Annotate. To wit, what are the units (dimensions) of Quantum Volume? Or a wave function.**Can a NISQ quantum computer ever achieve dramatic quantum advantage?**Even if some advantage, any prospect of it being dramatic? Or is the noisiness too much of an impediment. Depends on how strict or loose quantum advantage is defined, but for purposes here, strict, true, dramatic quantum advantage is the target. What criteria would have to be met to achieve true, dramatic quantum advantage using a NISQ device? Maybe based on the product of qubit fidelity and circuit depth — shallow circuits allow more noise, deep circuits allow much less noise. Presume 64 qubits in a single Hadamard transform as the target algorithm width. 50 qubits might be okay, or maybe 53–55 as the minimum.**The great seduction of quantum computing.**How can we reclaim our dignity?**Should FPGA and transistors be considered classical computing per se?**Far below the level of Turing machines and even Boolean logic. Not even necessarily binary. Closer to analog computing. Granted, FPGA libraries include digital components and effectively support Boolean logic, but the raw FPGA itself is more primitive.**Is this really quantum computing’s moment?**Or are we jumping the gun and 2–5 years from now would be a much better moment?**Levels of cleverness.**Hardware, firmware, software, algorithms, applications.**Caveats and cautions for quantum applications.**What to watch out for. Criteria for success.**My obsession with energy and photons.**How are entangled product states represented physically?**No point to quantum computing until and unless dramatic quantum advantage is achieved.**Laudable to develop algorithms on simulators, but there needs to be evidence that these algorithms will be scalable to intermediate-scale quantum computers (50 to hundreds of qubits) as that more-capable quantum hardware becomes available.**Quantum advantage matters.**Emphasize the same point.**Quantum advantage is everything.**Emphasize the same point.**Quantum advantage is mandatory.**Emphasize the same point.**Ideal qubit technology has not yet been invented.**Ideal as in good enough to achieve dramatic quantum advantage. Still in the early days. May be another 5–10 years before we see qubit technology which will enable dramatic quantum advantage for a wide range of applications.**Criteria for when it’s time to start getting ready for quantum computing.**Distinguish between those organizations which will likely be developing their own algorithms and those organizations looking for off the shelf packaged solutions. Clear distinction from organizations intending to be vendors in the quantum computing sector. And organizations seeking to offer quantum computing-related consulting services.**Consciousness based on quantum effects.**Human mind. Animal brains. AI systems. Robots. Still a speculative research area.**Never underestimate the boundless cleverness which can propel classical computing to great new levels of capability.**Clever heuristics and intuitive leaps can surmount even extremely daunting obstacles.**How many quantum states does your algorithm use?**Not just how many qubits, but 2^k quantum states for each k qubits used in a single Hadamard transform. And times the circuit depth. A better metric of resource requirements.**Is quantum computing imminent?**No, not really. Plenty of basic research is still needed. Still not to 32 and 40-qubit algorithms, let alone 50-qubit algorithms and beyond which are needed to achieve dramatic quantum advantage.**Things to keep in mind when contemplating and discussing quantum computing.**Nuances. Limitations. Uncertainty about pace and trajectory of future quantum technology developments. Risk of proprietary IP. Portability of algorithms between disparate quantum computers. Talent pool risks. Maintainability of algorithms and applications. Time to market.**Proof points for quantum computing to achieve widespread adoption.**Milestones to quantum advantage and beyond. The ENIAC Moment and The FORTRAN Moment would be two. Various numbers of qubits would be others — 72, 80, 96, 128, 192, 256, 512, 1024, etc. Various nines of qubit fidelity would be others — two nines, three nines, four nines, five nines, six nines, etc. Full quantum error correction would be another. Various circuit depths of coherence would be others — 10, 20, 50, 75, 100, 150, 250, 500, 1,000, etc. Fractions of full, dramatic quantum advantage would be others — 10%, 25%, 50%, 75%, 90%, etc., although different application categories might have different requirements.**Symbolic quantum computing.**For quantum AI. A good use case for more than a mega-qubit. But, what would it really look like? How to map symbols to qubits?**Is a quantum program a game of chance?**To some extent. It depends.**What is the key to quantum parallelism and quantum advantage?****Quantum control vs. determinism.**Probabilistic vs. determinism.**The basic, traditional quantum algorithms.**But what value do they have? What do they really show? Do they tell us anything useful for practical applications? But at least provide decent plain language descriptions.**QASM with physical qubits will still be useful even once logical qubits become available.**Limited capacity for early logical qubits. Niche applications may require heavy QASM-level operation, possibly with manual error mitigation or near-perfect qubits. QASM is used here as a metaphor for gate-level quantum programming ala classical assembly language programming, as opposed to using high-level programming approaches and general-purpose libraries which may not have the fine control required for some niche applications.**Need to distinguish devices and information.**A classical bit is simply abstract information, a value, a representation, not hardware. A classical gate is hardware, not a software operation. A qubit is a hardware device. A quantum logic gate is a software operation.**How long before we can move beyond sub-exponential performance for search algorithms?**Grover’s search algorithm offers only quadratic speedup. We need exponential speedup to unlock the full potential of quantum computers.**What does it mean to measure individual qubits of entangled product states?**Should it matter what order the qubits of a product state are measured? Other nuances for measuring non-isolated qubits.**Using classical AI to facilitate quantum computational chemistry.**Avoid tedious manual preparation of initial states and circuits.**Application frameworks for the seven main application categories.**Simulating physics, chemistry, material design, drug design, business optimization, finance, machine learning. Note IBM’s recent introduction of.*QISkit Nature***Things I like to say about quantum computing.**Wisdom. Fundamental general knowledge. Like, the ideal qubit technology has not yet been invented. Summarize the field. Simplify the field. Characterize the limits of the field. Cut through all of the hype.**Is quantum computing simply a parlor trick, a sleight of hand?**It can certainly seem that way — the hype makes it look that way. There is fundamental substance, but it’s hard to see through all of the hype.**How much can a quantum computer compute without recursion?**Seems like a rather severe limitation.**How do you do anything useful on a quantum computer — like what?**Tends to be too opaque and inscrutable. Need some direct, plain language description.**The power of randomness in a deterministic world.**Using randomness, probability, and statistical aggregation to approximate determinism. Many real events have a random character, so coping with randomness is a good first step. Being overly-reliant and overly-dependent on absolute determinism can cause problems when so many random factors are in play. Randomness can average out to an approximate determinism. But, ultimately, computation (and reality) need some level or levels plural of quasi-determinism, otherwise the result is pure chaos. How to achieve apparent stability or equilibrium even when starting with randomness. What principle or principles plural are at work here?**How to calculate the n-bit square root of an integer on a quantum computer.**No floating point, so have a composite of bits of the integer square root and bits of the fractional result.**Top 10 or 25 pieces of hype about quantum computing to beware of.****How does phase work for entangled product states?**Unlike a superposition for a single qubit where phase is a simple difference between two probability amplitudes, what does it mean if there are more than two product states? Is there only a single phase angle difference, or are there more phase angles? Is phase the same for Bell states as for superposition of a single qubit since there are only two probability amplitudes? Need to better understand the intuition behind phase (and maybe spin states as well) to directly answer this question.**Need for a better and more detailed report for Quantum Volume for a quantum computer to give algorithm designers and application developers better insight into the performance capabilities of the quantum computer.**Circuit depth vs. qubits. SWAP network overhead. Some applications may need more or less than the nominal two-thirds threshold for quality of results — test should be parameterized.**My function as doing due diligence for quantum computing.****Quantum algorithms need to be provably scalable.**Generally scalable via parameterized dynamic generation of quantum circuits. Test on both real machines and simulators for smaller sizes, simulator-only for medium sizes (32 to 50 qubits), and reliance on proof of scalability for largest sizes, beyond 50 or so qubits.**Goal of achieving the lowest energy for quantum computing.****Importance of focusing classical quantum simulators on near-perfect qubits.**Although the definition and specification of qubit fidelity for near-perfect qubits will vary over time, it would be very beneficial to configure classical quantum simulators to precisely match the qubit fidelity of real near-perfect qubits so that simulation runs will closely match the results of execution on a real quantum computer constructed of near-perfect qubits meeting those qubit fidelity specifications. In addition to precisely matching existing machines, qubit specifications could be tuned to match proposed qubits to see how they are likely to perform even before the machine is actually built. A wide range of possible near-perfect qubit specifications could be simulated to get a sense of expected results even before real machines are actually built. Specify number of nines for qubit fidelity. Focus algorithm designers and applications developers on how many nines of qubit fidelity they are working with. Focusing on both NISQ and perfect qubits is an unproductive distraction. Granted, initial focus will be on very-low nines (1 to 2–90% to 99%), but explicit focus on nines is a more productive mindset.**Importance of focusing on near-perfect qubits.**Especially as a near-term research target. Full-blown quantum error correction (QEC) is too far out. Near-perfect qubits are needed for QEC anyway. NISQ won’t get to quantum advantage. Range of possible nines — generally, 3 to 6. Nines and fractional nines.**Am I on the verge of becoming an apostate for quantum computing?**Yes, I’m getting more gloomy about the short-term prospects for quantum computing, but I still haven’t given up on the longer-term prospects, yet.**How might string theory impact quantum computing or quantum information science in general?****How much of quantum mechanics is visible in quantum computing?**How much knowledge of quantum mechanics do you need to know to understand quantum computing?**What is the quantum computing equivalent of the tape of a Turing machine?**None, yet?**Lack of rigorous proof of the superiority of quantum computers.****The truths you need to know about quantum computing.****What if you wanted to teach a robot about quantum computing?**Where to start? How much and where to end? What level of detail? What granularity? What path through the material? How to test knowledge? Never-ending process as the field evolves.**Quantum workforce development.**For research. For product development. For applications. Depends on personas and use cases.**Should quantum technology be restricted under export controls?**Interesting question. Or is the horse already out of the barn — too many countries already deep into quantum.**Why is quantum computing such a big deal?**Quantum parallelism provides an exponential speedup which gives a dramatic quantum advantage over classical computing. In theory.**What makes quantum computing such a big deal?**Quantum parallelism provides an exponential speedup which gives a dramatic quantum advantage over classical computing.**Why bother with quantum computing?**Quantum parallelism provides an exponential speedup which gives a dramatic quantum advantage over classical computing.**How do you know whether a quantum computer has done anything quantum at all?**Work by grad student Urmila Mahadev —.*Graduate Student Solves Quantum Verification Problem***Order finding using an NMR-based quantum computer.**Subset of Shor’s factoring algorithm.. Was this work an absolute deadend, or can it lead to something further?*Experimental Realization of an Order-Finding Algorithm with an NMR Quantum Computer***How accurate and reliable can a classical quantum simulator be if entries in a unitary matrix are irrational?**Such as an H gate with entries which are 1 divided by the square root of 2, an irrational number. How much precision is needed? Will it ever be enough?**How accurate and reliable can a quantum computer be if entries in a unitary matrix are irrational?**Such as an H gate with entries which are 1 divided by the square root of 2, an irrational number. How much precision is needed? Will it ever be enough?**Is quantum now?**“*Quantum is Now!*” is one of the common hype themes/memes infecting the quantum computing sector.**Quantum computers won’t break data encryption anytime soon.****Can any reasonable introduction to quantum computing be gentle?**Seeby Eleanor Rieffel and Wolfgang Polak.*Quantum Computing — A Gentle Introduction***What compute-intensive problems do not require a quantum computer?**Any problem with a classical computing solution whose algorithmic complexity is less than exponential (e.g., polynomial.) And maybe somewhat less than that.**Is quantum computing poised to upend entire industries from telecommunications and cybersecurity to advanced manufacturing, finance, medicine, and beyond?**No, not really, not yet, and not anytime soon. More hype, but worth responding to.**Are quantum computers poised to kick-start a new computing revolution?**No, not really, not yet, and not anytime soon. More hype, but worth responding to.**Other approaches to quantum computing than gate-based qubits.**Adiabatic quantum computing. One-way or measurement based quantum computing. Continuous-variable quantum computing. Worthy of significant research funding.**Is quantum computing beginning a completely new era of computing that will change computing more in the next 10 years than it has changed in its entire history?**Written two years ago (February 2018.) No, not really, not yet, and not anytime soon. More hype, but worth responding to.**Is variational quantum factoring (VQF) a viable alternative to Shor’s algorithm for factoring large integers?**Seeby Eric R. Anschuetz, Jonathan P. Olson, Alán Aspuru-Guzik, and Yudong Cao. Or is it only suitable for relatively small numbers, like 128, 64, or even only 32 bits? Is it dependent on granularity of phase or probability amplitudes. What is its scalability — do they say, what have they proved?*Variational Quantum Factoring***Potential for digital/analog hybrid quantum algorithms.**Best of all three worlds. Especially when approximate results are acceptable. Worthy of significant research funding.**How many of the next generation of quantum computers will be the quantum equivalent of Ford Mustangs and how many will be Ford Edsels?**Great marketing successes and total flops. Great and grand ideas frequently fail to turn out as great marketing successes even if they look great on paper and maybe even in the lab.**Quantum talent pool strategies and tactics for organizations large, medium, and small.**Talent has two key aspects here: 1) competition for new-hire talent and a very limited talent pool, and 2) need for a roadmap as to how enterprises should ramp up the talent and what talent “personas” they need at each stage of the ramp-up. A map of talent stages is needed. And the degree to which elite-level new-hire talent is needed, the degree to which current in-house talent can be trained for quantum, and eventually larger staffing of non-elite quantum talent. In the earlier stages I refer to talent as The Lunatic Fringe. In-house vs. outside consulting talent. Long-term vs. project-specific efforts. Make vs. buy strategies. Deep, basic research vs. application-oriented research.**Bitter truths about quantum computing.**Tough to accept.**How should academia be positioning itself.**Research: deep, basic research and algorithm and application-oriented research. Introductory survey courses. Targeted grad-level courses. Quantum computing and quantum information science as a minor — still too early to have quantum native programs. Internship opportunities for students.**Is interference the main quantum computing advantage?**Seeby Constantin Gonciulea, Distinguished Engineer & Executive Director, JPMorgan Chase.*Is Interference The Main Quantum Computing Advantage?***Will quantum computers truly serve humanity?**Interesting question, worth exploring in greater depth. Very concise clarity of the vocabulary is needed — how should “serve” be defined, and by whom? How much of humanity must be “served” to justify a claim that humanity has been served. Should service be judged by accomplishment of a critical amount of useful tasks, or mere absence of harm or ability to prevent harm? For a baseline comparison, have classical computers truly served humanity? Raised by Ilyas Khan on February 17, 2021 in*Will Quantum Computers Truly Serve Humanity?***Could quantum computers crack Bitcoin by 2022?**In a word, no. Very unlikely. We’re not on a technological path towards that outcome. Of course, it’s always possible that someone could come up with some incredibly elegant mathematical hack that manages to accomplish what we previously did not believe could be accomplished. Seeby Robert Stevens, May 12, 2020.*Quantum computers could crack Bitcoin by 2022***Key takeaways from QC40: Physics of Computation Conference — 40th Anniversary.**Conference planned for May 6, 2021. See— “*QC40: Physics of Computation Conference — 40th Anniversary**Celebrate 40 years of quantum — Keynotes, contributed talks, and more bridging the 1981 Physics of Computation conference with current research*.”**The 10 most important things I can say about quantum computing.**Reference my papers for details, but provide concise summaries. Focus on the here and now as well as the future, but not solely on ultimate promise.**Thoughts on probability, stochastic processes, statistical aggregation, and determinism.****Integrating quantum computing with real-time quantum sensing.**I’m not sure exactly what this would really look like, but the potential is… awesome. Need some sort of real-time loop — snapshot quantum sensed data, process, optionally output classically, rinse and repeat. Actual quantum processing would have to be re-thought from the ground up — not limited by or based on classical real-time processing. A quantum version of a CCD image sensor would be an obvious use case.**What is the secret sauce (or sauces) of quantum computing?**What really makes quantum parallelism tick… and roar? How is quantum advantage actually achieved?**Why NISQ starts at 50 qubits.**Actually, it’s*intermediate-scale*that starts at 50 qubits. I don’t know with certainty, but my informed speculation is that a few dozen qubits was viewed as being too few to do anything useful, 2⁵⁰ quantum states was viewed as the point where quantum advantage over classical computing could be achieved, and a number of machines had been announced at the time (2018) with 50 or more qubits — Intel at 49 qubits, IBM at 50 qubits, Google at 72 qubits, and Rigetti at 128, so there wouldn’t have seemed to be any need to place any attention on machines with less than 50 qubits. Or so it seems.**Are quantum Fourier transform (QFT) quantum phase estimation (QPE) the best we can do?**They are impressive, but somewhat limited. But since we can’t use either to any significant degree today, there’s no rush to go further. But we should still be a lot more forward-looking.**What circuit depth could we expect to require to achieve dramatic quantum advantage?**Of course it will vary a lot, but some ballpark estimate would be helpful.**How many of our current algorithms are truly and fully scalable?**To 30, 40, 50, 60, 70, 80 qubits and beyond. Without redesign, as currently written and currently implemented. I fear that the current answer is few or even none.**How many 9’s of qubit fidelity does your quantum algorithm require?**How do you know and have you confirmed or proved it?**When will we get to see the emperor’s clothes?**The tailors have made many wildly extravagant promises, and we’re all presuming that the reality will be even more extravagant than the promises, but… here we are, waiting patiently (okay,*very*impatiently!) for the emperor and his super-extravagant clothes to make their appearance. We won’t be disappointed… will we? Seriously, how much of the hype should be treated as fact, or even understatement of fact?**What is the truth about quantum computing?**Beyond the hype. What are the strongest statements we can make which are in fact grounded in fact?**Quantum work.**How much work is a quantum computer or actually a particular quantum circuit performing? Maybe the sum of quantum states for each step in the quantum circuit. Factoring in quantum parallelism — a Hadamard transform on n qubits producing 2^n quantum states. Compare the work of the quantum circuit with the work of a classical program which calculates the same results.**Quantum computing is (currently) far too complicated for all but the most elite of professionals.**To do anything useful, that is. When will that change? Maybe, or in theory, The FORTRAN Moment.**What are quantum computers good for?**What problems are they most applicable to? Key benefits — besides raw performance, solving problems previously thought unsolvable using classical computers.**What quantum magic allows the SWAP gate to work — it seems like a sleight of hand.**Okay, it’s three CNOT gates, but how does that get around the no-cloning theorem? Interesting analogy to using three XOR instructions to swap two values in classical computing. What is the quantum physics that makes SWAP work? Especially if the two qubits might be entangled with two or more other qubits.**What application is likely to reach dramatic quantum advantage first?**Or at least which application category? I’m personally rooting for quantum computational chemistry. Business process optimization would seem to be a slam dunk. What factors will make the choice? Could ability to tolerate an approximate solution be a key factor?**What application is likely to reach The ENIAC Moment first?**This could be the same as*What application is likely to reach dramatic quantum advantage first?*, but a true dramatic quantum advantage is not required provided that enough of a*partial quantum advantage*is achieved that really impresses people.**Would moonshot projects or Manhattan Project-style of projects really help quantum computing make a quantum leap to its full potential?**Timing is everything — doing either Project Apollo or the Manhattan Project even a mere five years earlier would have been an absolute disaster — a critical mass of the critical technologies, coupled with a critical mass of resources and a critical mass of commitment are essential. Quantum computing is still basically stumbling around in the dark. Besides, look how well classical computing developed, without a moonshot project. But also look at how the long stream of critical developments occurred over an extended period of time.**Scaling of algorithms and applications won’t be easy, automatic, and free.**Too many odd factors and tight limits. Someday it may get easier, more automatic, and relatively cheap, but not anytime soon. For the foreseeable future it will be tedious, difficult, and outright problematic. Quantum Fourier transform, quantum phase estimation, and anything dependent on fine granularity of phase will be especially problematic.**What Is Dramatic Quantum Advantage?**Emphasis on much more than a mere minor advantage over classical computing solutions. How much more? See*Fractional quantum advantage or partial quantum advantage or degrees of quantum advantage.*But a briefer, lighter treatment suitable for a very broad audience, including non-technical professionals and managers.**Quantum algorithm design, quantum application development, and quantum application deployment.**Overall architectural and methodology model for quantum computation. Core quantum algorithms. Hybrid classical code to bridge the gap between application data and quantum circuits. Classical code for applications which use those quantum algorithms. Real-world deployment of those applications.**Might Fortran make a comeback for quantum computing?**Purely speculative, but based on scientific computing being a primary focus for quantum computing. My notion of The FORTRAN Moment was merely using the original FORTRAN programming language as a metaphor for a transition to mass adoption by non-elite professionals as occurred in the 1950’s for classical computing when the original FORTRAN was introduced, but if scientists flock to quantum computing, many of them are already and still using Fortran. Note: FORTRAN (all caps) became Fortran (capitalized) beginning with the Fortran 90 standard. Many existing scientific applications may have been developed using the FORTRAN 77 standard, or even earlier versions of FORTRAN (FORTRAN IV, FORTRAN 66.)**Role of NIST in quantum computing and quantum information science in general.**Research and standards. Basic research and applied research. Early research which influenced current state of the art. Role in adoption throughout the U.S. government. Interaction with the commercial sector. Interaction internationally as well as domestically. Focus on quantum computing, but other aspects of quantum information science in general as well — quantum communication, quantum networking, quantum metrology, and quantum sensing.**What metrics to report for any purported advance on the algorithmic front?**Preferably part of every algorithm paper, or at least part of a summary of the algorithm and its impact, part of an algorithm announcement. Total number of qubits used. Largest Hadamard transform register size in qubits. Connectivity requirements or SWAP network complexity. Relative or fractional quantum advantage — is it really true, full, and very dramatic quantum advantage?**Will quantum computing facilitate human-level reasoning for AI?**A combination of faster and better? May have to wait until a quantum computer can have hundreds of billions of qubits, or more. Maybe even hundreds of trillions of qubits might be needed to offer a true and dramatic quantum advantage over both classical computers and the human mind.**Should we speak of quantum computing as a field, sector, industry, or what?**Field makes sense for research. Computing overall is an industry. Quantum computing is a sector of the overall computing industry. I lean towards quantum computing being a sector, for now.**Are more qubits, better qubits, and higher qubit fidelity basic research problems or mere engineering issues?**Personally, I think they are basic research problems — new technology is needed, not merely engineering of existing technology. I recall someone claiming that “*The physics of quantum computing is known, all we need now is engineering.*”**Beware of commercial projects which rely on additional research — best to rely primarily on off the shelf technology.**I’m all in favor of additional research efforts, but research belongs in the realm of researchers working in research labs. Commercial product developers are best advised to shy away from basing new products on research which has not yet been performed and proven. Playing researcher is a recipe for commercial disaster. Call it what it is — a research project, and don’t call it a commercial product project until all required research is completed. Research can run into unexpected obstacles which may require years or even decades of additional research to resolve.**Proposal for criterion for quantum advantage — can your quantum algorithm do in no more than an hour what would take 50,000 classical computers more than eight hours.**The whole point of quantum advantage is not just that a quantum computer is faster than a classical computer, but that a complex problem can be solved promptly using a quantum computer while it would take an incredible level of classical resources to achieve the same solution, and even then, likely not in a timely manner. As a prototypical example, suppose a business had a thorny optimization problem such as route scheduling which they needed to solve once a day within much less than eight hours to be acceptable. An hour or two might be acceptable, but more than a few hours would simply not be viable for the business requirements. So, 50,000 classical computers working for eight hours seems like a good metric for comparison. These are not precise numbers, just a rough ballpark to express the conceptual comparison. Alternatives are what a quantum computer can do in one second, ten seconds, one minute, ten minutes, or thirty minutes versus a large number of classical computers can do in one day, 48 hours, 72 hours, one week, one month, three months, or one year.**Will quantum computing be effectively useless until quantum phase estimation and quantum Fourier transform are practical at production-scale?**I strongly suspect so. Variational methods can work, in a fashion, but are unlikely to deliver dramatic quantum advantage. Dramatic quantum advantage is the threshold minimum for declaring that a quantum computer is useful, in my mind. Until applications can regularly utilize quantum phase estimation (QPE) and quantum Fourier transform (QFT) on qubit registers of at least 50 if not 55 or more qubits wide, any advantage over classical computers will be minimal at best. QPE and QFT on 32 to 40 qubits would be a great stepping stone on the path to 50, 55, and 60 qubits. All of this is predicated on QPE and QFT being the most powerful algorithmic building blocks currently known. Further research might produce even more powerful building block algorithms, but QPE and QFT are our best bets today — and they really aren’t even practical today or in the near term. They’re unlikely in the next two years. They’ll have to wait until qubit fidelity is up into three to five nines (99.9%, 99.99%, or 99.999%.) Five years might be a safe bet, but even that is pure speculation.**Preliminary thoughts on an ontology for quantum computing.**And overall quantum information science as well. My glossary of terms for quantum computing might be a good starting point for terms and concepts. The goal for this particular topic is not the full ontology, but more of a framework or outline for the full ontology. Criteria. Goals. In theory an ontology should be machine readable, as well as templates and automated processes for producing human-readable versions of the machine-readable ontology. Not clear who or what would use the ontology, although it should be a definitive map of the landscape of quantum computing.**Quantum teleportation.**Does it really apply to quantum computing, as opposed to quantum communication? Explain it more clearly — and more plainly. Are some people simply using it as a synonym for quantum entanglement?**Waiting anxiously for the first 100-qubit quantum computer.**At present, only IBM has a near-term roadmap to 100 qubits, with a 127-qubit machine targeted for later in 2021. What exactly will anybody be able to do with it that they can’t already do with IBM’s 27, 53, and 65-qubit machines, especially given limited connectivity and mediocre qubit fidelity. Until we have algorithms which can effectively exploit over 24 qubits, more qubits is not the gating factor. Greater qubit fidelity seems to be the primary gating factor. So the real question is what are the nines of the next 100-qubit quantum computer.**Potential for application-specific qubit topology.**Connectivity requirements may vary significantly between application categories. It might be that particular topologies (spatial and connectivity arrangements) of qubits benefit some applications more than others. General purpose may be suboptimal for too many. Consider optimizing for a few, at first.**Understanding the theory of multipartite entanglement.**It gets so complicated. What is the actual theory, and what are the ramifications for practical applications. What are the capabilities, what are the limitations, and what are the issues.**Better terms for superposition and entanglement that make more sense to non-physicist algorithm designers and application developers.**Terms need to refer to concepts and technologies in a way that makes sense to non-physicists and relate more closely to how real-world problems are conceptualized and formulated. If the real focus is quantum parallelism, terms should reflect that.**How many nines of qubit fidelity does your quantum computer have?**90% = 1 nine = 10% chance of error = 0.1 error rate, 99% = 2 nines = 1% chance of error = 0.01 error rate, 99.9% = 3 nines = 0.1% chance of error = 0.001 error rate, etc. Coherence time, gate errors, measurement errors. Any maximum of nines, like 9 — one error per billion gates, 12 — one error per trillion gates, or 15 — one error per quadrillion gates? Fractional nines — quarters, tenths, hundredths — 93% = 1.3 nines, 99.5 = 2.5 nines, 99.975 = 3.75 nines. What to say if less than 90% correct (80%, 70%, 60%)? Variability between qubits even on the same device — per qubit, average, exclude outliers below some threshold. Document in the implementation specification for a quantum computer. Maybe also in the Principles of Operation if not negligible since algorithm designers and application developers must be very cognizant of error rate if not very low. What constitutes “negligible” or near-perfect?**No, most current quantum computers are technically not true NISQ devices.**Technically, as per Preskill’s original paper, NISQ intermediate scale starts at 50 qubits, so devices with 5, 8, 12, 16, 20, 24, 28, or 32 qubits are not intermediate scale. That said, give nod to marketing, hype, and generally loose terminology in the sector that any noisy quantum computer is currently considered a NISQ device even if not technically intermediate scale.**What is a near-perfect qubit?**See*Near-perfect qubits.***Beyond NISQ — Proposed terms for quantum computers based on noisy, near-perfect, and fault-tolerant qubits of various sizes.**NISQ for noisy, NPISQ for near-perfect qubits, and FTISQ for fault-tolerant qubits. Change IS to SS — Small Scale — for devices with fewer than 50 qubits, and change IS to LS — Large Scale — for more than a few hundred qubits.**SWAP networks versus any to any connectivity — who will win?**Will qubit fidelity ever get high enough for larger SWAP networks to have a low enough error rate? How rapidly will the size of SWAP networks rise as the number of qubits rises? Will the algorithmic complexity of SWAP networks eventually become an insurmountable barrier? What qubit technologies will support true any to any connectivity?**What should people expect to see in quantum computing in five years?**Goldman Sachs announcement:. Milestones for 2–5 years. Research vs. commercial product offerings. Number of qubits… maximum, what is common — high, average, smallest commonly used, and what’s the typical range for algorithms? Vendors… what commercial offerings might Intel and Google have by then, will Microsoft have an offering, will Apple have an offering or internal use, possible new entrants. New qubit technologies? Nines of qubit fidelity… highest, common range, lowest in common use. Progress on quantum error correction?*Goldman Sachs Predicts Quantum Computing is 5 Years Away from Use in Markets***Can two particles or qubits be entangled at different times?**Entangle them, then move and accelerate or decelerate one of them so that its “clock” slows or speeds up, so that the two particles are at different times. Can different bits of an entangled product state be at different times? Could a quantum computer or communications link “see” the future or the past?**What is the quantum chasm?**The transition from quantum computers with hundreds of qubits to machines with millions of qubits has been referred to as the*quantum chasm*. See Alan Ho of Google inand John Preskill in*Crossing the Quantum Chasm*.*Quantum Computing in the NISQ era and beyond***Is there anything that a quantum computer can do that can’t be simulated accurately on a classical quantum simulator?**Subject to capacity limits, of course — 40 to 55-qubit limit of a classical quantum simulator. Need to define accurately… more accurately. Whether classical noise models can “accurately” model the noise and probability of a true, real quantum computer is an interesting question.**Can a quantum computer compute cellular automata?**Heavy output requirements, need to observe transitions rather than simply a final configuration.**Great quotes about quantum computing.**Such as “Almost anything becomes a quantum computer if you shine the right kind of light on it.”**How many quantum states and wave functions are there in the entire universe?****Three sources of probability in quantum computing: probability amplitudes, errors, and quantum uncertainty.****Absolute theoretical limit of quantum uncertainty vs. the practical uncertainty for a particular quantum computer (or even a particular qubit.)****How can quantum computing compete with the magnificent high-level abstractions of classical computing?**Other than raw performance. See also:*My journey into quantum computing has given me a newfound appreciation for the incredible intellectual power of classical computing*— alternative title.**Genetic programming and genetic algorithms for quantum computing.**How well do the concepts translate and what happens to performance. Any special opportunities on quantum computers that suggest a radically different approach.**Nines of qubit fidelity****.**Qubit fidelity includes gate fidelity — includes qubit coherence, gate errors (single and 2-qubit), and measurement errors. Error rate as raw decimal — 0.01 — or percentage — 1%. Fidelity — 99.9% — three nines — 99.9% = 0.001 = 0.1%. Nines and fractional nines. ENIAC moment — no QEC, elite staff, but need more nines of qubit fidelity. FORTRAN moment — full QEC doesn’t require as many nines of raw qubit fidelity, non-elite staff.**Need to focus on near-perfect qubits.**See*Importance of focusing on near-perfect qubits*. Alternative title.**Reality of quantum computing now and in the near-term.**No good reason to use a quantum computer now or in the near-term since there is no real quantum advantage. Production-scale use for real-world business problems is out of the question now and for the near-term (next two years.) Too few qubits. Qubit fidelity is too low. Too few algorithms (none?!) capable of achieving quantum advantage even if the hardware were ready. Maybe in two years? Not likely. Maybe in five years? Maybe, but only for a few niche applications. 5–7 years? Good chance for a fair number of application categories, but still not easy or for the faint of heart. 10 years? Better chance for widespread use for many application categories.**What to look for in writing about quantum computing.**Includes academic papers, white papers, technical documentation, blog posts, web pages, marketing material and press releases, technical media, and general media. Cover serious technical content, vague promises, and outright hype. Timeframe: current, near-term, medium-term, long-term — or indeterminate. Real machine or simulator. Number of qubits. Connectivity. Shot count or circuit repetitions. Whether it is production scale. Whether it is a practical real-world problem. Whether they discuss scalability and can demonstrate or prove a range of scaling, as well as scalability into the range of quantum advantage. Is quantum advantage being claimed and are such claims warranted and technically justified? Or is fractional quantum advantage claimed? Estimate of Big-O for algorithmic complexity. Etc.**Will quantum computing really change the world [dramatically]?**Maybe, or maybe not. Sure,*some*change is likely, but will it be truly*mind-boggling*and*dramatic*? So far, most of the promises have been more in the way of*improvements*, rather than outright, dramatic changes. What criteria should be used to judge how dramatic a change is? How do we judge the ramifications of any particular change? How do we filter out this kind of hype and get to legitimate forecasts?**Do classical application developers really need to know about bits?**For the most part, a plethora of high-level programming features enabled most developers of classical applications to rarely, if ever, need to know about bits. Mostly, concerns with bits are an archaic holdover from decades past when details about the bit layout of data items was a more urgent matter. Now, not so much, if ever. Some software developers still do need to know about bits, such as systems-level developers and those implementing data conversions. And many of the times people think they need to know about bits, it’s really usually to know about or specify the capacity of various number formats.**Do quantum application developers really need to know about qubits?**Currently, yes, no question about it, but as advanced higher-level programming models are developed, this could change — and*should*change. It’s all about higher-level abstractions. See also:*Do classical application developers really need to know about bits?***Are logical qubits analogous to virtual memory on a classical computer?**No, not exactly, and not really close, other than the simple fact that both data models map a logical concept to a physical concept. The two models are actually 180-degree diametrical opposites — there are more physical qubits for each logical qubit, but more virtual pages than physical pages. Still, it’s an interesting analogy. And maybe someday we may have*quantum storage*so that we could have*virtual qubits*which are*shuttled*between quantum storage and actual qubits which can be operated on.**The ugly truths about quantum computing.**There are so many. Where to begin! Grover’s algorithm won’t beat most sophisticated searches on classical computers. Grover’s algorithm can’t search databases. Shor’s algorithm won’t be practical for a long time, and in fact may never be practical for large numbers. NISQ will likely never achieve quantum advantage. Limited gradations of phase limit quantum Fourier transforms and quantum phase estimation — and hence limit Shor’s algorithm. Quantum computing is still a laboratory curiosity and will be for years to come. And so much more!**How close are we to the ENIAC moment of quantum computing?****Limited connectivity limits D-Wave.**Lack of general-purpose gates precludes use of arbitrary SWAP networks to transcend limited direct qubit connectivity. Latest D-Wave machine (Pegasus) has 15-qubit connectivity, but that won’t be sufficient to achieve quantum advantage for production-scale data.**A quantum computer with a single logical qubit would be completely useless, right? Or would it?!**Great testbed to prove out feasibility of a logical qubit and test all single-qubit operations, including superposition. Even if not sufficient to test two-qubit gates or entanglement. It will still be some time before a machine has enough physical qubits to build two logical qubits.**Will quantum computing change any of our lives?**Not likely in any dramatic way. Just lots of little, minor ways, hardly discernible from the myriad of incremental changes from classical technology improvements. Maybe there might be some true, quantum leap(s) eventually, but nothing proposed or on the horizon.**Early classical computers were both amazing and atrocious.**The 1940’s. Hints of the awesome future to come. Far better than nothing. Very limited. Very hard to use. Very unreliable. Very difficult to program. Compare and contrast with the state of quantum computing.**Heuristics for quantum computing.**Shortcuts to get to solutions. Don’t have a clear body of them at the present moment. Rules of thumb as well.**Quantum computing needs higher-level logical operations rather than raw physical operations.**Analogous to classical computing where higher-level abstractions were close enough to the way mathematicians and scientists worked that programming was not a monumentally overwhelming task as it is with quantum computing. Programmers of classical computers never needed to know the physics of how transistors worked, or even digital logic gates. Operations for high-level capabilities such as Boolean logic, integer arithmetic, real arithmetic, characters (bytes), arrays, subscripts, algebraic expressions, etc. greatly facilitated classical programming. Quantum computing needs analogous intellectual leverage.**No, you can’t become Quantum Ready for what quantum computing will look like five years from now.**The advances needed to make quantum computing usable, both to accomplish needed tasks, to achieve quantum advantage, and to make quantum computing approachable by less than elite staff will result in such a dramatic change in what quantum computing looks like that everything that you can learn and do today with quantum computing will be hopelessly obsolete and distinctly undesirable in five years. Much better for most people to wait 2–3 or maybe even 4–5 years.**Is quantum computing a chimera?**Chimera: “*a thing that is hoped or wished for but in fact is illusory or impossible to achieve.*” It’s still quite possibly a chimera. Yes, there’s a lot of tantalizing progress, but so much of the promised future still has a lot of the feel of an illusion. Quantum advantage is still not a slam-dunk. Overall, quantum computing does still have a lot of the feel of being a chimera. It will likely be another 5–7 years before we can even hope to possibly answer this question, and even then the answer may still be elusive.**The mythology of quantum computing.**So many claims (myths?) are made about quantum computing, but so few of them have become a reality. There does seem to be so much more hype, hyperbole, and wild promises than actual fact. How to sort out the myths and the facts. Finding the few needles in the vast haystacks of spin. There has to be a little bit of nutrition in all of that cotton candy, right?**Some resources for people who are approaching quantum computing for the first time.**Before even getting into a tutorial of quantum computing, I have five informal papers:

1. What Is Quantum Information Science?

https://jackkrupansky.medium.com/what-is-quantum-information-science-1bbeff565847.

2. What Applications Are Suitable for a Quantum Computer?

https://jackkrupansky.medium.com/what-applications-are-suitable-for-a-quantum-computer-5584ef62c38a.

3. What Can’t a Quantum Computer Compute?

https://jackkrupansky.medium.com/what-cant-a-quantum-computer-compute-53bf3945d419.

4. What Are Quantum Effects and How Do They Enable Quantum Information Science?

https://jackkrupansky.medium.com/what-are-quantum-effects-and-how-do-they-enable-quantum-information-science-75981368d059.

5. Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing.

https://jackkrupansky.medium.com/little-data-with-a-big-solution-space-the-sweet-spot-for-quantum-computing-b6c6d76ada6d.**Should GE get into the quantum computing business?**They may not have the specific technical background of Honeywell in quantum effects, but both companies were once in the computer business. In fact, GE sold its computer business to Honeywell, which later spun out and sold its own computer business to a joint venture of a Japanese company and a French company. GE is a prime candidate for quantum computing applications — that could drive their interest and be a key competitive advantage. Whether they should acquire an existing quantum computing vendor, partner with one, or make a strategic investment in one (or more), there are plenty of paths GE could take. Personally, I’d like to see GE take a major stake in IonQ, which is on the verge of becoming a public company as a result of a SPAC acquisition. But there are other attractive possibilities as well. Also, there is the question of timing — it’s still the early stage and a lot will shake out, so waiting 2–3 years might be advisable — the early birds are not necessarily the ultimate survivors.**Should other industrial companies get into the quantum computing business?**Besides Honeywell and GE, quite a few industrial companies got into the computer business in the 1950s and into the 1980s, including Ford (Philco) and Goodyear with a pair of supercomputers. These industrial companies are prime candidates for applications of quantum computers. They could focus on co-design to assure that the hardware and algorithms and applications are optimally matched for each other. It would be a good idea for the federal government to provide grants and contracts to seed such efforts since the federal government may be the biggest single beneficiary of advances in quantum computing and applications. The Pentagon may be the biggest beneficiary. Aerospace companies could be a major focus. See also:*Should GE get into the quantum computing business?***How many qubits could you simulate on a classical quantum simulator the size of a football field?**Purely as a thought experiment, no serious proposal intended. Football field being a metaphor for a decent-sized data center. The current belief is that you couldn’t simulate a quantum circuit larger than about 50 qubits on even the largest classical supercomputers. But it would be extremely beneficial to be able to simulate larger quantum circuits. For debugging, if for no other reason. Without such a simulator, it won’t be possible to validate many results for algorithms using 50 to 80 qubits, especially if such algorithms are designed to achieve quantum advantage. Sure, even these mega-simulators would not be of help for algorithms using 100 or more qubits, but we really do need to validate algorithms of substantial size, not just a very few dozen qubits. 38 to 41 qubits seems to be about the current practical limit. Three scenarios… First, a supercomputer or distributed cluster of high-end classical processors. Second, a large distributed cluster of small, cheap, commodity classical processors. Third, a custom-designed processor focused solely on simulation of quantum circuits, with the goal of much smaller processors so that many more can be packed in that football field. Could 55 qubits be simulated? 60? 65? 70? 75? 80? How much RAM memory would be needed? How much flash storage? How much rotating storage?**Will necessity be the mother of invention for quantum computing?**What necessities or technical or market-driven needs will guide the development of quantum computing capabilities in the years ahead? What technical innovations will be driven from outside of the laboratory as opposed to driven by academic ideas, theory, and research in the lab?**What will be the smallest quantum computer that achieves dramatic quantum advantage?**Both in terms of physical form factor and number of qubits.**What might a quantum minicomputer look like?**In terms of capabilities, cost, and physical packaging. Adopt, adapt, and upgrade classical minicomputer pioneer Gordon Bell’s terminology: “*1. It is the minimum computer (or very near it) that can be built with the state of the art technology. 2. It is that computer that can be purchased for a given, relatively minimal, fixed cost (e.g., $10K in 1970).*” Would definitely need to be general purpose. I lean towards requiring that it be capable of achieving dramatic quantum advantage — there’s no real need for a quantum computer which has no advantage over a classical computer. I wouldn’t worry too much about cost at this stage — the primary concern and limiting factor at this stage is to achieve dramatic quantum advantage even at any cost. Worth noting that classical minicomputers arrived on the scene in the 1960’s — two decades after their larger and more expensive brethren began arriving.**Fundamental organizing principles and key insights for quantum computing.**What really makes quantum computing tick. What is quantum parallelism really all about. How does product state for entangled qubits really work its magic? Not all the gory low-level technical details, but the key abstractions and functional elements of quantum computing. See also:*What Are Quantum Effects and How Do They Enable Quantum Information Science?***Why is quantum computing so hard to explain?**Maybe because nobody has done anything useful, exciting, and dramatic with it yet! We understand a technology best by what real-world achievements have been accomplished using it. We haven’t used it to go to the moon, design a new rocket propulsion, design a new drug, design a new battery, solve any great science or math mysteries, etc. Only once we have accomplished such achievements can we explain what magic sauce enabled the quantum solution. And why can’t we achieve anything yet? Not enough qubits. Not enough qubit fidelity. Insufficient connectivity. No decent programming model. No toolkit of sufficiently useful algorithmic building blocks that pave the path to solutions to big world problems. Difficulty of converting classical algorithms to quantum. Inability to simulate quantum circuits of a meaningful size. Lack of research focus on even the largest circuits that we currently can simulate (where are all the 40-qubit algorithms?!) Other than all of that… we’re making great progress — we just can’t explain it… yet!**What is exponential speedup?**Put simply a quantum algorithm offers*polynomial computational complexity*(better) while a comparable classical algorithm offers*exponential computational complexity*(worse). For input size n, the classical algorithm might take O(2^n) time while the quantum algorithm might take only O(n²) time. For example, for n = 40, the classical algorithm might take 2⁴⁰ = one trillion seconds, while the quantum algorithm might take only 40² = 1,600 seconds — that’s an exponential speedup. And the quantum advantage only grows as the input size increases. See also:and*What Is Quantum Advantage and What Is Quantum Supremacy?**What Is the Quantum Advantage of Your Quantum Algorithm?***Is it too early for students to become Quantum Natives?**Yes, it’s too early, definitely — unless you are a quantum physicist, in which case there is an endless amount of research needed. First, no quantum applications are practical for production-scale deployment at this time or over the next few years, mostly because the necessary hardware is not available. Second, many quantum applications will be hybrid anyway, such as variational methods, so that technical professionals must be skilled in classical techniques as well as quantum techniques. Third, it’s always advisable to have a backup plan in case some hot new field (quantum algorithms) doesn’t pan out or pan out soon enough to provide you with a decent income stream. Fourth, quantum advantage requires comparison of quantum solutions to classical solutions, so you should have at least a fair degree of knowledge and skill with classical methods to perform an adequate comparison. Fifth, there will probably be high demand for professionals equally-skilled in both quantum and classical technology for years to come. Best to be dual-technology for now, maybe 25% quantum and 75% classical, or 75% quantum and 25% classical at the extreme if you really are at the hardcore super-elite level.**Every large organization should write an RFP with specs for what they need or require in a quantum computer.**RFP = Request For Proposal. And if they don’t know what they need, they have a lot of work to do to scope out… their needs. Number of qubits. Connectivity. Qubit fidelity — nines. Coherence and circuit depth. Macro operations built into firmware. Granularity of phase and probability amplitudes. Algorithmic building blocks needed. Timeframe needed. Data capacity or size of n for algorithms based on n. Applications which they intend to run. Larger organizations or consortiums of organizations should commission development of hardware and any needed research. Vendors should have two parallel sets of teams, one focused on general purpose machines and one focused on specific customer-driven specs.**My comments on Preskill’s “**Especially his cautions. https://arxiv.org/abs/2106.10522.*Quantum computing 40 years later*”.**What qubit fidelity is needed to do anything useful with a quantum computer?**Define useful! Maybe three nines minimal. Depends on whether full QEC is needed. Are near-perfect qubits (four to six nines) required? Even if only generating random numbers.**Need for a standardized set of benchmark tests for quantum computing.**That test functional capabilities needed, as well as performance requirements. Should closely mimic what real applications will need to do. Needed for vendors to do tests. Needed for users to confirm that a machine does what they need. Also needed by users to study and learn from, such as best practices. Would be nice if users could use them as templates to modify and adapt to their own needs.**Calibration of qubits and qubit processing.**What it is, what it costs, how often and when to do it, and the fact that its cost grows exponentially as the number of qubits grows. How many qubits can we add before the calibration cost becomes too excessive? Is a more modular QPU design needed so that modules can be smaller and calibrated independently?**Generative coding of quantum circuits.**Rather than hand-coding quantum circuits, classical application code should generate quantum circuits dynamically, especially for parameterized design patterns (e.g., quantum Fourier transform). Reduce chances for errors. Reuse such parameterized generative coding to enhance productivity, and reduce errors. Increased incentive to optimize for commonly used design patterns which are used in a number of instances.**Key questions about quantum advantage to ask when reviewing algorithms, applications, papers, projects, and products.**Mostly just making sure the matter is discussed fully and thoroughly, and in specific detail. How scalable is it as well. See section in*What Is Dramatic Quantum Advantage?***Will quantum error correction necessarily reduce or eliminate measurement errors?**I haven’t seen anything to suggest that. Maybe there is some plan or expectation of some new approach to measurement? Already in my FTQC paper as Top question #8, but maybe worth a separate paper to highlight.**Have proponents of quantum computing grown too fat, happy, and complacent with a false sense of supreme accomplishment?**And still not near dramatic quantum advantage. Still just a*laboratory curiosity*. Still not even close to being ready for prime time or production-scale applications. Spending a lot of money on premature commercialization when so much more basic research is needed.**How relevant is Hilbert space when designing and using quantum algorithms?**What specific details of Hilbert space are required for quantum algorithm design? Or, are just a few details needed (e.g., probability amplitudes are complex numbers) and otherwise all of the rest of the conceptual framework of a Hilbert space can be ignored, for the most part, in most situations, for most quantum algorithm designers and most users of quantum algorithms? I would simply note that many published papers and articles related to quantum computing don’t even mention Hilbert space. I would concede that the concept of a Hilbert space is central to quantum mechanics and hence to quantum computing and quantum information science in general, but whether it is central to the level at which most quantum algorithm designers and quantum application developers work is another matter.**Will quantum computing be ready to satisfy your organization’s needs within the next two years?**No. Even 5–7 years is being optimistic.**Is the fatal flaw of Quantum Volume that it succeeds even if only slightly more than 67% of tests succeed?**That seems to be the case. According to the IBM Qiskit web page for Quantum Volume, it checks if the fraction of tests got the correct result bitstring for more than 2/3 of the tests. Maybe the test needs to be parameterized for the needs of particular applications. This seems like a rather low threshold in any case. It may have been a decent result in 2019 for very noisy hardware, but we should be setting our sights higher, much higher.**What is cross-entropy benchmarking (XEB)?**See the Google paper,or in*Characterizing quantum supremacy in near-term devices**Nature*. But explain it more simply. And describe its relevance and utility.**Is order finding an essential technique for quantum algorithms?**AKA period finding. Used for Shor’s factoring algorithm. In the early days it was talked about as having significant utility, but I rarely see it mentioned these days. It has no utility for current NISQ devices since it cannot be practically implemented with low qubit fidelity. The question is whether it will warrant renewed interest once qubits gain substantial fidelity, such as with near-perfect qubits and quantum error correction.**Can quantum advantage be achieved without quantum Fourier transform?**I strongly suspect not, in general. There may be niche cases where minimal quantum advantage or fractional quantum advantage can be achieved, but likely not significant or dramatic quantum advantage, in general.**Is quantum Fourier transform essential to achieving quantum advantage?**See*Can quantum advantage be achieved without quantum Fourier transform?***Can dramatic quantum advantage be achieved without quantum Fourier transform?**I strongly suspect not, in general. There may be niche cases where minimal quantum advantage or fractional quantum advantage can be achieved, but likely not dramatic quantum advantage, in general.**Is quantum Fourier transform essential to achieving dramatic quantum advantage?**See*Can dramatic quantum advantage be achieved without quantum Fourier transform?***Need a roadmap for quantum advantage.**What are all of the milestones for hardware, algorithms, and applications to get to quantum advantage. What are the basic research milestones, distinct from the engineering milestones. And milestones for levels of quantum advantage.**How many nines of qubit fidelity are needed for various sizes of quantum Fourier transform?**Is there a fixed formula, in both directions — nines for QFT size n and QFT size for k nines? Is there a minimum (for, say, a 4-bit or 8-bit QFT)? What is the largest QFT for say 6, 9, and 12 nines of qubit fidelity? Is there any limit for QFT size under quantum error correction (QEC)?**What will quantum computing be?**What will it eventually be when it has evolved and matured to the stage where it can support interesting production-scale applications and achieve dramatic quantum advantage. What will it be, as opposed to what we currently have in current and near-term quantum computing, and not limited by current limitations.**What will quantum computing be like in 5 or 10 years?**See*What will quantum computing be?*Maybe a roadmap for the capabilities which are missing or not fully functional in current quantum computers.**What’s the point of quantum computing as it exists today?**Doesn’t really solve any real, practical, production-scale problems today, let alone with a dramatic quantum advantage. Simply an interesting laboratory curiosity. And shows the current, too-limited state of the art. Only real benefit is to justify the need for funding of additional research. And to keep the dream alive. In the hope that additional research will eventually enable construction of quantum computers and development of quantum algorithms and applications which actually do solve some real, practical, production-scale problems, and with a dramatic quantum advantage.**Need for an alternative to Shor’s factoring algorithm as the exemplar of the best that quantum computing has to offer.**Need a use case that is on the short list of the most promising applications of quantum computing, such as drug discovery, developing new materials, developing new batteries, business process optimization, finance, etc. Cracking 4096-bit encryption keys is not on anybody’s list of why we’re interested in quantum computing. Need a roadmap of incremental and scalable progress from where we are today to actually solving a real, production-scale business problem. Needs to demonstrate dramatic quantum advantage.**Quantum computing won’t have earned its keep until we achieve dramatic quantum advantage across a broad range of applications.****Is there any quantum circuit which can’t be simulated (other than size)?**Granted, quantum circuits larger than about 50 qubits become problematic to simulate due to resource capacity requirements. But functionally, are there any quantum logic gates or combinations of gates which cannot be functionally simulated on a classical quantum simulator? None that I know of, at this time, but… is there some constraint that I don’t know about?**Would there be any significant performance or capacity benefits from a classical quantum simulator which can simulate a designated fraction or subset of a large, complex quantum circuit?**Mostly based on the number of qubits as the primary limiting factor. Substituting fixed values for any references to qubits and their quantum states which would be outside of the designated subset of the full circuit. Also make it easy to cut or partition circuits — snapshot simulated qubit quantum states at outer edge of a cut and be able to feed them in as fixed values for the quantum states at the inner edges of the cut for the adjacent fractional subset of the whole circuit. When might this work, and when might this either not work at all or be significantly suboptimal?**Would there be any significant performance or capacity benefits from a classical quantum simulator based on much more limited precision for the entries of unitary matrices (quantum logic gates) and for qubit probability amplitudes as well?**Or is there some lower limit for precision below which the simulator results would be too-low fidelity to be acceptable or useful? Could scaled integers be used as an alternative to floating-point arithmetic?**Incremental progress is great, but a lot of quantum leaps are desperately needed to achieve dramatic quantum advantage for production-scale practical applications.**How can we accelerate the process? Dramatically ramp up funding of basic research, for one. Put more emphasis on bold, high-risk research projects.**Prescription for forward movement in quantum computing: dramatically dial down commercialization efforts and dramatically dial up basic research in hardware, simulation, programming models, and algorithms.**Commercialization of current technology will not lead to dramatic quantum advantage. Little if any of the current technology will be relevant in 5–10 years. Better to focus algorithm research on expected hardware 2–7 years out. Better to focus on simulating 40-qubit algorithms using higher-fidelity qubits than on smaller NISQ algorithms to be ready to exploit future hardware as it becomes available. Better to push for higher performance and higher capacity classical quantum simulators to enable current simulation of 32 and 40-qubit and even 48-qubits algorithms with deeper circuits. The current programming model is grossly insufficient for commercial consumption.**Contemplate the value of quantum computing for AI.**Beyond current conceptualizations of quantum machine learning (QML). What exactly can be done with quantum parallelism? Exactly what quantum computational mechanisms do we need to fully simulate the human brain and mind or at least a significant fraction?**Quantum computing for health care.**What are the possibilities, besides drug discovery? Anything with 2^n possibilities to evaluate for n>=50.**The BASIC Moment for quantum computing.**Is this actually meaningful? Ala The FORTRAN Moment, but quantum computing for the masses as occurred with the BASIC programming language for classical computers in the 1960’s and 1970’s, even for average high school students or non-elite and non-STEM college students.

Again, this list will be updated on a fairly frequent basis.

And for topics previously on this list, see my list of current writing on quantum computing:

Comments welcome.

# Topics not under consideration

My intention is to remain focused on capabilities, ideas, limitations, issues, principles, and theory, so here are the topics that I have no intention of writing about:

**Hands-on tutorials.****Hands-on details of particular quantum computers.****Hands-on details of particular algorithms.**With some rare exceptions.**Hands-on details of particular quantum circuits.**With some rare exceptions.**Hands-on details of particular class libraries and APIs.****Hands-on details of use of particular programming languages.****Common, popular, and favorite circuit patterns and quantum programming pearls.**I’ll discuss the need, but stay away from the details.**Details of specific algorithmic building blocks.**I’ll discuss the need, but stay away from the details. With some rare exceptions.**Physical details of particular quantum computers.**Including specifications, timing, and performance.**Details of particular unitary transforms or particular quantum logic gates.**With some rare exceptions.**Details of support software.****“How to…”**In general. Some rare exceptions.**Funding and grant announcements.**Unless something especially noteworthy.**Announcements of partnerships and deals.****Press releases.**Unless something especially noteworthy.**Company news.**Unless something especially noteworthy.**Marketing.****Market studies and projections.****Competitive comparisons of vendors.**

# Book(s)?

Every once in a while I contemplate writing a book about quantum computing. Some of my “informal” papers are literally book-length. And there are probably small collections of related papers which conceivably could be of value if published as books.

Some topic areas that might make sense for me as books:

- Introduction to quantum computing.
- What is quantum computing?
- Introduction to quantum information science and quantum effects.
- The Greatest Challenges for Quantum Computing Are Hardware and Algorithms.
- Quantum computing for managers and executives.

So far, reason has prevailed and I have refrained from attempting to turn any of my writing about quantum computing into one or more books.

Still, the thought persists in the darker recesses at the back of my mind.

One of these days… but not any day soon… hopefully.

For more of my writing: ** List of My Papers on Quantum Computing**.