# Essential and Urgent Research Areas for Quantum Computing

Despite the great progress in development of quantum computing in recent years, much research is still very much needed on all fronts of quantum computing, and quantum information science overall, before the field is truly ready for commercialization and deployment of production-scale production-quality practical real-world applications to address production-scale real-world problems and opportunities. This informal paper attempts to highlight and summarize many of the areas of quantum computing which are in most desperate need of research, including theory, basic and pure research, experimental research, and applied research.

These research areas are described informally here, not intending to be directly suitable for formal research grant applications, but simply intended for general discussion about quantum computing as far as how much research is needed before the technology is really suitable for prime time commercial deployment — or even development of prototype experimental applications that at least approximate production-scale production-quality solutions to production-scale practical real-world problems.

My goal with this paper is to highlight and summarize areas where quantum computing is in need of research so that in other papers I can reference this paper to outline what I mean by research in quantum computing.

Just a few sample areas of needed research to give a sense of the landscape:

**More reliable qubits.****Longer qubit coherence.**Handle deeper quantum circuits.**More qubits. Greater density.****Greater qubit connectivity.****Fault-tolerant quantum computing (FTQC).****Quantum error correction (QEC) and logical qubits.****Near-perfect qubits.****Finer granularity of phase and probability amplitude.**Fully support quantum Fourier transform (QFT) and quantum phase estimation (QPE) for a relatively large number of qubits.**New qubit technologies.**But specific technologies are beyond the scope of this informal paper.**More advanced quantum computing architectures.****Modular quantum computers.****Universal quantum computers.**Full integration of classical and quantum processing.**Greater capacity and performance for classical quantum simulators.****More realistic classical quantum simulators.**Need for preconfigured noise models to match current and expected quantum computers, especially two to five years for algorithm researchers.**Debugging capabilities.**How to debug algorithms and applications for more than the qubit and quantum state capacity of even the largest classical quantum simulators is an interesting challenge.**Better programming models.****Richer set of algorithmic building blocks.****Automatically scalable quantum algorithms.****Problem statement language suitable for automatically mapping application problems into quantum solutions.****Quantum programming languages.****Algorithm validation tools.**Especially since more than 50 qubits cannot be fully simulated.**Quantum networking.****Quantum memory.****Quantum storage.**Persistent quantum memory.**Quantum sensing integrated with quantum computing.**Real-time processing with no analog or digital conversion needed.**Quantum computer science.**A whole new field.**Quantum software engineering.**A whole new field.**Cybersecurity.**

And that’s just for starters.

**Topics covered in this informal paper:**

- QSTEM research — setting the context
- How much research is needed before commercialization of quantum computing is advisable?
- Criteria for completion of research sufficient for commercialization
- Research is done when quantum computing is no longer a mere laboratory curiosity
- Reaching The ENIAC Moment for quantum computing — first real application
- Configurable packaged quantum solutions
- Reaching The FORTRAN Moment for quantum computing — widespread applications
- Programming models, programming languages, algorithmic building blocks, and application problem statement languages
- Commercial prospects for quantum computing? Beyond the scope of this paper
- Suggestive rather than exhaustive
- My own questions as well
- Critical gaps
- Areas of greatest opportunity
- Hardware and algorithms are the two greatest challenges for quantum computing
- What big breakthrough would open up the floodgates for quantum computing?
- The ideal qubit technology has not yet been discovered or invented
- We need the quantum equivalent of classical CMOS semiconductors
- Fields of research
- Summary of areas of QSTEM research
- Types of research organizations
- Joint projects and joint programs
- Research vs. engineering
- Engineering and product development
- Research for technology vs. research for products
- Commercial projects should rely on off the shelf technology and research results rather than engaging in fresh or deep research
- Types of research
- Basic research vs. applied research
- Pure research is basic research
- Theory is research
- Theory vs. experimental research
- Fundamental research
- One person’s engineering is another person’s applied research
- Corporate research
- Research to create conceptualizations, engineering to implement them
- Research for conceptualizations which are not readily implemented
- Researchers shouldn’t feel any of the pressures associated with commercialization and product development
- Quantum computing technology vs. products
- Need for reference implementations of all research technology
- The seven main application domains
- Research is needed on all fronts, all areas of QSTEM
- Areas of QSTEM research
- Physics
- Hardware
- Firmware (see: Hardware)
- Need for finer granularity of phase and probability amplitude to support quantum Fourier transform (QFT) and quantum phase estimation (QPE)
- Alternative hardware architectures may be required to support high connectivity for over 64 to 128 qubits
- Hardware support
- Debugging quantum algorithms and applications
- Classical quantum simulators
- Quantum information science in general
- Software
- Quantum software engineering
- Quantum computer science
- Cybersecurity
- Quantum algorithm support
- Special emphasis on NPISQ quantum algorithms
- Quantum algorithms
- Quantum application support
- Quantum applications
- Quantum application solutions
- Quantum general artificial intelligence
- Quantum advantage and quantum supremacy
- Other areas of QSTEM research
- Need for a comprehensive glossary for quantum computing
- Need roadmaps for all areas of QSTEM research
- Need reverse roadmaps for all areas of QSTEM research
- Need roadmaps for all application domains
- Preprints of all papers on arXiv
- Open access to all research papers
- All code, test programs, test data, sample test output, and full documentation on GitHub
- Priority criteria
- Timing — sprint vs. marathon
- Budget — 10X to 50X, for a start
- Need to retain and grow research talent
- Need for a lone physicist with a single great idea
- Calling all grad students and PhD candidates — peruse my list of topics and questions
- Sources to pursue for future auditions to this document
- Living document
- Summary and conclusions

# QSTEM research — setting the context

For the purposes of this paper I will use the contrived term *QSTEM research* to refer to STEM (Science, Technology, Engineering, and Mathematics) research related to quantum computing and the other areas of quantum information science (QIS) — quantum metrology, quantum sensing, quantum communication, and quantum networking.

For an overview of quantum information science (QIS), see my informal paper:

*What Is Quantum Information Science?*- https://jackkrupansky.medium.com/what-is-quantum-information-science-1bbeff565847

**Note:** QIST — short for *quantum information science and technology* — is generally a synonym for QIS — quantum information science. Generally, there is no intention to exclude technology from QIS. Essentially, QIST is redundant. But, in some contexts QIS may be intended to focus on *research* and QIST is intended to emphasize the practical applications of the more theoretical and research aspects of QIS. When in doubt, just use QIS and read QIST and QIS as synonyms.

**Also note:** Some people use the term *quantum science* as a synonym or shorthand for quantum information science (QIS). Similarly, some people use the term *quantum science and technology* as a synonym or shorthand for quantum information science and technology (QIST). Generally, it is better to use the full terms, and to read the shorthand terms as intending to refer to the full terms.

Traditionally, STEM has referred to all of science, technology, engineering, and mathematics.

QSTEM research simultaneously supplements and narrows the scope of research for all of traditional STEM in the following senses:

**Technically all of quantum computing and the other areas of quantum information science are included in STEM, by definition.**In that sense, all quantum research is included in STEM research, by definition.**QSTEM narrows STEM to only those areas which directly relate to quantum computing and the rest of quantum information science.**Hence, QSTEM research narrows STEM research to only those areas which directly relate to quantum computing and the rest of quantum information science.**QSTEM broadens or supplements STEM to include more recent work in quantum computing and the rest of quantum information science that might not have even existed in STEM 20 years ago or in some cases up to 40 years ago.****Finally, most simply, QSTEM can refer to that more recent work in quantum computing and the rest of quantum information science which occurred primarily in the past 20 years as well as some up to 40 years ago.****So, in that last sense, QSTEM research refers to the more recent quantum research of STEM which is occurring at present and in recent years and up to 20 years ago and in some cases up to 40 years ago, as well as quantum research — and related STEM research which may occur in future years.**

That may be a little vague and ambiguous, but it does highlight the issues and boundaries of quantum computing and the rest of quantum information science and the rest of STEM.

In short, the focus and context of this paper is QSTEM research, in all of the above listed senses.

# How much research is needed before commercialization of quantum computing is advisable?

How much more research does commercialization of quantum computing require? That’s a very open question to be pondered. The short answer is that a lot of research is still needed, on all fronts.

This informal paper seeks only to highlight relevant research areas, not to define how much research is actually required before work on commercialization is warranted.

# Criteria for completion of research sufficient for commercialization

Research is a never-ending process, but there is a question of how much research is needed before a solid foundation for development and engineering of commercial products is feasible. Or what criteria should be used to assess the viability of that transition.

Overall, the criteria include:

**The vast majority of open fundamental research questions have been answered.****No big questions of fundamental research are looming.****Engineering of quantum computers as commercial products is a methodical process with little in the way of surprises.****Algorithm design and application development becomes a methodical process with little in the way of surprises.****Engineering and development are no longer trial and error processes.**Or at least there is a drastic and dramatic reduction in trial and error.

# Research is done when quantum computing is no longer a mere laboratory curiosity

Granted, research is a never-ending process, but the great push for more research for quantum computing can at least take a break on the day that enough research is in place that quantum computing is no longer *a mere laboratory curiosity* — when it’s ready for production-scale production-quality real-world practical applications.

For more on laboratory curiosities, read my paper:

*When Will Quantum Computing Advance Beyond Mere Laboratory Curiosity?*- https://jackkrupansky.medium.com/when-will-quantum-computing-advance-beyond-mere-laboratory-curiosity-2e1b88329136

But until that day has clearly arrived, three goals must be pursued with intensive vigor:

**Research.****More research.****Even more research.**

# Reaching The ENIAC Moment for quantum computing — first real application

A key milestone for quantum computing — and QSTEM research — is when engineers are able to reach The ENIAC Moment for quantum computing — the first instance of designing and building a quantum computer which supports a production-scale production-quality practical real-world application. Both hardware and software, including algorithms and the application.

Much additional research will still be required since only the most elite teams will be capable of achieving The ENIAC Moment. The FORTRAN Moment will have to be reached before quantum computing applications can be readily developed and deployed by non-elite teams.

For more on The ENIAC Moment for quantum computing:

*When Will Quantum Computing Have Its ENIAC Moment?*- https://jackkrupansky.medium.com/when-will-quantum-computing-have-its-eniac-moment-8769c6ba450d

# Configurable packaged quantum solutions

It will be a number of years before non-elite teams will be able to design and develop sophisticated quantum applications. In the interim, but after The ENIAC Moment has been reached, *configurable packaged quantum solutions* will enable more non-elite organizations to utilize quantum applications in their operations even though they are technically unable to design and develop such algorithms and applications in-house.

Elite quantum technical teams will develop sophisticated algorithms and applications, but in a way that can be easily configured to adapt them to a wide range of input data and operating environments.

A non-elite team can then customize the algorithms and applications strictly using configuration data, with no need to modify or even look at the underlying quantum algorithms or application code.

These configurable packaged quantum solutions will automatically generate the needed quantum algorithm circuits as needed based on the input data and configuration settings.

This approach will work well for applications which are common across an industry or across many companies and organizations of similar size.

Sure, many organizations will eventually wish to design and develop their own quantum algorithms and applications, but it will be many years before that is feasible. Configurable packaged quantum solutions will offer an important and valuable intermediate stepping stone until the day when non-elite teams eventually can easily design and develop custom quantum algorithms and applications on their own.

# Reaching The FORTRAN Moment for quantum computing — widespread applications

A key milestone for quantum computing — and QSTEM research — is when engineers are able to reach The FORTRAN Moment for quantum computing — non-elite quantum application developers can design, develop, and deploy production-scale production-quality practical real-world applications with relative ease.

Higher-level programming models and programming languages will be needed, including much richer libraries of algorithmic building blocks than currently conceived.

Much additional research will still be required, in much the same sense that much research in classical computing occurred after the mid-1960’s after classical computing had achieved tremendous success, but much more was to come.

For more on The FORTRAN Moment for quantum computing:

*When Will Quantum Computing Have Its FORTRAN Moment?*- https://jackkrupansky.medium.com/when-will-quantum-computing-have-its-fortran-moment-995f510605cd

# Programming models, programming languages, algorithmic building blocks, and application problem statement languages

A significant collection of research tasks will revolve around programming models:

**Quantum programming models.**Plural — different levels, different needs. Higher level for greater intellectual leverage. Lower level for greater performance and control. Common concepts and elements across the various programming models.**Quantum programming languages.**Plural — different levels, different needs. Their conceptual, syntactic, and semantic capabilities. Higher level for greater intellectual leverage. Lower level for greater performance and control. Common concepts and elements across the various programming languages.**Quantum algorithmic building blocks.**High-level primitives which mask much of the complexity of low-level quantum logic gates.**Quantum application problem statement languages.**Allow application developers to describe applications in domain-specific terms which can then be automatically transformed into quantum algorithms.

The essential issue is how the programmer conceptualizes application problems and transforms them into working quantum algorithms and applications.

These four main areas of QSTEM research will largely determine how easy or hard it is to design and develop quantum algorithms and applications.

See the *Quantum computer science* section for many of these research tasks.

# Commercial prospects for quantum computing? Beyond the scope of this paper

What are the commercial prospects for quantum computing? That’s beyond the scope of this paper, which focuses on research.

It is a fair question, but so much really depends on the pace and progress of research, which is inherently unpredictable.

Some interesting timeframes:

**Right now and the next few months.****A year from now.****Two years from now.****Three to four years from now.****Five years from now.****Seven years from now.****Ten years from now.****Fifteen years from now.****Twenty years from now.****Or whenever research has reached the stage where quantum computing is no longer a mere laboratory curiosity.**

I’d suggest that we might see some interesting research results five to seven years from now, but it’s all a great gamble. Just roll the dice… and keep on rolling them.

# Suggestive rather than exhaustive

This informal paper seeks only to highlight and suggest representative areas of research for quantum computing, rather than attempt to be exhaustive.

It’s everything of which I am aware, but there may be many areas of research or potential research of which I am unaware.

# My own questions as well

In addition to generally recognized areas of research interest, this paper also includes a lot of my own concerns and questions which may need deeper research to answer.

Many of my own questions and issues may already have answers, somewhere, but just need to be made accessible and stated in plain language.

After all, the whole point of research is that the results should be readily accessible by engineers and product developers seeking to turn research into products and services.

# Critical gaps

These are the most critical areas where deeper research is most desperately needed for quantum computing:

**More qubits.**Hundreds, thousands. And more.**Greater qubit fidelity.****Longer coherence time.**Support much deeper circuits.**More reliable gate execution.**Needed for greater qubit fidelity.**Need for greater qubit connectivity.**Need for full any to any connectivity.**More reliable qubit measurement.****Need for finer granularity of phase and probability amplitude.**Support quantum Fourier transform (QFT) and quantum phase estimation (QPE).**Larger simulations.**Current 50-qubit limit (actually 40 qubits.)**More realistic simulations.**Simulators aren’t generally configured to closely match realistic quantum computer operating conditions — they’re too ideal or too arbitrary with noise models. Developers should be able to easily run simulations of quantum computers expected to become available in two to five years — or to closely match existing quantum computers or those expected over the coming year.**Lack of debugging capabilities.**How to debug algorithms and applications for more than the qubit and quantum state capacity of even the largest classical quantum simulators is an interesting challenge.**Quantum computing is (currently) far too complicated for all but the most elite of professionals.**To do anything useful, that is. When will that change? Maybe, or in theory, The FORTRAN Moment.**Quantum computing is too different from classical computing.**Difficult to transfer knowledge, expertise, and data between the two domains.**Difficulty of transforming application requirements into quantum algorithms.****Lack of high-level programming models.****Lack of a robust collection of algorithmic building blocks.****Need for much more powerful algorithmic building blocks.****Need for a high-level quantum programming languages.****Whether order-finding (AKA period-finding) can be made practical.****Whether Shor’s factoring algorithm is practical even for much more moderate size integers (64–512 bits.)****Whether Grover’s search algorithm has any practical utility which can ever deliver dramatic quantum advantage.****Dearth of 40-qubit algorithms.****Dearth of scalable algorithms.**Automatically scalable.**Whether dramatic quantum advantage can ever be achieved without quantum Fourier transform (QFT) and quantum phase estimation (QPE).****Whether variational methods can ever achieve****dramatic quantum advantage****.****Need a better collection of example algorithms.**That are clearly scalable and could achieve dramatic quantum advantage when the needed hardware is available.**Practical test case for scaling to dramatic quantum advantage.**Need to identify a practical application with a relatively simple algorithm which is scalable and could achieve dramatic quantum advantage when the needed hardware is available.

To be clear these are gaps related only to research, not education, training, or attracting and retaining talent, etc.

# Areas of greatest opportunity

**Discovery of new qubit technologies.**The ideal qubit technology has not yet been discovered or invented.**Modular quantum computers.**For greater flexibility and greater capacity.**Ongoing research in basic physics.**At the atomic, subatomic, molecular, and condensed matter levels. No telling what discoveries might lead to better qubit technology.**Development of methods for transforming non-quantum application ointo quantum algorithms.****Development of high-level application problem statement languages to which transformation rules can be applied to generate quantum algorithms.****Development of higher-level programming models.****Development of quantum-native high-level quantum programming languages.**Syntax is irrelevant — it’s the semantics which matters.**Development of more powerful algorithmic building blocks.****Better classical quantum simulators.**More realistic. More efficient. Higher capacity. Aim to reach the 50-qubit barrier. Or even the 60-qubit barrier. Closely match hardware expected in two to five years.

# Hardware and algorithms are the two greatest challenges for quantum computing

These are the two greatest challenges to the future of quantum computing: hardware and algorithms. We need much better qubits and much better algorithms to exploit those qubits.

This paper will summarize quite a few of the challenges for both hardware and algorithms, from the research perspective, but for more on those twin challenges in general, see my informal paper:

*The Greatest Challenges for Quantum Computing Are Hardware and Algorithms*- https://jackkrupansky.medium.com/the-greatest-challenges-for-quantum-computing-are-hardware-and-algorithms-c61061fa1210

# What big breakthrough would open up the floodgates for quantum computing?

That’s the really big question. Other than more of everything and better everything.

Generally it would have to be a much better qubit technology. Much more reliable. Full connectivity between qubits. And some reasonable number of near-perfect qubits — hundreds or thousands.

And a high-level programming model is needed. The current low-level programming model is not suitable for use by the masses of average software developers.

# The ideal qubit technology has not yet been discovered or invented

I just wanted to call out and emphasize this most central of realities and opportunities of quantum computing at present:

*The ideal qubit technology has not yet been discovered or invented.*

Fine tuning or incrementally enhancing *any* of the existing qubit hardware technologies just isn’t going to cut it.

We need *entirely new* qubit technologies.

Focusing attention on what we have now is only an unwarranted distraction and a waste of time, energy, money, and human talent.

That said, I can understand the interest in incremental improvement in the near-term so that we have some technology to experiment with while we await the discovery or invention of the ideal qubit technology — *the quantum equivalent of classical CMOS semiconductors.*

All of that said, specific qubit technologies, such as topological qubits are beyond the scope of this informal paper — they would fall under the topics of *Better materials for constructing qubits*, *Better qubit technologies*, *Identify and exploit many additional qubit technologies* under *Hardware*.

# We need the quantum equivalent of classical CMOS semiconductors

That’s the simplest way of putting it — after many generations of hardware evolution, classical computing finally really hit its stride with CMOS semiconductors. Maybe there is something better, but CMOS is good enough for most classical computing tasks.

The holy grail of quantum computing hardware research should be the quantum equivalent of classical CMOS semiconductors.

I’m not suggesting a qubit technology that has anything to do with CMOS per se, but has the same qualities of reliability, performance, power, heat, cost, size, scaling, capacity, flexibility,and ease of design.

# Fields of research

Quite a few professional fields have an interest in quantum computing:

**Fundamental physics.**At the atomic, subatomic, molecular, and condensed matter levels.**Materials research.****Electrical and electronic components.****Insulators, conductors, connectors, couplers.****Mathematics.****Computer science.****Electrical engineering.****Computer engineering.****Software development.**

# Summary of areas of QSTEM research

A very brief summary of the areas of QSTEM research:

**Quantum states.****Qubit technologies.****Qubit programming model.****Qubit control.****Qubit entanglement.****Algorithm building blocks.****Algorithms.****Application building blocks.****Applications.****Solution building blocks.****Solutions.****Areas of quantum information science other than quantum computing.**Including sensors, metrology, networking, and communication.

# Types of research organizations

A variety of organizations are involved in research.

**Academic.**Major universities.**Government.**Government labs. Such as the Department of Energy National Laboratories and NIST. Sponsoring agencies such as DOD, DARPA, NSF.**Large corporation — Technology orientation.****Large corporation — Non-technology orientation.****Nonprofit research organization.**Such as Mitre and Battelle.**Embedded within commercial projects.**Filling gaps, focused on the specific needs of the application project. For example, electric and autonomous vehicles and batteries. Some smaller companies, but larger companies as well.**Technology startups.**Generally venture capital is more focused on development of commercial products and services, but it is not uncommon for such efforts to include a fair amount of research.**Joint projects and joint programs.**Two or more partners from any of the above.**SBIR grants.**Small Business Innovation Research program of the U.S. government. Smaller technology firms. More focused on transition from basic research to commercialization, but not necessarily actual product development and sales.

# Joint projects and joint programs

Research programs can be quite expensive. And not every research organization can afford or attract all of the necessary talent. Joint projects and joint programs serve two distinct purposes:

**Broader scope.**Allow a given organization to participate in more research projects than they could afford or attract talent on their own.**Division of labor.**Leverage resources and talent across organizations to build a larger critical mass of resources and talent to support larger and more ambitious projects. Even if a given organization could tackle a project on its own, a joint effort allows the joint project to capitalize on a more optimal pool of talent — different organizations may have different specialties or areas of interest, so the joint project can tap higher quality talent than if the talent came from only one of the organizations.

Some of the common combinations:

**Academic — academic.**Different universities collaborate for larger projects.**Corporate — academic.**Companies may have applied research or application expertise that helps to focus academic research expertise.**Academic — corporate.**Universities may have intellectual property from research that they seek to commercialize.**Government — academic.**Government interests, such as defense, intelligence, and energy, can tap academic research expertise.**Government — joint academic.**Government explicitly requests proposals for multiple universities to collaborate to submit applications for joint project grants.**Corporate — corporate.**Two or more companies collaborate on research to leverage their respective areas of expertise.

# Research vs. engineering

There isn’t always a clear distinction and boundary between research and engineering. I would put it simply as:

- Research doesn’t have a focus on products and commercial offerings.
- Engineering has an intense focus on products and commercial offerings.

# Engineering and product development

Engineering is more closely associated with *product development* — development of *commercial products* — than research.

# Research for technology vs. research for products

The primary research emphasized by this informal paper is research that is not directed to a specific outcome in the form of a commercial product — I call that *research for products*. Rather, the focus here is *research for technology* which is independent of specific commercial products and focused on generic technology which could apply to a wide range of products.

That said, all research is inherently good, whatever its intended use, and hence encouraged by this paper.

# Commercial projects should rely on off the shelf technology and research results rather than engaging in fresh or deep research

Doing research for a specific product is very risky — any given research project is not guaranteed to achieve a successful outcome or complete in a reasonably short timeframe. Commercial projects should rely primarily on off the shelf technology that is proven and can be relied upon, and is reasonably predictable, especially in terms of time and resource requirements.

Very small, focused, and niche research mini-projects can sometimes be appropriate for commercial projects, but the risk has to be managed very carefully, with adequate allowances for backup plans if the research fails.

Commercial projects should endeavor to be 99% based on off the shelf technology and no more than 1% on executing new research projects.

Especially when fast-paced professional venture capital is involved, since VCs tend to be very impatient, preferring to see projects be completed within a year or two max.

Generally, trial and error tends to be used for purported research in commercial product projects — try something and see if it sort-of works, if not, try something else, rinse and repeat — more madness than method. True, methodical research is generally too time-consuming and resource-intensive for commercial projects.

# Types of research

Research is not a one-size-fits-all endeavor — there are several distinct types of research:

**Theoretical research.**Theory. No physical solutions or lab work.**Pure research.**Not directed at solving any particular problems. Focus is on understanding phenomena.**Experimental research.**Lab work, as distinct from theory.**Basic research.**Distinguish from applied research.**Fundamental research.**Foundation for engineering and design and development of solutions suitable for use in the real world.**Applied research.**Focus on solving specific problems of interest for real-world applications.

# Basic research vs. applied research

There isn’t always a clear distinction and boundary between *basic research* and *applied research*. This paper treats them as *synonymous*, even though in theory there should be some distinction.

The technical distinction is the same as the research for technology vs. research for products distinction:

**Basic research is more***technology-oriented*.**Applied research is more***product-oriented*.

# Pure research is basic research

There can be some confusion as to the distinction between *pure research* and *basic research*. This paper treats them as *synonymous*, even though in theory there should be some distinction.

The only distinction I can think of that’s relevant is that *pure research* has no prescribed end goal or target results, while *basic research* might be driven by a particular problem which requires a solution.

Either way, the actual *work* would be the same. They both serve as a foundation, upon which further research or product engineering can occur.

# Theory is research

There can be some confusion as to the distinction between *theory* and *research*. This paper treats them as *synonymous*, even though in theory there should be some distinction.

The important aspect of theory is that it’s about ideas and concepts, as well as mathematical and possibly computational models, but not focused on putting theory into *action*, which occurs as experimental research, and later engineering and product development.

# Theory vs. experimental research

Much of theory is simply *on paper* or in *computer models*, while *experimental research* is more clearly work in the *lab* using *experimental apparatus* coupled with *computational support*, possibly including some degree of *computer models*.

# Fundamental research

Whether considered theory, pure research, basic research, or applied research, the effect is that *fundamental research* is any research that is needed as a *foundation* for engineering and design and development of commercial products.

# One person’s engineering is another person’s applied research

The distinction between engineering and applied research can sometimes be quite murky and difficult to discern. It’s no surprise that the two can get confused and even used interchangeably.

I would draw a pair of distinctions:

**Engineering requires a clear, straight path forward.**No great mysteries, just lots of hard work.**Applied research builds on theory, basic, pure, fundamental research and seeks to find pathways to productive use of that research.**This may ultimately lead to engineering and product development, but the focus is on identifying and characterizing mysteries in a way that leads to ultimately identifying opportunities for solutions, but the solutions are really simply guides for research rather than the end result of the research itself.

But sometimes a little applied research is needed for advanced engineering projects. Just as long as care is taken to not let the research become the tail wagging the dog.

Similarly, care needs to be taken that applied research projects don’t get distorted into product engineering projects. Let researchers focus on research rather than engineering and development, and let engineers and developers focus on engineering and development rather than become distracted by research questions.

All of that said, the exact same technical task can be labeled as engineering if it’s performed by an engineer or developer, but labeled as research if performed by a researcher.

So we end up with this unfortunate pair of truths:

**A research task is considered engineering if performed by an engineer or developer.****An engineering or development task is considered research if performed by a researcher.**

In any case, the point of this paper is to emphasize the work of researchers, even if it occasionally veers into engineering and product development, as long as the intended purpose is to further a research project.

After all, many or most complex research projects inevitably involve some substantial degree of engineering and development. For example, the Large Hadron Collider by CERN, clearly a research project, involved a vast amount of intensive mechanical engineering, as well as much software development, which also led to the development of the World Wide Web by Tim Berners- Lee. Clearly that had practical applications, but it was still focused on delivering value to the LHC research program.

# Corporate research

Generally, research refers to academic and government labs, and some private and nonprofit labs. Sure, quite a few large corporations do research as well, but it tends to be more focused on applied science and oriented towards products rather than more basic, pure, and theoretical research.

Still, some amount of basic research of relevance to quantum computing does occur within large corporations.

# Research to create conceptualizations, engineering to implement them

The real goal of any robust research project is to develop a conceptualization or model for some phenomenon which is detailed and robust enough to be ready to hand off to an engineering team to be realized or turned into technology suitable for a product, an implementation of the conceptualization.

As such, a fair amount of the research areas listed in this informal paper are focused on creating conceptualizations.

# Research for conceptualizations which are not readily implemented

In some cases the implementation of a conceptualization is reasonably straightforward, requiring no more than traditional engineering processes and methods. Research is complete.

But in other cases there is no reasonably straightforward implementation that has acceptable performance — speed and capacity — and minimal complexity. In such cases, it will then be necessary to perform an additional stage of research focused on understanding and addressing implementation issues. It may be that an intermediate level of technology may be required before the underlying overall conceptualization can be implemented using the technology produced by that intermediate research.

And this process may unfold for any number of levels, a daisy chain of research projects, or a layered onion if you prefer, especially for very complex projects.

# Researchers shouldn’t feel any of the pressures associated with commercialization and product development

Researchers should be able to focus solely on their research, not needing to worry in the slightest about what might be happening in markets or commercial products.

Engineering and development of commercial products can entail a lot of pressure, none of which should be experienced by professional researchers.

Engineering and product development tends to have a much shorter timeline, typically measured in months or a year or two. Researchers should be able to focus on longer-term goals, three to five to ten or even twenty years, without any of the pressures of short-term commercialization efforts.

# Quantum computing technology vs. products

It is important to distinguish the raw technical capabilities of quantum computing technology versus the features and brands of products that are delivering the capabilities of the technology to users, or simply happen to use quantum computing technology under the hood and may not even be billed as quantum per se.

The focus of this paper is on the research which contributes to the raw technology and its capabilities, distinct from the use or marketing of that technology.

# Need for reference implementations of all research technology

A key result of virtually every research effort should be to deliver a *reference implementation* of the technology, preferably in GitHub. The intent is not to deliver production-quality code and hardware designs, but to supply a *baseline implementation* which production designers and developers can use as a guide to their own work.

This should include both code and clear and complete documentation.

And test data and sample test results as well.

This will assure that research results really are *ready to go*, ready for design and development of production quantum technology — hardware, software, algorithms, tools, algorithmic building blocks, applications, and solutions.

# The seven main application domains

Although much of QSTEM research applies uniformly across all application domains, algorithm and application research will frequently be focused on specific application domains. At present, the seven major application domains are:

**Simulating physics.****Simulating chemistry.****Material design.****Drug design.****Business optimization.****Finance.****Machine learning.**

There will undoubtedly be additional domains as research and implementation progress.

There is the potential for optimizing hardware around specific application domains or even particular applications as well.

Firmware optimization is possible as well.

# Research is needed on all fronts, all areas of QSTEM

Despite the great progress in development of quantum computing in recent years, much research is still very much needed on *all fronts* of quantum computing, and quantum information science overall, before the field is truly ready for commercialization and deployment of production-scale production-quality practical real-world applications to address production-scale real-world problems and opportunities.

Now to visit those fronts, the frontlines…

# Areas of QSTEM research

Here are the specific areas or categories of areas of research for quantum computing covered by the rest of this paper:

**Physics.****Hardware.****Firmware (see: Hardware).****Hardware support.****Debugging.****Classical quantum simulators.****Quantum information science in general.****Software.****Quantum software engineering.****Quantum computer science.****Cybersecurity.****Quantum algorithm support.****Quantum algorithms.****Quantum application support.****Quantum applications.****Quantum application solutions.****Quantum general artificial intelligence.****Quantum advantage and quantum supremacy.****Other areas of QSTEM research.**

**Note:** There may be some duplication of research efforts across research areas since some efforts fit into multiple areas.

# Physics

Research efforts for physicists:

**Condensed matter research leading to materials useful for qubits.****Planck-level limits on precision or granularity of probability amplitude and phase angle.****Is there a Planck-type limit to the amplitude of a state in the wave function of a quantum system, such that there is a maximum number of states in a quantum system?****Are real and complex numbers a sound basis for physics at the quantum level?**Or even at the Planck level? Real numbers don’t seem to accurately capture the essence of quanta. Real numbers work well for more macroscopic approximations, but not for discrete, quantized values. Real numbers work well for macroscopic quantities, but not for discrete quantities, such as a single photon (currently being considered for photonic quantum computing.) What is the appropriate numbering system for quantum mechanics and the Planck level? What numbering system is appropriate for probability amplitudes and phase angles, again focusing on their discrete, quantum character?**Minimum size of a qubit.****Maximum size of a qubit.****Minimum separation between qubits.****How might string theory impact quantum computing or quantum information science in general?****How accurate and reliable can a quantum computer be if entries in a unitary matrix are irrational?**Such as an H gate with entries which are 1 divided by the square root of 2, an irrational number. How much precision is needed? Will it ever be enough?**Physical representation or manifestation of product states — the underlying physical phenomenon.**The assertion is that the product states of entangled qubits cannot be represented as simply the sum of the quantum states of the individual qubits, so how are product states represented in physics — what is the underlying physical phenomenon that supports entangled product states? How does entanglement occur? How can entangled quantum systems (qubits) be un-entangled?**Does quantum computing fully support all computational requirements for simulating physics?**If not, what subset is fully supported, or what subset if not fully supported? After all, simulation of physics was Feynman’s original motivation for pursuing quantum computing.**What is the physical process to flip the spin of a qubit?**Or even to shift the probability amplitude between the two spins. A little helpful explanation of what’s really going on under the hood might make it a lot easier to understand and utilize quantum computing.**Is quantum computing an emergent phenomenon?**Is it a natural consequence of the quantum effects of everything?**How does quantum interference really work, under the hood?**What is the underlying physics?**Can we control a photon’s quantum state?**Or can we capture enough of its state with a matter interaction, do the control, and then regenerate a fresh photon with the updated state?**Squeezed states of light.**What are they and how can they be used for quantum computing?**How much energy is needed to represent a single qubit?****How much energy is needed to represent each basis state of a superposition in a qubit?****How much energy is needed to represent each product state of a collection of entangled qubits (each in a superposition)?****Is superposition essentially free and cheap, or inherently costly?****Why is spin so much more appealing for quantum computing?**Are there more appealing alternatives?**Do the ambitions of quantum computer scientists exceed the available physics?**For example, expectation of very fine granularity of probability amplitudes and phase angles to enable quantum Fourier transform (QFT) and quantum phase estimation (QPE). Also, very large number of simultaneous product states — 2^n, where n could be 20, 50, 100, 500, 1000, or more — far more simultaneous product states than the number of particles in the universe.

For reference, Feynman’s original vision of quantum computing:

- Feynman, R.P. Simulating physics with computers. Int J Theor Phys 21, 467–488 (1982).
- https://doi.org/10.1007/BF02650179
- Received 07 May 1981, Issue June 1982

# Hardware

Includes both raw physical hardware as well as firmware and control logic, as well as the basic programming model supported by the hardware, analogous to the instruction set of a classical computer. Includes *quantum networking*. Includes all research to support *quantum computer engineering* and *quantum computer science* as well.

**How specifically is quantum computing better than classical computing?**Quantum parallelism: exponential speedup. Trivial to generate true random numbers, rather than merely pseudo-random numbers or need for specialized hardware. Anything else? Does superposition yield any advantage other than enabling quantum parallelism? Does phase offer anything special?**How much physics do you need to know to understand the basics of quantum computing?****How much physics do you need to know to master quantum computing?****How much quantum mechanics do you need to know to understand the basics of quantum computing?****How much quantum mechanics do you need to know to master quantum computing?****How much linear algebra do you need to know to understand the basics of quantum computing?****How much linear algebra do you need to know to master quantum computing?****How can knowledge of physics, quantum mechanics, and linear algebra be minimized for quantum computing?****What are pure states and mixed states?**And density matrix and density operator. Do they relate to superposition, entanglement, or both? What’s the point? Why should we care? What are they good for? How do we use them? FWIW, I rarely see mention of them. What are the proper terms to use when the probability amplitudes of the basis states of a single, unentangled qubit are 0.0 and 1.0 vs. neither 0.0 nor 1.0 — pure and mixed states, or some other terms? Seeby Scott Aaronson.*Introduction to Quantum Information Science Lecture Notes***What are the theoretical limits of quantum computing?**Even with ideal hardware.**What are the practical limits of quantum computing?**How far can the hardware go in the relatively near future (5–10–20 years)?**Models for quantum computing hardware.**Simple bare hardware. Complex software and optimization over simple hardware. Complex firmware over simple hardware. Simple software over complex hardware, with macro operations in firmware and hardware. Multiple parallel hardware — multiple shots of the same circuit in parallel (2, 3, 4, 16, 64, 256, 1024), possibly using simple voting to select the dominant result. More reliable qubits vs. error correction/mitigation. Etc.**Better programming models.**Higher-level. Or possibly even lower-level if that facilitates compilation of macro operations (ala classical RISC instruction sets.)**Modest improvements in programming models.**Even if truly high-level programming models are substantially more difficult to achieve.**How much can a quantum computer compute without recursion?**Seems like a rather severe limitation.**Quantum computing needs higher-level logical operations rather than raw physical operations.****More qubits.****Better qubits overall.****Which will come next: more qubits or higher quality qubits?**Either way, we need near-perfect qubits, even for quantum error correction. Quality would help a lot. Not clear that algorithms are capable of utilizing more qubits yet, especially if better connectivity is needed. Will ion traps reach commercialization soon or have a slower slog towards a higher number of qubits?**Better materials for constructing qubits.****Greater qubit fidelity.****What qubit fidelity is needed to do anything useful with a quantum computer?****Faster gates.****Longer coherence time.**Support much deeper circuits. Simply extending coherence time vs. innovative approaches, such as a repeater or amplifier to boost coherence time. But how does that work for multi-qubit product states of entangled qubits?**Issues related to various circuit depths.**It may simply be coherence time, but various depths of circuits might encounter instinct limitations. Circuit depth ranges to be considered include 25, 50, 75, and 100 gates, 200, 500, 750, and 1,000 gates, thousands, tens of thousands, hundreds of thousands, and even millions or even tens of millions of gates.**Better couplers.**Better isolation. Better coupling.**Better connectivity in general.****Is nearest-neighbor connectivity a major limitation, just an annoyance, or a non-problem?****Need for novel qubit connectivity schemes.**Anything to move towards full any to any connectivity.**How can we get past nearest-neighbor connectivity?**Without SWAP networks. Some sort of bus for quantum state?**Are there alternatives to SWAP networks to overcome limited connectivity?****Are there limits to SWAP networks?**Especially for high qubit count quantum computers. And especially for high-connectivity algorithms, such as those using quantum Fourier transform (QFT) and quantum phase estimation (QPE) for a large number of qubits.**Feasibility and benefits of implementing SWAP networks in firmware.**Can they be optimized for the particular hardware?**Implement quantum phase estimation (QPE) and quantum Fourier transform (QFT) as discrete (parameterized) firmware operations.**Permit algorithm designers and application developers to use QPE and QFT as simple, atomic black boxes rather than a blizzard of gates. Optimize any SWAP networks needed for connectivity within the firmware and hardware.**Implement higher-level operations in firmware.**Such as quantum Fourier transform (QFT) and quantum phase estimation (QPE). Optimize higher-level operations for the specific hardware, without the need for a blizzard of discrete quantum logic gates at the source level.**More reliable measurement of qubits.**Lower error rate.**Are topological qubits practical?**Or will they continue to be elusive for longer than most people are willing to wait.**Better qubit technologies.**Topological qubits, etc. Detailed technologies would be beyond the scope of this informal paper. We need at least 15 or 25 projects for alternative qubit technologies — qubits which are higher reliability, more accurate and precise, smaller, cheaper, and faster.**Identify and exploit many additional qubit technologies.**Ideal qubit technology has not yet been discovered or invented.**Is there any material or substance that can’t be used as a qubit?**A grain of salt, sugar, or sand? A drop of water? A small ball bearing, BB pellet, or lead shot? A snippet of copper wire? A capacitor? In theory, anything could be used to build a quantum computer (or qubit) — everything is based on the same fundamental quantum mechanics. Isolation, control, coherence, and measurement are the gating factors. Since the natural state of the world is quantum, “*Almost anything becomes a quantum computer if you shine the right kind of light on it.*” — MIT physicist Seth Lloyd**What criteria must be met to create a qubit?**What exactly enables superposition? What exactly enables entanglement? Interference? Phase rotation? Basis amplitude rotation? What requirements must be met to enable measurement?**Whether qutrits, qudits, qumodes, or other variations on qubits have any significant value.****Advances in quantum materials.**Conductors. Insulators. Resonators.**Faster qubits.****Greater coherence for qubits.****More reliable qubits in general.****Incremental evolution of existing qubit technologies.**Until we have dramatically superior technologies, this is all we can do.**More reliable quantum logic gate execution.****Handle deeper quantum circuits.****Fault-tolerant quantum computing (FTQC).****Quantum error correction (QEC).**More approaches. More novel, out of the box thinking. More stages or phases or levels of support, rather than one massive long-delayed boil the ocean solution.**More efficient quantum error correction.**Fewer physical qubits.**More accurate and precise quantum error correction for phase and probability amplitude.**Support very fine granularity of phase and probability amplitude, which is needed for quantum Fourier transform (QFT), quantum phase estimation (QPE), and quantum amplitude estimation (QAE) for a larger number of bits. Theoretically and practically, how many gradations can be supported?**Near-perfect qubits.**Rather than wait for and depend only on full quantum error correction and logical qubits. Between three and five nines of reliability — 99.9%, 99.99%, or 99.999%. See my paper below on*Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits*.**Will quantum error correction necessarily reduce or eliminate measurement errors?****Should quantum error correction really be called quantum error reduction since there will always likely be at least some tiny residual error rate?****What might lurk beyond surface codes for quantum error correction?****When will algorithm designers and application developers no longer be working directly with physical qubits?**What will be technically necessary to build enough logical qubits that a large number of near-perfect physical qubits is no longer more appealing?**Better, more refined, simpler, and more sophisticated error mitigation techniques to use with near-perfect qubits before full quantum error correction is fully available at an acceptable logical qubit qubit capacity.**Enable The ENIAC Moment without quantum error correction. What can be done automatically in firmware or what can be done via smart compilation.**Fault tolerance issues in general, aside from pure quantum error correction.****What is the technical definition of a quantum processor?**How does that differ from a full quantum computer? Block diagram please.**Are quantum processor and quantum processing unit (QPU) identical terms or is there some distinction?****Greater capacity for QPU.**More qubits for a single QPU. Possibility that there may be multiple QPUs in a single quantum computer.**Modular QPU.****Are the ideas for modular quantum computers from Kielpinski, Monroe, and Wineland, et al (2002, 2012) still valid and should be pursued or updated?**See papers (2002, 2012) below.**Separate the storage and processing aspects of qubits.**Difficult to do all aspects of processing each qubit in situ, especially entanglement and measurement.**Central QPU (CQPU).**With a quantum bus to shuttle quantum state from qubit quantum memory to central QPU. much less wiring and logic for each qubit. Could be multiple central QPUs and multiple buses.**Quantum minicomputer.**In the spirit of classical minicomputers, microcomputers, and UNIX, research and develop the simplest and cheapest implementation of a minimal set of qubits(say 8–12, 16, or 24–32 qubits max, for now, but five years from now 40, 64, or even 128 or 256 or 1024 qubits but be considered minimal.) May be slower and less reliable, but much simpler and cheaper. See*central QPU*.**What will be the smallest quantum computer that achieves dramatic quantum advantage?**Both in terms of physical form factor and number of qubits.**Quantum networking.**Between quantum computers and QPUs. Networked quantum states.**Cybersecurity for quantum networking.**Vulnerabilities. Inherent defenses. How to defend. How to attack.**Would it make sense to have a multiprocessor quantum computer?**Could run multiple quantum programs simultaneously, or multiple variations (ansatze) simultaneously for variational methods. How many parallel processors? 4, 8, 16, 64, 128, 256, 1024, 4096?**Tightly-coupled QPUs.**Multiprocessor with higher performance, but more difficult to manage and coordinate.**Loosely-coupled QPUs.**Multiprocessor but not as high performance, but easier to manage.**Potential for systolic array processing.**May overlap with or replace CQPU — see above.**Would the D-Wave quantum annealing quantum computer be considered a systolic array processor?**What would that mean in terms of potential applications? How can application problems be mapped to systolic computations?**Quantum area networks (QAN).**Goal is modular quantum computing systems. Limit to how many qubits can fit in a micro-area (1 mm or 1 cm?). Need ability to “shuttle” quantum state across relatively short distances, a few mm, a cm, maybe a few cm, or maybe a few feet, or even 10–20 feet to enable two or more modular quantum systems to be combined into a single quantum computer. Maybe some way to daisy chain QPUs to have even a large data center complex as a single quantum computer system. Distinguish “moving” entirety of a qubit quantum state vs. enabling two-qubit gates for entanglement between two adjacent modules.**Quantum local area networks (QLAN).**Somewhat greater distance than a quantum area network (QAN), such as an entire building and even adjacent buildings.**Quantum Internet.**What exactly is it? Is it simply a special instance or application of quantum networking, or a distinct concept. Regular Internet using quantum links? Quantum channels? Quantum hubs — store and forward? Distinguish key distribution and fully-networked quantum state.**How consequential is the length of interconnections between qubits?**Does packing them much more densely yield a large benefit? Does separating them a modest amount have a large penalty?**Distance scales for interconnecting quantum processing elements.**We need decent terminology and scales of reference when discussing the interconnection of quantum processing elements, whether they be individual qubits, modules of qubits, or distinct quantum computers, from 1 angstrom to 50,000 miles. This covers*quantum networking*as well as individual modular quantum computer systems. Both transfer of quantum state and use of entanglement, and physical shuttling of qubits.**Physical shuttling of qubits.****Shuttling of quantum state.**Whether shuttling physical qubits or transporting quantum state between physical qubits over some distance.**Might wafer-scale integration (WSI) have any significant benefit, performance or otherwise?****Quantum memory.**Larger volume of qubits.**Quantum storage.**Much larger volume of qubits.**Persistent quantum memory.**Persist for extended periods of time, even without power.**Quantum processor vs. quantum computer — where does memory fit in?**Currently, qubits are the only memory for quantum information in a quantum computer, and all qubits reside in the quantum processing unit (QPU). This contrasts with a classical central processing unit (CPU) where only registers are in the CPU and all memory and storage is external to the CPU. Current quantum computers have only registers (qubits) with no quantum equivalent to either the main memory or mass storage of classical computers. But this begs the question of how quantum computers will evolve once the concept of true*quantum memory*enters the scene.**Quantum tomography.**Greater access to nondestructive reading of quantum state. How this relates to measurement.**Tomography for entangled qubit product states.**Reading individual qubits does not indicate the overall quantum state for entangled qubits.**Quantum sensors.**Quantum sensing.**Integration of qubits and quantum sensors, integrating quantum computing and quantum sensing.**Integrating quantum computing with real-time quantum sensing. Quantum image processing. Quantum audio processing.**Quantum effectors.****Quantum robotics.**Quantum sensors, quantum computing, quantum AI, and quantum effectors.**Other aspects of quantum information science.****Alternative methods of quantum computing, beyond the pure gate model.**Adiabatic quantum computing. One-way or measurement based quantum computing. Continuous-variable quantum computing. Photonic approaches.**Potential for digital/analog hybrid quantum algorithms.**Best of all three worlds. Especially when approximate results are acceptable.**Photonic quantum computing.**How does it really work? How is it different from non-photonic quantum computing? How are gate operations performed if photons do not interact (since they obey BES — Bose-Einstein Statistics)?**Is photonic computing the only truly viable path to large-scale general purpose quantum computing?**Room temperature operation. Simple, reliable fabrication. Resistance to errors and noise. Application to Turing machines as well — one technology for both. Lower power — much lower power.**Hybrid computing.**Including blending of classical computing and quantum computing.**Better conceptualization of a quantum computer as a coprocessor.****Universal quantum computer.**Supercede and combine both classical computing and the current conception of quantum computing. See my paper, below.**How much of Turing machine capabilities can be effectively integrated with quantum computing, particularly quantum parallelism?****What problems can a quantum computer compute that a Turing machine cannot compute — at all?**True random numbers. Possibly some aspects of human intelligence — creativity (based on true random numbers?)**Can finite state automata and pushdown automata be simulated or just implemented on a quantum computer?****Need technical criteria for what computations are possible on a quantum computer.**See my paper, below.**Consider expansion of what computations are possible on a quantum computer.****Consider possibilities of adding capabilities for I/O, file systems, database access, and network access for quantum computing.****Quantum computer/processor as a feature of a universal computer rather than as a separate computer.**Coprocessor vs. just additional instructions. Special data types vs. an attribute of existing data types. QRAM as a superset of DRAM — binary 0/1 plus quantum states and quantum operations.**QRAM.**A combined hybrid superset of DRAM and qubits. A single memory which can be read and written classically as binary 0 or 1 plus operated on with quantum logic gates to build quantum states.**QISC — CISC vs. RISC for quantum computing.***QISC = Quantum Instruction Set Computer*. How simple or complex should quantum operations be? QFT as a single operation — or even simpler qubit gates to decompose QFT faster and more efficiently?**How can we best make progress towards a universal quantum computer which combines and fully integrates quantum processing and classical processing?**Can we expand quantum parallelism to include the full capabilities of a classical Turing machine, or not, or what subset of a Turing machine?**Support for a variety of data types at the hardware or firmware level.****Could you implement processes and threads on a quantum computer?****Can quantum computers ever completely replace classical computers?**Prospects for a universal quantum computer which does exactly that.**Can quantum computing ever expand beyond being a mere coprocessor for narrow computations?**Complex logic, rich data types, complex data structures, Big Data, functions, classes/objects, I/O, databases — all while under quantum parallelism.**Future of quantum computer/processor as a feature of a universal computer rather than as a separate computer.**Coprocessor vs. just additional instructions. Special data types vs. an attribute of existing data types.**How might quantum computing fit in with Kurzweil’s Singularity?****Should quantum applications even need to be aware of qubits, even logical qubits, or should higher-level abstractions be much more appropriate for an application-level quantum programming model?**And support fully and efficiently in firmware even if the underlying hardware is qubits.**Support for finer granularity of phase and probability amplitude.**Fully support quantum Fourier transform (QFT) and quantum phase estimation (QPE) for a relatively large number of qubits. See paper below.**How precisely can a probability amplitude or probability or phase be estimated?**Can 0.0 or 1.0 or even 0.50 be precisely achieved? Or is there always a tiny epsilon of uncertainty down at the Planck level — 0.0 plus epsilon, 1.0 minus epsilon, 0.50 +/- epsilon? Same or different epsilon for probability amplitude and probability? Separate epsilons for the real and imaginary parts of a probability amplitude? Epsilon as a Planck probability? How tiny? Also for minimum difference between two probability amplitudes. Even for 0.50, how close can two probabilities be? Has anyone else written about this?**Room temperature operation.****Increasing qubit density within a given volume of cryostat.****Reducing volume of cryostat needed for a given qubit density.****Pushing more of the control electronics into the cryostat to decrease volume of wiring to external control circuits and net reduction of volume of cryostat.**Such as Intel’s efforts.**Net decrease of cryostat volume while driving up qubit capacity.****Improved shielding to reduce environmental interference.****Better classical quantum simulators.**Higher capacity, higher performance, and more realistic — more closely matched to the operating conditions of current and expected real quantum computers. See the separate section,*Classical quantum simulators*.**Handle classical simulation of much larger quantum circuits.**More qubits. Deeper circuits. Even up to a data center the size of a football field.**Co-design of hardware and algorithms or applications.**Good thing, great thing, or a distraction? What are the relative merits vs. the relative downsides? A cooperative, collaborative effort, but the hardware team may have competing and conflicting requests and even commitments so that the application doesn’t have final say.**Potential for application domain-specific hardware design.**General-purpose hardware has its benefits, but specialization and optimization for high-value application domains can have significant value, especially while general-purpose hardware capabilities are somewhat limited.**Application-specific hardware.**Beyond optimizing for an entire application domain, optimize for a specific application. Might an application-specific quantum computer lead to the greatest breakthrough in quantum computing? The available hardware is not up to the task — does not meet the needs of the high-value application. Make hardware captive to the needs of the application, not a mere negotiation between two parties at arms length.**Potential for application-specific qubit topology.**Custom connectivity.**Necessity is the mother of invention: Let application requirements drive the design of the quantum computer.**The designs of all of the early classical computers were driven by the intense needs of particular high-value applications. Bell Labs, Harvard, Zuse, Atanasoff, Colossus, ENIAC, EDVAC, EDSAC, SEAC, Whirlwind, MANIAC, SAGE, et al. But somehow there was still an intense interest in building a fairly general purpose machine. Similar in spirit to co-design.**Automated calibration.**Eliminate user concerns about calibration and drift.**Focus on****The ENIAC Moment for quantum computing****as to what hardware capabilities are needed as a top priority.**What hardware capabilities are in fact needed?**Are current conceptions of quantum computing and programming models sufficient for applying quantum effects to general AI and simulation of the human brain/mind?****Debugging capabilities.**Besides resorting to classical quantum simulators. How to debug algorithms and applications for more than the qubit and quantum state capacity of even the largest classical quantum simulators is an interesting challenge.**What are the theoretical limits of quantum computing?****What are the practical limits of quantum computing?****How perfectly random are quantum random numbers?**Are the probability amplitudes actually exactly equal, especially since they are based on the square root of 2, which is a non-exact irrational number?**How does phase work for entangled product states?**Maybe a simple definition problem, but then it needs to be defined. Does each qubit have a phase? Does each product state have a phase?**How accurate and reliable can a quantum computer be if entries in a unitary matrix are irrational?**Such as an H gate with entries which are 1 divided by the square root of 2, an irrational number. How much precision is needed? Will it ever be enough?**What quantum magic allows the SWAP gate to work — it seems like a sleight of hand?**Okay, it’s three CNOT gates, but how does that get around the no-cloning theorem? Interesting analogy to using three XOR instructions to swap two values in classical computing. What is the quantum physics that makes SWAP work? Especially if the two qubits might be entangled with two or more other qubits.**How does an X gate actually change a basis state?**From |0> to |1> or vice versa. What exactly is happening at the phenomenological, physics level?**How does an H gate actually create a second basis state?**Not the matrix math, but at the physical phenomenological, physics level. And how does it give the two basis states equal probability amplitudes? And how does it do the reverse — destroy one of the two basis states?**What are Ising coupling and Ising gates?**And how can they be used in quantum computing? In plain language. What higher-level operations can they facilitate.**What aspects of quantum computing are in fact truly deterministic?**Are there any? Or is everything quantum inherently probabilistic and uncertain? Is entanglement partially deterministic in the sense of the relationships between states, such as Bell states — always the same or always opposite? Unitarity — sum of probabilities is 1.0 — is deterministic, in some sense.**Can it ever be reasonable for a quantum computation to presume or require two probability amplitudes to be absolutely identical?**Given the inherent uncertainty of quantum mechanics, I presume the answer is or should be No, but… maybe there can or should be the notion of*close enough*, within some delta window. Should it be considered a*bug*if a quantum circuit or classical code examining quantum results appears to depend on absolute equality?**Is quantum computing still inherently binary?**Granted, there are qutrits, qudits, and qumodes, which are not binary, but qubits are still modeled around bits — binary digits. Phase is not binary, more of an analog continuous value, in theory. A rather strange hybrid.**Is the concept of a bit still central to quantum computing?**And quantum information science in general?**Does phase make a qubit a miniature analog computer?**Continuous value.**Could a classical computer be implemented solely with qubits?**Are qubits truly functionally complete? Is there a better common component for both classical and quantum computing?**Quantum iota.**Smallest unit of anything related to quantum computing. Including phase shifts and amplitude magnitudes. Is it a unit of energy, or… what?**The material science physics for how qubits actually work.**What their limits are. Qubit gate timing. Minimum size, maximum density.**Are there any open source quantum computer or QPU designs?****Are there any open source qubit designs?**Both the actual qubit and the control circuitry and firmware needed to control the qubit (quantum logic gates, measurement, and reset.)**Open source designs for quantum computers.**All of the technical details required to create or replicate a particular quantum computer. Full transparency.**What are the technical requirements for resetting a qubit to the 0 basis state?****Need simple plain text notation for even complex quantum states.**No Greek letters, math symbols, or other obscure notation. No complex or obscure graphics. 0 and 1 are obvious. 0/1 for superposition of 0 and 1. Etc.**How many pieces of information can a qubit represent?**Phase — imaginary portion of complex probability amplitude, in addition to the difference between the probability of |0> and |1>. Is that three? Or still just two? Or… what?**Synchronous vs. asynchronous processing.****Breakthroughs vs. incremental progress.**Are much more dramatic hardware breakthroughs needed? How much of necessary progress can be accomplished simply with lots of small steps over an extended period of time? What is needed for The ENIAC Moment, or The FORTRAN Moment? Or dramatic quantum advantage. What sort of a combination of breakthrough leaps and incremental slogging?**Initial thoughts for quantum computing abstraction.**How to abstract away from quantum mechanics. Criteria. Opportunities. Obstacles. Hardware resources vs. software resources. Is interference a quantum mechanical resource? It’s just the imaginary part of probability amplitude. Is probability amplitude even a quantum mechanical resource? Is phase a quantum mechanical resource? Does it have an existence distinct from merely probability amplitude? RFP for algorithm abstraction. Application domain: concepts, processes, goals, inputs, data structures.**The fundamental tension between isolation, control, interaction and entanglement at the quantum level.**How the conflicts can be managed and what are the limits to coping with the tension.**Does information have a state, or do only devices and media have states?****What can we learn from the timelines of the early classical computers — ABC, ENIAC, EDVAC, EDSAC, et al to UNIVAC I and IBM 701?**Mistakes to avoid. Opportunities missed. Failure to push more aggressively with research. Failures due to engineering getting ahead of the required research. Balancing the benefits of incremental advances and bold, dramatic leaps. See my classical computing timeline paper below.**What does the CNOT gate really do?**What is its true purpose? Conditional NOT may be its raw function, but for what purpose? What can you really do with it? In plain language, please. Focusing on the cases where the probability amplitudes are neither exactly 0.0 or 1.0. The definition “*leaves the control qubit unchanged and performs a Pauli-X gate on the target qubit when the control qubit is in state ∣1⟩; leaves the target qubit unchanged when the control qubit is in state ∣0⟩*” leaves a lot unsaid. No mention of entanglement or Bell states. Definition is too classical, insufficiently quantum. “Controlled NOT” is not a satisfying or productive description. Using three CNOT gates to implement a qubit SWAP operation, as well as Bell states, illustrate the opaqueness of this gate.**Do the controlled logic gates require that the control qubit be in a pure |1> state, or some threshold of amplitude, or is control strictly probabilistic, such as equal probability of control if, for example, a Hadamard gate was just executed on the control qubit?**A plain language description of the full function and purpose of the CNOT gate would help a lot.**What exactly happens when you apply a single-qubit gate to one qubit of a collection of entangled qubits?**Such as an X gate or a phase shift. What quantum mechanical insight should apply? What happens phenomenologically with real physical qubits? How should unitary matrices be constructed when n qubits are entangled, especially where n is a fairly large number, like 8, 64, 256, or 1024?**Is the inability to directly measure probability amplitudes the fatal flaw of quantum computing?**But what about phase estimation? Takes too many qubits and requires too much coherence. Destructive read.**Limits of fractional parallelism.**Unable to do full desired parallel computation in one step — must break it into pieces with optimization or other processing between the pieces. Blocks for D-Wave. Variational methods. Machine learning?**What precisely constitutes a Clifford gate and Clifford group?**In plain English.**Are rotation angles quantized or not?**What is the quantum unit?**Are phase shift angles quantized or not?**Are phase and probability amplitudes true continuous values or quantized at some level?**How precise a value of pi, pi/2, and pi/4 are needed to achieve correct results for logic gates?**Number of decimal digits or bits.**Is there a Planck-type limit to the amplitude of a state in the wave function of a quantum system, such that there is a maximum number of states in a quantum system?****Are the initial quantum states of qubits always |0> by definition, forced by the hardware or firmware, or are they unpredictable or undefined, by definition, or does this vary depending on the whim of the designer of the machine, and is there a recommended best practice for initializing qubit state?**Or is it up to the operating/control software or control firmware to artificially force the initial state before beginning execution of a new quantum program or circuit, and is there a standard convention and predictable expectation of initial state for a new program, especially for a quantum cloud service?**Is there a logic gate which dis-entangles two qubits?**Do they start with the same exact same state, or each take part of the state or does the state collapse into two distinct states, or what? Is there a different answer for each of the four Bell states?**What are the standard, recommended logic gates for setting (preparing) a qubit for |0> and |1>?****What does**What does it mean for a gate set to be universal? What other meanings does universal have?*universal*mean in quantum computing?**What is the precise role of interference in quantum computation?**What interference is used in each type of gate? Which gates use interference and which do not? Is it only in 2-qubit vs. not in 1-qubit gates? How is it used in the design and coding of quantum algorithms? Do quantum programmers need to understand interference per se, or is that more of an under-the-hood implementation detail, analogous to how logic gates are implemented in a classical computer? Is interference used in the resonators used to couple (entangle) two qubits — is that the only or main usage of interference? Does superposition require interference?**What exactly happens if a unitary matrix is not unitary?**How to check if a unitary matrix actually is unitary? Multiply by its conjugate transpose and make sure the result is the identity matrix? Can there be small errors or variations from absolute unitarity? How large can they be? How many of them can there be?**Since quantum computing is represented as being**In particular, is there any*probabilistic*rather than*deterministic*, it seems reasonable to assume that*any*quantum computation must be*repeated*some significant number of times and then*some sort of statistical analysis*performed on that set of results to*decide*which of the results is*the*result, but I don’t find a detailed treatment of this*repetition and evaluation process*anywhere in the literature.*magic formula*or*guidance*for*calculating*how many repetitions are required to achieve some designated percentage of accuracy (80%, 90%, 95%, 98%, 99%, 99.9%, 99.9999%, etc.)? What statistical criteria should be used to evaluate the results? For some computations it might be sufficient to sequentially evaluate*all*results since a*correct*result would be a solution to an equation (e.g., order of a modular exponentiation), but other applications might require detailed statistical analysis of the results, such as looking for the top k peaks in a statistical distribution. A lot more discussion and guidance is needed. And*every*presentation (especially formal publication) of a quantum algorithm should be accompanied by such a discussion and guidance for calculating the number of repetitions and what specific statistical analysis is required to evaluate the results. What might typical repetition counts be? 10? 20? 50? 100? 250? 500? 1,000? 10,000? 50,000? 1 million? 10 million? 100 billion?? And, in the context of NISQ quantum computers, how to validate whether the quantum computation may be completely failing every single time due to exceeding the coherence of the particular quantum computer, so that it will*never*return a valid result (except maybe by accident or chance.) How many failures of particular runs will indicate that the particular NISQ computer is*incapable*of delivering*any*correct result? Again, some formula, rule, or other guidance is needed for every algorithm and every machine. See my paper on shots and circuit repetitions below.**What is a degree of freedom?**In terms of quantum computing. In plain language.**Why is spin so much more appealing for quantum computing?**Are there more appealing alternatives?**Comparing and contrasting the D-Wave quantum annealing quantum computers with gate model quantum computers.**What forms of computations are appropriate for each. What are their relative futures? Can the two approaches be combined or even merged? Can they work together somehow? Can or should D-Wave be considered a systolic array processor?**Is the D-Wave quantum annealing quantum computer more of an analog computer than a discrete digital computer?****Evolving hardware to the stage where a quantum computer can be built by almost anyone using off the shelf components.**As was the case with mainframe computers in the 1950’s and 1960’s with vacuum tubes, discrete transistors, and then integrated circuits, then minicomputers in the 1960’s, then microcomputers in the 1970’s, and then personal computers in the 1980’s. Who will be the Intel of quantum computing in ten to fifteen years? Intel?!**Pulse-level control of qubits.**Ala IBM Qiskit Pulse. How useful is it really? What features to support as a generic feature on all quantum computers. How to generalize for non-superconducting transmon qubits. Can it be made more generic for all types of qubit technologies. Raw low-level access vs. high-level programming models. Conceptual representation vs. low-level details. Interoperability and portability issues.

Early papers on modular quantum computing:

*Architecture for a large-scale ion-trap quantum computer*- Kielpinski, Monroe, and Wineland
- July 2002
- https://www.researchgate.net/publication/11308739_Architecture_for_a_large-scale_ion-trap_quantum_computer
*Large Scale Modular Quantum Computer Architecture with Atomic Memory and Photonic Interconnects*- Monroe, Raussendorf, Ruthven, Brown, Maunz, Duan, Kim
- August 2012

For reference, my summary of the timeline of early classical computers:

*Timeline of Early Classical Computers*- https://jackkrupansky.medium.com/timeline-of-early-classical-computers-9539d5bf82d8

For reference, see my paper on universal quantum computers:

*What Is a Universal Quantum Computer?*- https://jackkrupansky.medium.com/what-is-a-universal-quantum-computer-db183fd1f15a

My paper on what quantum computers are not suitable for:

*What Can’t a Quantum Computer Compute?*- https://jackkrupansky.medium.com/what-cant-a-quantum-computer-compute-53bf3945d419

For completeness, a summary of what applications and application categories quantum computers are suitable for:

*What Applications Are Suitable for a Quantum Computer?*- https://jackkrupansky.medium.com/what-applications-are-suitable-for-a-quantum-computer-5584ef62c38a

For my thoughts on distance scales:

*Distance Scales for Interconnecting Quantum Computing Elements*- https://jackkrupansky.medium.com/distance-scales-for-interconnecting-quantum-computing-elements-f534254095b8

For my thoughts on quantum error correction, fault tolerance, and logical qubits — and near-perfect qubits:

*Preliminary Thoughts on Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits*- https://jackkrupansky.medium.com/preliminary-thoughts-on-fault-tolerant-quantum-computing-quantum-error-correction-and-logical-1f9e3f122e71

For more on working with probabilistic quantum results and development of expectation values, and shots or circuit repetitions, see my informal paper:

*Shots and Circuit Repetitions: Developing the Expectation Value for Results from a Quantum Computer*- https://jackkrupansky.medium.com/shots-and-circuit-repetitions-developing-the-expectation-value-for-results-from-a-quantum-computer-3d1f8eed398

For details on issues related to relying on fine granularity of phase, such as for QFT, QPE, and Shor’s factoring algorithm, see my paper:

*Beware of Quantum Algorithms Dependent on Fine Granularity of Phase*- https://jackkrupansky.medium.com/beware-of-quantum-algorithms-dependent-on-fine-granularity-of-phase-525bde2642d8

# Firmware (see: Hardware)

Although maybe firmware (such as FPGAs or low-level software controlling hardware) should have its own category, this paper has arbitrarily chosen to lump firmware research under the *Hardware* section.

In truth, the distinction between firmware and hardware can be debatable and arbitrary — and confusing, so I felt it made sense to consider the two together, at least for now.

In fact, part of the necessary research needs to be to examine specific features or capabilities of the quantum processing unit (QPU) and to decide and experiment with whether implementation should be in hardware or in firmware. There are plenty of tradeoffs to be considered — and researched as methodically as possible.

# Need for finer granularity of phase and probability amplitude to support quantum Fourier transform (QFT) and quantum phase estimation (QPE)

It is worth calling out the urgent need for quantum Fourier transform (QFT) and quantum phase estimation (QPE) to power more complex algorithms, including quantum computational chemistry, but unfortunately they depend critically on fine granularity of phase and probability amplitude, especially for transforms of a nontrivial number of qubits. Without much finer granularity, QFT and QPE simply won’t be practical for data of nontrivial size. And Shor’s factoring algorithm simply won’t work at all for larger integers.

It’s likely that quantum computers will see dramatic advances in the number of qubits and qubit fidelity in the coming years, but without dramatic advances in phase and probability amplitude granularity, even 1,000 logical qubits would be essentially worthless.

For details on issues related to relying on fine granularity of phase, such as for QFT, QPE, and Shor’s factoring algorithm, see my paper:

*Beware of Quantum Algorithms Dependent on Fine Granularity of Phase*- https://jackkrupansky.medium.com/beware-of-quantum-algorithms-dependent-on-fine-granularity-of-phase-525bde2642d8

# Alternative hardware architectures may be required to support high connectivity for over 64 to 128 qubits

Current quantum computer architectures may be sufficient to support algorithms in the 48 to 64-qubit range, but absent full qubit connectivity, SWAP networks may become very untenable when working with complex algorithms utilizing more than 128 qubits and possibly even 64 qubits — or even fewer.

It may be necessary to have research programs focus specifically on high-connectivity for algorithms with more than 64 to 128 qubits. Even 64 qubits is really pushing it.

Even then, there may be clever tricks to cheat so that high-connectivity can be achieved for 64 to 128 qubits, but then those clever tricks may run out of steam for the 160 to 200-qubit range, let alone the 256 to 1K range. Again, each range may require its own clever tricks. Or maybe not, but only deep research can address these matters.

# Hardware support

A summary of areas of research related to supporting quantum computing hardware, to characterize and facilitate using the hardware more effectively:

**Device characterization and modeling for each model of quantum computer.**Gordon Bell-style cpu model: K — processor, M — memory, but it gets confused since qubit is both memory and processing, still…**Better classical quantum simulators.**Higher capacity, higher performance, and more realistic — more closely matched to the operating conditions of current and expected real quantum computers. See the separate section,*Classical quantum simulators*.**Better benchmarking tools.****Need for a standardized set of benchmark tests for quantum computing.****Successor to Quantum Volume for benchmarking quantum computers with more than 50 (or even 40 or 32) qubits.**See my paper below for some thoughts.**Algorithmic Qubits (AQ).**Proposed by IonQ. Alternative to IBM’s Quantum Volume. “*defined as the largest number of effectively perfect qubits you can deploy for a typical quantum program*”. “*define a typical quantum program (circuit) as one that has a size (number of fully-connected gate operations) that scales with the square of the number of algorithmic qubits.*” AQ = log2(QV), or inversely, QV = 2^AQ.**Debugging capabilities.**How to debug algorithms and applications for more than the qubit and quantum state capacity of even the largest classical quantum simulators is an interesting challenge.**Can you really understand quantum computing without a deep comprehension of the firmware and control logic?**How exactly are unitary matrices turned into control of qubits? Full and transparent access. What’s really going on? What are the real limitations?**Need simple plain text notation for even complex quantum states.**No Greek letters, math symbols, or other obscure notation. No complex or obscure graphics. 0 and 1 are obvious. 0/1 for superposition of 0 and 1. Etc.**Operations (gates) per second vs. quantum states processed per second as proper measure of throughput.**m gates on n qubits vs. m gates on 2^n quantum states.**How large a program can a quantum computer accommodate?**How many quantum logic gates? Coherence time limits circuit depth for many machines, but for more advanced machines with much longer coherence time is there some other limit than coherence time?**How complex can a quantum algorithm be?**Besides raw gate count.

My thoughts on IBM’s Quantum Volume metric for the capacity and performance of a quantum computer:

*Why Is IBM’s Notion of Quantum Volume Only Valid up to About 50 Qubits?*- https://jackkrupansky.medium.com/why-is-ibms-notion-of-quantum-volume-only-valid-up-to-about-50-qubits-7a780453e32c

# Debugging quantum algorithms and applications

Research for debugging capabilities is already listed in multiple areas, but needs to be highlighted specially for research. Elite expert algorithm designers may not need debugging capabilities, but mere mortal professionals will, especially as quantum algorithms and applications get more complex as qubit counts rise.

And debugging of quantum algorithms is especially problematic since quantum state is by definition not visible while a quantum circuit is being executed.

So, special attention needs to be given to research for both debugging capabilities and assuring that quantum algorithms and applications are inherently easy to debug, which they aren’t currently.

How to debug algorithms and applications for more than the qubit and quantum state capacity of even the largest classical quantum simulators is an interesting challenge.

# Classical quantum simulators

At present, classical quantum simulators are relatively primitive and not terribly efficient. Areas needing research and improvement include:

**Larger simulations.**Current 50-qubit limit (actually 40 qubits.) Limited circuit depth.**More efficient classical simulation of quantum circuits.****Handle classical simulation of much larger quantum circuits.**More qubits. Deeper circuits. Even up to a data center the size of a football field.**How many qubits could you simulate on a classical quantum simulator the size of a football field?**A full data center dedicated to and optimized for classical quantum simulation. Can we get to 55, 60, 64, 70, or even 72 qubits?**Use cases for classical quantum simulators.**See my paper below.**More realistic simulations.**Simulators aren’t generally configured to closely match realistic quantum computer operating conditions — they’re too ideal or too arbitrary with noise models. Developers should be able to easily run simulations of quantum computers expected to become available in two to five years — or to closely match existing quantum computers or those expected over the coming year. Emphasis on “easily” — select from preconfigured noise models for particular quantum computers rather than each developer needing to hand-tune the noise model themselves. See my paper below.**Simulation so real that no human or application could tell that a simulator was being used rather than a real quantum computer.**Other than the fact that it runs much slower, that is. No need for absolute perfect match to real machines, just “close enough” so that algorithms and applications can achieve virtually identical results from quantum computations.**Debugging capabilities.**How to debug algorithms and applications for more than the qubit and quantum state capacity of even the largest classical quantum simulators is an interesting challenge.**Capturing and analyzing quantum state for a simulation run.****Is there any quantum circuit which can’t be simulated (other than size)?****Would there be any significant performance or capacity benefits from a classical quantum simulator which can simulate a designated fraction or subset of a large, complex quantum circuit?**See*Algorithm knitting or circuit knitting*below.**Would there be any significant performance or capacity benefits from a classical quantum simulator based on much more limited precision for the entries of unitary matrices (quantum logic gates) and for qubit probability amplitudes as well?**How many bits (or decimal digits) of precision are required?**How accurate and reliable can a classical quantum simulator be if entries in a unitary matrix are irrational?**Such as an H gate with entries which are 1 divided by the square root of 2, an irrational number. How much precision is needed? Will it ever be enough?**Are there any aspects of quantum computing which cannot be adequately simulated on a classical computer?**Other than performance and capacity if too many qubits (>40–55).**Need for much-higher performance and much-higher accuracy in quantum computer simulators.**Possibly even using massively parallel classical supercomputers, with thousands of processors, to get as far as we can, especially until we have more powerful quantum computers with enough qubits and long enough coherence. Possibly using GPUs and FPGAs, or even full-custom hardware.**What is the optimal classical computer architecture for simulating a quantum computer?**Presumably the key issue is dealing with exponential states — 2^n states for n qubits. Multiple cores per system — how many might be optimal? Would a much larger number of much simpler processor cores give the most bang for the buck? Which processor operations need to be optimized the most to achieve both greater performance and greater capacity. Optimal encoding of individual quantum states. Potential for custom hardware accelerator for encoding and decoding quantum states in memory storage. Using only main memory for quantum states vs. mass storage. Optimal size of memory and mass storage per processor core. Massive distributed systems vs. largest local area networked systems. How many systems before network overhead becomes too onerous. How many quantum states could realistically be achieved? 2⁴⁵, 2⁵⁰, 2⁵⁵, 2⁶⁰, 2⁶⁴, 2⁷²?**Need for novel classical processor designs to support large-scale quantum simulators.**Much smaller and simpler classical processors. Don’t need to do a lot, just manage a large number of quantum states in memory. And simple qubit operations. Need to manage connectivity. Smaller so many more can be placed on a single chip — maybe 256, 1024, 4K, or even 16K or more processors on a single chip. Maybe a goal of 10M processors for a single simulator, each with 64 GB RAM each. Manage connectivity and routing of operations with a smaller set of external processors.**Potential for distributed quantum simulator.**Is it easy or hard? Theoretical limits. Practical limits. Using specialized processors, memory, and connectivity. Using commodity hardware and networks only.**How large a quantum program can be simulated essentially instantaneously — in less than a single second?**Maximum circuit depth. Maximum number of gates. Maximum number of qubits. Formula for combining all three — qubits, gates, depth. Impact of entanglement — minor (20%), insignificant (5%), major (50–75%), or dramatic (2x or more).**Should we be focusing more effort on boosting high-end simulators in order to enable much more sophisticated algorithms? Rather than stunting algorithm ambitions by focusing them on noisy low-qubit current machines?**Make 32–40 qubits the sweet spot for algorithms. And emphasize the need for 32–40 qubit algorithms that are likely to scale as real hardware becomes available that exceeds even the best simulators. See my papers below on a staged model for scalable quantum algorithms and scalable 40-qubit algorithms.**Nuances of behavior which may exist on a simulator but not on a real quantum computer, and vice versa.****Pulse-level control of qubits.**Ala IBM Qiskit Pulse. How to simulate it. How to make it generic across all types of qubit technologies. How to observe its effects during simulation. How to debug unexpected behaviors or results. Detecting and reporting improper or dubious usages.**Algorithm knitting or circuit knitting.**As promoted by IBM, classical code cleverly figures out how to split a larger algorithm or quantum circuit into smaller algorithms or quantum circuits which can then be executed or simulated more efficiently and correctly, especially on near-term quantum computers or simulators with limited qubit capacity, and then classical post-processing code “knits” the results together for the full algorithm or circuit.

See my paper on use cases for classical quantum simulators, including more realistic simulations to match quantum computers expected in two to five years, as well as current quantum computers and those expected over the coming year:

*Five Major Use Cases for a Classical Quantum Simulator*- https://jackkrupansky.medium.com/five-major-use-cases-for-a-classical-quantum-simulator-368b9cb10639

My paper on the staged model for scalable quantum algorithms:

*Staged Model for Scaling of Quantum Algorithms*- https://jackkrupansky.medium.com/staged-model-for-scaling-of-quantum-algorithms-d1070056907f

My paper on the merits of focusing on scalable 40-qubit algorithms:

*Where Are All of the 40-qubit Quantum Algorithms?*- https://jackkrupansky.medium.com/where-are-all-of-the-40-qubit-quantum-algorithms-14b711017086

# Quantum information science in general

Quantum computing is only one of the areas of the larger field of *quantum information science (QIS)*. The other areas are equally worthy of continued and increased research:

**Quantum communication.****Quantum sensing.****Quantum metrology.****Quantum networking.**

The immediate goal here is not to detail all interesting research areas of QIS, but to focus on quantum computing and mention the other areas of QIS only in passing and to note some of the areas where there is overlap such that research in one area can help in the others, and that there are quite a few fundamental research areas common to all areas of QIS.

Some interesting areas for QIS research which may or may not have any relevance to quantum computing:

**What is quantum information?**Tricky. Lots of hype. Qubits vs. quantum state vs. subset of quantum state. No definitive definition. Somewhere between qubits and quantum state. Information vs. representation. Are probabilities “information” per se? Is phase “information” per se? Does phase represent separate information from the information implied by probability amplitude?**Is the wave function the unit of quantum information?**Combines 0 and 1 — superposition. Represents entanglement. Wave function volume = total number of terms of all wave functions for a system with non-zero probability amplitudes. Basis states may be fractions of a unit of quantum information, but not a complete unit of quantum information.**How many pieces of information can a qubit represent?**Phase — imaginary portion of complex probability amplitude, in addition to the difference between the probability of |0> and |1>. Is that three? Or still just two? Or… what?**Does information have a state, or do only devices and media have states?****Quantum information vs. quantum information storage.**Manipulation as well. Computing, communication, sensing, networking.**Quantum information theory.**New concept, or at least a new term. No clear definition. Seeby Wilde.*From Classical to Quantum Shannon Theory***What is a quantum resource?**Superposition, entanglement, interference, quantum parallelism. See*What Are Quantum Effects and How Do They Enable Quantum Information Science?***Quantum time crystals.****Consciousness based on quantum effects.****Is the concept of a bit still central to quantum computing?**And quantum information science in general? But then there are qutrits, qudits, and qumodes as well, which are not strictly binary.**What other quantum effects could be exploited for quantum computing?****Quantum effects in biological life.**Quantum effects in human senses — sight, hearing, and touch. Smell and taste? Quantum effects in human (and animal) memories. Quantum effects in insects (and birds). Quantum effects in plant life. Quantum effects… everywhere. Any quantum effects peculiar to biological life?**Probability plus statistics equals approximate determinism.**Statistical aggregation of probabilistic data can approximate determinism. Is all apparent determinism really simply approximate determinism? Is there any absolute determinism in a universe based on quantum mechanics? Dampening of quantum mechanical effects, oscillation at the Planck level.**How many wave functions are needed to define the full state of a quantum computer?**One for each isolated system, each unentangled qubit. One for each collection of entangled qubits. If two qubits are entangled by a 2-qubit gate, how does the number of wave functions change?**Chaos theory and quantum computing (or quantum information science in general).**Quantum chaos theory?**Can the****Wolfram Physics Project****be computed using a quantum computer?**Is Wolfram’s Fundamental Theory of Physics compatible with quantum mechanics, the native science of quantum computers? Could some variant of quantum computing be more compatible with Wolfram’s Fundamental Theory of Physics? Is there any workable synergism between Wolfram’s Fundamental Theory of Physics and quantum computing?**Is all interaction in the universe quantum interaction and hence quantum information processing?****Derivation of Planck constant and relevance to quantum computing.**What is the quanta for probability amplitude, phase angle, and probability? What is the physical, phenomenological basis for these quanta? What are the practical implications, such as minimum and maximum angles of rotations? What are the quanta for errors?**Classical computers are based on quantum mechanics and quantum effects too.**Yes, they are. Quantum effects within transistors — and in wires as well. All classical bits and gates are ultimately derived solely from quantum effects. Somehow, quantum effects can be converted into non-quantum macroscopic effects — through the magic of statistical aggregation. Probability plus statistical aggregation equals approximate but practical determinism. Contrast with absolute determinism. Or does an exponential limit plus quantum threshold guarantee determinism, short of hardware failure? Every classical computer is also a quantum computer, in some sense. Simply collapses all of the quantum states. But still depend on quantum state transitions between the collapses. Technically, it may be more correct to say that classical computing is based on quantum effects rather than on the logical operations of the Bloch sphere and unitary matrices on qubits. It would be fascinating to explore the boundary between the quantum and non-quantum worlds, possibly even producing a hybrid, mixed model of computation.**Might a quantum computer permit capture and processing of holographic images?**Holographic cameras. Holographic displays. Holographic image representation — need full quantum state, or at least phase? “*The interference pattern recorded on the hologram is a Fourier transform of the object. When illuminated by a laser beam similar to the one which has produced it, the hologram reproduces the appearance of the object by inverse Fourier transformation.*” Hybrid of quantum sensing, quantum computing, and quantum effectors.**Quantum image processing.**Sensors, cameras. Immediate processing — of raw quantum pixels. Structure — static images, dynamic video. Image storage. Display. Load and manipulate. Navigate image data — program control API — sequential, random access — by position, by time, search — Image characteristics, image objects, features, tagged features. UX during display — motion, paths, time jumps and rates, keyword search — tags, image characteristics, image features, image objects, match fragments of other images. Audience size — single user, two users, small group of users, larger group of users, very large and open group of users. Issues — colors, lighting, resolution — what is a quantum pixel? Quantum alternative to pixels. 3D or even 4D cells. Image quanta. What could sensors capture? Match to the sensor — could convert to other or neutral formats later. Want quantum state, which implies more info than can be classically measured. Transitions between pixels. Relation to human brain and mind and how they process images (imagery.) Stereoscopic views — two — ala human vision, more than two — enhanced beyond human vision, greater ability for machine processing, may not be fully viewable by human senses. Audio to go with images — what would quantum audio be like, contrast with normal analog?**What is a degree of freedom?**Does it relate to the number of basis states? In terms of quantum information science. In plain language.**Why is spin so much more appealing for quantum computing?**Are there more appealing alternatives?

For more on quantum information science, see my summary paper:

*What Is Quantum Information Science?*- https://jackkrupansky.medium.com/what-is-quantum-information-science-1bbeff565847

For more on quantum effects and QIS, see my summary paper:

*What Are Quantum Effects and How Do They Enable Quantum Information Science?*- https://jackkrupansky.medium.com/what-are-quantum-effects-and-how-do-they-enable-quantum-information-science-75981368d059

# Software

Here, software refers to software to support the use of quantum computer hardware, as distinct from quantum algorithms and application software (quantum applications and quantum solutions.)

**Support software.****Software tools.**What would be useful?**Compilers, translators, and language processors.**Accommodate various programming languages, specification languages, and notations.**Debugging features.**Visualizing activity of an algorithm or quantum circuit. Visualizing both final and intermediate results.**Need for a critical mass of powerful tools — ala C and UNIX.**Algorithm designers and application developers need to feel productive, that they can rapidly go from a raw glimmer of an idea to a working implementation.**Operating systems for quantum computers.**Currently, there is no need for an operating system on a quantum processing unit (QPU), but the overall quantum computing system certainly has an operating system for communication via a network, running jobs, and returning results. As well as support functions such as calibration and tuning. Unclear what other operating system-type functions current quantum computing systems perform.**Might it be helpful for a quantum algorithm designer or application developer to know how many wave functions their algorithm requires?**An indication of complexity. Maybe an indication of a problem if the count is too high or too low. Show a time series of wave function count as circuit execution progresses — where does it peak, how does it end. Single metric would be the peak count of wave functions.**How many quantum states does your algorithm use?**Not just how many qubits, but 2^k quantum states for each k qubits used in a single Hadamard transform. And times the circuit depth. A better metric of resource requirements.**XML and JSON for running a quantum program.**<circuit>, <measure> — selected or all gates, but also allow <measure> as a gate, <repeat> — how many times to rerun the same program. Option to specify an initial state other than all zeroes — for all runs, for specific runs, or have <repeat> specified for each alternative initial state. Option for randomized initial state — same as H gate, other choices for randomization. May not be as useful for quantum circuits which must be dynamically generated by classical code (*generative coding*), but maybe still useful to record any generated quantum circuit.

# Quantum software engineering

Quantum software engineering doesn’t currently exist as a field today per se. Nobody is discussing it currently or even recognizes the need for it. But, it seems inevitable as quantum algorithms and applications get more sophisticated and more in need of a formal engineering discipline than simply folklore and craft. See also *Quantum computer science*.

**What is it?**What does it consist of? How much of classical software engineering applies to quantum computing?**What might a quantum software methodology look like?**Contrast with classical software methodologies. Any special opportunities of quantum computing that can be exploited?**Does the concept of the software development life cycle (SDLC) (or systems software development life cycle) still apply to quantum computing?**How might it differ from traditional digital computing?**Need for algorithm design guidelines.**What factors to keep in mind when designing quantum algorithms. And guidelines for criteria for reviewing quantum algorithms.**Need for guidelines for problem decomposition to develop a quantum algorithm.**What opportunities to look for to exploit the capabilities of quantum computing, primarily quantum parallelism.**Quantum application problem statement languages.**Allow application developers to describe applications in domain-specific terms which can then be automatically transformed into quantum algorithms.**Design and development process and tasks for quantum algorithms.****Data modeling for quantum computing.**Generally, how to think about data and computational results for quantum computing. How to organize input data. How to organize and process result data. How to organize complex data sets for quantum computing. Dealing with data types. Working with shot count or circuit repetitions and distributions of results and development of expectation values. See my paper below for the latter.**Data structures for quantum computing.**How much, if any, of classical data structures make any sense in the context of quantum computing? What data structures or data structuring methods might make sense for quantum computing?**Coding and commenting conventions for quantum algorithms.**Algorithms need to be human readable. And maintainable.**Need for robust commenting conventions for quantum circuits.**The “why” rather than the “what” of circuits. Intentions and motivations. Intended purpose. Intended outcome. Pseudo-code as well.**Quantum pseudo-code.**Informal language or even simply commenting conventions which allow algorithms to be expressed as*what*they are doing rather than*how*they are doing it. High level, but detailed enough that an average developer could translate to working quantum circuits. Would be good for published papers as well.**Hints for algorithm steps to permit more effective automatic optimization of quantum algorithms and circuits.**Allow the algorithm designer to fully express their intentions.**Characteristics of algorithms to document.**Qubit requirements. Qubit fidelity requirements. Connectivity requirements — is nearest-neighbor enough, big-O for swaps needed based on input size. Coherence needed — big-O for gates based on input size.**Quantum algorithms should clearly document their required minimum qubit fidelity.**No point trying to run an algorithm on a machine incapable of supporting the algorithm.**What are the key performance indicators (KPI) or metrics for an algorithm?**And how do different algorithms compare? Ditto for overall quantum applications.**Getting a handle on algorithmic complexity (computational complexity.)**See paper in next section.**Is SQL and the relational database model equally relevant for quantum computing, less relevant, or even more relevant?**Currently not relevant at all, but I suspect that there are opportunities.**Little data — large solution space.**See. When might it not be the best choice?*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***Need for quantum circuit narration.**Ascertaining the net effect of a quantum circuit or even a short snippet of a quantum circuit can be tedious, error-prone, or even beyond the reach of many users. An automated tool for*narrating*the incremental and net effect would be very helpful to answer the question*What does this quantum circuit (or snippet or portion of the circuit)**do**?*Alternatively, a library of common circuit patterns with narrative English descriptions of the effect and purpose of each pattern would accomplish much of the purpose of such an automated tool. Alternatively, authors of quantum circuits could carefully comment their circuits, explaining what each sequence or arrangement of gates is attempting to do — it’s net effect and purpose. The net effect is that quantum circuits need to be more readable by real people, not simply executable on a quantum computer.**Methodology and techniques for working with circuit repetitions (shot count) and developing expected value for a probabilistic quantum algorithm.**Formalize calculation and estimation of shot count based on the quantum computation. See my paper below. Researchers need to come up with formulas, rules of thumb, and automated processes for deducing the shot count needed in a particular situation, rather than the user engaging in guesswork and trial and error.**Open source should be the rule for quantum computing.**Open everything. Okay to charge for consulting services and premium or production-scale production-quality access to hardware, but all code, algorithms, libraries, papers, tutorials, documentation, and training materials should be strictly open source and freely available online. Free access to cloud-based hardware for education, training, and experimentation.**Cybersecurity.**Quantum algorithms and applications which aid cybersecurity, as well as quantum algorithms and applications which may interfere with or undermine cybersecurity. See also references to Shor’s factoring algorithm. See the separatesection for the various aspects.*Cybersecurity*

For more on working with probabilistic quantum results and development of expectation values, see my informal paper:

*Shots and Circuit Repetitions: Developing the Expectation Value for Results from a Quantum Computer*- https://jackkrupansky.medium.com/shots-and-circuit-repetitions-developing-the-expectation-value-for-results-from-a-quantum-computer-3d1f8eed398

# Quantum computer science

Quantum computer science doesn’t currently exist as a field today per se. Nobody is discussing it currently or even recognizes the need for it. But, it seems inevitable as quantum algorithms and applications get more sophisticated and more in need of a formal science than simply folklore and craft. See also *Quantum software engineering*.

**What is it?**What does it consist of? How much of classical computer science applies to quantum computing?**Is quantum computer science distinct from classical computer science?**How much overlap? Or maybe quantum computer science is a module to add to classical computer science?**Fault-tolerant quantum computing (FTQC).****Quantum error correction (QEC).****Logical qubits.****Manual error mitigation.****How to work with quantum results — development of expectation values.**Results can be probabilistic and possibly in error. Working with shot count or circuit repetitions and distributions of results and development of expectation values. See my paper below. Researchers need to come up with formulas, rules of thumb, and automated processes for deducing the shot count needed in a particular situation, rather than the user engaging in guesswork and trial and error.**Quantum computing needs a comparable distinction as between classic boolean logic and integer arithmetic and the physics of transistors.**Classical developers need not understand the physics of transistors. Quantum developers — and quantum computer scientists — need a comparable semantic distance from the underlying physics.**Does Shor’s factoring algorithm suffer from the halting problem?**Not guaranteed to finish (halt), even though probabilistically it may be very likely to finish (halt) in most cases. Roughly 50% chance that a randomly chosen integer will produce a viable order that results in a factor. And if the quantum computer doesn’t have a low enough error rate, that 50% chance could be reduced enough to make non-halting a possible issue in some cases, even if reasonably infrequent.**The halting problem in general.**Especially for variational methods. Even Shor’s factoring algorithm since it is based on random numbers.**Initial thoughts for quantum computing abstraction.**How to abstract away from quantum mechanics. Criteria. Opportunities. Obstacles. Hardware resources vs. software resources. Is interference a quantum mechanical resource? It’s just the imaginary part of probability amplitude. Is probability amplitude even a quantum mechanical resource? Is phase a quantum mechanical resource? Does it have an existence distinct from merely probability amplitude? RFP for algorithm abstraction. Application domain: concepts, processes, goals, inputs, data structures.**Significance of PSPACE.**How exactly to characterize algorithmic complexity for quantum computing. What qualities from problem space to map to qubits vs. computational basis states.**Algorithmic or computational complexity of quantum algorithms and applications.**See my paper below.**Quantum programming models.**Plural — different levels, different needs. Higher level for greater intellectual leverage. Lower level for greater performance and control. Common concepts and elements across the various programming models.**What might an ideal high-level quantum programming model look like?**What would the fundamental algorithmic building blocks look like? How does one proceed from problem statement to solution?**Quantum programming languages.**Plural — different levels, different needs. Their conceptual, syntactic, and semantic capabilities. Higher level for greater intellectual leverage. Lower level for greater performance and control. Common concepts and elements across the various programming languages.**What might a BASIC language for quantum computing look like?**What types of problems can be solved very easily?**Might FORTRAN make a comeback for quantum computing?****What might an ideal high-level quantum programming language look like?**The syntax is less interesting than how computations can be expressed in the programming model. Plural — different levels, different needs. Higher level for greater intellectual leverage. Lower level for greater performance and control.**Possible abstractions for a quantum programming model and language.**What entities should a quantum programmer be aware of? Semantics of entities. Semantics of operations on and between entities.**Quantum computing as analog computing.**Phase and probability amplitude as a continuous value (albeit ultimately discrete.)**Applicability of Church-Turing thesis to quantum computing.****Basis state vs. basis vector.**Unravel the terminology. And computational basis state and product state as well. And eigenstate while we’re at it.**What constitutes a***computational basis*?**Are the |0> and |1> basis states supposed to be orthonormal?**In the Bloch sphere they are opposing vectors, |0> pointing to the north pole, and 1> pointing to the south pole, but NOT at right angles to each other, so are they really an “orthonormal basis”?**Turing test for quantum computing.**Is a particular machine really a quantum computer? Can you really tell simply by running an algorithm and examining the results? Can a test for ability to generate random numbers tell? Probabilistic rather than deterministic results. And achieve quantum advantage. Maybe, if a particular machine cannot achieve quantum advantage, then it shouldn’t be considered a quantum computer. Allow for simulators, but also wish to detect a simulator vs. a real quantum computer.**Derivation of Planck constant and relevance to quantum computing.**What is the quanta for probability amplitude, phase angle, and probability? What is the physical, phenomenological basis for these quanta? What are the practical implications, such as minimum and maximum angles of rotations? What are the quanta for errors?**If a classical program is like a recipe or control of a washing machine, what metaphor works for quantum programs?**Same, just more limited? Or exactly the same, just completely linear, with no loops? Or, what?**Is there a sense of “work” being performed during execution of quantum logic gates?**Work being a transfer of energy. If a substantial number of qubits are entangled, how would the “work” or a single gate be characterized? How much “work” is performed by a Hadamard gate that creates a superposition? How much “work” is performed by a Hadamard gate that undoes a superposition?**Why is pi sprinkled everywhere in quantum computing?**Pi, two pi, and pi over 2 are most common. Gates are essentially just rotations by angles, with two pi radians in a full circle. Rotation is periodic, with two pi radians of rotation being one period.**What are the consequences in quantum computing of pi being irrational?**Since pi has an infinity of digits, what precision actually matters in quantum computing? Is there some limit, some Planck-level quantum unit, to precision of gates? How many digits of pi should we use, in general? Is 3.14 good enough? 3.1416? 3.14159? 3.1415926535? Or what?**Can finite state automata and pushdown automata be simulated or just implemented on a quantum computer?****Is SQL and the relational database model equally relevant for quantum computing, less relevant, or even more relevant?**Currently not relevant at all, but I suspect that there are opportunities.**Since quantum computing is represented as being**In particular, is there any*probabilistic*rather than*deterministic*, it seems reasonable to assume that*any*quantum computation must be*repeated*some significant number of times and then*some sort of statistical analysis*performed on that set of results to*decide*which of the results is*the*result, but I don’t find a detailed treatment of this*repetition and evaluation process*anywhere in the literature.*magic formula*or*guidance*for*calculating*how many repetitions are required to achieve some designated percentage of accuracy (80%, 90%, 95%, 98%, 99%, 99.9%, 99.9999%, etc.)? What statistical criteria should be used to evaluate the results? For some computations it might be sufficient to sequentially evaluate*all*results since a*correct*result would be a solution to an equation (e.g., order of a modular exponentiation), but other applications might require detailed statistical analysis of the results, such as looking for the top k peaks in a statistical distribution. A lot more discussion and guidance is needed. And*every*presentation (especially formal publication) of a quantum algorithm should be accompanied by such a discussion and guidance for calculating the number of repetitions and what specific statistical analysis is required to evaluate the results. What might typical repetition counts be? 10? 20? 50? 100? 250? 500? 1,000? 10,000? 50,000? 1 million? 10 million? 100 billion?? And, in the context of NISQ quantum computers, how to validate whether the quantum computation may be completely failing every single time due to exceeding the coherence of the particular quantum computer, so that it will*never*return a valid result (except maybe by accident or chance.) How many failures of particular runs will indicate that the particular NISQ computer is*incapable*of delivering*any*correct result? Again, some formula, rule, or other guidance is needed for every algorithm and every machine. See my paper on shots and circuit repetitions below.**What might a BASIC language for quantum computing look like?**What types of problems can be solved very easily?**Might FORTRAN make a comeback for quantum computing?****Conceptualization of approaches for quantum-inspired classical algorithms.****What is a degree of freedom?**In terms of quantum computer science. In plain language.**How can quantum computing compete with the magnificent high-level abstractions of classical computing?**Other than raw performance.**Can dramatic quantum advantage be achieved without quantum Fourier transform?**Is there some other path?**Need for an alternative to Shor’s factoring algorithm as the exemplar of the best that quantum computing has to offer.****Data modeling for quantum computing.**Generally, how to think about data and computational results for quantum computing. How to organize input data. How to organize and process result data. How to organize complex data sets for quantum computing. Dealing with data types. Working with shot count or circuit repetitions and distributions of results and development of expectation values. See my paper below for the latter.**Data structures for quantum computing.**How much, if any, of classical data structures make any sense in the context of quantum computing? What data structures or data structuring methods might make sense for quantum computing?**What is the quantum computing equivalent of the tape of a Turing machine?****The need for reversible algorithms.**Early discussions of quantum computing insisted that every quantum computation needs to be fully reversible, but more recently that requirement has been absent. What changed? Is reversibility essential, mandated, desirable, optional, irrelevant, or what? More clarity is needed as to whether or when or under what conditions full reversibility is required.**What circuit depth could we expect to require to achieve dramatic quantum advantage?**Of course it will vary a lot, but some ballpark estimate would be helpful.**Quantum work.**How much work is a quantum computer or actually a particular quantum circuit performing? Maybe the sum of quantum states for each step in the quantum circuit. Factoring in quantum parallelism — a Hadamard transform on n qubits producing 2^n quantum states. Compare the work of the quantum circuit with the work of a classical program which calculates the same results.**Working with real numbers on a quantum computer.**When all you have are raw qubits. How to process real numbers on a quantum computer. How to get real numbers for the results of quantum computations.**Does measurement of a single qubit cause the full collapse for all entangled qubits participating in the wave function in which the measured qubit is participating?**The full semantics of measurement and collapse needs to be more fully and accurately articulated. Do all of the qubits participating in the wave function collapse to exactly pure states — with an absolute guarantee of no mixed states? Or is it just the qubit(s) being measured. Should the order in which entangled qubits are measured matter? At least for Bell states and quantum communication, it seems as though both qubits collapse at the same time, but this needs to be articulated more clearly. Curious about GHZ and W states.**How exactly can quantum computing compete with Monte Carlo simulation on a classical computer?**Where approximate solutions are acceptable. Where exactly is the quantum advantage? Potential for a hybrid?**Hybrid Monte Carlo methods.**Quantum-specific enhancements. Keep up with improvements in classical MCS.**Can probability amplitudes (or probabilities) of the basis states be determined from the Bloch sphere?**I haven’t seen any geometric representation of them, particularly for superposition. Certainly not possible for entangled qubits.**What does “**A phase shift that has no impact on the outcome of measurement? Change in amplitude that leaves probability unchanged? What exactly is the point or purpose or effect of saying it? Other phrases… Up to a relative phase shift. Up to an unobservable global phase factor. (What is a phase factor vs. phase shift?) Up to an unimportant constant factor. Up to an unimportant global phase T. Up to an unimportant global phase. Up to an unimportant global phase factor. Up to an unimportant global phase shift. Up to a global phase. Up to a global phase shift. Up to an irrelevant overall phase. Up to an irrelevant overall phase factor. Up to an irrelevant global phase. Up to a normalization factor. Up to an overall multiplicative factor. Up to an error on a single qubit of the output. Up to a constant factor. Up to a unitary transformation. Up to a relative phase shift. “*up to a factor*” or “*equivalent up to global phase*” really mean?*equivalent up to global phase*” — “*Two quantum states |ψi and |ϕi are equivalent up to global phase if |ϕi = e^iθ |ψi, where θ ∈ R. The phase e^iθ will not be observed upon measurement of either state.*” — https://arxiv.org/abs/0705.0017.**Deeper understanding and conceptualization of global properties and how they can be used in quantum computations.****What exactly is a**https://en.wikipedia.org/wiki/Root_of_unity http://mathworld.wolfram.com/RootofUnity.html*root of unity*and how is it used in quantum computing?**What are the various forms of quantum parallelism?**Enumerate and describe them.**Thinking beyond quantum parallelism.**What are the limits of current models of quantum parallelism? What could be done for problems which exceed those limits?**Chunking as an approach to processing more data than a quantum computer can directly handle.**An essential challenge. Multiple runs with variations of seams — look for statistical balancing. Major post-processing — using a quantum computer simply to guide a classical computer to where to focus effort. Gridding as an essential tool — using a quantum computer simply to guide a classical computer to where to focus effort.**Limits of quantum computing.**What about problems and solutions which are of super-exponential complexity? How can complexity be reduced from super-exponential to exponential? Techniques and heuristics for complexity reduction, such as Shor reducing factoring to order-finding.**Techniques and heuristics for complexity reduction.**Such as Shor reducing factoring to order-finding.**Can a quantum algorithm compute a transcendental function?**Normally requires an infinite series expansion on a classical computer. No looping available. Sine, cosine, exponential, logarithm.**Little data — large solution space.**See. When might it not be the best choice?*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***Key to exploiting quantum computing is reduction of algorithmic complexity.**Reducing from an algorithm of higher complexity to a lower level of complexity, primarily from exponential to polynomial. Even reduction from exponential to polynomial may not be sufficient — even linear may be too intensive. Is super-polynomial still too expensive or not?**What is the largest problem solvable with a single quantum circuit?****What’s the largest single quantum computation we can imagine at this time?****Is there any theoretical limit to the complexity of a single quantum calculation?**I suspect that there might be. There are undoubtedly practical limits for particular qubit technology implementations, but very possibly there might be theoretical limitations as well. And since every implementation has to use some technology, there is likely some best and most ideal qubit technology whose practical limit is then the theoretical limit.**What is the simplest example of quantum parallelism?****Deeper and higher-level conceptualization of Hilbert space and how it can be directly or indirectly exploited in quantum computation.****Extent to which number theory can be exploited for quantum computation.**Use of order-finding (AKA period-finding) for Shor’ factoring algorithm is one example, but is it a freak aberration, or the tip of the iceberg for even more dramatic applications. What other aspects of number theory can be exploited for quantum computing? My hunch is that there’s plenty of opportunity to mine here.**Comparing and contrasting the D-Wave quantum annealing quantum computers with gate model quantum computers.**What forms of computations are appropriate for each. What are their relative futures? Can the two approaches be combined or even merged? Can they work together somehow?**What compute-intensive problems do not require a quantum computer?**Any problem with a classical computing solution whose algorithmic complexity is less than exponential (e.g., polynomial.) And maybe somewhat less than that.**Do the ambitions of quantum computer scientists exceed the available physics?**For example, expectation of very fine granularity of probability amplitudes and phase angles to enable quantum Fourier transform (QFT) and quantum phase estimation (QPE). Also, a very large number of simultaneous product states — 2^n, where n could be 20, 50, 100, 500, 1000, or more — far more simultaneous product states than the number of particles in the universe.**Cybersecurity.**Quantum algorithms and applications which aid cybersecurity, as well as quantum algorithms and applications which may interfere with or undermine cybersecurity. See also references to Shor’s factoring algorithm. See the separatesection for the various aspects.*Cybersecurity*

Also see the *quantum algorithm support* section for research areas which overlap with *quantum computer science*.

For more on algorithmic complexity (computational complexity), see my paper:

*What Is Algorithmic Complexity (or Computational Complexity) and Big-O Notation?*- https://jackkrupansky.medium.com/what-is-algorithmic-complexity-or-computational-complexity-and-big-o-notation-9c1e5eb6ad48

For more on working with probabilistic quantum results and development of expectation values, see my informal paper:

*Shots and Circuit Repetitions: Developing the Expectation Value for Results from a Quantum Computer*- https://jackkrupansky.medium.com/shots-and-circuit-repetitions-developing-the-expectation-value-for-results-from-a-quantum-computer-3d1f8eed398

# Cybersecurity

Cybersecurity is a broad area. Included in all of the major areas of research, but summarized here:

**True random number generation.**Random numbers have a lot of application in cybersecurity. Unlike classical computers, which can only generate pseudo-random numbers unless they have special hardware, quantum computers can trivially generate true random numbers. See my paper below.**Security of quantum computers.**Nothing quantum per se — a quantum computer is still a computer with all of the security issues of classical computers.**Quantum communications.**Such as QKD. Beyond the scope of this paper.**Quantum networking.**Vulnerabilities. Inherent defenses. How to defend. How to attack.**Quantum algorithms.**Quantum algorithms which aid cybersecurity, as well as quantum algorithms which may interfere with or undermine cybersecurity. See also references to Shor’s factoring algorithm.**Quantum applications.**Quantum applications which aid cybersecurity, as well as quantum applications which may interfere with or undermine cybersecurity.**Shor’s factoring algorithm.**Will it ever work? Will it break public key encryption? Will it break bitcoin, cryptocurrencies in general, distributed ledger technology, or blockchain in general?**Post-quantum cryptography.**Cryptography in light of the risk that maybe some day Shor’s factoring algorithm will become practical. The technical alternatives, today, and tomorrow. What more can be done?**Quantum hacking.**Quantum algorithms and quantum applications which could dramatically enhance computer hacking. This could present significant cybersecurity risks for criminal or state-sponsored hacking, but also present significant opportunities for national security cyberwarfare.**Cyberwarfare.**Both defensive and offensive capabilities at the national security level. Can be both tactical and strategic. Can be a passive deterrent as well as active measures.

My paper which discusses generation of true random numbers, both as an adjunct to classical computers, or directly on a quantum computer:

*Quantum Advantage Now: Generation of True Random Numbers*- https://jackkrupansky.medium.com/quantum-advantage-now-generation-of-true-random-numbers-237d89f8a7f2

# Quantum algorithm support

These are research areas which apply across all algorithms generally.

Also, see the *quantum computer science* section for areas of overlap with algorithm support.

**Better understanding of criteria for what applications or problems are not amenable to implementation as quantum algorithms.**See my paper below — what can’t a quantum computer compute.**Clearly document the required minimum qubit fidelity for an algorithm.**No point trying to run an algorithm on a machine incapable of supporting the algorithm. Rules of thumb and automated tools to support determining that minimum requirement.**Distinguish derivative algorithms from algorithm implementations.**An algorithm is essentially a specification for what the algorithm does. The code or circuit which implements the algorithm is an*implementation*of the algorithm (specification.) Variations or deviations from the original algorithm (specification) are more properly termed*derivative algorithms*. For example, there are a number of derivative algorithms based on Shor’s factoring algorithm — they each have their own implementation, but none of them is an implementation of Shor’s original factoring algorithm per se.**The art of crafting a quantum algorithm.**Document accumulated wisdom for how to go about putting together an algorithm, starting with a statement of the problem to be solved. Also, given a rough algorithm, how to refine, optimize, and polish it into a well-crafted algorithm.**Conceptualization of generative coding of quantum circuits.**Strategies, methods, and techniques for generative coding of concrete quantum circuits from abstract quantum algorithms.**Libraries for generative coding patterns.****Better programming models.**Higher-level. Or possibly even lower-level if that facilitates compilation of macro operations (ala classical RISC instruction sets.)**Modest improvements in programming models.**Even if truly high-level programming models are substantially more difficult to achieve.**Development of quantum-native high-level quantum programming languages.**Syntax is irrelevant — it’s the semantics which matters. Plural — different levels, different needs. Higher level for greater intellectual leverage. Lower level for greater performance and control.**Quantum computing needs a high-level algebra.**Classical computing directly paralleled algebraic expressions and simple logic. What might a high-level algebra for quantum computing look like? It certainly isn’t linear algebra. Needs to relate to the terms of application domains, in the same way that application concepts related to traditional algebra, but… in a completely different way.**Computable functions for quantum computing.**What exactly can a quantum computer compute or not compute, in the formal mathematical sense? Presumably quantum computers can compute more than classical computers, but how can that difference be formalized? All operations of a Turing machine are computable in that mathematical sense, but operations of a Turing u-machine are not required to be computable. True random numbers are not computable in the classical sense of a classical Turing machine, but can readily be computed by a quantum computer.**Quantum analogs to the rich data and control structures of classical computing.**Reality “computes” quite well with energy, matter, mass, charge, atoms, subatomic particles, molecules, forces, fields, chemical reactions, crystal lattices, liquids, gases, and plasmas. Not sure how any of this can be applied to quantum computing, but reality certainly does “compute” a lot of interesting phenomena based solely on quantum effects and these other emergent phenomena. What data and control structures are really needed to “compute” or simulate reality at the level of quantum effects?**What might a BASIC language for quantum computing look like?**What types of problems can be solved very easily?**Might FORTRAN make a comeback for quantum computing?****Conceptualization of methodologies for analyzing application problems and transforming them into physics problems which can be implemented as quantum algorithms.**Principles of transformation. Sufficient to develop how-to guides for mere mortals, average application developers, not elite experts.**Application and domain-specific methodologies for analyzing application problems and transforming them into physics problems which can be implemented as quantum algorithms.****Quantum application problem statement languages.**Allow application developers to describe applications in domain-specific terms which can then be automatically transformed into quantum algorithms.**Quantum algorithm frameworks.****Quantum algorithm metaphors.****Quantum algorithm design patterns.****What are the more common***design patterns*for algorithms for quantum computing?**Quantum algorithmic building blocks.**Permit high-level operations. Substantially less effort to code complex algorithms.**Modest improvements in quantum algorithmic building blocks.**Even if truly high-level operations prove to be much more difficult to achieve,**What are the most important algorithmic building blocks for quantum applications?****More sophisticated algorithmic approaches, beyond quantum Fourier transform (QFT) and quantum phase estimation (QPE).****Were quantum phase estimation (QPE) and quantum Fourier transform (QFT) ever really viable practical visions for quantum computing and achieving quantum advantage — given their dependence on precise and fine granularity of phase and probability amplitude?**See my paper below.**Are quantum Fourier transform (QFT) quantum phase estimation (QPE) the best we can do?**They are impressive, but somewhat limited. But since we can’t use either to any significant degree today, there’s no rush to go further. But we should still be a lot more forward-looking.**Is period-finding (such as used by Shor’s factoring algorithm) as useful as had been claimed?****Is order-finding (AKA period-finding) an essential technique for quantum algorithms?****What are applications of order-finding (AKA period-finding) algorithms (OFA, PFA, or QPF)?**Beyond Shor’s factoring algorithm. Practical uses, where OFA is the best and optimal approach.**Quantum amplitude estimation (QAE).**Analogous to quantum phase estimation (QPE) but applied to probability amplitudes for basis states. Questions about granularity of amplitude, as well as circuits for performing the estimation. Or, should this be an atomic operation built into the firmware and optimized for the particular hardware?**Implement common and high-value operations in firmware.**Such as SWAP networks, quantum Fourier transform (QFT), quantum phase estimation (QPE), etc. dramatically minimize the number of quantum logic gates. Optimize for the particular hardware. Dramatically simplify algorithms. But… what’s practical and yields a truly dramatic benefit?**Working with fine granularity of phase and probability amplitude.**See paper below.**Methodology and techniques for working with circuit repetitions (shot count) and developing expected value for a probabilistic quantum algorithm.**Formalize calculation and estimation of shot count based on the quantum computation. See my paper below. Researchers need to come up with formulas, rules of thumb, and automated processes for deducing the shot count needed in a particular situation, rather than the user engaging in guesswork and trial and error.**Methods and techniques for quantum parallelism.****Conceptualization of approaches for quantum-inspired classical algorithms.****Conceptualization of heuristic approaches to quantum algorithm design.****Conceptualization of rules of thumb for quantum algorithm design.****Conceptualization of scalable algorithms.**Models for how to approach scalability. Automatic scalability.**How can we automatically validate scalability of a quantum algorithm?**Automated tools. Criteria and code patterns to detect.**Quantum algorithms need to be provably scalable.**How can this be done? Is it possible?**Can quantum variational methods ever achieve quantum advantage?**Are variational methods a deadend? Sure, variational methods are a great way of performing computational chemistry calculations on NISQ devices, but not with any apparent let alone dramatic quantum advantage. Too fragmented. Each invocation is too limited. Need to use 50 or more qubits in a single Hadamard transform computation to achieve quantum advantage.**Focus on near-perfect qubits rather than depend on only full quantum error correction and logical qubits.**See my paper below on*Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits*— and near-perfect qubits.**Impact of fault tolerance, quantum error correction, and logical qubits on quantum algorithms.**Contrast with noisy qubits, near-perfect qubits, and manual error mitigation.**Bespoke (custom) algorithms vs. reusable algorithms.**Tradeoff between algorithms custom-designed and tailored for particular applications, and general-purpose algorithms usable across many applications. Extra focus on the latter would be most beneficial, but criteria for characterizing both are needed.**Challenges of modeling complex systems for quantum computing.**Other than physics and chemistry which are naturally modeled by quantum mechanics. Telling people to simply model their problem as a physics problem is too simplistic and not so helpful for non-physicists and non-chemists.**Debugging capabilities.**How to debug algorithms and applications for more than the qubit and quantum state capacity of even the largest classical quantum simulators is an interesting challenge.**How large a quantum Fourier transform can be supported?**What are the limits to the granularity of phase?**What can we do with 128 qubits?****What can we do with 256 qubits?****What can we do with 512 qubits?****What can we do with 1,000 qubits?****What can we do with 1,000,000 qubits?****Dealing with the quantum chasm.**The transition from quantum computers with hundreds of qubits to machines with millions of qubits has been referred to as the*quantum chasm*. See Alan Ho of Google inand John Preskill in*Crossing the Quantum Chasm*.*Quantum Computing in the NISQ era and beyond***Can a quantum computer compute cellular automata?****Genetic programming and genetic algorithms for quantum computing.****How can quantum computing compete with the magnificent high-level abstractions of classical computing?**Other than raw performance.**Can dramatic quantum advantage be achieved without quantum Fourier transform?**Is there some other path?**Need for an alternative to Shor’s factoring algorithm as the exemplar of the best that quantum computing has to offer.****Data structures for quantum computing.**How much, if any, of classical data structures make any sense in the context of quantum computing? What data structures or data structuring methods might make sense for quantum computing?**What’s common across all quantum computing application categories?****Proper commenting and documentation for quantum algorithms.****Using classical AI to facilitate quantum computational chemistry.**Avoid tedious manual preparation of initial states and circuits.**Methods and criteria for validation of results of quantum computations.****How much knowledge of quantum mechanics do you need to know to understand quantum computing?**How much can be hidden, at least from most algorithm designers?**What is the quantum computing equivalent of the tape of a Turing machine?****How accurate and reliable can a quantum computer be if entries in a unitary matrix are irrational?**Such as an H gate with entries which are 1 divided by the square root of 2, an irrational number. How much precision is needed? Will it ever be enough? Would symbolic math for pi, square root, and exponentials lead to greater accuracy and closer alignment with the reality of quantum mechanics?**General support for production-scale production-quality algorithms, whether larger amounts of data, larger input parameters, or larger solution spaces.****The need for reversible algorithms.**Early discussions of quantum computing insisted that every quantum computation needs to be fully reversible, but more recently that requirement has been absent. What changed? Is reversibility essential, mandated, desirable, optional, irrelevant, or what? More clarity is needed as to whether or when or under what conditions full reversibility is required.**What circuit depth could we expect to require to achieve dramatic quantum advantage?**Of course it will vary a lot, but some ballpark estimate would be helpful.**How many of our current algorithms are truly and fully scalable?**To 30, 40, 50, 60, 70, 80 qubits and beyond. Without redesign, as currently written and currently implemented. I fear that the current answer is few or even none.**Quantum work.**How much work is a quantum computer or actually a particular quantum circuit performing? Maybe the sum of quantum states for each step in the quantum circuit. Factoring in quantum parallelism — a Hadamard transform on n qubits producing 2^n quantum states. Compare the work of the quantum circuit with the work of a classical program which calculates the same results.**How large and complex a molecule would have to be simulated to have a dramatic impact on chemists?**True, dramatic quantum advantage, if not outright quantum supremacy.**Working with real numbers on a quantum computer.**When all you have are raw qubits. How to process real numbers on a quantum computer. How to get real numbers for the results of quantum computations.**Does measurement of a single qubit cause the full collapse for all entangled qubits participating in the wave function in which the measured qubit is participating?**The full semantics of measurement and collapse needs to be more fully and accurately articulated. Do all of the qubits participating in the wave function collapse to exactly pure states — with an absolute guarantee of no mixed states? Or is it just the qubit(s) being measured. Should the order in which entangled qubits are measured matter? At least for Bell states and quantum communication, it seems as though both qubits collapse at the same time, but this needs to be articulated more clearly. Curious about GHZ and W states.**Does measurement of a qubit guarantee collapse to an absolutely pure state?**Are all other qubits participating in the same wave function also guaranteed to collapse to exact pure states rather than mixed states?**How stable are pure states for qubits?**Are pure states any more or less stable than mixed states?**If simulating physics, does it ever really make sense for the circuit to always start in the ground state?**After all, in the real world, there is always something going on before an experiment is run. Maybe the initial state can or should be constrained to a relatively narrow range of state values, but not absolutely pure ground state. Ditto for chemistry and chemical reactions — always some relatively noisy background. That said, for purposes of debugging anomalous cases, it can be helpful to capture and initiate to some known noisy state, to reproduce problematic test cases. Or, maybe classical code should always do the selection and recording of a particular initial state, even if intending it to be random.**How exactly can quantum computing compete with Monte Carlo simulation on a classical computer?**Where approximate solutions are acceptable. Where exactly is the quantum advantage? Potential for a hybrid?**Hybrid Monte Carlo methods.**Quantum-specific enhancements. Keep up with improvements in classical MCS.**Can probability amplitudes (or probabilities) of the basis states be determined from the Bloch sphere?**I haven’t seen any geometric representation of them, particularly for superposition. Certainly not possible for entangled qubits.**What does “**A phase shift that has no impact on the outcome of measurement? Change in amplitude that leaves probability unchanged? What exactly is the point or purpose or effect of saying it? Other phrases… Up to a relative phase shift. Up to an unobservable global phase factor. (What is a phase factor vs. phase shift?) Up to an unimportant constant factor. Up to an unimportant global phase T. Up to an unimportant global phase. Up to an unimportant global phase factor. Up to an unimportant global phase shift. Up to a global phase. Up to a global phase shift. Up to an irrelevant overall phase. Up to an irrelevant overall phase factor. Up to an irrelevant global phase. Up to a normalization factor. Up to an overall multiplicative factor. Up to an error on a single qubit of the output. Up to a constant factor. Up to a unitary transformation. Up to a relative phase shift. “*up to a factor*” or “*equivalent up to global phase*” really mean?*equivalent up to global phase*” — “*Two quantum states |ψi and |ϕi are equivalent up to global phase if |ϕi = e^iθ |ψi, where θ ∈ R. The phase e^iθ will not be observed upon measurement of either state.*” — https://arxiv.org/abs/0705.0017.**What exactly is a**https://en.wikipedia.org/wiki/Root_of_unity http://mathworld.wolfram.com/RootofUnity.html*root of unity*and how is it used in quantum computing?**What are the various forms of quantum parallelism?**Enumerate and describe them.**Thinking beyond quantum parallelism.**What are the limits of current models of quantum parallelism? What could be done for problems which exceed those limits?**Chunking as an approach to processing more data than a quantum computer can directly handle.**An essential challenge. Multiple runs with variations of seams — look for statistical balancing. Major post-processing — using a quantum computer simply to guide a classical computer to where to focus effort. Gridding as an essential tool — using a quantum computer simply to guide a classical computer to where to focus effort.**Chunking and machine learning.**How much data can a quantum computer handle in a single chunk? How much data for each complete problem to be chunked? How to handle the seams between chunks?**Limits of quantum computing.**What about problems and solutions which are of super-exponential complexity? How can complexity be reduced from super-exponential to exponential? Techniques and heuristics for complexity reduction, such as Shor reducing factoring to order-finding (AKA period-finding).**Techniques and heuristics for complexity reduction.**Such as Shor reducing factoring to order-finding (AKA period-finding).**Can a quantum algorithm compute a transcendental function?**Normally requires an infinite series expansion on a classical computer. No looping available. Sine, cosine, exponential, logarithm.**Is there any merit to executing multiple copies of the same quantum circuit in parallel on the same quantum processor?**Or is that no better than serially executing the same circuit? For example, if the quantum processor has 53 qubits but a quantum volume of only 64 to 256 (6 to 8 qubits), why not execute six copies in parallel — and compare the results? Any advantage to running those six copies serially instead? Maybe crosstalk and interference between qubits. Can the six (or five or four) sets of 6–8 qubits be physically isolated sufficiently?**What types of problems do not require determinism (deterministic results) and which do?**Good enough even if not the most optimal — optimization, selection. Learning — is it really okay to learn the wrong lessons or to learn part of the material? Consumer interests — no clear deterministic solution anyway. Problem diagnosis — want options, possibilities which can then be further evaluated. List of possibilities vs. single solution which has some probability of being wrong. Financial applications which require strict deterministic results. Immediate results vs. storage and persistence of results.**How much number theory do you need to understand to understand advanced quantum algorithms?**Such as using order-finding (AKA period-finding), including Shor’s factoring algorithm.**Need for technical guidelines for when quantum computing is appropriate for a computation.**See my paper below.**Need simple plain text notation for even complex quantum states.**No Greek letters, math symbols, or other obscure notation. No complex or obscure graphics. 0 and 1 are obvious. 0/1 for superposition of 0 and 1. Etc.**Anatomy of a quantum algorithm.**What are the pieces? How do they relate or interconnect? How many distinct design patterns? SPAM — State Preparation And Measurement. Hybrid models. Clever reduction of algorithmic complexity.**Where is the value in quantum machine learning if there is no support for Big Data?****Is quantum computing appropriate for problems of factorial complexity?**Or must they first be reduced to exponential complexity? Well, they do have to be reduced to polynomial complexity anyway. If it takes two steps with an intermediate reduction, fine. But better if there was a one-step reduction. The traveling salesman problem?**Can the****Wolfram Physics Project****be computed using a quantum computer?**Is Wolfram’s Fundamental Theory of Physics compatible with quantum mechanics, the native science of quantum computers. Could some variant of quantum computing be more compatible with Wolfram’s Fundamental Theory of Physics? Is there any workable synergism between Wolfram’s Fundamental Theory of Physics and quantum computing?**Little data — large solution space.**See. When might it not be the best choice?*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing***What is a complex quantum algorithm?**What’s the threshold or criteria to distinguish from a simple algorithm? When is a quantum algorithm no longer a trivial or simple or non-complex quantum algorithm? Raw number of gates? Degree of entanglement? Number of qubits in the largest register with all qubits in a Hadamard superposition?**How soon will algorithms hit a hard wall without quantum error correction?**Algorithms using more than 28 or 32 or 36 or 40 qubits? For usable, practical algorithms addressing real-world business problems, not specialized computer science experiments.**Need for quantum circuit narration.**Ascertaining the net effect of a quantum circuit or even a short snippet of a quantum circuit can be tedious, error-prone, or even beyond the reach of many users. An automated tool for*narrating*the incremental and net effect would be very helpful to answer the question*What does this quantum circuit (or snippet or portion of the circuit)**do**?*Alternatively, a library of common circuit patterns with narrative English descriptions of the effect and purpose of each pattern would accomplish much of the purpose of such an automated tool. Alternatively, authors of quantum circuits could carefully comment their circuits, explaining what each sequence or arrangement of gates is attempting to do — it’s net effect and purpose. The net effect is that quantum circuits need to be more readable by real people, not simply executable on a quantum computer.**What might an ideal high-level quantum programming model look like?**What would the fundamental algorithmic building blocks look like? How does one proceed from problem statement to solution?**What might an ideal high-level quantum programming language look like?**The syntax is less interesting than how computations can be expressed in the programming model. Plural — different levels, different needs. Higher level for greater intellectual leverage. Lower level for greater performance and control.When will quantum computing hardware and algorithms reach the stage where real-world, practical, production-quality, production-capacity applications can be readily produced without heroic levels of effort? Quantum algorithmic breakout is the critical mass moment when there is a confluence of the hardware for quantum computers and our ability to develop effective algorithms which fully exploit that hardware which has reached the critical mass needed for development of relatively sophisticated quantum algorithms suitable for real-world, practical, production-quality, production-capacity applications so that their development becomes commonplace, even easy, as opposed to the heroic and severely limited efforts which are common today. It will be the moment of liberation from the tedium of agonizingly slow progress. No longer constrained by very limited hardware. No longer constrained by limited algorithmic building blocks.*What Is Quantum Algorithmic Breakout and When Will It Be Achieved?***Is the inability to directly measure probability amplitudes the fatal flaw of quantum computing?**But what about phase estimation? Takes too many qubits and requires too much coherence. Destructive read.**Fractional parallelism.**Unable to do full desired parallel computation in one step — must break it into pieces with optimization or other processing between the pieces. Blocks for D-Wave. Variational methods. Machine learning?**Key to exploiting quantum computing is reduction of algorithmic complexity.**Reducing from an algorithm of higher complexity to a lower level of complexity, primarily from exponential to polynomial. Even reduction from exponential to polynomial may not be sufficient — even linear may be too intensive. Is super-polynomial still too expensive or not?**Need for 32 to 64-qubit algorithms.**Smaller algorithms don’t really demonstrate the true potential of quantum computing — dramatic quantum advantage. 32–40 qubits can be classically simulated, so focus there first, with emphasis on scaling from 32 to 64 qubits.**Quantum pseudo-code.**Informal language or even simply commenting conventions which allow algorithms to be expressed as*what*they are doing rather than*how*they are doing it. High level, but detailed enough that an average developer could translate to working quantum circuits. Would be good for published papers.**How might a quantum computer work with color?**In contrast to the various color models used in classical computing. Color of what? Beyond images? Would classical pixels make sense on a quantum computer? Could quantum pixels be entangled, or always entangled?**How would audio be represented and processed in a quantum computer?****Modes of execution of quantum algorithms: for full effect, debugging, to test or evaluate the algorithm.**May affect shot count (circuit repetitions) and how results of multiple executions are processed and characterized.**Is there any theoretical limit to the complexity of a single quantum calculation?**I suspect that there might be. There are undoubtedly practical limits for particular qubit technology implementations, but very possibly there might be theoretical limitations as well. And since every implementation has to use some technology, there is likely some best and most ideal qubit technology whose practical limit is then the theoretical limit.**How long before we can move beyond sub-exponential performance for search algorithms?**Grover’s search algorithm offers only quadratic speedup. We need exponential speedup to unlock the full potential of quantum computers.**Potential for algorithm-specific hardware design.**General-purpose hardware has its benefits, but specialization and optimization for high-value applications and algorithms can have significant value, especially while general-purpose hardware capabilities are somewhat limited.**Do the Deutsch–Jozsa and Simon algorithms have any practical application as algorithm design patterns?**Can they be revised or extended to give them any significant utility as design patterns?**Algorithm issues that may crop up as various thresholds of algorithm size (qubits) are exceeded.**Such as granularity of phase and probability amplitude, and quantum Fourier transform (QFT) and quantum phase estimation (QPE), not to mention circuit depth and coherence time. Qubit threshold such as 16, 20, 24, 28, 32, 40, 44, 48, 50, 55, 60, 64, 72, 80, 96, 128, 192, 256, 384, 512, 768, 1024, 2048, 4096, etc.**Special emphasis on NPISQ quantum algorithms.**40-qubit algorithms are very interesting and have great potential, but from a research perspective the intermediate-scale of qubits — 50 to a few hundred — has very great potential, but only if they are near-perfect rather than noisy — NPISQ rather than NISQ. There should be distinct research programs for different ranges of qubit capacity since the nature of the physical limits may vary. Obvious ranges to focus on include 50–96, 96–128, 128–160, 160–256, 256–1K, and >1K. For example, it may be possible to perfect algorithms in the 64 to 80-qubit range but not be able to scale those techniques directly to the 150 to 200-qubit range or the 256 to 1K-range.**Algorithm knitting or circuit knitting.**As promoted by IBM, classical code cleverly figures out how to split a larger algorithm or quantum circuit into smaller algorithms or quantum circuits which can then be executed or simulated more efficiently and correctly, especially on near-term quantum computers or simulators with limited qubit capacity, and then classical post-processing code “knits” the results together for the full algorithm or circuit.

My paper on what quantum computers are not suitable for:

*What Can’t a Quantum Computer Compute?*- https://jackkrupansky.medium.com/what-cant-a-quantum-computer-compute-53bf3945d419

For more on working with probabilistic quantum results and development of expectation values, see my informal paper:

*Shots and Circuit Repetitions: Developing the Expectation Value for Results from a Quantum Computer*

For details on issues related to relying on fine granularity of phase, such as for QFT, QPE, and Shor’s factoring algorithm, see my paper:

*Beware of Quantum Algorithms Dependent on Fine Granularity of Phase*- https://jackkrupansky.medium.com/beware-of-quantum-algorithms-dependent-on-fine-granularity-of-phase-525bde2642d8

For my thoughts on quantum error correction, fault tolerance, and logical qubits — and near-perfect qubits:

*Preliminary Thoughts on Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits*- https://jackkrupansky.medium.com/preliminary-thoughts-on-fault-tolerant-quantum-computing-quantum-error-correction-and-logical-1f9e3f122e71

# Special emphasis on NPISQ quantum algorithms

40-qubit algorithms are very interesting and have great potential, but from a research perspective the *intermediate-scale* of qubits — 50 to a few hundred — has very great potential, but only if they are near-perfect rather than noisy — NPISQ rather than NISQ. Eventually the focus will be on fault-tolerant logical qubits — FTISQ — but that’s even further out.

There should be distinct research programs for different ranges of qubit capacity since the nature of the physical limits may vary.

Obvious ranges to focus on include 50–96, 96–128, 128–160, 160–256, 256–1K, and >1K.

For example, it may be possible to perfect algorithms in the 64 to 80-qubit range but not be able to scale those techniques directly to the 150 to 200-qubit range or the 256 to 1K-range.

In the short-term, algorithms in the 100 to 160-qubit range are likely to have great practical value but are likely to encounter challenges that might not be so relevant or visible when pursuing algorithms in the 50 to 64 or maybe even to 72 to 80-qubit range.

A specialized research program targeting 256-qubit algorithms (or 150 to 250-qubits) could yield very promising results within a very few years — IBM has their 433-qubit Osprey processor roadmapped for 2022, although that hardware may or may not still offer only noisy qubits (NISQ) rather than the near-perfect qubits (NPISQ) needed for more complex algorithms.

For more on my proposal to upgrade terms from NISQ to NPISQ and beyond, see my paper:

*Beyond NISQ — Terms for Quantum Computers Based on Noisy, Near-perfect, and Fault-tolerant Qubits*- https://jackkrupansky.medium.com/beyond-nisq-terms-for-quantum-computers-based-on-noisy-near-perfect-and-fault-tolerant-qubits-d02049fa4c93

# Quantum algorithms

Generally these areas of research relate to specific algorithms or at least categories of algorithms. Research areas that cross all algorithms should be categorized under *quantum algorithm support*.

**Algorithms to apply to any of the application categories identified to date.**See paper below.**Generic algorithms.**Don’t relate to any particular type of domain or application.**Domain-specific algorithms.****Application-specific algorithms.****Cross-application algorithms.****Better collection of example algorithms.**Which demonstrate quantum parallelism and are clearly scalable and could achieve dramatic quantum advantage when the needed hardware is available.**What’s the most sensible, practical, simple, and scalable Hello World quantum example algorithm?**Which demonstrates quantum parallelism, is clearly scalable, and could achieve dramatic quantum advantage when the needed hardware is available. And whose effect is readily understood by all audiences.**Address the dearth of 28-qubit algorithms.**Via simulation, especially if hardware isn’t up to the task.**Address the dearth of 32-qubit algorithms.**Via simulation, especially if hardware isn’t up to the task.**Address the dearth of 40-qubit algorithms.**Via simulation, especially if hardware isn’t up to the task.**Address the dearth of scalable algorithms.**Even if real quantum computers aren’t up to the task, at least simulation scaling up to 40 qubits.**64-bit algorithms.****72-qubit algorithms.****80-qubit algorithms.****128-qubit algorithms.****256-qubit algorithms.****Revisit and review the Quantum Algorithm Zoo to assess its relevance to quantum computing as it exists today and in the future.**Can the algorithms be adapted? Can any of them actually be implemented on near-term hardware? Are they automatically scalable?**Reference implementation of Shor’s factoring algorithm.**Scalable. Even if it still won’t run on near-term quantum computers.**Focus on near-perfect qubits rather than depend on only full quantum error correction and logical qubits.**See my paper below on*Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits*— and near-perfect qubits.**Identify a practical application with a relatively simple algorithm which is scalable and could achieve dramatic quantum advantage when the needed hardware is available.**Beyond a mere quantum Hello World program, a practical application.**Contemplate algorithms for problems even much harder than Shor’s factoring algorithm.****Applying quantum effects to general AI and simulation of the human brain/mind.****Can the classical outer loop of Shor’s factoring algorithm be fully implemented as a quantum circuit?**But it is essentially a loop. Question of halting problem. Need for repetitions of the quantum inner circuit (order-finding.) How to implement GCD as a quantum circuit. His paper seems to imply that the tail-end of order-finding (deducing r from s/r) can be implemented as a quantum circuit (even though his paper presumes a classical implementation), but that section seems like too much of a hand wave.**Need a baseline implementation of Shor’s factoring algorithm.**There can be and are many derivative algorithms and implementations and improvements, but there should be a single agreed-upon starting point even if suboptimal. A baseline for technical comparison.**Need a baseline classical implementation of Shor’s factoring algorithm.**Something to compare against, at least for smaller numbers. Maybe up to 16 or even 32 bits? Simply coding the quantum order-finding subroutine in classical code. Three levels of implementation: 1) simple, naive, single-processor, trying each order candidate sequentially, 2) multi-processor for each candidate order in parallel, 3) parallel multiprocessor computations for a batch of trial random numbers in parallel.**Will Shor’s factoring algorithm even work at all for larger or even moderate-sized integers?**Due to reliance on quantum Fourier transform (QFT) which relies on very fine granularity of phase, which may not be practical for real qubits. See my paper below.**Need for exponential speedup for search.**Rather than Grover’s merely quadratic speedup.**Computing for gravity-like problems.**Since quantum mechanics doesn’t include gravity and General Relativity.**What is the computational complexity of protein folding?**Is it super-exponential, and hence not reducible to polynomial complexity even on a quantum computer?**Could a quantum computer compute pi more efficiently than classical methods?**Or would it need n qubits to compute n bits of pi, so over three trillion qubits would be needed to compute pi to one trillion decimal places? Simply, is computing pi simply phase estimation where phase is… pi? But does that essentially mean needing the value of pi to create a phase of one pi? Or is there some way to perform a rotation of exactly one half of a full cycle without knowing pi in advance? Could an incremental approach be taken, one bit at a time? Is asin(1.0)*2 exactly pi? What is the series expansion for asin? Would quantum parallelism help at all? Can’t give a result with more bits of precision than qubits since a quantum computer has no I/O or access to mass storage.**What is the simplest example of quantum parallelism?****Track progress on semiprime factorization.**Implementations and redesigns of Shor’s factoring algorithm or alternative algorithms. Anything above factoring 15 or 21? Limit of classical quantum simulation. Limits of current, near-term, and projected real quantum computers. 6-bit, 7-bit, 8-bit, 9–12 bits, 16-bit, 20-bit, 24-bit, 32-bit, 48-bit, 64-bit, 72-bit, 96-bit, 128-bit. The basics, even before large numbers can be factored.**Need for practical guidelines for when quantum computing is appropriate for a particular computation.****When will a quantum computer be able to calculate the ground-state energy of aspirin?**C9H8O4. How many qubits? How many gates? What connectivity? How many repetitions (“shots”)? If drug design is a key future for quantum computing, it seems as if aspirin is a good test as a “starting gate” or baseline. Or caffeine. Or sugar. Or salt. Or gasoline. Question of chemical accuracy. What accuracy is really desired — and needed to have a dramatic advantage over classical computers?**Could a quantum computer simulate U.S. presidential electoral college results based on popular polls for each state?**Margin of error for each state. Granularity of results for each state — how continuous, how discretely chunked.**Simulation of spin lattice and continuous-space problems.****Using quantum computing to analyze, discover, and characterize chemical reactions.**Moving well beyond ground state energies. Researchers seem too stuck or bogged down in the basics to get to full-blown chemistry simulation. Is this a lack of qubits, a lack of coherence, a lack of qubit connectivity, or a lack of theory in a form that can be directly transformed into quantum algorithms?**Need some examples for simulation of physics.**Something that Feynman would appreciate.**How might string theory be used to approach the design of quantum algorithms?****How might quantum computing be applied to string theory?**After all, a major part of the motivation for quantum computing was simulation of physics.**Do the Deutsch–Jozsa and Simon algorithms have any practical application?**Can they be revised or extended to give them any significant utility? They are traditional examples of basic quantum algorithms, but examples without practical value as they are. Can they be turned into useful design patterns?**Need for reference implementations of all quantum algorithms.**Production algorithms may require more sophisticated and optimized algorithm implementations, but published algorithms should each provide a freely-accessible and clearly-documented*reference implementation*(in GitHub) which production designers and developers can use as a*baseline reference*for their own work.**Opportunities for quantum-inspired classical algorithms.**Technically they are not quantum algorithms per se, but they are functionally equivalent, just with classical implementations.**Cybersecurity.**Quantum algorithms which aid cybersecurity, as well as quantum algorithms which may interfere with or undermine cybersecurity. See also references to Shor’s factoring algorithm.**Post-quantum cryptography.**Cryptography in light of the risk that maybe some day Shor’s factoring algorithm will become practical. The technical alternatives, today, and tomorrow. What more can be done?**Quantum gaming.**Quantum algorithms which could dramatically enhance computer gaming.**Quantum hacking.**Quantum algorithms which could dramatically enhance computer hacking. This could present significant cybersecurity risks for criminal or state-sponsored hacking, but also present significant opportunities for national security cyberwarfare.**Cyberwarfare.**Both defensive and offensive capabilities at the national security level. Can be both tactical and strategic. Can be a passive deterrent as well as active measures.**Machine-assisted scientific discovery.**Unsupervised computational discovery and learning of physical laws from experimental data without prior knowledge of the systems in question. See thepaper by Choi, Flam-Shepherd, Kyaw, and Aspuru-Guzik.*Learning quantum dynamics with latent neural ODEs*

For reference see my paper on applications and application categories:

*What Applications Are Suitable for a Quantum Computer?*- https://jackkrupansky.medium.com/what-applications-are-suitable-for-a-quantum-computer-5584ef62c38a

As well my paper on what quantum computers are not suitable for:

*What Can’t a Quantum Computer Compute?*- https://jackkrupansky.medium.com/what-cant-a-quantum-computer-compute-53bf3945d419

*Beware of Quantum Algorithms Dependent on Fine Granularity of Phase*

For my thoughts on quantum error correction, fault tolerance, and logical qubits — and near-perfect qubits:

*Preliminary Thoughts on Fault-Tolerant Quantum Computing, Quantum Error Correction, and Logical Qubits*- https://jackkrupansky.medium.com/preliminary-thoughts-on-fault-tolerant-quantum-computing-quantum-error-correction-and-logical-1f9e3f122e71

# Quantum application support

Research areas relevant across all applications. May be specific to particular application categories.

**Overall focus on toolkits, libraries, frameworks, and packaged solutions for quantum computing applications.****Application-specific frameworks.****Application category-specific frameworks.****Application frameworks for the seven main application categories.**Simulating physics, chemistry, material design, drug design, business optimization, finance, and machine learning. Note IBM’s recent introduction of.*QISkit Nature***Application-specific support libraries****Application category-specific support libraries****Example applications.**Graduated examples, from very simple to more complex.**Configurable packaged quantum solutions.**Prewritten code — complete applications — addressing particular niches of quantum computing applications. Users never see the actual quantum algorithms. Instead they code high-level configuration parameters which are input to the solution software which then generates the optimal quantum algorithm and even the full quantum application.**Focus research to deliver support for three stages of deployment for quantum computing: The ENIAC Moment, configurable packaged quantum solutions, and The FORTRAN Moment.****Quantum application debugging capabilities.**Part classical debugging, part debugging of quantum algorithms.**General support for production-scale production-quality quantum applications and algorithms, whether larger amounts of data, larger input parameters, or larger solution spaces.****General intuition and guidance for how to solve a real-world problem on a quantum computer.**No clarity whatsoever, at present. The best I can say currently is to convert your real-world problem to a physics problem, but even that is not terribly helpful — even for solving physics problems. In any case, this is an urgent need.**Need for technical guidelines for when quantum computing is appropriate for an application.**See my papers below.**Should quantum applications even need to be aware of qubits, even logical qubits, or should higher-level abstractions be much more appropriate for an application-level quantum programming model?**Classical programming doesn’t require knowledge of bits: Integers, real numbers, floating point, Booleans — logical true and false — binary, but not necessarily implemented as a single bit, text, strings, characters, character codes, structures, objects, arrays, trees, maps, graphs, media — audio, video, structured data, semi-structured data.**What practical, production-scale real-world problems can quantum computing solve — and deliver dramatic quantum advantage and dramatic real business value?**And be clearly beyond what classical computing can deliver.**Not every application which overwhelms classical computing is necessarily automatically an application for quantum computing.**Need specific or even general criteria. Are we really using classical computers as effectively as we could? What applications might overwhelm even a quantum computer? See my papers below.**What quantum volume is required for practical quantum computing applications?**Maybe 32 qubits by 32 gates = 2³² = 4 billion? 40 x 40 = 2⁴⁰. 50 x 50 = 2⁵⁰. 60 x 60 = 2⁶⁰.**Need for practical guidelines for when quantum computing is appropriate for a particular application.**See my papers below.**Application-specific benchmarking and application category-specific benchmarking.****The hidden shift problem.**What exactly is it useful for? Hidden subgroup problem (HSP). Given two functions f, g such that there is a shift s for which f(x) = g(x + s) for all x. The problem is then to find s. If f(x) = g(x + s) then s is referred to as a shift.**Will quantum computing ever… cure cancer, solve inequality, cure poverty, cure the climate crisis, or enable nuclear fusion as an energy source?**How can we properly characterize what quantum computing can actually do, what problems it can address or solve? See my papers below.**Potential for application domain-specific hardware design.**And possibly even for particular applications. General-purpose hardware has its benefits, but specialization and optimization for high-value applications can have significant value, especially while general-purpose hardware capabilities are somewhat limited.**Post-quantum cryptography.**Cryptography in light of the risk that maybe some day Shor’s factoring algorithm will become practical. The technical alternatives, today, and tomorrow. What more can be done?

For reference see my paper on applications and application categories:

*What Applications Are Suitable for a Quantum Computer?*- https://jackkrupansky.medium.com/what-applications-are-suitable-for-a-quantum-computer-5584ef62c38a

As well my paper on what quantum computers are not suitable for:

*What Can’t a Quantum Computer Compute?*- https://jackkrupansky.medium.com/what-cant-a-quantum-computer-compute-53bf3945d419

# Quantum applications

Research for specific applications.

**Generic applications.**Don’t relate to any particular type of domain.**Domain-specific applications.****Cross-domain applications.****Better characterization of what applications are and are not suitable for quantum computers.**It’s not about applications per se, but forms of computations. See my papers below.**Roadmap for molecular modeling of chemistry.**Define a wide range of molecules, from simplest to most complex, as the milestones on the roadmap. Such as salt, caffeine, sugar, gasoline, etc. Estimate qubits, circuit depth, and coherence and qubit fidelity needed for each milestone.**Quantum computing for health care.**What are the possibilities — from a realistic perspective, what specific approaches?**What application is likely to reach dramatic quantum advantage first?**Or at least which application category? I’m personally rooting for quantum computational chemistry. Business process optimization would seem to be a slam dunk. What factors will make the choice? Could the ability to tolerate an approximate solution be a key factor?**What application is likely to reach The ENIAC Moment first?**This could be the same as*What application is likely to reach dramatic quantum advantage first?*, but a true dramatic quantum advantage is not required provided that enough of a*partial quantum advantage*is achieved that really impresses people, and is actually useful.**Would moonshot projects or Manhattan Project-style of projects really help quantum computing make a quantum leap to its full potential?**Timing is everything — doing either Project Apollo or the Manhattan Project even a mere five years earlier would have been an absolute disaster — a critical mass of the critical technologies, coupled with a critical mass of resources and a critical mass of commitment are essential. Quantum computing is still basically stumbling around in the dark. Besides, look how well classical computing developed, without a moonshot project. But also look at how the long stream of critical developments occurred over an extended period of time.**If we had a full-scale quantum computer today, how could it help us fight the coronavirus outbreak?**Why not? Capacity? Coherence? Error rate? How might a much better quantum computer have helped? What algorithms?**What are some examples of what a quantum computer could do given Hubble space telescope data (or other space telescopes)?**Would they work best with the raw data or would some preprocessing be needed? Would one or more quantum computers be needed for all processing of space telescope data?**Could you implement****Mathematica****on a quantum computer?****How much of****Mathematica****could you implement on a quantum computer?****Need for reference implementations of all quantum applications.**Production application deployment may require more sophisticated and optimized implementations, but published applications should each provide a freely-accessible and clearly-documented*reference implementation*(in GitHub) which production designers and developers can use as a*baseline reference*for their own work.**Quantum machine learning.****Artificial intelligence.**Besides machine learning.**Quantum general artificial intelligence.**Simulating higher-order human general intelligence. See the separate section*Quantum general artificial intelligence*.**Quantum computational chemistry database.**Calculate as much as possible about atoms, molecules, and chemical reactions and store this information in a database so that subsequent quantum computational chemistry computations can utilize this precomputed information to simplify even more complex quantum chemistry computations.**Cybersecurity.**Quantum applications which aid cybersecurity, as well as quantum applications which may interfere with or undermine cybersecurity. See also references to Shor’s factoring algorithm.**Quantum gaming.**Quantum applications which could dramatically enhance computer gaming.**Quantum hacking.**Quantum applications which could dramatically enhance computer hacking. This could present significant cybersecurity risks for criminal or state-sponsored hacking, but also present significant opportunities for national security cyberwarfare.**Cyberwarfare.**Both defensive and offensive capabilities at the national security level. Can be both tactical and strategic. Can be a passive deterrent as well as active measures.**Machine-assisted scientific discovery.**Unsupervised computational discovery and learning of physical laws from experimental data without prior knowledge of the systems in question. See thepaper by Choi, Flam-Shepherd, Kyaw, and Aspuru-Guzik.*Learning quantum dynamics with latent neural ODEs*

For reference see my paper on applications and application categories:

*What Applications Are Suitable for a Quantum Computer?*- https://jackkrupansky.medium.com/what-applications-are-suitable-for-a-quantum-computer-5584ef62c38a

As well my paper on what quantum computers are not suitable for:

*What Can’t a Quantum Computer Compute?*- https://jackkrupansky.medium.com/what-cant-a-quantum-computer-compute-53bf3945d419

# Quantum application solutions

The essence of a *solution* is that no coding is required — everything is already there, it’s a complete solution to an application problem. All that is needed is to supply the input data and configuration parameters. At least that’s the ideal. In practice, some coding may be required.

**Configurable packaged quantum solutions.**Configurable prepackaged quantum algorithms and applications which can be tailored and customized via configuration parameters rather than directly modifying code or quantum algorithms.**Conceptualization of configurable packaged quantum solutions.**Architecture and framework, so everybody doesn’t need to reinvent the quantum wheel.**OpenFermion — is it a toolkit, framework, or solution?**Seems like a toolkit, but maybe some framework features. Definitely not a solution. Not strictly an algorithmic building block, but it may have algorithmic building blocks under the hood, and may offer algorithmic building blocks to users.**Potential for a cross between OpenFermion and Mathematica.**A configurable packaged quantum solution. Easy to set up molecules in terms that a scientist understands. No need to develop quantum circuits. Easy to interpret results.

# Quantum general artificial intelligence

Not merely machine learning (ML) or other forms of traditional AI or even quantum machine learning (QML), but quantum applied to the problem of general AI — simulating the higher-order capabilities of the human mind.

**Will quantum computing facilitate human-level reasoning for AI?****Will artificial general intelligence (AGI) require quantum computing?**Quantum effects. Turing u-machine — functions which are not computable by a classical Turing machine.**Are current conceptions of quantum computing and programming models sufficient for applying quantum effects to general AI and simulation of the human brain/mind?****Consciousness based on quantum effects.**Human mind. Animal brains. AI systems. Robots. Still a speculative research area.**Symbolic quantum computing.**For quantum AI. A good use case for more than a mega-qubit. But, what would it really look like? How to map symbols to qubits?**What would robots be able to do if they had access to a quantum computer?****What would robots be able to do if they were based on a quantum computer?**

# Quantum advantage and quantum supremacy

General issues relating to pursuing and achieving quantum advantage.

**Fully characterize quantum advantage.****Methods and techniques for pursuing and achieving quantum advantage.****Automated assessment of quantum advantage.****Automated measurement of quantum advantage.****Can a NISQ quantum computer ever achieve dramatic quantum advantage?**Even if some advantage, any prospect of it being dramatic? Or is the noisiness too much of an impediment. Depends on how strict or loose quantum advantage is defined, but for purposes here, strict, true, dramatic quantum advantage is the target. What criteria would have to be met to achieve true, dramatic quantum advantage using a NISQ device? Maybe based on the product of qubit fidelity and circuit depth — shallow circuits allow more noise, deep circuits allow much less noise. Presume 64 qubits in a single Hadamard transform as the target algorithm width. 50 qubits might be okay, or maybe 53–55 as the minimum.**Potential for fractional quantum advantage.**Even if full dramatic quantum advantage cannot be attained. See my paper below.**Quantum supremacy for practical applications.**Achieving quantum supremacy for contrived computer science experiments is not so relevant to real-world applications.

Some of my papers on quantum advantage:

*What Is Quantum Advantage and What Is Quantum Supremacy?*

https://medium.com/@jackkrupansky/what-is-quantum-advantage-and-what-is-quantum-supremacy-3e63d7c18f5b*What Is the Quantum Advantage of Your Quantum Algorithm?*

https://medium.com/@jackkrupansky/what-is-the-quantum-advantage-of-your-quantum-algorithm-8f481aefe8d0*What Is Dramatic Quantum Advantage?*

https://jackkrupansky.medium.com/what-is-dramatic-quantum-advantage-e21b5ffce48c*Fractional Quantum Advantage — Stepping Stones to Dramatic Quantum Advantage*

https://jackkrupansky.medium.com/fractional-quantum-advantage-stepping-stones-to-dramatic-quantum-advantage-6c8014700c61

# Other areas of QSTEM research

**Need a comprehensive glossary for quantum computing terminology.**And for quantum information science in general, and quantum mechanics as well. I started one — over 3,500 entries — but it still needs a lot of work, and a proper home.**Need roadmaps for all categories of research.**There should be some sense of ordering of the efforts to research. See section below.**Need reverse roadmaps for all categories of research.**Start with desired end goals (or intermediate goals) and then detail the precursor work that must be accomplished to enable that end goal. See section below.**Personas, use cases, and access patterns for quantum computing.**Who, what, and how for the use of quantum computers. Including managers, executives, non-quantum technical staff, IT service staff, scientists, engineers, operations staff (e.g., those seeking optimization), marketing, sales, technical support, etc. I’ve written such a categorization for databases and cybersecurity; every field could use one. Important for focusing on audiences for writing, products, and services. And for judging the impact of research.**Requirements for Hello World for quantum computing.**A single X or H gate might suffice, but my inclination is that it should be the simplest program which shows a practical example of quantum parallelism with results that most people can understand. Something functional, such as 2, 3, and 4-qubit quantum Fourier transform (QFT) — or preferably something that most people can understand. Something that most people can relate to and say “*Wow, that’s cool!*” Preferably, effectively uses quantum parallelism to achieve quantum advantage. Should attempt to show quantum advantage, but that could take 50 qubits or more. Maybe a graduated collection of Hello World programs is best. Or a scalable program that would work with four to eight qubits but scale to 32, 40, 50, 64, 72, 80, 96, and more qubits.**Ontology for quantum computing.**Ontology of a quantum computation. What are all of the elements, all of the pieces, all of the moving parts.**Fundamental organizing principles and key insights for quantum computing.****Clarify and distinguish qubit as information vs. qubit as a storage and processing hardware device.**Comparable to the distinction between information, representation, and storage in classical computing.**Quantum information theory.**Ala Shannon information theory. But what is it really?**Formalization and terminology for quantum computing which don’t rely on physics, quantum mechanics, and linear algebra.**Higher-level abstractions, comparable to some degree with classical computing abstractions. How should phase be modeled?**What are the theoretical limits of quantum computing?****What are the practical limits of quantum computing?****Is a quantum computer merely a quantum calculator?****Beware of commercial projects which rely on additional research — best to rely primarily on off the shelf technology.****Is there any potential for quantum computing with Big Data?****Little data — large solution space.**A general model for processing data on a quantum computer. See my paper below. When might it not be the best choice?**How much data can you put into a quantum computer?**Input data must be represented in the gate structure of the quantum circuit. Intermediate results can be computed on a colossal scale using quantum parallelism. But the real bottom line is that only a very modest amount of data can be input directly into a quantum computer, and only a very modest amount of data can be returned as the results of the quantum computation. See my little data with a large solution space paper below.**How much data can a quantum computer process?**Not really suited for Big Data per se in a classical sense (Volume, Velocity, and Variety.) As opposed to the number of operations per second per qubit. Measure in bits per qubit per second of input processed. Aggregate for number of qubits. Focus on quantum states per second since that’s the source for quantum advantage — 2^n product states for n qubits. Must take shot count or circuit repetitions into account as well. See my little data with a large solution space paper below.**Key questions about quantum advantage to ask when reviewing algorithms, applications, papers, projects, and products.****Proposed new role for NIST.**Promote and support research in open source quantum algorithms, algorithmic building blocks, quantum applications, and quantum reference data.**Security of quantum computers.**What might it mean to hack a quantum computer? Other than simply access controls and usage patterns for network and cloud-based quantum computers. Nothing quantum per se — a quantum computer is still a computer with all of the security issues of classical computers.**Will quantum computing mean the end of strong encryption?**When or can quantum computers break strong encryption for 2048 and 4096-bit public keys? Will Shor’s factoring algorithm actually work?**What compute-intensive problems do not require a quantum computer?**Any problem with a classical computing solution whose algorithmic complexity is less than exponential (e.g., polynomial.) And maybe somewhat less than that.**Replace Bloch sphere with a better visualization for entangled product states, as well as for superposition in terms of visualizing probability amplitudes.****Key takeaways from QC40: Physics of Computation Conference — 40th Anniversary.**Conference planned for May 6, 2021. See— “*QC40: Physics of Computation Conference — 40th Anniversary**Celebrate 40 years of quantum — Keynotes, contributed talks, and more bridging the 1981 Physics of Computation conference with current research*.”**Open source should be the rule for quantum computing.**Open everything. Okay to charge for consulting services and premium or production-scale production-quality access to hardware, but all code, algorithms, libraries, papers, tutorials, documentation, and training materials should be strictly open source and freely available online. Free access to cloud-based hardware for education, training, and experimentation.**Criteria for when it’s time to start getting ready for quantum computing.**Distinguish between those organizations which will likely be developing their own algorithms and those organizations looking for off the shelf packaged solutions. Clear distinction from organizations intending to be vendors in the quantum computing sector. And organizations seeking to offer quantum computing-related consulting services.**Proof points for quantum computing to achieve widespread adoption.**Milestones to quantum advantage and beyond. The ENIAC Moment and The FORTRAN Moment would be two. Various numbers of qubits would be others — 72, 80, 96, 128, 192, 256, 512, 1024, etc. Various nines of qubit fidelity would be others — two nines, three nines, four nines, five nines, six nines, etc. Full quantum error correction would be another. Various circuit depths of coherence would be others — 10, 20, 50, 75, 100, 150, 250, 500, 1,000 gates, etc. Fractions of full, dramatic quantum advantage would be others — 10%, 25%, 50%, 75%, 90%, etc., although different application categories might have different requirements.**Does Shor’s factoring algorithm break bitcoin, crypto currencies in general, distributed ledger technology, or blockchain in general?**No, one-way hash? Ahhh… mining might get “broken” (made too easy) by being accelerated? But maybe Shor’s factoring algorithm might not be feasible for very large keys due to the lack of fine granularity of quantum phase. Or at least not for quite a few years.**Intellectual property in quantum computing.**Who has what patents — IBM, Microsoft, Intel, Google, Rigetti, IonQ, Honeywell? How important are the patents? How freely and cheaply are patents licensed? Who has what copyrights — code, algorithms, books, papers, training materials? Benefits and roles of open source — pure free licensing, semi-limit licensing.**Criteria for when quantum computing is appropriate for financial applications.**Generally, financial applications, especially transactions, must be to-the-penny accurate. Generally, they must be deterministic. And generally must complete or not, can’t be maybe it completed. Generally need audit trails. Generally records must remain intact indefinitely, can’t dissipate unexpectedly or… ever. So, where does quantum fit into finance?**How to get quantum computing beyond the mere laboratory curiosity stage.**Actually able to deploy production-scale production-quality practical real-world applications. See my paper below.**What might post-quantum computing look like?**Speculation. Just for fun. Or maybe a great science fiction story. Make use of worm holes for networking? Make use of time travel (e.g., send data into the past, compute, and then it is complete in the present, or retrieve data or results from the future)? Spiritual computing? Extrasensory perception for UX using quantum sensing? Or, might the purported Singularity technically be post-quantum computing? Maybe qubits which can do everything a neuron can? True, human-level AI?**Breakthroughs vs. incremental progress.**Will quantum error correction be enough? Are more dramatic hardware breakthroughs needed? Is a dramatic algorithmic breakthrough needed? Need a much higher level programming model, with logical operations. What is needed for The ENIAC Moment, or The FORTRAN Moment?**What technological, theoretical, and physical issues are constraining quantum computing at this stage?****Standardized syllabus for introduction to quantum computing.**But there may be multiple levels of introduction. Various end states. General awareness, no technical depth. Ready for sophisticated applications. Each application area has different depth requirements. Also introduction to quantum and quantum information science as well as quantum computing itself. Different programming languages and runtime environments.**Three stages of deployment for quantum computing: The ENIAC Moment, configurable packaged quantum solutions, and The FORTRAN Moment.**The initial stage of deployment — The ENIAC Moment — for quantum computing solutions relies on super-elite STEM professionals using a wide range of tricks and supreme cleverness to achieve solutions. For example, manual error mitigation. The second stage — configurable packaged quantum solutions — also relies on similar super-elite professionals to create frameworks for solutions which can then be configured by non-elite professionals to achieve solutions. Those non-elite professionals are able to prepare their domain-specific input data in a convenient form compatible with their non-elite capabilities, but not have to comprehend or even touch the underlying quantum algorithms or code. The final stage — The FORTRAN Moment — relies on a much more advanced and high-level programming model, application frameworks, and libraries, as well as logical qubits based on full, automatic, and transparent quantum error correction to enable non-elite professionals to develop solutions from scratch without the direct involvement or dependence on super-elite professionals.**Customer journey maps for quantum computing.**How does that fit in with personas, use cases, and access patterns?**NISQ era.**Does it have any real practical consequences, or is it simply an inconsequential steppingstone to get to a proverbial*post-NISQ era*where there actually are dramatic practical consequences? AFAICT, NISQ alone won’t enable production-scale production-quality practical applications with dramatic quantum advantage.**Suggested metadata for quantum computing papers.**Qubit count, total circuit size, maximum circuit depth. Quantum volume required. Estimated Big-O — computational complexity. Estimated quantum advantage. Citation(s) for comparable classical algorithms and estimate or quantify specific or approximate quantum advantage over classic solutions. Specify circuit repetitions (shots) required for each run of the algorithm — specify any criteria, metric, or formula used to define the repetition count, and whether parameterized by particular input values for each run. NISQ vs. FTQC for production applications. Production vs. proof of concept vs. experiment. Detail scalability of current implementation. Connectivity requirements — degree of SWAPs needed to overcome limited connectivity. SQUID vs. ion trap vs. suitable for both. Too painful and tedious — and frequently incomplete — to tease the information out of the text of most papers.**What can we learn from the timeline of early classical computers?**Mistakes to avoid. Opportunities missed. Failure to push more aggressively with research. Failures due to engineering getting ahead of the required research. Balancing the benefits of incremental advances and bold, dramatic leaps. See my classical computing timeline paper below.**Operational and planning and other IT issues.**Highlight technical and logistical issues that operational and IT staff will have to deal with.**Evolving the industry to the stage where a quantum computer can be built by almost anyone using off the shelf components.**As was the case with mainframe computers in the 1950’s and 1960’s with vacuum tubes, discrete transistors, and then integrated circuits, then minicomputers in the 1960’s, then microcomputers in the 1970’s, and then personal computers in the 1980’s. Who will be the Intel of quantum computing in ten to fifteen years? Intel?!**Development of international and industry standards for all areas of QSTEM.**Interoperability and portability are imperative. Proprietary vendor lock-in is to be avoided at all costs. The earlier that standards can be developed, the better, although evolution needs to be facilitated as the science and technology evolve.**Could classical computing be extended by borrowing from quantum computing?**Could classical logic gates be made much smaller and faster but still fully deterministic using cryogenics and/or isolation techniques borrowed from quantum computing? In other words, might everything needed to perfect quantum computing be applied to classical computing as well? Including alternative architectures for processing units, buses, memories, and communication.**Quantum gaming.**Quantum algorithms and quantum applications which could dramatically enhance computer gaming.**Quantum hacking.**Quantum algorithms and quantum applications which could dramatically enhance computer hacking. This could present significant cybersecurity risks for criminal or state-sponsored hacking, but also present significant opportunities for national security cyberwarfare.**Cybersecurity.**Quantum algorithms and applications which aid cybersecurity, as well as quantum algorithms which may interfere with or undermine cybersecurity. See also references to Shor’s factoring algorithm. See the separatesection for the various aspects.*Cybersecurity***Cyberwarfare.**Both defensive and offensive capabilities at the national security level. Can be both tactical and strategic. Can be a passive deterrent as well as active measures.**Machine-assisted scientific discovery.**Unsupervised computational discovery and learning of physical laws from experimental data without prior knowledge of the systems in question. See thepaper by Choi, Flam-Shepherd, Kyaw, and Aspuru-Guzik.*Learning quantum dynamics with latent neural ODEs***Frictionless quantum computing.**AKA frictionless quantum software. IBM’s metaphor for making it much easier to develop quantum applications: “*our vision for creating a programming environment where the intricacies of the underlying technology are no longer a concern to users.*” In my terms, higher-level programming models, higher-level algorithmic building blocks, high-level programming languages, The FORTRAN Moment, metaphors, design patterns, application frameworks, libraries, configurable packaged quantum solutions, application problem statement languages, etc.

For the general model of processing data on a quantum computer:

*Little Data With a Big Solution Space — the Sweet Spot for Quantum Computing*- https://jackkrupansky.medium.com/little-data-with-a-big-solution-space-the-sweet-spot-for-quantum-computing-b6c6d76ada6d

For reference, my summary of the timeline of early classical computers:

*Timeline of Early Classical Computers*- https://jackkrupansky.medium.com/timeline-of-early-classical-computers-9539d5bf82d8

For my thoughts on getting beyond the stage where a quantum computer is still a mere laboratory curiosity:

*When Will Quantum Computing Advance Beyond Mere Laboratory Curiosity?*- https://jackkrupansky.medium.com/when-will-quantum-computing-advance-beyond-mere-laboratory-curiosity-2e1b88329136

# Need for a comprehensive glossary for quantum computing

I’ve personally accumulated a rough glossary of terms relevant to quantum computing, quantum information science in general, and quantum mechanics as well. And it covers a fair fraction of classical computing as well.

As of this writing, it has over 3,500 entries, although quite a few are TBD — To Be Determined.

My informal project needs to be formalized and given a proper home.

Currently, it is here — in six sections since it is too large for a single Medium document:

*Quantum Computing Glossary*- https://jackkrupansky.medium.com/quantum-computing-glossary-introduction-2414aa510854

# Need roadmaps for all areas of QSTEM research

Blindly pursuing research in a haphazard manner is likely to lead to suboptimal results. Rather, there should be some sense of ordering of the efforts to research. The overall goal is practical quantum computing. The big question is how to get there. That’s where the overall roadmap comes into play. And then roadmaps for the individual research areas.

Researchers and their supervisors should strive to create and adhere to research roadmaps.

That said, unstructured, undirected *pure research* without particular end goals is still reasonable, at least to some extent. Discovery of the unknown is still an urgent priority.

# Need reverse roadmaps for all areas of QSTEM research

A roadmap guides forward progress. A *reverse roadmap* starts not at the beginning, but with some desired end goal or intermediate goal and then details the precursor work that must be accomplished to enable that end goal.

Both normal roadmaps and reverse roadmaps are quite useful for planning and performing research.

# Need roadmaps for all application domains

Each of the major application domains should have a roadmap for algorithms and applications, detailing both functional milestones and complexity milestones.

For example, for quantum computational chemistry, list example molecules of increasing complexity.

Roadmaps with clear milestones are needed for all seven of the main application domains for quantum computing:

**Simulating physics.****Simulating chemistry.****Material design.****Drug design.****Business optimization.****Finance.****Machine learning.**

In addition to guiding research for the application domains, this would provide guidance to hardware researchers as to what capabilities their hardware would need to achieve over a range of timeframes.

# Preprints of all papers on arXiv

It almost goes without saying these days, but the expectation is that preprints of all published research papers should be posted on arXiv.

# Open access to all research papers

No QSTEM research paper or support materials should require a paywall or even registration.

Posting preprints of arXiv is one solution, the preferred solution, but if that approach is not used, papers should at least be published as *open access* — freely available to all parties, online. Thankfully, that’s happening a lot these days, but I still occasionally run into paywalls for research papers that I wish to read.

# All code, test programs, test data, sample test output, and full documentation on GitHub

All source code — both classical and quantum, configuration data, scripts, test programs, test data, and sample test result text should be maintained in a GitHub repository to facilitate shared access. And to facilitate reproduction of research results.

Full and clear documentation should be maintained in GitHub as well.

# Priority criteria

There are a lot of ideas mentioned in this informal paper. Which should have the highest priority to get the most attention? That’s beyond the scope of this paper. Some criteria for higher priority:

**Precursors for desired goals.****What is currently needed in the short term.**Practical utility.**Long-term needs which have a long lead time.****What is easy, the low-hanging fruit.****What is needed most to prove the viability and practicality of quantum computing.****Big ambitious goals.**Go for the gold. Reach for the stars. Requires great ambition, discipline and perseverance — and backers with very deep pockets and great patience.**The most intriguing open questions.**The greatest opportunity for discovery.**What interests researchers the most.**The default criteria.**What interests funding sources the most.**Worst case.

# Timing — sprint vs. marathon

It’s unclear if research in quantum computing is more of a sprint or a marathon — will a 5-year push get us where we need to be, or will it be a 10, 15, and 20-year marathon.

# Budget — 10X to 50X, for a start

This paper alone covers a vast landscape of research. But this may just be the tip of a much larger iceberg. Research isn’t free, so that’s a lot of money. How much might all of this research cost? Anybody’s guess at this stage, but I’d say at least *an order of magnitude* more than current expenditures, and that’s for starters. It could be as much as five times that much, in a range of a 10X to 50X multiple of current QSTEM research budgets.

But actual or forecast cost is beyond the scope of this paper, which is focused on simply identifying the areas of research.

# Need to retain and grow research talent

Right now, there is a great and growing demand for quantum technical talent in existing and startup tech companies, and even in old-line companies (e.g., big banks and industrial firms) seeking to exploit quantum computing. Unfortunately, the largest pool of quantum technical talent are researchers in academia and other research labs. That’s a great source to mine for commercial companies, but does not bode well for ongoing theory, pure, and basic research.

We’re facing the prospect of a *brain drain* from academia and other research labs as commercial companies hire like crazy, offering top dollar — and bonuses and stock options, far more than academia and research labs can offer.

The research establishment needs to come up with novel and innovative approaches to retaining, attracting, and growing research talent.

Some of the research efforts outlined in this paper can indeed occur within commercial companies, where top-dollar salaries, bonuses, and stock options are more readily available, but even then, in those cases, the concern becomes whether those commercial researchers can remain focused on theory, pure, and basic research as outlined in this paper, rather than getting sucked into superficial applied research projects which are essentially product engineering in disguise rather than true research.

I don’t (yet) have any great ideas for solutions at this stage for the problem of retaining and growing research talent — I’m merely trying to highlight the looming and ongoing problem.

# Need for a lone physicist with a single great idea

Achieving The ENIAC Moment for quantum computing might be less about some big corporate push with a large team and a vast budget, and more about a solitary elite scientist or engineer identifying a kernel of a simple but great idea and pushing forward to a breakthrough implementation that dramatically blows away everyone and everything else.

Possibly even developing new hardware, new tools, and a new programming model focused on achieving great results for that single great idea.

Minimal resources, but maximal impact.

Even if that one implementation is specialized for that one idea, it could provide the foundation for generalizing to a much broader class of problems.

Are there any young Feynmans, Fermis, or Oppenheimers out there poised for the challenge?

Or I should say who are they, where are they, and what are they currently up to.

And from a research perspective, how can research programs be preparing the next two generations of great innovators?

# Calling all grad students and PhD candidates — peruse my list of topics and questions

It would be very helpful for grad students and PhD candidates to peruse this informal paper for potential project, thesis, and research ideas.

Or maybe just to answer some of my questions which may already have answers that are simply not yet readily accessible or need to be translated to plain language.

# Sources to pursue for future auditions to this document

This informal paper includes everything I can think of and am aware of as far as QSTEM research, but I’m sure there is more out there. Some sources I haven’t yet pursued are:

**NSF grants, programs, and funding opportunities.****Google searches for quantum research in general and more specific quantum areas.****Monitoring news from contacts on LinkedIn.**Including notices of new research papers and arXiv preprints.**Reviewing some of my own older informal papers for all references to research.**

# Living document

These are just the research areas I know about at this moment. This document will be updated on a fairly frequent basis as I identify more areas in which research is needed.

# Summary and conclusions

- Lots of research is needed, on all fronts, in all areas of QSTEM. There are many gaps.
- Hardware and algorithms are the two greatest challenges for quantum computing. We need much better qubits and much better algorithms to exploit those qubits.
- Although I think we’re most desperate for better and more qubits, I think the larger issue is a need for a much higher-level programming model and high-level quantum programming language and a significant library of powerful algorithmic building blocks so that much more advanced algorithms can be more readily cobbled together — in a scalable manner. Multiple models and multiple languages are probably appropriate for a variety of levels and needs.
- As I wrote in a recent paper,
*where are all of the 40-qubit quantum algorithms?*Much research is needed to adequately answer that challenge. - Commercial projects should rely on off the shelf technology and research results rather than engaging in fresh or deep research.
- We need to see The ENIAC Moment for quantum computing, where an elite team finally cobbles together a quantum application capable of solving a real-world problem at production scale. Obviously we need enough qubits and reliable qubits for that to happen.
- Configurable packaged quantum solutions for various application domains are needed. No coding needed. Just prepare the input data and input parameters and let the packaged solution do the rest, generating optimal and efficient quantum circuits as needed. An important and valuable intermediate stepping stone until the day when non-elite teams eventually can easily design and develop custom quantum algorithms and applications on their own.
- And finally we need to see The FORTRAN Moment for quantum computing where we have the combination of a higher-level programming model, a high-level quantum programming language, a substantial library of algorithmic building blocks, and full quantum error correction (QEC) with logical qubits, so that average, non-elite teams can design, develop, and deploy production-scale production-quality quantum applications to solve real-world problems — without breaking a sweat.
- At least another solid decade of research for sure.
- Probably between ten and twenty years to get to the stage where quantum computing is ready for widespread use and can easily achieve dramatic quantum advantage.
- Maybe some preliminary elite usage in as little as five to seven years, what I call The ENIAC Moment, even though still not ready for widespread usage and not reaching its true potential.
- Continuing searches and monitoring for fresh research ideas.
- Revisit every two years to assess both progress and how much more is still needed.

For more of my writing: ** List of My Papers on Quantum Computing**.