When Will Quantum Computing Have Its FORTRAN Moment?

Jack Krupansky
8 min readMay 14, 2019

Quantum computing currently lacks a high-level programming language and associated high-level programming model. Everything is currently done in terms of raw qubits and raw quantum logic gates, not unlike working in machine language or assembly language on the early classical computers of the 1940’s and 1950’s before high-level languages such as FORTRAN, COBOL, LISP, PL/1, and BASIC were invented. Programmers did amazing things back in those days, and will likely do some amazing things with raw qubits and quantum logic gates over the next few years, but as with early classical computing, the audience and potential market is greatly limited by such a primitive programming environment. A high-level language analogous to FORTRAN would allow quantum computing to turn the corner and leap from a fairly narrow niche of very elite experts to a much broader, mainstream market. This informal paper will explore the need and potential for such a FORTRAN moment for quantum computing.

FORTRAN is short for Formula Translation. It is a programming language focused on numerical calculations such as commonly found in scientific and engineering computation. It was developed in the mid-1950’s by John Backus and others at IBM to dramatically reduce the level of effort required to develop scientific and engineering applications for classical computers.

Note: Technically FORTRAN is no longer written in all caps as I do here. FORTRAN 77 was the last release with the name in all caps. Fortran 90 changed the name to mixed case. I use the original all caps name since I’m referring to the original introduction of FORTRAN rather than the eventual, modern usage.

There were some earlier attempts at higher level languages — see the Wikipedia History of programming languages article — and they actually had some limited success, but FORTRAN made a much bigger and durable splash, comparable to the ENIAC moment on the hardware front of classical computing.

The key benefit was that the programmer could focus on their algorithm and logic in terms much more familiar to mathematicians and (non-computer) scientists and engineers — most notably algebraic expressions and control flow (particularly loops) — rather than the bits, words, registers, and instructions of the bare machine. Input and output and formatting of data was another key contribution of FORTRAN. Programs became much simpler and much easier to write.

Even non-computer scientists and non-computer engineers could now master programming. Actually, they could master it before FORTRAN, but only with a great expenditure of time and energy. Now, they could master programming much more easily.

And the audience and market capable of assimilating computer systems greatly expanded, no longer limited to the very few, the very elite, and only the most motivated.

Some of the more complex and sophisticated applications and underlying system software continued to require mastery of machine language, but that quickly become a increasingly minor subset of the total audience and total market.

The FORTRAN moment was not a precise moment or specific event, but more of a process, beginning with its conception, its design, its initial implementation, its initial use by its creators, and culminating with wider usage. It’s difficult to say when FORTRAN achieved a critical mass of usage, but the initial compiler was delivered in 1957, according to the Wikipedia Fortran article. The initial concept was first proposed internally at IBM in 1953, the initial specification in 1954, and the first manual in 1956. The full detail of the early history can be found in Backus’ paper The History of FORTRAN I, II, and III. The paper indicates that Backus, et al gave five or six talks to IBM customers in late 1954 and early 1955. This was probably the initial public “splash” except for two facts: 1) with only one exception, customers were uniformly skeptical, and 2) FORTRAN still existed only as an idea on paper with no software to support it. Still, that one customer may have been key to the ultimate success of FORTRAN, so that fateful presentation in January 1955 might reasonably be considered at least one prospect for the initial FORTRAN moment, although the actual release of the compiler in April 1957 may be the sounder dating of the FORTRAN moment.

The initial release of FORTRAN was far from the full embodiment of FORTRAN. Many releases followed over subsequent years, decades, and even a subsequent century. The point is that the FORTRAN moment did not represent the complete arrival of the full promise of the concept of FORTRAN, but did represent the turning of a corner, the opening of the door to a whole new level of ability to exploit computing hardware. That’s the kind of turning point we need for quantum computing.

FORTRAN marked the transition from a very limited, elite audience to a much wider audience and a great leap in productivity. Many applications either wouldn’t have been developed at all or would have been delayed on a much longer timeline if every programming project required scarce elite machine language programmers.

FORTRAN alone was not enough to be the end-all for programming of classical computers, just the moment of real breakout from the tedium of machine language programming. It was focused on science and engineering, not commercial data processing or general-purpose programming, or artificial intelligence. But FORTRAN did set the stage and ushered in the new world of high-level languages. COBOL opened the floodgates for commercial data processing, beginning in 1960. LISP enabled artificial intelligence, beginning in 1958. BASIC enabled anybody to program, beginning in 1964. Many other programming languages followed — and there were other precursor languages as well, but FORTRAN, COBOL, LISP, and BASIC set the overall tone for development of applications for classical computers. ALGOL and PL/1 also made important contributions to advancing the art of programming. And later C and PASCAL. And many others.

FORTRAN, COBOL, LISP, and BASIC are now quickly becoming forgotten languages, but the point is that quantum computing has yet to reach even the stage in its evolution comparable to FORTRAN. We’re still waiting for quantum computing to have its FORTRAN moment.

There are a variety of interactive, visual, graphical tools, and libraries for composing quantum programs — such as IBM QISKit and its graphical Quantum Composer, or Rigetti’s Forest and Grove libraries, but the programmer is still forced to conceptualize both problem and solution in terms of raw qubits and raw quantum logic gates. This is the current programming model for quantum computing. A large part of what FORTRAN and other programming languages brought to the market was a much richer programming model, closer to the language of science and engineering, so that scientists and engineers could continue to conceptualize their problems and solutions in science and engineering terms rather than the bits, words, registers, and instructions of the bare machine.

The key obstacle to achieving a FORTRAN moment is that quantum scientists and quantum engineers are still overwhelmed with the fundamental task of perfecting qubits and scaling up machines to have enough qubits that FORTRAN-like programming models are in fact technically feasible.

I’m certainly not suggesting that FORTRAN itself — or any other traditional classical computing programming languages, for that matter — would be appropriate for the FORTRAN moment of quantum computing, but simply offering FORTRAN as an analogy for a phase transition in the evolution of any technology.

It’s an open question at this stage exactly what would be an appropriate programming model for quantum computing. Simulations and optimization need quantum techniques which are currently beyond the reach of simple algebraic expressions and classical control structures. But something will pop up. After all, necessity is the mother of invention.

As important as a programming model and programming language are, they cannot be effectively conceptualized and realized in a vacuum without first gaining significant insight by actually attempting to design and implement applications on bare machines without the benefits of such abstractions. That’s the way programming models and programming languages emerged, evolved, and progressed with classical computers — as exemplified by the history of the development of FORTRAN. Sure, we can imagine that we know so much better now how we can shortcut the process, but ultimately there is no credible shortcut to actually slogging through development on a bare machine. That’s how we arrived at the ENIAC moment of classical computing, which was the necessary precursor of the FORTRAN moment of classical computing.

A great quote from John Backus, the inventor of FORTRAN: “Much of my work has come from being lazy. I didn’t like writing programs, and so, when I was working on the IBM 701, writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs for the 701.” That experience working on a bare machine with no high-level programming model set the stage for his conceptualization and implementation of FORTRAN.

I’m personally still not at the stage of being able to propose a halfway-decent programming model or programming language for quantum computing, but such a model and language need to provide a higher-level abstraction which encompasses and exploits the essential features and resources of quantum computing, including but not limited to:

  • Superposition
  • Entanglement
  • Interference
  • Unitary transformations
  • Quantum parallelism
  • Quantum Fourier transforms
  • Phase estimation
  • Amplitude amplification

I am a little surprised — or maybe disappointed is a better word — that nobody has yet come up with a halfway-decent high-level programming model for quantum computing, but as noted, the scientists and designers are still overwhelmed with simply perfecting qubits and scaling to the level where scientists, engineers, and commercial software developers have enough hardware to actually support non-trivial applications — and programming models.

For more discussion of the many challenges that quantum programmers face on the algorithms front, see my paper The Greatest Challenges for Quantum Computing Are Hardware and Algorithms.

When might we finally see the FORTRAN moment for quantum computing? Well, first we will need to see the ENIAC moment — availability of hardware and some adventurous souls willing to do a substantial real application on the raw hardware. Then, after a number of such applications, produced quite arduously, finally a Backus-like character will step forward and propose and develop a high-level programming model and a language to go with it.

What I wrote in 2019:

  • Two years? I doubt it. Four to five years? I can definitely see it. Three years? I wouldn’t be completely surprised, but still surprised. More than five years? I’d be disappointed, but it’s possible.
  • I’d bet that we’ll see a high-level programming model and associated high-level programming language for quantum computing in the same timeframe as we see 512 to 1024-qubit machines. Using a Moore’s Law-like model of doubling qubits every twelve to eighteen months, and assuming we see a 128-qubit machine this year, we could see a 512-qubit machine in two to three years and a 1024-qubit machine in three to five years. Three to four years seems like a reasonable expectation — at least on the hardware front.

My update in 2020:

  • If as given in the ENIAC moment paper it does indeed take five to seven years to reach the ENIAC moment, it may take another two to five years to reach the FORTRAN moment. That’s 5+2 to 7+5 or 7 to 12 years, with 9–10 years as the average. Yikes, that a depressingly long wait from today!
  • Although there was an expectation of seeing a 128-qubit machine in 2019, that didn’t happen and we ended up with 53 qubits. If we’re lucky, we may see 64 qubits in 2020, which means we wouldn’t see 512 qubits for another three to five years and 1024 qubits in four to six years.
  • Some of the work to develop the concepts, algorithms, and software for a quantum programming language could indeed occur in parallel with development of the hardware, but sometimes lack of hardware ends up slowing software efforts, if for no other reason than management thinking that the priority and resources can be lowered or slowed if the hardware isn’t going to be ready to be exploited by the software.

Stay tuned for further developments. But don’t hold your breath.

Meanwhile, read a previous paper, When Will Quantum Computing Have Its ENIAC Moment?, which discusses a precursor milestone which will have to take place long before we get to the FORTRAN Moment.

--

--