The Blog

What Apple and Microsoft Owe to Turing

In 1936, at the age of just 23, Alan Turing invented the fundamental logical principles of the modern computer -- almost by accident.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

In 1936, at the age of just 23, Alan Turing invented the fundamental logical principles of the modern computer -- almost by accident. No one could have guessed that anything of practical value would emerge from his highly abstract research into the foundations of mathematics, let alone a machine that would change all our lives. Turing called his invention the 'universal computing machine'.

As everyone knows, the way to make a computer do the job we want (word-processing, say) is simply to locate the appropriate program in memory and start it running. This is the 'stored program' concept and it was Turing's invention in 1936.

His fabulous idea, dreamed up by pure thought, was of a single processor--a single slab of hardware--that could change itself from a machine dedicated to one type of work into a machine dedicated to a completely different job--from calculator to word-processor to chess opponent, for example. It did this by making use of programs--sequences of coded instructions--stored inside its memory.

Nowadays, when nearly everyone owns a physical realization of Turing's universal machine, his idea of a one-stop-shop computing machine is apt to seem as obvious as the wheel and the arch. But in 1936, when engineers thought in terms of building different machines for different purposes, Turing's vision of a single universal machine was revolutionary.

For the next few years Turing's revolutionary ideas existed only on paper. A crucial moment came in 1944, when he set eyes on Tommy Flowers' racks of high-speed electronic code-cracking equipment at Bletchley Park. Turing realized that the technology Flowers was pioneering, large-scale digital electronics, was the way to build a miraculously fast universal computer.

Nevertheless, another four years elapsed before the first universal Turing machine in hardware ran the first stored program, on Monday 21 June 1948. It was the first day of the modern computer age. Based on Turing's ideas, and almost big enough to fill a room, this distant ancestor of our laptops was named simply the 'Baby' computer. Thereafter electronic stored-program universal digital computers became ever smaller and ever faster, until now they fit into coat pockets and school satchels, linking each of us to the whole wide world.

In my opinion Turing's greatest peacetime achievement was inventing the two closely related technological ideas on which modern computing is based, his twin concepts of the universal computer and the stored program. Yet historians of the computer have often found Turing's contributions hard to place, and many histories of computing written during the six decades since his death sadly do not so much as mention him.

Even today there is still no real consensus on Turing's place in computing history. Earlier this year an opinion piece by the editor of the Association for Computing Machinery's flagship journal objected to the claim that Turing invented the stored-program concept ['Who Begat Computing?' Communications of the ACM, January 2013]. The article's author, Moshe Vardi, dismissed the claim as 'simply ahistorical'.

Vardi emphasized that it was not Turing but the Hungarian-American mathematician John von Neumann who, in 1945, 'offered the first explicit exposition of the stored-program computer'. This is true, but the point does not support Vardi's charge of historical inaccuracy. Although von Neumann did write the first paper explaining how to convert Turing's ideas into electronic form, the fundamental conception of the stored-program universal computer was nevertheless Turing's.

Von Neumann was actually very clear in attributing credit to Turing, both in private and in public. It is unfortunate that his statements are not more widely known. He explained in 1946 that Turing's 'great positive contribution' was to show that 'one, definite mechanism can be "universal"'; and in a 1949 lecture he emphasized the crucial importance of Turing's research, which lay, he said, in Turing's 1936 demonstration that a single appropriately designed machine 'can, when given suitable instructions, do anything that can be done by automata at all'.

Von Neumann's friend and scientific colleague Stanley Frankel recollected that von Neumann 'firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing'.

Von Neumann, then, set out the electronic basis for a practical version of the universal Turing machine (with considerable assistance, on the engineering side, from his associates Presper Eckert and John Mauchly). In von Neumann's design the picturesque 'scanner' and 'memory-tape' of Turing's 1936 machine were replaced with electronic equipment, and he also replaced Turing's pioneering programming code with what he later described as a 'practical' code for high-speed computing.

Von Neumann's design went on to become an industry standard, yet among his many contributions to the development of the computer, perhaps the greatest of all was simply informing America's electronic engineers about Turing's concept of the stored-program universal computer.

During 1945, hard on the heels of von Neumann's groundbreaking paper reporting his (and Eckert and Mauchly's) design, Turing designed his own electronic version of his universal machine. This was radically different from von Neumann's; Turing sacrificed everything to speed, launching a 1940s version of what today's computer architects call RISC (Reduced Instruction Set Computing).

In 1947 Turing gave a clear statement of the connection, as he saw it, between the universal computing machine of 1936 and the electronic stored-program universal digital computer:

'Some years ago I was researching on what might now be described as an investigation of the theoretical possibilities and limitations of digital computing machines. I considered a type of machine which had a central mechanism, and an infinite memory which was contained on an infinite tape. ... [D]igital computing machines ... are in fact practical versions of the universal machine. There is a certain central pool of electronic equipment, and a large memory, [and] the appropriate instructions for the computing process involved are stored in the memory.'

Returning to Moshe Vardi's efforts to refute the claim that Turing invented the stored-program concept, Vardi states--defending von Neumann's corner--that 'we should not confuse a mathematical idea with an engineering design'. So at best Turing deserves the credit for an abstract mathematical idea? Not so fast. Vardi is ignoring the fact that some inventions do belong equally to the realms of mathematics and engineering. The universal Turing machine was one such, and this is part of its brilliance.

What Turing described in 1936 was not an abstract mathematical notion but a solid three-dimensional machine (containing, as he said, wheels, levers, and paper tape); and the cardinal problem in electronic computing's pioneering years was just this: How best to build practical electronic forms of the universal Turing machine?