Alan Turing: The Enigma (72 page)

Read Alan Turing: The Enigma Online

Authors: Andrew Hodges

Tags: #Biography & Autobiography, #Science & Technology, #Computers, #History, #Mathematics, #History & Philosophy

BOOK: Alan Turing: The Enigma
6.68Mb size Format: txt, pdf, ePub

The point of this complexity was that it increased the speed of operation. Speed took a slightly higher priority than simplicity. This was reflected also in the fact that Alan planned the pulse rate of the ACE to be a million a second, straining electronic technology to the full.
*
His emphasis on speed was natural enough, granted his Bletchley experience, in which speed was of the very essence, a few hours marking the difference between usefulness and pointlessness. It was also related to the universality of the electronic computer. In 1942 they had tried to get faster Bombes to cope with the fourth rotor; they had been saved by the German slip-up with the weather reporting signal system, but without this stroke of fortune they would have had to wait over a year for the machinery to match the problem. One virtue of a universal machine would lie in its ability to take on any new problem immediately – but this meant that it must be as fast as possible from the start. It would hardly be desirable to re-engineer a universal machine, for the sake of a special problem. The whole point was to get the engineering out of the way once and for all, so that all the work thereafter could go into the devising of instruction tables.

Although the ACE was based upon the idea of the Universal Turing Machine, in one way it departed from it. The departure lay in the feature, at first sight an extraordinary one, that it had no facility for conditional branching. In this respect it might seem to lack the crucial idea that Babbage had introduced a hundred years earlier. For the ‘scanner’, or Logical Control, of the ACE could only hold one ‘address’, or position on the tape, at a time. It had no way of holding more than one, and then of selecting a next destination according to some criterion.

The omission, however, was only on the surface. The reason for it was that it was a case where the hardware could be simplified, at the cost of more stored instructions. Alan set out a way in which conditional branching could be done without needing the logical control to hold more than one ‘address’ at a time. It was not the best technical solution, but it had the ment of a brutal simplicity. Suppose that it were wished to execute instruction 50 if some digit D were 1, and to execute instruction 33 if it were 0. His idea was to ‘pretend that the instructions were really numbers and calculate
D × (Instruction 50) + (1-D) × (Instruction 33).’ The result of this calculation would be a instruction which would have the effect that was required. The ‘IF’ was effected not by hardware, but by extra programming. It was a device which had led him to mix up data (the digit D) with instructions. This in itself was of great significance, for he had allowed himself to
modify the stored program
. But this was only the beginning.

Von Neumann had also seen that it was possible to interfere with the stored instructions, but had done so in only one very particular way. If a stored instruction had the effect of ‘taking the number at address 786’, then he had noticed that it would be convenient to be able to add 1 into the 786, so that it gave the effect of ‘taking the number at address 787’. This was just what was needed for working along a long list of numbers, stored in locations 786, 787, 788, 789 and so forth, as would so frequently occur in large calculations. He had programmed the idea of going to the ‘next’ address, so that it did not have to be spelt out in explicit form. But von Neumann went no further than this. In fact, he actually proposed a way of ensuring that instructions
could not
be modified in any other way than this.

The Turing approach was very different. Commenting on this feature of modifying instructions, he wrote in the report: ‘This gives the machine the possiblity of constructing its own orders. …This can be very powerful.’ In 1945 both he and the ENIAC team had hit upon the idea of storing the instructions inside the machine. But this in itself said nothing about the next step, that of exploiting the fact that the instructions could now themselves be changed in the course of running the machine. This was what he now went on to explain.

It was an idea that had arrived somewhat by chance. On the American side, they had thought of storing instructions internally because it was the only way of supplying instructions quickly enough. On the Turing side, he had simply taken over the single tape of the old Universal Turing Machine. But neither of these reasons for adopting stored instructions said anything about the possibility of interfering with the instructions in the course of a computation. On the American side, it was not pointed out as a feature of the new design until 1947.
4
Equally, the Universal Turing Machine, in its paper operation, was not designed to change the ‘description number’ that it worked upon. It was designed to read, decode, and execute the instruction table stored upon its tape. It would never
change
these instructions. The Universal Turing Machine of 1936 was like the Babbage machine in the way that it would operate with a fixed stock of instructions. (It differed in that that stock was stored in exactly the same medium as the working and the input and output.) And so Alan Turing’s own ‘universality’ argument showed that a Babbage-like machine was enough. In principle there was nothing that could be achieved through modifying the instructions in the course of an operation that could not be achieved by a universal machine without this facility. The faculty of program modification could only
economise
on instructions, and would not enlarge upon the theoretical scope of operations. But that economy could, as Alan said, be ’Very powerful’.

This original perception flowed from the very universality of the machine, which might be used for any kind of ‘definite method’, not necessarily arithmetical. That being so, the pulses ‘1101’ stored in a delay line, might well not refer in any way to the number ‘thirteen’, but might represent a chess move or a piece of cipher. Or, even if the machine were engaged upon arithmetic, the ‘1101’ might be representing not ‘thirteen’, but indicating perhaps a possible
error
of thirteen units, or a thirteen in the floating-point representation
*
of numbers, or something else altogether, at the choice of the user of the machine. As he saw it from the start, there would be far more to adding and multiplying than putting pulses through the hardware adder and multiplier. The pulses would have to be organised, interpreted, chopped up, and put together again, according to the particular way in which they were being used. He dwelt in particular upon the question of doing arithmetic in floating-point form, and showed how the mere addition of two floating-point numbers required a whole instruction table. He wrote some tables of this kind. MULTIP, for instance, would have the effect of multiplying two numbers which had been encoded and stored in floating-point form, encoding and storing the result. His tables made use of the ‘very powerful’ feature of the machine, for he had it assemble bits of the necessary instructions for itself, and then execute them.

But if a simple operation like multiplying floating-point numbers would require a set of instructions, then a procedure of any useful scale would involve putting many such sets of instructions together. He envisaged this not as a stringing together of tables, but as a
hierarchy
, in which subsidiary tables like MULTIP would serve a ‘master’ table. He gave a specific example of a master table called CALPOL which was to calculate the value of a fifteenth-order poly nominal in floating-point. Every time a multiplication or addition was required, it had to call upon the services of a subsidiary table. The business of doing this calling up and sending back of subsidiary tables was something which
itself
required instructions, as he saw:

 

When we wish to start on a subsidiary operation we need only make a note of where we left off the major operation and then apply the first instruction of the subsidiary. When the subsidiary is over we look up the note and continue with the major operation. Each subsidiary operation can end with instructions for the recovery of the note. How is the burying and disinterring of the note to be done? There are of course many ways. One is to keep a list of these notes in one or more standard size delay lines …with the most recent last. The position of the most recent of these will be kept in a fixed TS [short delay line] and this reference will be modified every time a subsidiary is started or finished. The burying and disinterring processes are fairly elaborate, but there is fortunately no need to
repeat the instructions involved each time, the burying being done through a standard instruction table BURY, and the disinterring by the table UNBURY.

Perhaps he drew the imagery of burying and unburying from the silver bar story.
*
This was an entirely new idea. Von Neumann had thought only in terms of working through a sequence of instructions.

The concept of a hierarchy of tables brought in further applications of program modification. Thus he imagined ‘keeping the instruction tables in an abbreviated form, and expanding each table whenever we want it’ – the work being done by the machine itself, using a table called EXPAND. The further he progressed with this idea, the more he saw that the ACE could be used to prepare, collate, and organise its own programs. He wrote:

 

Instruction tables will have to be made up by mathematicians with computing experience and perhaps a certain puzzle-solving ability. There will probably be a good deal of work of this kind to be done, for every known process has got to be translated into instruction table form at some stage. This work will go on whilst the machine is being built, in order to avoid some of the delay between the delivery of the machine and the production of results. Delay there must be, due to the virtually inevitable snags, for up to a point it is better to let the snags be there than to spend such time in design that there are none (how many decades would this course take?) This process of constructing instruction tables should be very fascinating. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.

It is not surprising that he looked forward to the process of writing instruction tables as ‘very fascinating’. For he had created something quite original, and something all of his own. He had invented the art of computer programming.

It was a complete break with the old-style calculators – of which, in any case, he knew little. They assembled adding and multiplying mechanisms, and then had the job of feeding in a paper tape to make them work in the right order. They were machines to do arithmetic, in which the logical organisation was a rather tiresome burden. The ACE was quite different. It was to be a machine to play out programs ‘for every known process’. The emphasis was placed on the logical organisation of the work, and the hardware arithmetic added only to short-cut the more frequently used constituent operations.

On desk calculators, the figures 0 to 9 would appear visibly in the registers and the keyboard, and the operator would be led to feel that in some way the calculator had the numbers themselves stored within it. In reality, it had nothing but wheels and levers, but the illusion would be strong. The illusion carried over to the big relay calculators, the Aiken and Stibitz machines, and to the ENIAC. Even the EDVAC proposals carried the
feeling that the pulses in the delay lines would somehow actually be numbers. But the Turing conception was rather different, and took a more abstract view. In the ACE, one might regard pulses as representing numbers, or as representing instructions. But it was really all in the mind of the beholder. The machine acted, as he put it, ‘without understanding’, and in fact operated not on numbers nor on instructions, but on electronic pulses. One could ‘pretend that the instruction was really a number’, because the machine itself knew nothing about either. Accordingly, he was free in his mind to think about mixing data and instructions, about operating on instructions, about tables of instructions being inserted by other instructions of ‘higher authority’.

There was a reason for his facility to take this liberated approach. Ever since he first thought about mathematical logic, he was aware of mathematics as a game played with marks on paper, to be manipulated by chess-like rules, regardless of their ‘meaning’. This was the outlook that the Hilbert approach encouraged. Gödel’s theorem had cheerfully mixed up ‘numbers’ and ‘theorems’, and
Computable Numbers
had represented instruction tables as ‘description numbers’. His proof of the existence of unsolvable problems depended upon this mixing up of numbers and instructions, by treating them all alike as abstract symbols.
*
It was therefore a small step to regard instructions, and instruction tables, as grist to the ACE’s own mill. Indeed, since so much of his war work had depended upon indicator systems, in which instructions were deliberately disguised as data, there was no step for him to make at all. He saw as obvious what to others was a jump into confusion and illegality.

This vision of the ACE’s function was also tied in with the imitation argument. The ACE would never really be ‘doing arithmetic’, in the way that a human being would. It would only be
imitating
arithmetic, in the sense that an input representing ‘67 + 45’ could be made to guarantee an output representing ‘112’. But there were no ‘numbers’ inside the machine, only pulses. When it came to floating-point numbers, this was an insight of practical significance. The whole point of his development was that the operator of the ACE would be able to use a ‘subsidiary table’ like MULTIP,
as if it
were a single instruction to ‘multiply’. In actual fact, it would have the effect of much shunting and assembling of pulses inside the machine. But that would not matter to the user, who could work
as if
the machine worked directly on ‘floating-point numbers’. As he wrote, ‘We have only to think how this is to be done once, and then forget how it is done.’ The same would apply if the machine were programmed to play chess: it would be used
as if
it were playing chess. At any stage it would only be outwardly imitating the effect of the brain. But then, who knew how the brain did it? The only fair use of language, in Alan’s view, was to apply the same
standards, the standards of outward appearance, to the machine as to the brain. In practice, people said quite nonchalantly that it was ‘doing arithmetic’; they should also say it was playing chess, learning, or thinking, if it could likewise imitate the function of the brain, quite regardless of what was ‘really’ happening inside. Even in his technical proposals, therefore, there lay a philosophical vision which was utterly beyond the ambition of building a machine to do large and difficult sums. This did not help him to communicate with other people.

Other books

The Mask That Sang by Susan Currie
ConneXions by LaPearl, Isabella
The Plan by Apryl Summers
Forbidden Bear by Harmony Raines
A Book of Dreams by Peter Reich
Motel. Pool. by Kim Fielding
Randal Telk and the 396 Steps to Sexual Bliss by Walter Knight, James Boedeker
Twin Cities Noir by Julie Schaper