Lecture
Linear Programming and all that
I first "got involved in computers" in the spring of 1959. My
employer, J Bibby & Sons, had two successive Zebras, which I
have designated in this story as Zebra one and Zebra two.
(Their actual production numbers are unknown to me, but they
were not the first two produced by many months.)
I lived and breathed the Zebra for nearly three years. The
lessons I learned then have stood me in good stead for over 35
years and I guess they will stay with me until I stop
programming computers and take up some less testing way of
living, like retiring.
J Bibby & Sons was an old-established manufacturer of animal
foodstuffs, cooking fats and soap. There were 17 men surnamed
Bibby on the Board of Directors and one who was not, called M
Morrice, who was the Company Secretary.
I was employed as an assistant statistician in the Development
Department, although my formal exposure to statistics was a
short course for scientists at university as part of my (very
ordinary) degree in natural sciences and a short evening
course at the Ashley Down Technical College in Bristol (now
the University of Bath!).
I had heard about computers at school - I read "Discovery" and
"Endeavour" - and had seen the special purpose computer that
played noughts and crosses at the Festival of Britain in 1951
- I think it was made by Ferranti. I was not impressed.
Computers were mentioned once or twice at university, but we
students never saw one.
After the man who interviewed me, David Spurrell, had left
Bibbys, I found his notes of my interview. He was more
impressed with my farming background than my knowledge of
statistics. This was appropriate enough for a job in which the
nutrition of pigs loomed very large indeed. David had
inherited the Company Statistician job at Bibbys from Andrew
Muir, who had persuaded the company to buy the Stantec Zebra
from STC for � 30,000. To put that in context, my starting
salary at Bibbys was � 850, so a Zebra would be about half a
million pounds at current prices.
The justification for the purchase of the Zebra was Linear
Programming (LP), an iterative technique to enable you to
calculate the least cost mixture of a number of components,
each with different prices, each component containing
different proportions of sub-components, subject to
constraints. The classical example given in the original
papers was to make a least cost mixture of nuts using several
different sorts of nuts, with different prices, subject to
minimum and maximum constraints on the amount of protein,
fibre and starch.
Before the Zebra was purchased Andrew Muir had carried out a
long series of feasibility studies. To simulate the LP
algorithm running on the Zebra, two comptometer operators were
employed. These delightful "comp-girls" could enter data, add
and subtract using touch only, on comptometers. These were
fast desktop electric calculators, virtually silent and often
used behind the scenes in financial organisations.
Comp-girls knew nothing of mathematics and had to be provided
with scripts just like computer programs, which they obeyed
faithfully and with impressive speed, copying down results
from the single register onto worksheets at the end of each
iteration. The profession of comptometer operator and the
comptometer itself have both long since passed into history.
Andrew Muir had written the LP program for the Zebra in Normal
Code. This used the simplex method and the product form of the
inverse. The basis for all this was a research report and
several papers by GB Dantzig. LP was used routinely to
calculate least cost mixes for a hundred or so different
products ranging from layers' pellets (for chickens, to
persuade them to lay an egg a day) to elephant nuts (for
maintenance of the main attraction at a travelling circus).
Each case of animal feed took several minutes to solve. Being
an iterative technique it was not possible to forecast how
long it would take - the better the approximation you started
with the quicker the program converged. The solutions were
passed to the Nutritionist, a veterinary surgeon, who in turn
passed them (or not) to the shift production foremen, who were
responsible for the machine settings.
There was a problem here; the whole purpose of the operation
was to de-skill the nutritionist post which had always had
considerable power, and salary, within Bibbys. However closely
the specifications were set there were many, many occasions
when the LP formulation was rejected on non-quantifiable
grounds - colour, texture, feel and so on. I believe that, by
the time I arrived, it had been decided not to use LP as a
production process, and that was why Andrew Muir had left.
Nonetheless the Zebra ran overnight on LP on the eighth floor
of the office building in Liverpool in 1959, 100 yards from
the Head Office and directly connected to the factory itself
- a noisy, dusty, smelly place where khaki-clad operatives
laboured in conditions that had not changed significantly for
50 years. That is not to say that Bibbys was abackward-looking
company. It had one of the first non-contributory pension
schemes for all employees which contributed not a little to
the tradition of long service in the company at all levels. To
work at Bibbys in Liverpool had somewhat of the stability of a
job in the civil service in a community where the depression
of the 1930s seemed only yesterday.
The LP program was not without its problems. In particular the
use of fixed point arithmetic on 32-bit numbers could cause
over- or under-flow in certain cases. Since the LP algorithm
used was independent of the scale of the intermediate results
such problems could be resolved when signalled by rescaling,
multiplying or dividing the offending row of the matrix by a
factor of 100, and remembering to rescale the result. This was
not automatic. If the LP program detected a need for a rescale
it would stop. If an operator was present he would load the
rescale paper tape program and try again!
As a new recruit who had read about floating point, I asked
why this was not used; it seemed to me to be obvious. I
learned about the restricted speed of the machine - 312
microseconds for a basic operation, but, with no hardware to
help, 20-40 milliseconds for the floating point operations.
The eight hour shift would have stretched to nearly a month!
The other main application was the forecasting of sales of the
various animal feeds. These were seasonal and subject to
variations due to price fluctuations in the products such as
milk, bacon and eggs (although in the late fifties these were
smoothed out by massive government intervention through
product marketing boards). Sales forecasting used exponential
smoothing, a method that minimised the amount of historical
data that had to be kept.
I am not sure who wrote our exponential smoothing program,
which was called EXPS and was based on an algorithm due to RG
Brown of Arthur D Little. I suspect that the version used when
I was at Bibbys was a joint production by Andrew Muir and
David Spurrell. Again it was written in Normal Code for speed
and space considerations. For each run the program and data
were read in on 5-channel paper tape.
As I said earlier, I had never seen a real computer before,
except in photographs. Again I was less than impressed. It
seemed to me that it had little relevance to the work I was
supposed to do, the design and analysis of animal nutrition
experiments. I was soon to be re-evaluating my position.
In fact the design of the experiments was the least of my
worries. How to layout the experiments was well understood -
it was heavily constrained by the animal houses, long since
built, and the carefully bred strains of animals, genetically
similar to each other and hence capable of displaying small
differences in weight, milk yield or backfat when fed slightly
different rations.
It was the analysis that took the time. This was done using a
technique due to RA Fisher called "Analysis of Variance",
designed to pick up differences in response to a series of
treatments in the presence of background variability, such as
is always found in measuring any parameter of a living
population.
Analysis of variance on a typical experiment could take days.
The algorithm was not self-checking and although a desk
calculator was used the entering of the data and its
manipulation were severe tests of my accuracy. A full analysis
could easily take a week as I found it impossible to calculate
for more than about two hours at a stretch without introducing
errors.
My first few months at Bibbys were a nightmare of calculation.
I worked in a room with four engineers driven mad by the noise
of my electromechanical desk calculator. On several occasions
I inadvertently divided by zero, causing the mechanism to
enter a noisy loop which could only be broken by disconnecting
the mains, which jammed the machine. The manufacturer's
maintenance engineer had to be called to disassemble the cog
wheels and perform a hasty rebuild. Meanwhile the computer sat
upstairs for long periods waiting for the price information to
enable it to run the LP program or the final week's sales
figures to allow the EXPS program to run.
I discussed with the Company Statistician whether it would be
possible to write a program for analysis of variance. He was
discouraging. I had still not cleared up the backlog of
experiments to be analysed that he and Andrew Muir had allowed
to build up during the writing of the LP and EXPS programs.
They had regarded these programs as essential to the future of
the company: a 1% or 2% saving in raw material prices would
have an immediate effect on the profitability of the company.
It was difficult to see how detecting the significance of a
small increase in the effect of layers' mash could be
translated into profits.
Farmers were notoriously reluctant to change the feed they
used for their animals and certainly would be unlikely to pay
more for something with only a "scientifically" proven effect.
The major increases in yield in farming were a joint effect of
genetic programs and better formulations. Farmers could see
the effects of the genetics and could understand well the
effect of using a better breed of boar or bull, but the feeds
all looked the same, just hessian bags full of something
compounded by the miller.
During a rest from adding and squaring on the Muldiva I went
up the eighth floor and talked to the third member of the
computing section, Tom Corless. Tom was an acknowledged
character known throughout the factory. He looked and sounded
like a Liverpudlian burglar and always wore black boots with
aggressive external steel toe-caps.
I asked Tom if the computer could add and square, the basic
operations for the analysis of variance. He reassured me. I
asked him if he could teach me how to program - Tom was
clearly not overjoyed with the idea of spending time with this
university graduate who did not, he quickly discovered, have
anything like his acquaintance with Chrystal's Algebra (1865),
his preferred lunchtime reading.
Both the Nutritionist and the Commodity Buyers were putting up
a spirited resistance to any further incursion of the Zebra
into their empires. The Bibby Zebra had always had a bad
record for reliability. Often the night shift, already subject
to stalling for rescaling, was aborted due to hardware error.
Suddenly a computer consultant appeared, called RH Williams.
He had been a computer saleman for Univac in the United
States, returned to his native Wales and set up a one-man
computer consulting company. As a salesman he was no doubt
successful but his knowledge of computers was restricted to
the Univac and he didn't feel it necessary to have any
technical expertise at all. As he explained it to me, he was
involved in strategy, not tactics. Rumour had it that he had
met one of the director Bibbys in a London club and was
invited in to resolve the computer problems we were suffering
at the time.
Considerable pressure was brought to bear on STC behind the
scenes and a replacement Zebra was supplied. It was hoped that
better reliability would overcome the objections of the user
groups.
RH Williams was in charge of "Acceptance Tests" for the Zebra
two and quickly displayed his lack of technical knowledge,
instructing Tom to get it to multiply two numbers together
after having removed the software multiplication routines. I
spent some time talking to him and was underwhelmed, to such
an extent that I went to "my" Director, Ben Bibby, and
complained bitterly.
With the replacement machine installed, Williams displayed
some ingenuity in producing a new scheme for least-cost animal
feeds which could be run in parallel with the LP program. This
consisted of simply costing all previously made formulations
using current daily prices and printing off the 10 cheapest.
Since these existing recipes had all been previously accepted
by the Nutritionist there could be no dispute with him and he
could simply select the one to be made in the factory from the
candidates suggested by the computer. But, of course, the
formulations might not be the cheapest possible: the
Nutritionist could still explore other possibilities and
suggest another recipe that could be added to the history
file.
The problem with this method was the volatility of the
commodity prices and the appearance of new raw materials to
use in animal feedstuffs. This naturally generated new recipes
and the history file, again kept on 5-hole paper tape, grew
and grew. Eventually it exceeded three feet in diameter and
had so much inertia that it had to be spun by hand at the
start of the program to avoid being torn as the first recipe
was read. The size of the cartwheel of paper tape was so great
that even the company directors could see that the whole thing
had got out of hand.
For whatever reason, RH Williams' contract was terminated. He
later came to public notice when he sued the clearing banks
unsuccessfully for royalties on the OCR numbers printed on the
bottom of cheques. David Spurrell resigned at around the same
time, and I succeeded him as Company Statistician.
I had my own interest in Zebras one and two. Although I had
not quite cleared the backlog of experimental analyses, I
persuaded the Chief Engineer to let me go on a programming
course at STC in Newport.
This was a personal disaster at first. There was no real
attempt to explain the basic ideas behind programming: we
were, instead, led through the coding of what would now be
described as the operating system! This was definitely not
what I wanted nor could I understand what was going on.
In the second week we were introduced to "Simple Code", the
interpeter which caused the Zebra to execute something very
close to Edsac code. This was more what I was after and I
began to see how to do real work on the machine. What is more,
the default arithmetic type in Simple Code was floating point!
(The Analysis of Variance, being concerned very largely with
squaring and adding, caused overflow very quickly in fixed
point).
I spent many evenings at home struggling with programming and
eventually got the hang of it. I started off with the simpler
things and wrote programs for mean, standard deviation, linear
regression and Pearson correlation coefficient.
Analysis of Variance was my target and eventually, after
attending a conference and getting some clues from JC Gower of
Rothamsted, I managed to get the algorithm straight and
programmed and debugged it. This was my first major program,
called Anova. I was immensely helped by discussions with Tom
Corless, whose capacity for abstract thought while lying full
length on the computer room floor never ceased to amaze me.
This business of lying full length was often resorted to. A
print-out of the program had to be done on a telex machine
which used rolls of unperforated paper. These curled up unless
weighted down and for programs of any size, say eight feet of
print-out, horizontal on the floor was the best way to read
them.
Like telex rolls, 5-hole paper tape was not a friendly medium.
Apart from the fact that you could easily cut yourself on the
edges of the tape and that short lengths would undulate snake-
like along the floor until absorbed by the fans revolving
silently inside the operator's desk pedestal, a mistake in
punching a tape was a disaster.
Did one punch the tape again? No, one did not. One cut out the
offending frame(s) and spliced in corrected frames, using a
jig, transparent tape and emery boards - the sort used by
comp-girls to file their finger nails. The emery boards were
used to rough up the sellotape to enable the friction rollers
to get a grip as the reader took on the spliced join. The
cylindrical knives that cut out the confetti from the tapes -
in demand forweddings - sometimes, after prolonged use, would
produce the dreaded "trap-door" fault, where a hole would be
opened or closed randomly on successive reads, the chad
remaining connected to the edge of the hole by a tenuous but
strong fibre.
I was, with startling suddenness, put in charge of Zebra two,
the comp-girls and Tom Corless, a transition I was quite
unprepared for. What was worse was that the position of the
full scale implementation of the results of the daily LP runs
and the weekly EXPS run was now untenable. The combined forces
of the Nutritionist and the Commodity Buyers, who had no
intention of allowing their grip on the company to be loosened
by a computer, had forced a stalemate. I did not understand
either the LP or the EXPS programs, which Tom now ran on the
Zebra - and I wasn't interested in them either. It seemed to
me a lost cause, far too difficult to explain to the company
directors.
I can't remember much of the year I spent running Zebra two. I
wrote a program which worked out targets for salesmen from the
Ministry of Agriculture's census returns of animals in the
various counties of England and Wales. All this did was cause
alarm and despondency in the sales force; it made no allowance
for the density of the populations or the average size of
herds. Salesmen visited farms, not the animals themselves. I
comforted myself with the thought that at least I hadn't
specified the problem, only programmed a solution dictated by
the sales manager.
It was in this period that I became interested in nomograms
and simple devices for analogue computation, circular and
spiral sliderules. With the aid of Zebra two I wrote a program
to design circular sliderules for special purposes and this
was converted into a "give-away" circular slide-rule which
worked out the cost of producing a dozen eggs given a few
parameters, including the cost of a ton of Bibby's Layers'
Pellets. This was a great success and several thousand were
distributed. I have always regretted not having kept one!
I ran the Anova program regularly and extended it to handle
covariance. My work on analysis of experiments now took 40
minutes on a Monday morning and I had the rest of the week to
think of other things. I ran all the old,hand-calculated,
analyses through to check out the program and found several
discrepancies in old reports. In every case these proved to be
mistakes in the human computations, usually slips in decimal
point location.
By now I had discovered the literature of computing, such as
it was, and had decided that there was a need for something
easier than Normal Code and more flexible than Simple Code
which would cope with commercial problems. (Simple Code, like
the Edsac machine code on which it was modelled, had been
designed to facilitate matrix computation.)
I found literature on Algol and went on a Zebra Algol one-day
course. We were promised an Algol interpreter for the Zebra,
which I believe was being written in the Netherlands. The
instructor refused to answer, or even discuss my only question
- how do you read and print one character? I had discovered
the theology of computer languages! In Algol, of course,input-
output is never mentioned; it is a bit like discussing
republicanism at a Royal Garden Party.
It was a case of back to the library, where the Bibbys
librarian helped me to obtain various research reports from
America. This led me to the early work on Cobol - I think it
must have been, at this stage, without a year suffix, but the
report I have is dated 1958. This seemed to be what I wanted
but I was daunted by its size.
After some struggling I decided to subset it and write a
translator for Zebra. I christened my new language Intrinsic,
which was Bibby's telex answerback. After some further reading
I decided to write it in its own language and bootstrap it up
after the model of Halstead's Neliac, which was a quick and
dirty attempt to produce an Algol clone. Halstead worked at
the Naval Electronics Laboratory in the USA and the "iac"
stood, I think, for International Algebraic Compiler. (The
original name for what became known as Algol was IAL -
International Algebraic Language.)
It was later that summer that the Chief Engineer found out
what I was doing and gently told me to stop. This was too much
and I started applying for other jobs - after all I could now
call myself a compiler writer and talk about dynamic own
arrays. I was sorry to say goodbye to Zebra two - like a first
love I still have fond memories.
What happened at Bibbys? Years later, I tried to recruit Tom
Corless to work for me at Trent Polytechnic, Nottingham. He
reminded me about thenon-contributory pension - I think he had
about seven years to go at Bibbys. And he told me that he had
translated my Anova program for the IBM that succeeded the
Zebra. I was very pleased: by then the algorithm was 20 years
old. I asked him about Linear Programming. "They don't do that
anymore", he said.
Home page!