17 - RESEARCH & STRUCTURED PROGRAMMING

Biographical notes
by BERNARD A HODSON

Those without a technical or computer background may want to omit a reading of this chapter, as it describes, in more detail than has been used thus far, some of the research projects conducted at the University and elsewhere.

One of the persons I hired from the UK was Richard Morgan, who came with his wife Pamela, who also worked for us. Richard is now actively involved with the Parliamentary computer systems in London England. Richard had a degree in Classics and decided he would like to investigate the Minoan Seals and see if he could interpret them, something which had escaped Classicists up to that time. The Seals were all that seemed to be left of a civilisation which existed in the Mediterranean, but which at the time of Richard's investigation had not been interpreted. He was able to get documentation on the Seals, including pictures, and then created a database of the Seal features to see if, by searching for patterns and linkages, he might obtain an idea of what they meant. That was about as far as the research went.

They also worked on a computer assisted learning project with two medical doctors (pathologists), Drs. Wyatt and Bowden, (from the Faculty of Medicine) on how students might use a computer in the study of haemoglobin. The doctors spent close to a year developing the material and they developed the on-line program for the students to use. The two doctors wrote a paper concluding that the technology available at that time did not lend itself too well to on-line education. I had reached the same conclusion in developing a version of the Hodson-Turing approach for the teaching of engineering.

The two doctors were expert in emphysema, a disease of the lungs often caused by smoking, which blackens the lungs and reduces the capacity of the lungs. I worked with them to try to build a three dimensional model of the lung from slices of the lung. Again the technology was not ripe for this activity, which can now be done routinely with current software and graphical capability, something not available to us at that time.

Another interesting project, the forerunner of desktop publishing, was carried out by Steven Skelly. Steven was with the Faculty of Law and during his two year probation had become interested in computers, so much so that the Faculty said that he was more interested in computers than law and did not renew his contract at the end of the probationary period. Knowing his potential I hired him as a Research Associate and was soundly criticised by the Vice President Academic (for hiring someone who had been let go by another group) who, fortunately, could do nothing about it.

Steven had been looking at the possibility of computerising the Statutes of Manitoba, and persuaded the Provincial authorities to let him put new Statutes on the computer as they were being developed. In this way the whole legislative process could be speeded up. The Province agreed and we proceeded. As the Statute was debated each amendment was computerised and the latest version, annotated with the changes, was given to the legislators. It worked admirably. Not only that but we were able to get funding from the Federal Canada Council, equal to about three times the salary I was paying Steven, so it was a profitable venture also.

Steven then started putting the existing Statutes on the computer so that searches could be made to find out whether a proposed new Statute might adversely affect any of the existing Statutes. He arranged with a software company in Florida, who were experimenting with desktop publishing, to co-operate with him, which they did.

A couple of years into the project I was in Toronto airport and started talking to a couple of persons from the Department of Justice in Ottawa. They told me they were going to the USA to find out what was being done in the computerisation of legal documents. I told them they were going to the wrong place. They finally finished up hiring Steven, who finished up as Senior Assistant Deputy Minister (i.e. Second in command) of the Department of Justice.

On the University computer we had, at tbe time, only a limited amount of memory, initially 128k (k=1024) later 256k words, a very small memory by today's standards. We had, therefore, to be able to flip applications in and out to maximise the use of the computer. Carol Abraham., a Romanian who had joined us from the research organisation in Australia, CSIRO, developed one of the world's first rollin/rollout systems, something needed for any system that shares the system between a lot of applications. There are large differences between the time a computer takes to do computations and the time it takes to write something to disc or other output media. Carol's work checked an application to see if it might be waiting for data to be input (another slow process) or was placing data on an output device. If either was in progress he rolled out the application and rolled in another. At a later time he would check the status of the rolled out application and, depending on the priority, roll it back in if it was ready to do further computation. This could be done with several applications in progress, and was an essential need of early computers, and is still a technique used in today's systems. He later did further work in time sharing systems.

Working with the Head of the Mathematics Department I developed a program to calculate non cyclic Latin Squares of high order. A cyclic Latin Square is easy to develop, a typical one being:
1 2 3 4
2 3 4 1
3 4 1 2
4 1 2 3

where you notice that all numbers appear in each row and column.

A non cyclic one is more difficult to find, a simple example being:
1 3 4 2
3 1 2 4
2 4 3 1
4 2 1 3

For some reason the US Air Force were interested in high order non cyclic Latin Squares, with upwards of 25 rows and columns, and gave the Department of Mathematics research funds to examine this problem.

The hard way is to do it solely by computer but even today, if the problem was done without any insight it would take scores of hours to compute. My colleague, with his mathematics knowledge, had developed the knack of knowing roughly what might be a useful starting point. Even with this knowledge it took a considerable time for the non cyclic Latin Squares to be computed and it would run for hours on the fastest computer in Canada at that time, an IBM 7090 at the University of Toronto. I had to print out a message after every few million iterations, telling the computer operator that it was still running satisfactorily. As it happened I made an error in the program (which did not affect the results, which were self evident) proving that the Fortran compiler being used did not follow specifications (which was actually to our advantage as we had inadvertently generated a negative index on a DO loop, which at that time was unacceptable in the Fortran standard.

While at Imperial in Calgary I had been asked to predict how much more oil and gas might be found in Canada, given the successes to date. This involved establishing all the information we considered valuable in the search for oil and creating a set of equations in many variables. Up to that time most statistical work had been done with what are known as linear statistics, but for the problem I was investigating we had to use non linear statistics. By manipulating the variables we could come up with what was known as a local optimum result. In some ways, in three dimensions, you can think that an optimal solution is at the bottom of a large soup bowl, but it is difficult to picture this in more dimensions than three. The problem is that there might be another and better soup bowl some distance away, so I had to "step out" the variables I had used to see if a better soup bowl existed. I solved this problem by brute force on the computer and considered publishing it as the "Hodson soup bowl" method, but thought it too crude. It is unfortunate that I didn't because a few years later a well known non linear statistician called Box published the exact same method of handling such problems.

Shortly after arriving at the University I was asked if I would be willing to act as an expert witness in a possible law suit in which one group of oil companies in Alberta was accusing another group of stealing their oil by directional drilling. In this situation an oil well can be drilled from outside the boundaries of a known oil field, to tap in to the oil in that field, in the same sense that electricity can be stolen by a neighbour tapping in to your mains connection. The potential expert witnesses from the other side were located at the University of Alberta in Edmonton. It would have been an interesting experience but the law suit was dropped for some reason.

Another project in Calgary was to optimise our selection of leaseholds. At that time a company could get a permit to explore for oil over a number of square miles. If they were successful in finding hydrocarbons then they could apply for permits to drill for oil on a production basis, but only on half the land they had previously leased. Moreover there were strict rules to this allowable fifty percent. You could establish a lease on certain rectangular pieces (e.g. A rectangle two miles by three miles, a square one mile by one mile etc.). These areas could abut on a corner but could not be side by side, but could be on a side by side basis provided there was a one mile separation. The principle object of these rules was to allow bidding by other oil companies for the sections you were relinquishing.

The technique was to assign values to the different segments being considered for production leases and then use a computer program to assess which was the best strategy. At first we tried linear programming but its results would say take a fraction of this area, another fraction of that area. The trouble was that you had to take a complete area, not just a bit of it. We finished up initially by using Monte Carlo techniques (where the computer throws dice randomly for several million throws) to arrive at a probable solution as to what was our best strategy. A mathematician called Danzig became interested in the problem we had and developed a technique known as "integer programming", in some ways similar to the linear programs we were using to optimise refinery operations, but allowing integer rather than fractional solutions.

Various members of my staff did research in their allowed half day a week to do so. Don Costin did some work on digitising voice on tape, to experiment with voice digitisation, Alistair Nichol developed a French thesaurus, Karl Schmidt did work on the two dimensional presentation of three variables, I aided a Japanese scientist in the study of mites and other insects in grain storage bins. Another staff member developed techniques of cluster analysis, renting a rowing boat to do his study on Lake Winnipeg (I had difficulty getting that through the Comptroller). Yet another did a simulation of the grain drying operation at the Lakehead, at the Northern edge of Lake Superior (grain bins are very dusty and I had further difficulty justifying cleaning bills for the individual with the Comptroller). There was much more done, suffice it to say that we all had a lot of useful fun.

The computer was on the sixth floor of the engineering building. Nearby was a 750,000 volt transformer and on the roof above us was a radar laboratory. Both these could potentially have affected computer operations with their radiation, but they never did. One experiment conducted above us was to see if radar towers would affect animals, where they subjected a number of chickens to the radar waves. This was a concern of many people at the time but I do not believe there was any established correlation that radar or microwave affected animals.