The job I accepted was that of "Head of Systems" and our computers were mainly located in an old factory that had previously been used for the manufacture of brassieres for women. The hardware was primarily from Digital Equipment (DEC 10's and smaller units) with some satellite imaging hardware from General Electric called the GE 100. There were also a variety of peripheral devices such as an early film scanner on which you placed a photograph, which was then scanned and converted to "pixels", which are digitised (numerical values of what is seen) spots from the image. A scanned image would have several scores of thousands of pixels, with the numbers representing the value of a scanned grey level, (in later scanners there was a number for each of the primary colours).

The data we had to process was primarily from satellites in what were known as the Landsat series, but in addition we also processed data from the atmospheric NOAA series satellites. Landsat data was received at a satellite data receiving dish in Prince Albert, in the Province of Saskatchewan. As the satellite passed overhead every 90 minutes it would transmit the data it had gathered from that orbit. This dish was able to accept data gathered over most of Canada with the exception of a small amount in the East of Canada. At a later stage a dish was erected at Shoe Cove, in Newfoundland, to remedy this situation, which also covered a greater part of the Atlantic Ocean. I visited Shoe Cove on occasion.

There is one major difference between data generated in commercial enterprises and data generated by satellites. Commercial data is usually separated into manageable chunks with data separation markers between chunks, making it relatively easy to process. Satellite data comes down in a continuous stream of "bits" describing an image as seen by the satellite. If the receiving hardware misses one of these bits then the entire image could be distorted. To counter this special computer programs were developed to check whether any "bits" appeared to be missing, but I will not describe the technique in this section.

Any picture can be digitised to "bits" and, once digitised, the image can be manipulated. As one rather mundane example if you take a picture in a dark garage, or of a submarine in the ocean depths, the generated picture will appear black in most cases. By converting it to bits you can then manipulate the bits so that you can "see" the contents of the garage as if it was day time, or the submarine as if it was on the surface. In the current day the son of one of my former bosses has developed a patented technique where any black and white image can be converted to the colours that it originally had. To date no one has exploited this technique of his, which could be used to generate coloured movies from black and white film scrip, or coloured photos from black and white images.

A Landsat satellite image consists of several million pixels representing reflection from the earth in different spectral bands (a spectral band is energy reflected at a specific wavelength of interest). The image generated represents the accumulation of data gathered from small areas on the earth's surface. Originally the area scanned was divided in to fairly large squares so that you could only see gross patterns on the earth, such as fields and parking lots. As the technology developed these squares became smaller and smaller (and in consequence the amount of data gathered became larger and larger) so that the satellite received an image from squares of just a few metres across, enabling the satellite to identify buildings, cars, boats and even people. Today even this has been reduced. Depending on the values from the spectral bands (the number of bands increased through the years) maps could be drawn, pollution identified, forest fires monitored, desert growth plotted, crop health determined. The satellite data became strategic in that countries such as Canada could determine the expected wheat crop around the world, and plan their sales strategy accordingly.

In the early days Landsat was generating about 10 billion bits of data each day, which had to be stored somewhere. Nowadays this number is much larger. It was initially stored on magnetic tape but the problem is that magnetic tape deteriorates over time, deterioration being particularly acute if the tapes are not rotated periodically, about every three months. It is important that the data be stored because many groups wish to see how the earth has changed over a period of years (e.g. to see how deserts are growing in size on a world wide basis). Another problem is that spectral imagery is only good if there is no cloud cover, so much of the data on tape is useless unless you are interested in seeing the top of clouds.

For this reason a cataloguing system was developed which not only indicated the area surveyed, along with the date of the observation, but also the degree of cloud cover. Such data and images are available on a world wide basis to those who have the needed image processing capability. It can now be converted from magnetic tape to CDROM but again there is the problem of deterioration over the years. It is expected that CDROM will last longer but once again there is the problem of frequent software change by the vendors. Will the software of tomorrow be capable of processing the CDROM data of today.

The great need for image processing is the ability to process reams of data in parallel. Doing it sequentially takes for ever and a day. This was partially alleviated by the GE100 previously referred to but the need became more acute as the volume of data increased. To see if things could be improved I was asked to investigate a new technology which was being introduced, that of array processing. An array is, in the image processing world, a two dimensional collection of data values, and the idea of the "array processor" was that part or all of an array could be processed simultaneously (what we call in parallel) rather than sequentially, as is done with most computer systems. A company called CSPI, in Massachusetts, had developed such a technology and I was asked to evaluate what they had produced. It turned out that it worked well and we purchased one, following which I generated some computer programs to use it, including one to do what were known as "Fast Fourier Transforms (FFT)", a technique that was used for processing digitised images such as we obtained from the orbiting satellites.

The results of this were so interesting that I prepared a paper on the development which was presented at the Digital Equipment Users conference (DECUS) in the USA. This was well received and I was invited to Socorro, New Mexico, to discuss the topic of "array processors". The site was at the location of the United States Very Large Array (VLA) which consisted of 27 antennas spread over several miles in the shape of a letter Y. With these antennae they scanned the heavens to pick up a variety of signals. Each antenna was connected to a central computer by what is known as a Wave Guide (the theory of which I had studied at University but had never seen one before) I did not expect to see ice in New Mexico but the Motel's swimming pool, where I stayed, was ice covered during the entire visit. Many years later I revisited the area, having another look at the VLA and also visiting the not too far away Severe Storm Research Centre where they tried to fathom out the secrets of thunderstorms. This involved traversing a road (if you could call it that) that was like a boulder strewn pathway going up a mountain.

I became interested in two topics of interest to the remote sensing community, one of which was to compress the satellite data received, the other to play around with generating different colours so that the satellite data received could be portrayed in different colour combinations to achieve a better understanding of the data received. In addition to satellite data we also flew remote sensing aircraft with various sensors attached, to match what was seen at low level with what was seen from the satellite. I insisted that every one of my staff fly in the remote sensing planes, so they could see what difficulties had to be overcome in gathering data. The planes flew at low level, often over heated air rising from the ground, making the journey quite turbulent. Operators had to use a variety of computer equipment on board the aircraft to gather the data. One of the benefits of this work, as it happened, was to pinpoint potential new areas for growing grapes in the Niagara wine growing belt of Ontario. We prided ourselves that only one member of staff had become airsick during this activity.

I was approached by the National Research Council to see if I could colour process some radio frequency data they were receiving from a supernova, and I agreed to do this for them, with very interesting results, identifying an area of star generation as well as the shock wave interaction between the explosion of the supernova and the surrounding galaxies. I was at first reprimanded by my boss for doing this but I asked him what the purpose of our organisation was. When he replied "remote sensing" I asked him whether he considered a supernova to be sufficiently remote, at which point he shut up. The generated supernova image was almost three dimensional, so much so that a member of the NRC staff generated a "petit point" image of the data, which was hung at the NRC offices.

I was, however, somewhat bored with what was mainly a "hardware" oriented research activity and decided, after four years of this type of work, to get back to a more software oriented environment.