3 - THE GENETIX CONCEPT
by BERNARD A HODSON
One of the objectives of these sections is to trace the history of a concept which could replace the computer as we know it today, and which likely would have been brought in at the beginning of the industry, had it not been for the development needs of the atomic bomb. The concept has been maturing over a long period of time, first being proposed for hardware by Allan Turing in 1936, and then being developed by Hodson in software from 1955 onwards.
Since the beginning our ideas of computers for application development has been based on the concept of a stored program (a set of instructions that are stored internally by a programmer within the computer memory). These sets of instructions are generated for each and every application, rarely being compatible from one application to the next. Data bases, spreadsheet and other packages cover a spectrum of specifically oriented applications, but are still based on the now obsolete software technology of the von Neumann perspective. These packages, along with the operating systems to run them, are demanding more and more resources, of memory, disc and chip speed. Current software is so complex and un-coordinated in its structure that it is highly unreliable, subject to frequent crashes, insecure, frustrating to the everyday computer user, and subject to viruses, worms, identity theft, spam and other nefarious acts.
GENETIX was the name given to the software equivalent of the Turing ideas on hardware and is an entirely different concept to that espoused as a result of von Neumann. It says that any and all applications can be created and run with a single hardware (Turing) - via an infinite paper tape, or with a small single piece of software (Hodson). Current technology, based on von Neumann concepts, needs hundreds of millions of computer instructions to do what Hodson's concepts can do with a few dozen.
The GENETIX concept also says that this small piece of software could replace all current operating systems (the traffic policemen of computers). With this approach it becomes possible to replace the computer as we currently know it today by a small chip containing just a small amount of conventional code. This chip, when developed, will fulfil the 1936 dream of Turing, replacing the infinite tape he surmised with a very small segment of machine code. Ultimately this tiny code will be all that is required for any organisation to develop and run all its applications. Acceptance of the concept will save organisations millions of currency spent on computer hardware and software purchases, as well as on the costly development and maintenance of applications. Moreover it will save the IT industry more than $46B annually (2004 figures), the estimated current cost of spam, viruses, hackers and other nefarious activity.
The ideas presented are not theoretical, such a system having been built to demonstrate the concept, which works quite effectively. Amongst many early applications it was used to ran a pilot operation for a credit system similar to that of Dun and Bradstreet (using a pilot 40,000 records), was used to develop a pilot hospital information system similar to that developed by Lockheed Aircraft and Missile Systems for the Mayo Clinic, and for a document storage and retrieval system similar to that used world wide by Exxon in the USA. These early experimental versions showed the concepts to be viable. They were also an ideal tool for teaching concepts of computers and systems analysis not only to students in Universities and Colleges, but also to senior executive staff at advanced management courses in both Canada and the United States.
The first part of the concept involves defining what is known as a virtual machine with a very limited instruction set, which does basic operations such as add or subtract numbers, compare strings of text, move strings of text about within the computer, draw a line on a graphics screen, communicate with a telephone, send an email, open a gate, sound an alarm, and so on.
The second part of the concept is that it is very easy to create the equivalent of this virtual machine on any real machine. This means that any application developed for the virtual machine should run successfully on all computers.
The third aspect of the concept is that any application, no matter how complex, can be divided into a relatively small set of basic functions, which we call 'software genes'. This small set of software genes, which can be considered as building blocks, can be found in every application (however sophisticated). By linking these building blocks in different patterns a user can create a home accounting system, a communications package, a word processor or an airline reservation system operating on a world wide network. To do these applications with current technology requires hundreds of thousands of millions of computer instructions, but no machine code is ever generated with GENETIX, the single computer program doing it all.
The building blocks or software genes are simply data elements stored in the memory bank of the computer, each of which is given a name. The number of software genes is quite small, numbering just a few hundred. With this small set of building blocks most of the applications currently running on the world's computers, can be built. Once these software genes have been created they can be used on any computer that has an emulation of the virtual machine. The virtual machine is not dependent on any current computer architecture or any particular operating system, so will be able to run on any future computer that may be built. In fact an examination of even the most sophisticated computer applications shows they can be constructed with just a few software genes.
The future impact of GENETIX will be the elimination of the computer as we know it today and its replacement by plug compatible peripheral devices each containing copies of the software genes. A user will just buy the peripherals needed and plug them together. Future applications will then be delivered on some type of credit card, also containing the same set of software genes.
In summary, just as an almost infinite number of human beings is created from a large but finite number of genes and chromosomes, so future computing will be achieved by the analogy of software genes, although the number of software genes will be considerably less than the number of human genes.
The first versions of GENETIX defined the genes in terms of small conventional machine language modules. Recent developments of the concept have replaced those early ideas with a numerically structured system using almos no machine code. This new architecture, even more powerful and versatile than GENETIX, is described in a paper called 'a new type of computing' which can be found at the beginning of this web site.