Ontogeny Recapitulates Phylogeny

Ontogeny Recapitulates Phylogeny

After Charles Darwin's book on the origin of the species was published, the German zoologist Ernst Haeckel stated that "ontogeny recapitulates phylogeny". By this he meant that the development of an embryo (ontogeny) repeats (i.e., recapitulates) the evolution of the species (phylogeny). In other words, after fertilization, a human egg goes through stages of being a fish, a pig, and so on before turning into a human baby. Modern biologists consider this as a gross simplification, but it still has a kernel of truth in it.

Something uncertainly similar has happened in the computer industry. Each new species (mainframe, minicomputer, personal computer, handheld, embedded computer, smart card, etc.) appears to go through the development that its ancestors did, both in hardware and in software. We often forget that much of what happens in the computer business and a lot of other fields is technology driven. The reason the ancient Romans lacked cars is not that they liked walking so much. It is because they did not know how to build cars. Personal computers exist not because millions of people have a centuries-old pent-up desire to own a computer, but because it is now possible to produce them cheaply. We often forget how much technology affects our view of systems and it is worth reflecting on this point from time to time.

Particularly, it often occurs that a change in technology renders some idea out of date and it quickly disappears. However, another change in technology could revive it again. This is particularly true when the change has to do with the relative performance of different parts of the system. For example, when CPUs became much faster than memories, caches became important to speed up the "slow" memory. If new memory technology someday makes memories much faster than CPUs, caches will disappear. And if a new CPU technology makes them faster than memories again, caches will reappear. In biology, extinction is forever, but in computer science, it is sometimes only for a few years.

As a result of this impermanence, in this blog we will occasionally look at "obsolete" concepts, that is, ideas that are not optimal with current technology. However, changes in the technology may bring back some of the so-called "obsolete concepts." Thus, it is important to understand why a concept is obsolete and what changes in the environment might bring it back again.

To make this point clearer, let us consider a simple example. Early computers had hardwired instruction sets. The instructions were carried out directly by hardware and could not be changed. Then came microprogramming (first introduced on a large scale with the IBM 360), in which an underlying interpreter carried out the "hardware instructions" in software. Hardwired execution became obsolete. Not flexible enough. Then RISC computers were created, and microprogramming (i.e., interpreted execution) became obsolete because direct execution was faster. Now we are seeing the revival of interpretation in the form of Java applets that are sent over the Internet and interpreted upon arrival. Execution speed is not always essential because network delays are so great that they tend to dominate. For this reason, the pendulum has already swung several cycles between direct execution and interpretation and may yet swing again in the future.



Tags

memory, microprogramming, software