The concept of the gene as a symbolic representation of the organism – a code script – is a fundamental feature of the living world and must form the kernel of biological theory — Sydney Brenner, 2012 (.5)

Computer Models of Evolution
See the five Next pages for What'sNEW

Langton
What's the difference between the process of evolution in a computer and the process of evolution outside the computer? The entities that are being evolved are made of different stuff, but the process is identical.... These abstract computer processes make it possible to pose and answer questions about evolution that are not answerable if all one has to work with is the fossil record and fruit flies. — Christopher G. Langton (1)

Any theory holding that life originates on Earth de novo from nonliving chemicals has problems for which computers provide a good metaphor. The problems are in two categories, hardware and software. For life as for computers, both are required, together.

After eukaryotic cellular life has become established, in this metaphor, the machinery for creating new biological hardware is in place, and the remaining problem is one of software only. How does the genetic programming for new evolutionary features get written and installed?

This aspect of the problem of evolution is a good one to focus on because computers are everywhere and can be readily observed. We can ask the same question about real computers: how do new computer programs get written and installed? Of course, the answer is that computer programmers write the programs and computer users install them.

But Darwinism holds that during the course of evolution there were no programmers for genetic programs: the process was blind, self-driven. An analogous process in the world of computers would cause new computer programs or subroutines to appear spontaneously in the traffic of computer code being copied and transferred. If a spontaneous new computer program or subroutine somehow became able to replicate itself, it would have taken a significant step toward "life." If, subsequently, it accrued other advantages, like concealment, it would have a "survival" advantage in the world of computer traffic. From there, by analogy with Darwinism, it could grow and multiply and have properties similar to life. Does this ever happen?

Alternatively, it should be possible for scientists to artificially create a computer "environment" in which the evolution of computer programs could occur. Parameters governing the mutation and recombination rates could be optimized for the evolution of new programs. At the lightning speed of modern computers, jillions of trials could be run to see if randomness coupled with any nonteleological iterative process can ever write computer programs with genuinely new functions. Has this been done?

The Blind Watchmaker

biomorph
Dawkins
Nothing in my biologist's intuition, nothing in my 20 years' experience of programming computers, and nothing in my wildest dreams prepared me for what emerged on the screen.... With a wild surmise, I began to breed, generation after generation, from whichever child looked most like an insect. My incredulity grew in parallel with the evolving resemblance.... I still cannot conceal from you my feeling of exultation as I first watched these exquisite creatures emerging before my eyes. I distinctly heard the triumphal opening chords of Also sprach Zarathustra (the '2001 theme') in my mind. — Richard Dawkins
(2)

Richard Dawkins has written several computer programs which function, he says, like evolution. Computers today are powerful and can run the programs very quickly. This speed enables Dawkins to compress a single generation, or computer trial, into a fraction of a second. In his widely acclaimed 1987 book The Blind Watchmaker, from which the above quote comes, he tells about several such programs.

One of Dawkins' programs begins with a random string of letters and creates a sentence. The sentence is from Shakespeare, "METHINKS IT IS LIKE A WEASEL." The evolution takes only 64, 43, or 41 generations in different computer trials. Dawkins acknowledges that the chance of that short sentence getting produced in a given random trial is, "about 1 in 10,000 million million million million million million." But, he continues, with "cumulative selection," the thing becomes doable. Each time a random computer trial happens to produce a correct letter in a slot, that letter is preserved by cumulative selection (p 46-50).

There is a problem with using Dawkins's scheme as an analogy for evolution. In order for there to be such a thing as a correct letter, the complete sentence has to already exist. In real life, this would require evolution to be teleological, to have a prescribed goal. Teleology in nature is the very thing Darwinists abhor. Random mutations cannot have any prescribed goal. For life to evolve this way, what preexisting model is it emulating?

Alternatively, if there is no model in Dawkins's computer, how is the sentence that is only 61 percent wrong, favored over the one that is 86 percent wrong? How is "MDLDMNLS ITJISWHRZREZ MECS P" better than "WDLTMNLT DTJBSWIRZREZLMQCO P"? After presenting this idea, Dawkins then admits, "Life isn't like that" (p 50).

biomorphDawkins also uses the computer to generate artificial creatures he calls "Biomorphs." He begins with some stick figures on the computer screen. He allows a few variables to change at random, within prescribed parameters. This changes the shapes of the stick figures. The resulting creatures show some variety, and Dawkins is good at naming them. The evolution they undergo is analogous to biological evolution, he says. So he offers his creatures as further evidence that chance can write new genetic programs. His enthusiasm for the Biomorph program is evident in the quotation at the beginning of this section.

biomorphDawkins acknowledges that he uses artificial selection to guide the process. His creatures are tightly constrained by his Biomorph software and completely dependent on the computer's operating software. Deleterious mutations are not possible. The ratio of actual to possible creatures in his genetic scheme is one to one: every sequence makes a viable creature. He achieves the "evolution" by adjusting only a few variables within narrow ranges. So the only changes that occur in the creatures are those whose potential is already available in the program originally. The evolution that he is able to simulate is, at most, microevolution.

biomorphAt the back of The Blind Watchmaker is an order form that can be used to obtain a copy of Dawkins's program, "The Blind Watchmaker Evolution Simulation Software," for $10.95. Do you think that the installation instructions say, "Don't be careful when you install this program, and don't make a backup copy of your system first, because mistakes are the way things get better?"

No. If chance were able to write improvements to computer programs we would know about it by now. "Hey, thanks for the freeware spreadsheet program you copied and gave me. By the way, due to some chance errors in the copying, I ended up with a version 3.1 copy, from your version 3.0 original. I'll be able to do mortgage tables. Isn't it great?" "Yeah, I've heard of that before."

No. The best outcome that ever follows the exchange of computer programs is that the programs work as expected. And what often happens is more like, "Hey, thanks for the freeware spreadsheet program you copied and gave me. But after I loaded it, everything crashed. Can you come over and help me?" Maybe another piece of software is required for the spreadsheet to work on the new computer. Or maybe the other necessary software is already loaded, but locked up by some previous programming. Tinkering is often required.

How would Richard Dawkins's artificial creatures work if you allowed chance mutations to affect the "Blind Watchmaker" software, or the computer's operating programs? Would you guess that Dawkins himself carefully makes backup copies of his programs, lest something should, by chance, alter them? If Dawkins had to conduct evolutionary simulations with operating and applications programs that had been slightly randomized, what music would he hear then?

The Santa Fe Institute

SFI There is a theory that deals with the application of computers to evolution. It's sometimes called the theory of complex systems, or complexity theory. Ilya Prigogine is considered one of its founders. The theory is hard to summarize; it is perhaps misleading to even describe complexity theory as a single theory. Regardless of how it is characterized, since 1984, complexity theory has had an institutional home at the Santa Fe Institute in Santa Fe, New Mexico.

Gell-Mann
Physicist Murray Gell-Mann (right), one of the founders of the Santa Fe Institute and cochair of its Science Board, describes complexity theory by saying that people in this field work from the top down, whereas scientists usually work by "reductionism" from the bottom up. Gell-Mann thinks both approaches are needed. The hallmark of complexity theory is the extensive use of information theory and computers to model the behavior of notoriously complex problems such as economic markets, protein folding, and evolution. In 1994 Gell-Mann published a book, The Quark and the Jaguar
(3), largely about this field. One might expect that Gell-Mann's book would describe some examples, if there are any, of computer programs that can independently evolve to more organized forms. He briefly mentions Richard Dawkins's Biomorphs program without endorsing it. "In real biological evolution there is no designer in the loop" (p 318).

The closest possibility Gell-Mann discusses is a program called Tierra, written by Thomas S. Ray. In this program there is a standard "ancestor," an "organism" consisting of eighty computer instructions. From it other organisms descend. Mutations are introduced into these descendants at rate about a million times higher than the average mutation rate in eukaryotes. When the computer's memory is full, the older or more defective organisms are "killed off." The outcome of this artificial process about which Gell-Mann has the most to say is the evolution of a more compressed version of the original ancestor, one with only 36 instructions instead of eighty (p 315). He mentions the evolution of no new features in Tierra. Yet the noteworthy outcome of biological evolution is not the reduction of the instruction set, but the growth of it, and the emergence of new features. As real life evolves, new genes with new meaning are added to the genome.

Kauffman
The current champion of the theory of complex systems, as it pertains to biology, is Stuart Kauffman (left), also of the Santa Fe Institute. "Kauffman has spent decades trying to show—through elaborate computer simulations—that Darwinian theory alone cannot account for the origin or subsequent evolution of life"
(4). Kauffman proposes that a process he has named "autocatalysis" is necessary and sufficient to create life from nonliving chemicals. Autocatalysis is a chemical process which is enhanced by one of its own products. For example, imagine a pair of complementary nucleotides that could, as a unit, enhance the pairing rate of the same nucleotides, unpaired. If unpaired nucleotides were constantly supplied, like "food" for the reaction, autocatalysis would produce many nucleotide pairs. This concept has attracted some attention, but so far its applicability to biology is tenuous.

In his 1995 book, At Home in the Universe (5), Kauffman also mentions Tierra. To Kauffman what's interesting about Tierra is the fact that the organisms become extinct. "They disappear from the stage and are heard from no more" (p 238). From this behavior he draws a lesson about the size and frequency of extinctions in real life, not about evolutionary advances. In sum, the evolution in Tierra which he describes is all copying and shuffling, while not generating new programs or subroutines.

Kauffman also discusses another attempt to duplicate evolution in a computer model. The result is negative (p 276-277):

Since the Turing machine and its programs can themselves be specified by a sequence of binary digits, one string of symbols is essentially manipulating another string. Thus the operation of a Turing machine is a bit like the operation of an enzyme on a substrate, snipping a few atoms out, adding a few atoms here and there.

What would happen, McCaskill had wondered, if one made a soup of Turing machines and let them collide; one collision partner would act as the machine, and the other partner in the collision would act as the input tape. The soup of programs would act on itself, rewriting each other's programs until... Until what?

Well, it didn't work. Many Turing machine programs are able to enter infinite loop and "hang." In such a case, the collision partners become locked in a mutual embrace that never ends, yielding no "product" programs. The attempt to create a silicon self-reproducing spaghetti of programs failed. Oh well.

Chris Langton, External Professor of the Santa Fe Institute, clearly believes that it is reasonable to ask for a computer model that emulates life (6):
"...If a programmer creates a world of "molecules" that—by following rules such as those of chemistry—spontaneously organize themselves into entities that eat, reproduce and evolve, Langton would consider those entities to be alive "even if it's in a computer."
In September, 1995, Langton was asked: if chance can write the new genetic code behind evolutionary progress, then it should be able to write new computer code. Can it? Langton answered yes (7). His example was a string of computer code two bits long used to specify one strategy in a computer game simulating evolution (the "Prisoner's Dilemma"). Because mutations are allowed, the string of code can occasionally double to four bits; sometimes this doubling can lead to a superior strategy.

This example seems weak. If any random process can write computer programs, there should be examples more impressive than the duplication of two bits. This is equivalent to the insertion of one nucleotide in a real genome. One possible excuse for this weakness is that not enough time has gone by for self-generated computer programs to emerge. But evolution is a robust process. If new genetic programs can be created without input in the biological world, wouldn't there be some convincing indication of an analogous process in the computer world by now? Dramatization of Tierra

Tierra

A hyper-parasite (red, three piece object) steals the CPU from a parasite (blue sphere). Using the stolen CPU, and its own CPU red sphere) it is able to produce two daughters (wire frame objects on left and right) simultaneously. — Thomas S. Ray (8)

Ray
The author of the Tierra program, Tom Ray (left), sees much drama in its evolution. However, he is apparently aware that the evolution it has achieved so far is well short of that in biology. "It is hoped that with the help of some impulse towards greater complexity, this dynamic can lead to a large spiraling upwards in complexity." He thinks the needed impulse may be supplied by running Tierra on a worldwide network of computers analogous to a multicelled creature.

Ray's advocacy of the proposed network sounds a bit like a venture capital prospectus, complete with disclaimers. "Eventually the product can be neutered and sold to the end user.... it is a venture into the unknown for which we can not estimate the likelihood of success." While we should only encourage research of this sort, we should be realistic about what it has accomplished so far. As for claims that computers will duplicate the progress evident in biological evolution, we should maintain a healthy scepticism and wait for some convincing results before we buy in.

What'sNEW: see the five Next pages

References

.5. Sydney Brenner, "Life's code script," Nature, 22 Feb 2012.
1. Christopher G. Langton, interviewed by John Brockman in The Third Culture, Touchstone, 1995. p 353.
2. Richard Dawkins, The Blind Watchmaker, W.W. Norton and Company, 1987.
3. Murray Gell-Mann, The Quark and the Jaguar: Adventures in the Simple and the Complex, W. H. Freeman and Company, 1994.
4. John Horgan, "From Complexity to Perplexity," p 104-109, Scientific American June, 1995.
5. Stuart Kauffman, At Home in the Universe: The Search for the Laws of Self-Organization and Complexity, Oxford University Press, 1995.
6. John Horgan, "From Complexity to Perplexity," p 104-109, Scientific American June, 1995.
7. Chris Langton, interviewed by Brig Klyce, Santa Fe, NM, 29 September 1995.
8. Thomas S. Ray, Tierra Photoessay and Tierra home page.

Related Reading

Adami, Christoph, Introduction to Artificial Life, Telos (Springer-Verlag), 1998.
Adami, Christoph, et al., "Evolution of Biological Complexity" [abstract], p 4463-4468 v 97 PNAS, 2000.
Bennett, Charles H., "How to Define Complexity in Physics, and Why" p 137-148 Complexity, Entropy and the Physics of Information: The Proceedings of the Workshop on Complexity, Physics and the Physics of Information held May-June 1989. Wojciech H. Zurek, ed. Addison-Wesley Publishing Company, 1990.
Casti, John L., Complexification. HarperCollins Publishers, 1994.
Colasanti, Ricardo and Tash Loder, "The Great Complexity Debate" p 24-25 v 10 no 1 The Bulletin of the Santa Fe Institute, Spring 1995.
Dawkins, Richard, "The eye in a twinkling" p 690-691 v 368 Nature, 21 April 1994.
Dyson, George B., Darwin Among the Machines; or, The Origins of [Artificial] Life, 07 Jul 1997.
Eigen, Manfred, "New Concepts for Dealing with the Evolution of Nucleic Acids" p 307-320 Cold Spring Harbor Symposia on Quantitative Biology, Volume LII: Evolution of Catalytic Function, Cold Spring Harbor Laboratory, 1987.
Eigen, Manfred, "The origin of biological information" p 443-454, Astronomical and Biochemical Origins and the Search for Life in the Universe, Cristiano Batalli Cosmovici, Stuart Bowyer and Dan Werthimer, eds., Editrice Compositori, 1997.
Frauenfelder, Hans and Peter Wolynes, "Biomolecules: Where the Physics of Complexity and Simplicity Meet" p 58-64 Physics Today, February 1994.
Holland, John H., Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence, The MIT Press, 1992. (First edition, 1975, The University of Michigan.)
Holland, John H., Hidden Order: How Adaptation Builds Complexity, Addison-Wesley Publishing Company, Inc., 1995.
Holland, John H., Emergence: From Chaos to Order, Addison-Wesley Publishing Company, Inc., 1998.
Horgan, John, The End of Science, Addison-Wesley Publishing Company, Inc. 1996.
Kauffman, Stuart A., Investigations: The Nature of Autonomous Agents and the Worlds They Mutually Create, Santa Fe Institute, September 13, 1996.
Kauffman, Stuart A., The Origins of Order: Self-Organization and Selection in Evolution, Oxford University Press, 1993.
Kauffman, Stuart A. "'What is life?': was Schrödinger right?" p 83-114 What is Life? The Next Fifty Years, Michael P. Murphy and Luke A. O'Neill, eds. Cambridge University Press, 1995.
Koza, John R., "Architecture-Altering Operations for Evolving the Architecture of a Multi-Part Program in Genetic Programming." Report No. STAN-CS-TR-94-1528, Department of Computer Science, Stanford University, 1994.
Koza, John R. and David Andre, "Parallel Genetic Programming on a Network of Transputers," Report No. STAN-CS-TR-95-1542. Department of Computer Science, Stanford University, 1995.
Koza, John R. and James P. Rice, Genetic Programming: The Movie, MIT Press, 1992.
Koza, John R., "Genetic Evolution and Co-Evolution of Computer Programs," p 603-630, Artificial Life II, Christopher G. Langton et al,, eds., Addison-Wesley Publishing Company, 1992.
Küppers, Bernd-Olaf, Information and the Origin of Life. (Originally published in German: R. Piper GmbH and Co., 1986.) English translation: The MIT Press, 1990.
Langton, Christopher G.; Charles Taylor; J. Doyne Farmer and Steen Rasmussen, editors., Artificial Life II: Proceedings of the Workshop on Artificial Life Held February, 1990 in Santa Fe, New Mexico, Addison-Wesley Publishing Company, 1992.
Maddox, John, "Polite row about models in biology" p 555 v 373 Nature, 16 February 1995.
McMullin, Barry, "Code McMullin: The Case of the Independent Test" p 18-25 v 12 n 2 SFI Bulletin, Summer 1997.
McShea, Daniel W., "Mechanisms of Large-Scale Evolutionary Trends," p 1747-1763 n 48(6) Evolution, 1994.
McShea, Daniel W., "Metazoan Complexity and Evolution: Is There a Trend?" p 477-492 n 50(2) Evolution, 1996.
McShea, Daniel W., "Complexity and Homoplasy" in submission, 1996.
Michalewicz, Zbigniew, Genetic Algorithms + Data Structures = Evolution Programs (Second, Extended Edition). Springer-Verlag, 1994.
Prigogine, Ilya, From Being To Becoming. New York: W. H. Freeman and Company, 1980.
Ray, Thomas S., "An approach to the Synthesis of Life" p 371-408, Artificial Life II, Langton et al., eds., 1992.
Zoretich, Frank, "Does Evolution Lead to More Complexity?" (re: Dan McShea) p 3-7 v 11 n 2 SFI Bulletin, Summer 1996.
Zurek, Wojciech H., ed., Complexity, Entropy and the Physics of Information: The Proceedings of the Workshop ...Held May-June, 1989 in Santa Fe, New Mexico, Addison-Wesley Publishing Company, 1990.
COSMIC ANCESTRY | Quick Guide | Next | by Brig Klyce | All Rights Reserved