Computer Models of Evolution
Any theory holding that life originates on Earth de novo from nonliving chemicals has problems for which computers provide a good metaphor. The problems are in two categories, hardware and software. For life as for computers, both are required, together.
After eukaryotic cellular life has become established, in this metaphor, the machinery for creating new biological hardware is in place, and the remaining problem is one of software only. How does the genetic programming for new evolutionary features get written and installed?
This aspect of the problem of evolution is a good one to focus on because computers are everywhere and can be readily observed. We can ask the same question about real computers: how do new computer programs get written and installed? Of course, the answer is that computer programmers write the programs and computer users install them.
But Darwinism holds that during the course of evolution there were no programmers for genetic programs: the process was blind, self-driven. An analogous process in the world of computers would cause new computer programs or subroutines to appear spontaneously in the traffic of computer code being copied and transferred. If a spontaneous new computer program or subroutine somehow became able to replicate itself, it would have taken a significant step toward "life." If, subsequently, it accrued other advantages, like concealment, it would have a "survival" advantage in the world of computer traffic. From there, by analogy with Darwinism, it could grow and multiply and have properties similar to life. Does this ever happen?
Alternatively, it should be possible for scientists to artificially create a computer "environment" in which the evolution of computer programs could occur. Parameters governing the mutation and recombination rates could be optimized for the evolution of new programs. At the lightning speed of modern computers, jillions of trials could be run to see if randomness coupled with any nonteleological iterative process can ever write computer programs with genuinely new functions. Has this been done?
The Blind Watchmaker
Richard Dawkins has written several computer programs which function, he says, like evolution. Computers today are powerful and can run the programs very quickly. This speed enables Dawkins to compress a single generation, or computer trial, into a fraction of a second. In his widely acclaimed 1987 book The Blind Watchmaker, from which the above quote comes, he tells about several such programs.
One of Dawkins' programs begins with a random string of letters and creates a sentence. The sentence is from Shakespeare, "METHINKS IT IS LIKE A WEASEL." The evolution takes only 64, 43, or 41 generations in different computer trials. Dawkins acknowledges that the chance of that short sentence getting produced in a given random trial is, "about 1 in 10,000 million million million million million million." But, he continues, with "cumulative selection," the thing becomes doable. Each time a random computer trial happens to produce a correct letter in a slot, that letter is preserved by cumulative selection (p 46-50).
There is a problem with using Dawkins's scheme as an analogy for evolution. In order for there to be such a thing as a correct letter, the complete sentence has to already exist. In real life, this would require evolution to be teleological, to have a prescribed goal. Teleology in nature is the very thing Darwinists abhor. Random mutations cannot have any prescribed goal. For life to evolve this way, what preexisting model is it emulating?
Alternatively, if there is no model in Dawkins's computer, how is the sentence that is only 61 percent wrong, favored over the one that is 86 percent wrong? How is "MDLDMNLS ITJISWHRZREZ MECS P" better than "WDLTMNLT DTJBSWIRZREZLMQCO P"? After presenting this idea, Dawkins then admits, "Life isn't like that" (p 50).
Dawkins also uses the computer to generate artificial creatures he calls "Biomorphs." He begins with some stick figures on the computer screen. He allows a few variables to change at random, within prescribed parameters. This changes the shapes of the stick figures. The resulting creatures show some variety, and Dawkins is good at naming them. The evolution they undergo is analogous to biological evolution, he says. So he offers his creatures as further evidence that chance can write new genetic programs. His enthusiasm for the Biomorph program is evident in the quotation at the beginning of this section.
Dawkins acknowledges that he uses artificial selection to guide the process. His creatures are tightly constrained by his Biomorph software and completely dependent on the computer's operating software. Deleterious mutations are not possible. The ratio of actual to possible creatures in his genetic scheme is one to one: every sequence makes a viable creature. He achieves the "evolution" by adjusting only a few variables within narrow ranges. So the only changes that occur in the creatures are those whose potential is already available in the program originally. The evolution that he is able to simulate is, at most, microevolution.
At the back of The Blind Watchmaker is an order form that can be used to obtain a copy of Dawkins's program, "The Blind Watchmaker Evolution Simulation Software," for $10.95. Do you think that the installation instructions say, "Don't be careful when you install this program, and don't make a backup copy of your system first, because mistakes are the way things get better?"
No. If chance were able to write improvements to computer programs we would know about it by now. "Hey, thanks for the freeware spreadsheet program you copied and gave me. By the way, due to some chance errors in the copying, I ended up with a version 3.1 copy, from your version 3.0 original. I'll be able to do mortgage tables. Isn't it great?" "Yeah, I've heard of that before."
No. The best outcome that ever follows the exchange of computer programs is that the programs work as expected. And what often happens is more like, "Hey, thanks for the freeware spreadsheet program you copied and gave me. But after I loaded it, everything crashed. Can you come over and help me?" Maybe another piece of software is required for the spreadsheet to work on the new computer. Or maybe the other necessary software is already loaded, but locked up by some previous programming. Tinkering is often required.
How would Richard Dawkins's artificial creatures work if you allowed chance mutations to affect the "Blind Watchmaker" software, or the computer's operating programs? Would you guess that Dawkins himself carefully makes backup copies of his programs, lest something should, by chance, alter them? If Dawkins had to conduct evolutionary simulations with operating and applications programs that had been slightly randomized, what music would he hear then?
The closest possibility Gell-Mann discusses is a program called Tierra, written by Thomas S. Ray. In this program there is a standard "ancestor," an "organism" consisting of eighty computer instructions. From it other organisms descend. Mutations are introduced into these descendants at rate about a million times higher than the average mutation rate in eukaryotes. When the computer's memory is full, the older or more defective organisms are "killed off." The outcome of this artificial process about which Gell-Mann has the most to say is the evolution of a more compressed version of the original ancestor, one with only 36 instructions instead of eighty (p 315). He mentions the evolution of no new features in Tierra. Yet the noteworthy outcome of biological evolution is not the reduction of the instruction set, but the growth of it, and the emergence of new features. As real life evolves, new genes with new meaning are added to the genome.
In his 1995 book, At Home in the Universe (5), Kauffman also mentions Tierra. To Kauffman what's interesting about Tierra is the fact that the organisms become extinct. "They disappear from the stage and are heard from no more" (p 238). From this behavior he draws a lesson about the size and frequency of extinctions in real life, not about evolutionary advances. The evolution in Tierra which he describes is all copying and shuffling; the process does not generate new programs or subroutines.
Kauffman also discusses another attempt to duplicate evolution in a computer model. The result is negative (p 276-277):
Since the Turing machine and its programs can themselves be specified by a sequence of binary digits, one string of symbols is essentially manipulating another string. Thus the operation of a Turing machine is a bit like the operation of an enzyme on a substrate, snipping a few atoms out, adding a few atoms here and there.Chris Langton, External Professor of the Santa Fe Institute, clearly believes that it is reasonable to ask for a computer model that emulates life (6):
What would happen, McCaskill had wondered, if one made a soup of Turing machines and let them collide; one collision partner would act as the machine, and the other partner in the collision would act as the input tape. The soup of programs would act on itself, rewriting each other's programs until... Until what?
Well, it didn't work. Many Turing machine programs are able to enter infinite loop and "hang." In such a case, the collision partners become locked in a mutual embrace that never ends, yielding no "product" programs. The attempt to create a silicon self-reproducing spaghetti of programs failed. Oh well.
"...If a programmer creates a world of "molecules" that—by following rules such as those of chemistry—spontaneously organize themselves into entities that eat, reproduce and evolve, Langton would consider those entities to be alive "even if it's in a computer."In September, 1995, Langton was asked: if chance can write the new genetic code behind evolutionary progress, then it should be able to write new computer code. Can it? Langton answered yes (7). His example was a string of computer code two bits long used to specify one strategy in a computer game simulating evolution (the "Prisoner's Dilemma"). Because mutations are allowed, the string of code can occasionally double to four bits; sometimes this doubling can lead to a superior strategy.
This example seems weak. If any random process can write computer programs, there should be examples more impressive than the duplication of two bits. This is equivalent to the insertion of one nucleotide in a real genome. One possible excuse for this weakness is that not enough time has gone by for self-generated computer programs to emerge. But evolution is a robust process. If new genetic programs can be created without input in the biological world, wouldn't there be some convincing indication of an analogous process in the computer world by now?
Ray's advocacy of the proposed network sounds a bit like a venture capital prospectus, complete with disclaimers. "Eventually the product can be neutered and sold to the end user.... it is a venture into the unknown for which we can not estimate the likelihood of success." While we should only encourage research of this sort, we should be realistic about what it has accomplished so far. As for claims that computers will duplicate the progress evident in biological evolution, we should maintain a healthy scepticism and wait for some convincing results before we buy in.