Wednesday, June 17, 2009

The Explanatory Power of Natural Selection

Below is a paper I wrote about a year ago on the explanatory power of natural selection to produce the information rich micro-structures required for biological life. I'm not an expert, and I am sure that those who are would find much to take issue with in what follows. But, nevertheless, here it is.



THE EXPLANATORY POWER OF NATURAL SELECTION:
A RESPONSE TO RICHARD DAWKINS



A Paper
Submitted to Dr. Jeremy Evans
of the
New Orleans Baptist Theological Seminary





In Partial Fulfillment
of the Requirements for the Course
Introduction to Christian Apologetics: PHIL5301
in the Division of Theological and Historical Studies


Benjamin K. Kimmell
B.S., Florida State University, 2001
June 5, 2008

Introduction

“There has probably never been a more devastating rout of popular belief by clever reasoning than Charles Darwin’s destruction of the argument from design.”[1] So says the famous Oxford evolutionary biologist Richard Dawkins in his best-selling book The God Delusion. In chapter 3, Dawkins declares that Darwin’s concept of natural selection represents a devastating defeat of the teleological argument for the existence of God. “Thanks to Darwin, it is no longer true to say that nothing that we know looks designed unless it is designed.”[2]
In this paper, I will examine Dawkins’ rationale for rejecting the design hypothesis. Specifically, I will argue that natural selection does not possess the explanatory resources needed to explain the appearance of design in the natural world. This examination will include an assessment of the problem of biological information. I will devote particular attention to the theoretical inadequacy of natural process (law, chance, or the combination of law and chance) to account for the existence of complex, specified information. I will include an analysis of one specific claim from experimental data that allegedly confirms the spontaneous emergence of biological information based on Qb virus replication. A further section will be included that raises some basic conceptual questions regarding the potential for natural selection to originate information. Finally, I will explore some of the claims regarding the relevance of computer simulations (Artificial Life) to the question of biological information and function.
It is my purpose to expose the basis for Dawkins’ affirmation that “the mature Darwin blew it [the design argument] out of the water.”[3] The origin of biological information has
not been settled in favor of naturalism. I intend to show that the design argument, which stands or falls on the basis of information, remains viable.

Defining Natural Selection

Natural selection is the primary evolutionary mover. It is allegedly responsible for the ability of nature to overcome the fantastically prohibitive odds against the emergence of complex biological organisms. Natural selection is the preservation of traits that aid in reproduction or traits that perform some function that is advantageous in the presence of selection pressure. It is a cumulative, step-wise process that leverages prior selected variation.[4] Natural selection was originally conceived as a process whereby pre-existing biological function diversifies. “Natural Selection,” Darwin wrote, “acts exclusively by the preservation and accumulation of variations, which are beneficial under the organic and inorganic conditions to which each creature is exposed at all periods of life.”[5] Darwin believed the mechanism of natural selection to be a sufficient cause for the fullness of biological diversity that we observe. Small changes within a population, when preserved, will accumulate over time until distinct species and even higher taxonomic categories emerge.[6]
The notion that natural selection is capable of producing the extent of actual biological diversity on earth has recently been challenged.
Information theorists have questioned the ability of natural processes to create the quantity and quality of information required for biological function. William Dembski quoted Nobel Laureate David Baltimore as stating that “Modern biology is a science of information.”[7] Other notable evolutionary scientists have concurred. According to Dembski, a vast informational gulf separates the organic from the inorganic world.[8] It is a myth of modern evolutionary biology to suppose that complex specified information (CSI) can be generated without intelligence.[9]

Generating CSI

Some evolutionists have suggested that natural laws and algorithms are sufficient causes to bring about the effect of CSI. In fact, they assume that CSI is generated using algorithms in conjunction with natural law without considering whether or not they are, in principle, capable of explaining the existence of CSI at all. Naturalistic accounts of information typically focus on its flow, not its origin.[10] Dembski explains that algorithms and laws equate to mathematical concepts of functions. A function is a one-to-one mapping relation between two sets of data (the domain and range). The important point to note is that mathematical functions are purely deterministic. There is no statistical variance whatever. Initial and boundary conditions of natural laws constitute the domain. The range is the physical state at time t. Mathematically, we might posit some CSI j, and some natural law or algorithm f that constitutes the origin of j. To solve for j we must posit some domain element i. Hence f(i) = j. But the origin of information has now been explained on the basis of some other variable. In order to explain the origin of information, i must now be explained.[11]

The problem of information has not been solved. The expression above illustrates simply that natural processes and algorithms are well suited to manipulate information, but they cannot create it. In fact, information seems to be subject to degrading influences much like energy is to entropy. Unless explicitly preserved by external forces, information will degrade along with the medium used to convey it. In the end, functions do not create any more information than was initially present in the data variables and function itself. In order to account for the information inherent in the function itself, Dembski deploys the universal composition function m. Hence m(i, f) = f(i) = j. Utilizing natural law or algorithms to explain the existence of CSI is much like filling one hole by digging another.[12]

The primary alternative to natural law for generating CSI in nature would be chance. The problem with this option is that it can only generate non-complex specificity or complex unspecified information. Randomly typing letters on a keyboard would result in a long string of complex information. The string itself would be highly improbable. But there is no specificity, no independently recognizable pattern that orders the data into meaningful information. Alternatively, the typist may type out the correct character string for a simple word, perhaps “t-h-e.” The typist may get lucky in typing an intelligible word here or there, but the specified sequence will maintain a relatively low level of complexity. Therefore, chance cannot be said to account for the complex, specified information that is obviously intrinsic to putatively designed artifacts.[13]

A universal probability bound (UPB) is the statistical point beyond which chance cannot be said to be a causal explanation for an effect. Liberal estimates place the UPB at 1 part in 10-50. Dembski offers a more conservative estimation of the UPB based on the number of elementary particles in the universe and the duration between Plank time and the heat death of the universe. Dembski’s UPB comes to 1 part in 10-150. Any event with a probability lower than the UPB cannot be attributed to chance. Many biological organisms display a degree of complexity that would far exceed even Dembski’s conservative UPB, thus they cannot be said to be products of chance.[14]

We have seen that neither law nor chance by themselves can explain CSI. It seems safe to say that their combination cannot explain CSI either. Laws can only transmit or degrade information. The data contributions to law by chance would certainly not be complex specified information. Thus law and chance combined cannot create CSI.[15] The Law of Conservation of Information (LCI) simply entails that for any CSI x to be explained in terms of antecedent naturalistic circumstances (events or states of affairs) y, then the CSI of x was previously present already in y.[16]

There are a number of important corollaries to the LCI. The most important of which is that any closed, finite system of natural causes that exhibits CSI must have received its CSI prior to becoming closed. This corollary directly contradicts the philosophy of science proffered by materialists like Richard Dawkins and Daniel Dennett. Their view of scientific explanation is purely reductive. Complex beings arise from less complex beings. But if the LCI is true, then reductive, naturalistic explanations for the existence of CSI cannot be true, even in principle.[17]

In his newly revised advanced evolutionary biology textbook, Graham Bell agrees that the origin of biological complexity is difficult to explain (presumably the explanation is difficult on any naturalistic model). “Nevertheless,” Bell states, “it has been found that self-replicating RNA molecules will appear in the culture tubes, even if the cultures are not inoculated with Qb RNA. It seems that they evolve from very short RNA sequences that form spontaneously in solutions of single nucleotides...”[18] Bell is referring to the Qb virus that infects bacteria and uses its genetic materials to encode various proteins, one of which is the Qb replicase, an enzyme that catalyses the replication of Qb RNA.[19] According to Bell, this phenomenon is an instance of molecular evolution de novo. New biological information arises from the concentration of replicase.

Bells conclusion, however, is controversial. William Dembski states that the case described above is not an instance of true replication since the virus needs to leverage the cell’s “genetic machinery.” In order for the RNA molecules to replicate, the replicase enzyme must be present. So even if the above scenario did describe an authentic instance of replication, we would still need to explain the origin of the replicase.[20]

Prominent Intelligent Design advocate and biochemist Michael Behe calls Bells discussion of the Qb RNA replication “grossly misleading.” He does not deny that RNA strands will emerge in the scenario that Bell describes, but their emergence is due to the prior presence of the replicase enzyme as well as the presence of high concentrations of nucleotides. If the two are combined, then after an unspecified temporal delay, RNA molecules will form and they will replicate (this is not surprising since the replication enzyme is present). But the RNA molecule will consist of randomly strung together nucleotides that contain no specified information. The unspecified string of nucleotides will then replicate. “But,” Behe insists, “the string codes for nothing at all. It is not the generation of information; it’s just stringing together nonsense letters.”[21] Behe’s description of the unspecified string of nucleotides is the informational equivalent of allowing a monkey to execute a series of keystrokes on a computer. The result is meaningless and Bell’s scenario utterly fails to explain the origin of complex, specified biological information. Behe concludes that “Darwinists must be pretty desperate to keep plying this old chestnut.”[22]

Other Considerations

According to Richard Dawkins, natural selection is the answer to biological diversity. It is the creative and unifying force in nature. In some significant ways, however, Dawkins affirmation is rather uninteresting. No rational person denies that natural selection occurs. The disagreement arises when the process of natural selection is attributed a creative efficacy that it simply does not possess. Phillip Johnson has argued that rather than being a diversifying force in nature, natural selection is instead a unifying force. Those organisms that are at the genetic or morphological extremes for their species are generally less fit to survive. Thus, variations from the norm are selected out of the population to preserve the core population according to its kind.[23]
The basic tenet of evolution is that organisms generally reproduce at exponential rates. A limited environment causes organisms to compete for scarce resources. Genetic mutations (as they are now known under the neo-Darwinian synthesis) may occasionally confer reproductive advantage. Those organisms that reproduce the most are statistically better positioned for survival. Traits that confer reproductive advantages may spread throughout the population and ultimately become the basis for further variation and selection.[24]

Natural selection predicts that those organisms that reproduce the most are the most fit for survival. Those most fit for survival are defined as those organisms that reproduce the most.
Success or advantage in evolutionary terms, “has no inherent meaning other than actual success in reproduction.”[25] The tautological nature of natural selection causes one to wonder about the explanatory power of a theory that adds nothing to direct observation. If natural selection is so patently obvious, then one may wonder why one needs an allegedly sophisticated scientific theory to explain it.[26] Johnson wryly observes, “When I want to know how a fish can become a man, I am not enlightened by being told that the organisms that leave the most offspring are the ones that leave the most offspring.”[27]

Artificial Life and Natural Selection

Despite claims by intelligent design advocates that novel, biological information cannot be generated by unintelligent processes, many advocates of evolution argue that computer simulations of artificial life conclusively show that self-replicating processes evolve new information. Graham Bell cites University of Delaware researcher Thomas Ray’s work on the “Tierran” experiment.[28] Bell affirms that Ray’s experiments in artificial life show self-replicating “processes” that have a minimal pre-specified variation are capable of exhibiting parasitic features.[29] The “system is something more than an eternally recurring set of computer instructions...”[30] It copies itself and numerous iterations show impressive results in terms of modeling some important features of evolution.

One major issue with Ray’s Tierran experiment is that since it copies itself, there should be some accounting for the information in the first copy. The program is a rather complex, 80 bytes in length. The project completely ignores the problem of getting 80 bytes together in a useful collection in the first place. Robert Newman, New Testament scholar and theoretical astrophysicist explains:[31]

“Each byte in Tierran has 5 bits or 32 combinations, so there are 3280 combinations for an 80-byte program, which is 2 x 10120. Following Ludwig's scheme of using all the earth's 100 million PCs to generate 1000 80-byte combinations per second, we would need 7 x 10100 years for the job. If all 1090 elementary particles were turned into computers to generate combinations, it would still take 7 x 1010 years, several times the age of the universe. Not a likely scenario, but one might hope a shorter program that could permanently start reproduction might kick in much earlier.”[32]

Other significant issues include the rate of mutation programmed into Tierran. The established rate is something like 1 mutation per 5,000 copies. Dawkins recognizes that mutations or variance in DNA replication occurs at a much lower frequency, something like 1 variant per 1 billion copies.[33]It is an open question whether copying variance at such a low rate could account for the biological diversity we see in the natural world given current estimates of the age of the earth. But it seems likely, however, that the entire universe has not existed long enough to allow for the observed diversity in biological information based mutation and Dawkins’ cumulative selection approach. Moreover, in Tierran, the discarded programs make their “innards” (usable information content) available for assimilation into newly produced copies. This is utterly unlike the way the biological world works.

Richard Dawkins created a program that he believes relevantly simulates natural selection. He readily acknowledges that single-step selection is basically statistically impossible for the features that we see in the natural world. He believes, however, that natural selection is a cumulative mechanism that reduces otherwise prohibitive odds.[34] In his computer program, he specifies a scenario such that the computer is instructed to generate a random sequence of 28 characters. The program then selects those sequences that statistically conform to a target sequence (METHINKS IT IS LIKE A WEASEL), discarding the others. In short order Dawkins’ program is able to select the target sequence.

The obvious flaw in Dawkinsian Weasel programs is that actual molecules in situ do not
have a target in mind.[35] Hence, there can be no statistical selection toward any goal since no goal exists. Dawkins acknowledges this point, but basically dismisses it and assaults what he considers to be a vain human construct “that cherishes the absurd notion that our species is the final goal of evolution.”[36] He quickly turned to a different program that selects based on imagery. As Meyer rightly points out, natural selection can only determine function “if considerations of proximity to possible future function are allowed, but this requires foresight that molecules do not have.”[37] Attributing such foresight or rationale to inanimate molecules is grossly anthropomorphic.

Conclusion

Prior to Darwin, selection was regarded as a mental process confined to intelligent agents who had the ability to reflect and deliberate on various possibilities. Darwin argued that unintelligent processes possess the capacity to discriminate in a meaningful way among alternatives. Dembski writes that “Darwin perpetrated the greatest intellectual swindle in the history of ideas.”[38] Information is holistic in nature. It is not the “mereological sum of its constituent” parts.[39] Natural processes cannot account for the requisite information in biological organisms and the appearance of design that such information entails. We have seen that natural law, nor chance, nor chance combined with law is capable of creating CSI. Qb virus replication does not represent an instance of de novo biological information emerging spontaneously from a highly concentrated organic environment. Major conceptual questions have been raised regarding the explanatory value of natural selection given that it has often been formulated as a tautology. And major research on Artificial Life has been shown to lack relevance to natural processes in the actual world.

As stated in the introduction, the origin of biological information has not been settled in favor of naturalism. Naturalism cannot account for it, therefore Dawkins’ confident assertions that Darwin’s theory blew Paley’s design argument out of the water is perfectly absurd.
Philosopher Dan Dennett has credited Darwin’s theory of evolution as the best idea anyone has ever had. But it is much like a magician performing tricks at a distance. Modern biochemistry, molecular biology, and information theory have been handed out to the scientific community as binoculars. Science can see past Darwin’s neat trick. “It’s time,” Dembski declares, “to lay aside the tricks – the smoke screens and the handwaving, the just-so stories and the stonewalling, the bluster and the bluffing – and to explain scientifically what people have known all along, namely, why you can’t get design without a designer. That’s where intelligent design comes in.”[40]


WORKS CITED

Books
Behe, Michael. Personal email correspondence dated June 4, 2008.

Bell, Graham. Selection: The Mechanism of Evolution. Oxford: Oxford University Press, 2008.

Berlinski, David. The Devil’s Delusion: Atheism and Its Scientific Pretensions. New York:
Crown Forum, 2008.

Darwin, Charles, The Origin of the Species by Means of Natural Selection or, The Preservation
of Favored Races in the Struggle for Life. vol. 1 of 2. Akron, OH: The Werner Company Book Manufacturers, 1904.

Dawkins, Richard. The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe
Without Design. Latest paperback edition. New York & London: W.W. Norton & Company, 1987.

________. The God Delusion. New York: Mariner Books, 2008.

Dembski, William. Intelligent Design: The Bridge Between Science and Theology. Downers
Grove: InterVarsity Press, 1999.

________. The Design Revolution: Answering the Toughest Questions about Intelligent
Design. Downers Grove: InterVarsity Press, 2004.

________. Personal email correspondence dated June 4, 2008.

Johnson, Phillip. Darwin on Trial. 2nd ed. Downers Grove: InterVarsity Press, 1993.

Meyer, Stephen C. “The Explanatory Power of Design: DNA and the Origin of Information,”
Mere Creation: Science, Faith, and Intelligent Design. William Dembski ed. Downers Grove: InterVarsity Press, 1998.
Electronic Sources

“Dr. Robert Newman,” Access Research Network, Web page: available from
http://www.arn.org/authors/newman.html; Internet; accessed June 6, 2008.

Newman Robert. “Artificial Life and Cellular Automata,” Access Research Network, Robert C.
Newman Files, March 15, 2000. Web page: available from http://www.arn.org/docs/newman/rn_artificiallife.htm; Internet; accessed June 5, 2008.



[1]Richard Dawkins, The God Delusion. (New York: Mariner Books, 2008), 103.

[2]Ibid.

[3]Ibid.
[4]Graham Bell, Selection: The Mechanism of Evolution. (Oxford: Oxford University Press, 2008), 12-15.

[5]Charles Darwin, The Origin of the Species by Means of Natural Selection or, The Preservation of Favored Races in the Struggle for Life. vol. 1 of 2. (Akron, OH: The Werner

[6]Darwin, 161.

[7]William Dembski, The Design Revolution: Answering the Toughest Questions about Intelligent Design. (Downers Grove: InterVarsity Press, 2004), 139.

[8]Ibid.

[9]William Dembski, Intelligent Design: The Bridge Between Science and Theology. (Downers Grove: InterVarsity Press, 1999), 153. A detailed definition of CSI is beyond the scope of this paper. Put simply, CSI is the defining feature of intelligent agency. CSI exhibits complexity coupled with specificity to trigger a “design inference.” For more details on what constitutes CSI and criteria for its detection see section 5.3 of Intelligent Design “The Complexity-Specification Criterion.”

[10]Dembski, Intelligent Design, 292. See his footnote 19.
[11]Ibid., 160-161.

[12]Dembski, Intelligent Design, 162.
[13]Ibid. 165-166.

[14]Ibid., 166.

[15]Dembski provides a theoretical justification for this intuition. I will not belabor the point here. For more information see Dembski, Intelligent Design, 168-169. Trial and error is a widely affirmed approach to scientific experimentation which is based on the combination of law and chance. Trial and error are the conceptual bases for computer programs that allegedly simulate evolution by natural selection. The trials are executed and less desirable outcomes are discarded. The problem with these programs is that they often contain a goal or standard by which the selections are made. This is blatant telos, the very thing that evolution by natural selection attempts to avoid. I will address the issue of computer programs in more detail below.

[16]Dembski, The Design Revolution, 162.

[17]Ibid., 162-163.

[18]Bell, 10.

[19]Ibid., 1.
[20]William Dembski. The source is a personal email correspondence dated June 4, 2008.

[21]Michael Behe. The source is a personal email correspondence dated June 4, 2008. The quote and the remainder of Behe’s position on the Qb replicase issue are taken from this email.

[22]Ibid.
[23]Phillip Johnson, Darwin on Trial. 2nd ed. (Downers Grove: InterVarsity Press, 1993), 16.

[24]Ibid., 17.
[25]Ibid., 21.

[26]David Berlinski made a similar point in The Devil’s Delusion: Atheism and Its Scientific Pretensions. (New York: Crown Forum, 2008). I was unable to verify the actual reference page.

[27]Johnson, 22.

[28]Bell, 17-19.
[29]Among experts, there is a debate concerning the proper way to correctly define “life.” Since computer programs like Tierra can replicate themselves, this seems to satisfy at least one important feature of the traditional view of what constitutes life. Hence, we may now be reluctant to call self-replicating programs like Tierra merely “processes.” Should we call them “beings?” Of course these programs exhibit no self-awareness, but few people regard self-awareness as an essential property of life. Further discussion of this matter is beyond the scope of this paper.

[30]Bell, 17.

[31]“Dr. Robert Newman,” Access Research Network, Web page: available from http://www.arn.org/authors/newman.html; Internet; accessed June 6, 2008.

[32]Robert Newman, “Artificial Life and Cellular Automata,” Access Research Network, Robert C. Newman Files, March 15, 2000. Web page: available from http://www.arn.org/docs/newman/rn_artificiallife.htm; Internet; accessed June 5, 2008.

[33]Dawkins, The Blind Watchmaker, 124.

[34]Richard Dawkins, The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe Without Design. Latest paperback edition (New York & London: W.W. Norton & Company, 1987), 49.
[35]Stephen C. Meyer. “The Explanatory Power of Design: DNA and the Origin of Information,” Mere Creation: Science, Faith, and Intelligent Design. William Dembski ed. (Downers Grove: InterVarsity Press, 1998), 128.

[36]Dawkins, The Blind Watchmaker, 50.

[37]Meyer, 128.

[38]Dembski, The Design Revolution, 263.

[39]Dembski, Intelligent Design, 173.

[40]Dembski, The Design Revolution, 263.

2 comments:

David Jedziniak said...

Ben, this a great explanation of the shortcomings of Darwin's theory of evolution. It does seem a bit harsh in certain ways, though. Darwin's theory is just that, a theory. Darwin knew this and withheld his work from the public domain for over 20 years of his life. When he finally released the information he was stricken by multiple diseases that ultimately killed him before the scientific and theological community came to terms with his hypotheses.

While I do not subscribe to the camp that holds up the theory of evolution as law, I do believe that Darwin made significant contributions to both science and theology. For instance, the law of natural selection is part of this body of work, and is acknowledged as plausible by both sides. Also, by defining the requirements for evolutionary theory, he laid the ground work for it's disconfirmation as an explanation for origin. I think many people are quick to condemn the man for the actions of his successors, rather than acknowledging his contribution. It could be argued that if he had done a less thorough job in defining this theory, it would be much more difficult to debunk it.

Ben said...

Thanks for the feedback Dave. I would point out, however, that my contentions are with his followers, as you say; specifically Dawkins and his ilk. Of course, Darwinism is a "theory" but many power brokers in the academy and society want to marginalize those who do not subscribe to it.

My purpose was to examine just what natural selection is capable of doing. I conclude, not much with respect to generating the information rich biomolecular structures required for biological life.