Wednesday, June 17, 2009

The Explanatory Power of Natural Selection

Below is a paper I wrote about a year ago on the explanatory power of natural selection to produce the information rich micro-structures required for biological life. I'm not an expert, and I am sure that those who are would find much to take issue with in what follows. But, nevertheless, here it is.



THE EXPLANATORY POWER OF NATURAL SELECTION:
A RESPONSE TO RICHARD DAWKINS



A Paper
Submitted to Dr. Jeremy Evans
of the
New Orleans Baptist Theological Seminary





In Partial Fulfillment
of the Requirements for the Course
Introduction to Christian Apologetics: PHIL5301
in the Division of Theological and Historical Studies


Benjamin K. Kimmell
B.S., Florida State University, 2001
June 5, 2008

Introduction

“There has probably never been a more devastating rout of popular belief by clever reasoning than Charles Darwin’s destruction of the argument from design.”[1] So says the famous Oxford evolutionary biologist Richard Dawkins in his best-selling book The God Delusion. In chapter 3, Dawkins declares that Darwin’s concept of natural selection represents a devastating defeat of the teleological argument for the existence of God. “Thanks to Darwin, it is no longer true to say that nothing that we know looks designed unless it is designed.”[2]
In this paper, I will examine Dawkins’ rationale for rejecting the design hypothesis. Specifically, I will argue that natural selection does not possess the explanatory resources needed to explain the appearance of design in the natural world. This examination will include an assessment of the problem of biological information. I will devote particular attention to the theoretical inadequacy of natural process (law, chance, or the combination of law and chance) to account for the existence of complex, specified information. I will include an analysis of one specific claim from experimental data that allegedly confirms the spontaneous emergence of biological information based on Qb virus replication. A further section will be included that raises some basic conceptual questions regarding the potential for natural selection to originate information. Finally, I will explore some of the claims regarding the relevance of computer simulations (Artificial Life) to the question of biological information and function.
It is my purpose to expose the basis for Dawkins’ affirmation that “the mature Darwin blew it [the design argument] out of the water.”[3] The origin of biological information has
not been settled in favor of naturalism. I intend to show that the design argument, which stands or falls on the basis of information, remains viable.

Defining Natural Selection

Natural selection is the primary evolutionary mover. It is allegedly responsible for the ability of nature to overcome the fantastically prohibitive odds against the emergence of complex biological organisms. Natural selection is the preservation of traits that aid in reproduction or traits that perform some function that is advantageous in the presence of selection pressure. It is a cumulative, step-wise process that leverages prior selected variation.[4] Natural selection was originally conceived as a process whereby pre-existing biological function diversifies. “Natural Selection,” Darwin wrote, “acts exclusively by the preservation and accumulation of variations, which are beneficial under the organic and inorganic conditions to which each creature is exposed at all periods of life.”[5] Darwin believed the mechanism of natural selection to be a sufficient cause for the fullness of biological diversity that we observe. Small changes within a population, when preserved, will accumulate over time until distinct species and even higher taxonomic categories emerge.[6]
The notion that natural selection is capable of producing the extent of actual biological diversity on earth has recently been challenged.
Information theorists have questioned the ability of natural processes to create the quantity and quality of information required for biological function. William Dembski quoted Nobel Laureate David Baltimore as stating that “Modern biology is a science of information.”[7] Other notable evolutionary scientists have concurred. According to Dembski, a vast informational gulf separates the organic from the inorganic world.[8] It is a myth of modern evolutionary biology to suppose that complex specified information (CSI) can be generated without intelligence.[9]

Generating CSI

Some evolutionists have suggested that natural laws and algorithms are sufficient causes to bring about the effect of CSI. In fact, they assume that CSI is generated using algorithms in conjunction with natural law without considering whether or not they are, in principle, capable of explaining the existence of CSI at all. Naturalistic accounts of information typically focus on its flow, not its origin.[10] Dembski explains that algorithms and laws equate to mathematical concepts of functions. A function is a one-to-one mapping relation between two sets of data (the domain and range). The important point to note is that mathematical functions are purely deterministic. There is no statistical variance whatever. Initial and boundary conditions of natural laws constitute the domain. The range is the physical state at time t. Mathematically, we might posit some CSI j, and some natural law or algorithm f that constitutes the origin of j. To solve for j we must posit some domain element i. Hence f(i) = j. But the origin of information has now been explained on the basis of some other variable. In order to explain the origin of information, i must now be explained.[11]

The problem of information has not been solved. The expression above illustrates simply that natural processes and algorithms are well suited to manipulate information, but they cannot create it. In fact, information seems to be subject to degrading influences much like energy is to entropy. Unless explicitly preserved by external forces, information will degrade along with the medium used to convey it. In the end, functions do not create any more information than was initially present in the data variables and function itself. In order to account for the information inherent in the function itself, Dembski deploys the universal composition function m. Hence m(i, f) = f(i) = j. Utilizing natural law or algorithms to explain the existence of CSI is much like filling one hole by digging another.[12]

The primary alternative to natural law for generating CSI in nature would be chance. The problem with this option is that it can only generate non-complex specificity or complex unspecified information. Randomly typing letters on a keyboard would result in a long string of complex information. The string itself would be highly improbable. But there is no specificity, no independently recognizable pattern that orders the data into meaningful information. Alternatively, the typist may type out the correct character string for a simple word, perhaps “t-h-e.” The typist may get lucky in typing an intelligible word here or there, but the specified sequence will maintain a relatively low level of complexity. Therefore, chance cannot be said to account for the complex, specified information that is obviously intrinsic to putatively designed artifacts.[13]

A universal probability bound (UPB) is the statistical point beyond which chance cannot be said to be a causal explanation for an effect. Liberal estimates place the UPB at 1 part in 10-50. Dembski offers a more conservative estimation of the UPB based on the number of elementary particles in the universe and the duration between Plank time and the heat death of the universe. Dembski’s UPB comes to 1 part in 10-150. Any event with a probability lower than the UPB cannot be attributed to chance. Many biological organisms display a degree of complexity that would far exceed even Dembski’s conservative UPB, thus they cannot be said to be products of chance.[14]

We have seen that neither law nor chance by themselves can explain CSI. It seems safe to say that their combination cannot explain CSI either. Laws can only transmit or degrade information. The data contributions to law by chance would certainly not be complex specified information. Thus law and chance combined cannot create CSI.[15] The Law of Conservation of Information (LCI) simply entails that for any CSI x to be explained in terms of antecedent naturalistic circumstances (events or states of affairs) y, then the CSI of x was previously present already in y.[16]

There are a number of important corollaries to the LCI. The most important of which is that any closed, finite system of natural causes that exhibits CSI must have received its CSI prior to becoming closed. This corollary directly contradicts the philosophy of science proffered by materialists like Richard Dawkins and Daniel Dennett. Their view of scientific explanation is purely reductive. Complex beings arise from less complex beings. But if the LCI is true, then reductive, naturalistic explanations for the existence of CSI cannot be true, even in principle.[17]

In his newly revised advanced evolutionary biology textbook, Graham Bell agrees that the origin of biological complexity is difficult to explain (presumably the explanation is difficult on any naturalistic model). “Nevertheless,” Bell states, “it has been found that self-replicating RNA molecules will appear in the culture tubes, even if the cultures are not inoculated with Qb RNA. It seems that they evolve from very short RNA sequences that form spontaneously in solutions of single nucleotides...”[18] Bell is referring to the Qb virus that infects bacteria and uses its genetic materials to encode various proteins, one of which is the Qb replicase, an enzyme that catalyses the replication of Qb RNA.[19] According to Bell, this phenomenon is an instance of molecular evolution de novo. New biological information arises from the concentration of replicase.

Bells conclusion, however, is controversial. William Dembski states that the case described above is not an instance of true replication since the virus needs to leverage the cell’s “genetic machinery.” In order for the RNA molecules to replicate, the replicase enzyme must be present. So even if the above scenario did describe an authentic instance of replication, we would still need to explain the origin of the replicase.[20]

Prominent Intelligent Design advocate and biochemist Michael Behe calls Bells discussion of the Qb RNA replication “grossly misleading.” He does not deny that RNA strands will emerge in the scenario that Bell describes, but their emergence is due to the prior presence of the replicase enzyme as well as the presence of high concentrations of nucleotides. If the two are combined, then after an unspecified temporal delay, RNA molecules will form and they will replicate (this is not surprising since the replication enzyme is present). But the RNA molecule will consist of randomly strung together nucleotides that contain no specified information. The unspecified string of nucleotides will then replicate. “But,” Behe insists, “the string codes for nothing at all. It is not the generation of information; it’s just stringing together nonsense letters.”[21] Behe’s description of the unspecified string of nucleotides is the informational equivalent of allowing a monkey to execute a series of keystrokes on a computer. The result is meaningless and Bell’s scenario utterly fails to explain the origin of complex, specified biological information. Behe concludes that “Darwinists must be pretty desperate to keep plying this old chestnut.”[22]

Other Considerations

According to Richard Dawkins, natural selection is the answer to biological diversity. It is the creative and unifying force in nature. In some significant ways, however, Dawkins affirmation is rather uninteresting. No rational person denies that natural selection occurs. The disagreement arises when the process of natural selection is attributed a creative efficacy that it simply does not possess. Phillip Johnson has argued that rather than being a diversifying force in nature, natural selection is instead a unifying force. Those organisms that are at the genetic or morphological extremes for their species are generally less fit to survive. Thus, variations from the norm are selected out of the population to preserve the core population according to its kind.[23]
The basic tenet of evolution is that organisms generally reproduce at exponential rates. A limited environment causes organisms to compete for scarce resources. Genetic mutations (as they are now known under the neo-Darwinian synthesis) may occasionally confer reproductive advantage. Those organisms that reproduce the most are statistically better positioned for survival. Traits that confer reproductive advantages may spread throughout the population and ultimately become the basis for further variation and selection.[24]

Natural selection predicts that those organisms that reproduce the most are the most fit for survival. Those most fit for survival are defined as those organisms that reproduce the most.
Success or advantage in evolutionary terms, “has no inherent meaning other than actual success in reproduction.”[25] The tautological nature of natural selection causes one to wonder about the explanatory power of a theory that adds nothing to direct observation. If natural selection is so patently obvious, then one may wonder why one needs an allegedly sophisticated scientific theory to explain it.[26] Johnson wryly observes, “When I want to know how a fish can become a man, I am not enlightened by being told that the organisms that leave the most offspring are the ones that leave the most offspring.”[27]

Artificial Life and Natural Selection

Despite claims by intelligent design advocates that novel, biological information cannot be generated by unintelligent processes, many advocates of evolution argue that computer simulations of artificial life conclusively show that self-replicating processes evolve new information. Graham Bell cites University of Delaware researcher Thomas Ray’s work on the “Tierran” experiment.[28] Bell affirms that Ray’s experiments in artificial life show self-replicating “processes” that have a minimal pre-specified variation are capable of exhibiting parasitic features.[29] The “system is something more than an eternally recurring set of computer instructions...”[30] It copies itself and numerous iterations show impressive results in terms of modeling some important features of evolution.

One major issue with Ray’s Tierran experiment is that since it copies itself, there should be some accounting for the information in the first copy. The program is a rather complex, 80 bytes in length. The project completely ignores the problem of getting 80 bytes together in a useful collection in the first place. Robert Newman, New Testament scholar and theoretical astrophysicist explains:[31]

“Each byte in Tierran has 5 bits or 32 combinations, so there are 3280 combinations for an 80-byte program, which is 2 x 10120. Following Ludwig's scheme of using all the earth's 100 million PCs to generate 1000 80-byte combinations per second, we would need 7 x 10100 years for the job. If all 1090 elementary particles were turned into computers to generate combinations, it would still take 7 x 1010 years, several times the age of the universe. Not a likely scenario, but one might hope a shorter program that could permanently start reproduction might kick in much earlier.”[32]

Other significant issues include the rate of mutation programmed into Tierran. The established rate is something like 1 mutation per 5,000 copies. Dawkins recognizes that mutations or variance in DNA replication occurs at a much lower frequency, something like 1 variant per 1 billion copies.[33]It is an open question whether copying variance at such a low rate could account for the biological diversity we see in the natural world given current estimates of the age of the earth. But it seems likely, however, that the entire universe has not existed long enough to allow for the observed diversity in biological information based mutation and Dawkins’ cumulative selection approach. Moreover, in Tierran, the discarded programs make their “innards” (usable information content) available for assimilation into newly produced copies. This is utterly unlike the way the biological world works.

Richard Dawkins created a program that he believes relevantly simulates natural selection. He readily acknowledges that single-step selection is basically statistically impossible for the features that we see in the natural world. He believes, however, that natural selection is a cumulative mechanism that reduces otherwise prohibitive odds.[34] In his computer program, he specifies a scenario such that the computer is instructed to generate a random sequence of 28 characters. The program then selects those sequences that statistically conform to a target sequence (METHINKS IT IS LIKE A WEASEL), discarding the others. In short order Dawkins’ program is able to select the target sequence.

The obvious flaw in Dawkinsian Weasel programs is that actual molecules in situ do not
have a target in mind.[35] Hence, there can be no statistical selection toward any goal since no goal exists. Dawkins acknowledges this point, but basically dismisses it and assaults what he considers to be a vain human construct “that cherishes the absurd notion that our species is the final goal of evolution.”[36] He quickly turned to a different program that selects based on imagery. As Meyer rightly points out, natural selection can only determine function “if considerations of proximity to possible future function are allowed, but this requires foresight that molecules do not have.”[37] Attributing such foresight or rationale to inanimate molecules is grossly anthropomorphic.

Conclusion

Prior to Darwin, selection was regarded as a mental process confined to intelligent agents who had the ability to reflect and deliberate on various possibilities. Darwin argued that unintelligent processes possess the capacity to discriminate in a meaningful way among alternatives. Dembski writes that “Darwin perpetrated the greatest intellectual swindle in the history of ideas.”[38] Information is holistic in nature. It is not the “mereological sum of its constituent” parts.[39] Natural processes cannot account for the requisite information in biological organisms and the appearance of design that such information entails. We have seen that natural law, nor chance, nor chance combined with law is capable of creating CSI. Qb virus replication does not represent an instance of de novo biological information emerging spontaneously from a highly concentrated organic environment. Major conceptual questions have been raised regarding the explanatory value of natural selection given that it has often been formulated as a tautology. And major research on Artificial Life has been shown to lack relevance to natural processes in the actual world.

As stated in the introduction, the origin of biological information has not been settled in favor of naturalism. Naturalism cannot account for it, therefore Dawkins’ confident assertions that Darwin’s theory blew Paley’s design argument out of the water is perfectly absurd.
Philosopher Dan Dennett has credited Darwin’s theory of evolution as the best idea anyone has ever had. But it is much like a magician performing tricks at a distance. Modern biochemistry, molecular biology, and information theory have been handed out to the scientific community as binoculars. Science can see past Darwin’s neat trick. “It’s time,” Dembski declares, “to lay aside the tricks – the smoke screens and the handwaving, the just-so stories and the stonewalling, the bluster and the bluffing – and to explain scientifically what people have known all along, namely, why you can’t get design without a designer. That’s where intelligent design comes in.”[40]


WORKS CITED

Books
Behe, Michael. Personal email correspondence dated June 4, 2008.

Bell, Graham. Selection: The Mechanism of Evolution. Oxford: Oxford University Press, 2008.

Berlinski, David. The Devil’s Delusion: Atheism and Its Scientific Pretensions. New York:
Crown Forum, 2008.

Darwin, Charles, The Origin of the Species by Means of Natural Selection or, The Preservation
of Favored Races in the Struggle for Life. vol. 1 of 2. Akron, OH: The Werner Company Book Manufacturers, 1904.

Dawkins, Richard. The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe
Without Design. Latest paperback edition. New York & London: W.W. Norton & Company, 1987.

________. The God Delusion. New York: Mariner Books, 2008.

Dembski, William. Intelligent Design: The Bridge Between Science and Theology. Downers
Grove: InterVarsity Press, 1999.

________. The Design Revolution: Answering the Toughest Questions about Intelligent
Design. Downers Grove: InterVarsity Press, 2004.

________. Personal email correspondence dated June 4, 2008.

Johnson, Phillip. Darwin on Trial. 2nd ed. Downers Grove: InterVarsity Press, 1993.

Meyer, Stephen C. “The Explanatory Power of Design: DNA and the Origin of Information,”
Mere Creation: Science, Faith, and Intelligent Design. William Dembski ed. Downers Grove: InterVarsity Press, 1998.
Electronic Sources

“Dr. Robert Newman,” Access Research Network, Web page: available from
http://www.arn.org/authors/newman.html; Internet; accessed June 6, 2008.

Newman Robert. “Artificial Life and Cellular Automata,” Access Research Network, Robert C.
Newman Files, March 15, 2000. Web page: available from http://www.arn.org/docs/newman/rn_artificiallife.htm; Internet; accessed June 5, 2008.



[1]Richard Dawkins, The God Delusion. (New York: Mariner Books, 2008), 103.

[2]Ibid.

[3]Ibid.
[4]Graham Bell, Selection: The Mechanism of Evolution. (Oxford: Oxford University Press, 2008), 12-15.

[5]Charles Darwin, The Origin of the Species by Means of Natural Selection or, The Preservation of Favored Races in the Struggle for Life. vol. 1 of 2. (Akron, OH: The Werner

[6]Darwin, 161.

[7]William Dembski, The Design Revolution: Answering the Toughest Questions about Intelligent Design. (Downers Grove: InterVarsity Press, 2004), 139.

[8]Ibid.

[9]William Dembski, Intelligent Design: The Bridge Between Science and Theology. (Downers Grove: InterVarsity Press, 1999), 153. A detailed definition of CSI is beyond the scope of this paper. Put simply, CSI is the defining feature of intelligent agency. CSI exhibits complexity coupled with specificity to trigger a “design inference.” For more details on what constitutes CSI and criteria for its detection see section 5.3 of Intelligent Design “The Complexity-Specification Criterion.”

[10]Dembski, Intelligent Design, 292. See his footnote 19.
[11]Ibid., 160-161.

[12]Dembski, Intelligent Design, 162.
[13]Ibid. 165-166.

[14]Ibid., 166.

[15]Dembski provides a theoretical justification for this intuition. I will not belabor the point here. For more information see Dembski, Intelligent Design, 168-169. Trial and error is a widely affirmed approach to scientific experimentation which is based on the combination of law and chance. Trial and error are the conceptual bases for computer programs that allegedly simulate evolution by natural selection. The trials are executed and less desirable outcomes are discarded. The problem with these programs is that they often contain a goal or standard by which the selections are made. This is blatant telos, the very thing that evolution by natural selection attempts to avoid. I will address the issue of computer programs in more detail below.

[16]Dembski, The Design Revolution, 162.

[17]Ibid., 162-163.

[18]Bell, 10.

[19]Ibid., 1.
[20]William Dembski. The source is a personal email correspondence dated June 4, 2008.

[21]Michael Behe. The source is a personal email correspondence dated June 4, 2008. The quote and the remainder of Behe’s position on the Qb replicase issue are taken from this email.

[22]Ibid.
[23]Phillip Johnson, Darwin on Trial. 2nd ed. (Downers Grove: InterVarsity Press, 1993), 16.

[24]Ibid., 17.
[25]Ibid., 21.

[26]David Berlinski made a similar point in The Devil’s Delusion: Atheism and Its Scientific Pretensions. (New York: Crown Forum, 2008). I was unable to verify the actual reference page.

[27]Johnson, 22.

[28]Bell, 17-19.
[29]Among experts, there is a debate concerning the proper way to correctly define “life.” Since computer programs like Tierra can replicate themselves, this seems to satisfy at least one important feature of the traditional view of what constitutes life. Hence, we may now be reluctant to call self-replicating programs like Tierra merely “processes.” Should we call them “beings?” Of course these programs exhibit no self-awareness, but few people regard self-awareness as an essential property of life. Further discussion of this matter is beyond the scope of this paper.

[30]Bell, 17.

[31]“Dr. Robert Newman,” Access Research Network, Web page: available from http://www.arn.org/authors/newman.html; Internet; accessed June 6, 2008.

[32]Robert Newman, “Artificial Life and Cellular Automata,” Access Research Network, Robert C. Newman Files, March 15, 2000. Web page: available from http://www.arn.org/docs/newman/rn_artificiallife.htm; Internet; accessed June 5, 2008.

[33]Dawkins, The Blind Watchmaker, 124.

[34]Richard Dawkins, The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe Without Design. Latest paperback edition (New York & London: W.W. Norton & Company, 1987), 49.
[35]Stephen C. Meyer. “The Explanatory Power of Design: DNA and the Origin of Information,” Mere Creation: Science, Faith, and Intelligent Design. William Dembski ed. (Downers Grove: InterVarsity Press, 1998), 128.

[36]Dawkins, The Blind Watchmaker, 50.

[37]Meyer, 128.

[38]Dembski, The Design Revolution, 263.

[39]Dembski, Intelligent Design, 173.

[40]Dembski, The Design Revolution, 263.

Thursday, June 11, 2009

Some thoughts on Old vs. Young Earth Creationism

The following is a contribution I recently made to an internal debate among Christian apologists regarding the proper interpretation of the creation narratives of Genesis and the age of the earth/universe:

I'll just add this one (rather lengthy) comment to this conversation and then happily retire from the discussion. Now, I've studied Greek and Hebrew enough to know that there is a lot I don't know. But I've talked to, and read some of the works of, highly regarded evangelical Hebrew scholars and have found that there is quite possibly much more that went into the writing of the creation narratives than a casual reading might indicate.

First, I don't know that a straight forward "literal" 24 hrs is not anachronistic. Much like reports of Jesus' sayings in the New Testament are not likely word for word. (Such was not the author's intent and would have been quite foreign to the first century gospel writer’s cognitive processes and intentions). There are numerous scholars that agree that the Genesis account of creation was originally written for polemical purposes against other ANE creation myths. Perhaps there is something of exegetical significance in this, perhaps not.

I think that "literal" 24 hrs is a gross over-simplification of the issue given what we know about time and relativity (i.e. reference frames, etc). See Gerald Schroeder's book Genesis and the Big Bang (although I would not rely on the Jewish mystics and numerous other aspects of his hermeneutics). It should be noted that serious defenders of Christian orthodoxy have approached the creation texts allegorically. Some even thought that six days would actually be an insult to a perfect being that could do all creating in an instant.

Bruce Waltke is about as conservative as they come and one of the leading OT scholars alive today. In his celebrated An Old Testament Theology he explains that liberal theologians stand above the Bible holding up higher criticism and their "assured results." Neo-orthodox theologians stand before the Bible such that through preaching the words of the Bible become the word of God (a canon within a canon which places authority with the audience). Traditionalists place confessions/traditions alongside the Bible which often end up nullifying the biblical witness. Fundamentalists stand on the Bible. I will now quote Waltke at length:

"By 'fundamentalist' I mean here those who presume the Bible does not stray from their standards of accuracy, especially in matters of science and historiography. They presume their interpretive horizon represents truth and that the biblical writers, though writing in an ancient environment, will not stray from the "accuracy" of their modern horizon. But the ancient standards do not necessarily conform to modern standards. The only legitimate human standard by which the Bible can be measured is the logic of noncontradiction. Paradox may be incomprehensible, but contradiction is 'non-sense.' What I have in mind here is that the fundamentalists do not "stand under" the Bible long enough to "understand" it. Sometimes they, though well intentioned, advertise 'the Bible as it is for men as they are,' but they neglect the prior question of whether 'men as they are are fit for the Bible as it is." (Waltke, 77).

I will also note that many of the contemporary liberal New Testament radicals come from brittle fundamentalist backgrounds. "Show me one error in the Bible and I throw out the whole thing." Craig Evans suggested that that was what happended in the case of Bart Ehrman. (Note: Ehrman clarified his move to agnosticism was based on his inability to reconcile the concept of God with the existence of evil. Prior to that, he stated at the 2007 Greer-Heard debate with Daniel Wallace, he knew of textual variants, and had shifted to a more mainline, non-evangelical Christianity).

Waltke continues:

"Many Christians subconsciously maintain a naivete that in fact is a studied neglect toward the Bible. They resist learning about critical issues, such as the existence of differing Hebrew texts and versions of the biblical text, the need for textual critics to choose among the variant forms, the uncertainty of the meaning of some Hebrew words in the Old Testament, and so on. These types of questions make us uncomfortable because answering them requires that we place ourselves above the text. It forces us to play the role of the critic, making judgments about the history, social situations, and literary forms. This role is spiritually and psychologically difficult for the pious, but in the exegetical process, these and other types of judgments have to be made. To back away from these questions in the name of piety is to flee the responsibility God has given us. On the other hand, some sophomoric students, having cast off the original naiveté, retain a suspicious stance toward the Bible. This is a spiritually impoverishing position because being above the text means that we cease to hear the text speaking directly to us. Consequently, we are cut off from the life-giving power of the word of God.

The correct balance is to first cast off our original naiveté, prayerfully tackle difficult exegetical questions, and then reassume a stance in subjection to the text--what Paul Ricouer calls a 'second naivete,' a childlike acceptance in faith of the text's message. In practice this means that having done our critical work on the text, we insist on submitting ourselves to it, accepting its truth and its authority in our lives. This is a difficult balance to achieve, but God's grace through the Holy Spirit will generate this stance in those who pray for it." (Waltke, 82-83, italics mine).

The sense I get from some of [our exchanges] is that perhaps we have presumed to stand on the Bible before we have properly stood under it. This is a huge hermeneutical question we are dealing with here.

My point in all of this is that I am getting a sense [from some contributors] that the hermeneutical questions regarding Genesis have been thoroughly settled. I have studied this issue enough to suspect that they have not been. I am trying to "stand under" the Bible in order to understand it well enough to stand on my convictions regarding the proper hermeneutic once I've done due diligence. But this is a time consuming task. Those who claim to have listened to a few sermons from high profile evangelical leaders and have the answers simply have not done the work. Let us be workmen who rightly divide the word. I trust that is what we are all attempting to do. Let us do so with charity and intellectual humility.

I’ll now close with Waltke’s own position, which I endorse:

“I label my own position as ‘evangelical’ for lack of a better term. I accept the inerrancy of Scripture as to its Source and its infallibility as to its authority. My spiritual conviction is intellectually defensible. The finite mind is incapable of coming to infinite truth and moreover is depraved. To live wisely I need the inspired revelation of the divine reality by which I can judge the wisdom or the folly, the right or the wrong, of my thoughts and actions. But I dare not presume to understand how or what this revelation means before coming to it on its own terms. I must allow the Bible to dictate how it seeks to reveal God’s truth. I study how it writes history; I examine and learn to recognize the different forms of literature: poetry, narrative, prophecy, and so on. I consider the Bible utterly trustworthy, and I commit my life to it, but I do not presume to know beforehand the exact nature of its parts. With this posture, I continue to learn and allow myself to be taught and corrected by the Bible.” (Waltke, 77)

I do lean toward an old earth/universe position because I do think that there is sufficient hermeneutical room to allow for it and it accords well with what are generally taken to be highly verified scientific theories which are products (fallible though they are) of general revelation and rationality as an important aspect of the imago dei that remains, I am persuaded, even in the unregenerate. I don’t know what to think about evolution other than to say that the origin of life certainly did not occur by chance, neither do irreducibly complex molecular machines just magically assemble themselves all at once. Gradual transition, as necessitated by Darwin’s theory, as I understand it, has been falsified by the fossil record. But what about any kind of death or pain before original sin? I’m not sure what to say. But there are some natural law theodicies that may shed some light on the issue of pain and suffering as necessary for a rationally intelligible and morally significant world (c.f. Bruce Reichenbach’s “Natural Evils and Natural Law: A Theodicy for Natural Evils”, 1976). Would Adam have felt pain if he’d stubbed his toe against a stone prior to original sin? I tend to think that he would have, and that it would have been a good thing. There was death in the plant kingdom prior to original sin inasmuch as Adam and Eve ate the fruits of the garden. These are some issues that float around in my head and I have so much other reading to do that I’ve not had time to sort it all out. But the one thing I will say is that the issue of young vs. old creationism is not as simple as some...would seem to suggest, and I concur...that if this is going to degenerate into name calling or outright demagoguery, then we should move on to another subject that is more edifying for the saints, and will enable us to be obedient regarding Paul’s injunction to redeem the time for the days are evil.


Fides Quaerens Intellectum

Tuesday, June 02, 2009

My aspiring new philosopher


Check out Luke...contemplative? I'll invite readers to suggest captions for this photo in the comments section...