Evolution: Searching for Truth in All The Wrong Places

flagellum

Evolution: Searching for Truth in All The Wrong Places
Prepared by Ed Hopkins

A compilation of the work of five scientists in the areas of:
Intelligent Design, irreducible complexity, complex specified information, and information and chance.

Neo-Darwinians
Searching for Truth in All the Wrong Places

Purpose of this paper

Five authors and a selection of their writings, (seven books in all), are contained in the following paper. These five authors contribute significant concepts, ideas, and facts regarding the need for design, or for that matter, a designer; to be involved in science, particularly in biology and the origin of life. Science from the materialistic and evolutionary point of views is shown to be extremely limited in the area of finding scientific truth simply because it imposes self administered limitations on the way it manages to look for truth. These authors expose these limitations, and collectively present an unshakeable array of logic and facts that translate into the recognition of design, the source of information, and the complexity of life as we know it.

Rather than the reader wading through all seven books, and perhaps having a limited time to do so, the reader can find much of the pertinent information from these books in a distilled down, thirty-eight page paper. It is recommended that for further development and analysis on this subject, that the reader, if so lead, read these books, or at least find more information in sections of the books, that appeal to him the most.

Darwin's Black Box (1996) - Michael Behe - See biography in appendix


Molecular Machines
Machines turn cellular switches on and off, sometimes killing the cell or causing it to grow. Solar-powered machines capture the energy of photons and store it in chemicals. Electrical machines allow current to flow through nerves. Manufacturing machines build other molecular machines, as well as themselves. Cells swim using machines, copy themselves with machinery, and ingest food with machinery. In short, highly sophisticated molecular machines control every cellular process. Thus the details of life are finely calibrated, and the machinery of life enormously complex.

If you focus your search on the question of how molecular machines - the basis of life - developed, you find an eerie and complete silence. The complexity of life's foundation has paralyzed science's attempt to account for it; molecular machines raise an as-yet-impenetrable barrier to Darwinism's universal reach. (Behe, Darwin's Black Box, DBB p. 4-5)


Early Controversy over Darwinism
Mathematicians over the years have complained that Darwinism's numbers just do not add up. Information theorist Hubert Yockey argues that the information needed to begin life could not have developed by chance. In 1966 leading mathematicians and evolutionary biologists held a symposium at the Wistar Institute in Philadelphia because the organizer, Martin Kaplan, had overheard "a rather weird discussion between four mathematicians….on mathematical doubts concerning the Darwinian theory of evolution." A mathematician who claimed that there was insufficient time for the number of mutations apparently needed to make an eye was told by the biologists that his figures must be wrong. The mathematicians were not persuaded that the fault was theirs. As one said:

There is a considerable gap in the neo-Darwinian theory of evolution, and we believe this gap to be of such a nature that it cannot be bridged with the current conception of biology. (Behe, DBB p. 29)

Darwin's theory has generated dissent from the time it was published, and not just for theological reasons. In 1871 one of Darwin's critics, St. George Mivart, listed his objections to the theory, many of which are surprisingly similar to those raised by modern critics.

What is to be brought forward (against Darwinism) may be summed up as follows: That "Natural Selection" is incompetent to account for the incipient stages of useful structures. That it does not harmonize with the co-existence of closely similar structures of diverse origin. (Behe, DBB p. 30)

Darwin knew that his theory of gradual evolution by natural selection carried a heavy burden:
"If it could be demonstrated that any complex organ existed which could not possibly have been formed by numerous, successive, slight modification, my theory would absolutely break down." (Behe DBB p. 39)


Irreducible Complexity
What is an irreducibly complex system? According to Behe:
By irreducibly complex I mean a single system composed of several well-matched, interacting parts that contribute to the basic function, wherein the removal of any one of the parts causes the system to effectively cease functioning. An irreducibly complex system cannot be produced directly (that is, by continuously improving the initial function, which continues to work by the same mechanism) by slight, successive modifications of a precursor system, because any precursor to an irreducibly complex system that is missing a part is by definition nonfunctional.

Since natural selection can only choose systems that are already working, then if a biological system cannot be produced gradually it would have to arise as an integrated unit, in one fell swoop, for natural selection to have anything to act on. (Behe, DBB p. 39)

In order to be a candidate for natural selection a system must have minimal function: the ability to accomplish a task in physically realistic circumstances. As an example: suppose the world's first outboard motor had been designed and was being marketed. The motor functioned smoothly- burning gasoline at a controlled rate and causing the propeller to turn at one revolution per hour. Few people if any would buy such a machine, because it fails to perform at a level suitable for its purpose. Minimal function is critical in an evolutionary process. One of the roadblocks of evolution is reaching minimal function immediately. Evolution cannot accumulate proteins and less than minimal processes or natural selection will reject them. (Behe, DBB p. 45-46)

Let us now take a closer look at one of these machines. Behe describes in some detail the bacterial flagellum.
The bacterial flagellum is a long, hair-like filament embedded in the cell membrane. The external filament consists of a single type of protein, called "flagellin." The flagellin filament is the paddle surface that contacts the liquid during swimming. At the end of the flagellin filament near the surface of the cell, there is a bulge in the thickness of the flagellum. It is here that the filament attaches to the rotor drive. The attachment material is comprised of something called "hook protein." The filament of a bacterial flagellum, unlike a cilium , contains no motor protein; if it is broken off, the filament just floats stiffly in the water. Therefore the motor that rotates the filament-propeller must be located somewhere else. Experiments have demonstrated that it is located at the base of the flagellum, where electron microscopy shows several ring structures occur. The rotary nature of the flagellum has clear, unavoidable consequences, as noted in a popular biochemistry textbook.

The bacterial rotary motor must have the same mechanical elements as other rotary devices: a rotor (the rotating element) and a stator (the stationary element). Other elements mentioned in the diagram accompanying Behe's description include; Bushings, studs and C ring as part of the stator, rod or drive shaft, the rotor made up of (an S ring, and an M ring). All these structures are proteins required for the efficient operation of the flagellum. About forty other proteins are necessary for function. The exact roles of most of the proteins are not known, but they include signals to turn the motor on and off; "bushing" proteins to allow the flagellum to penetrate through the cell membrane and cell wall; proteins to assist in the assembly of the structure; and proteins to regulate the production of the proteins that make up the flagellum. (Behe, DBB p. 72-73)


Evidence for Molecular Evolution
Molecular evolution is not based on scientific authority. There is no publication in the scientific literature - in prestigious journals, specialty journals, or books - that describes how molecular evolution of any real, complex, biochemical system either did occur or even might have occurred. There are assertions that such evolution occurred, but absolutely none are supported by pertinent experiments or calculation. Since no one knows molecular evolution by direct experience and since there is no authority on which to base claims of knowledge, it can truly be said that - like contention that the Eagles will win the Super Bowl this year - the assertions of Darwinian molecular evolution is merely bluster. (Behe, DBB p. 185-186)

Design is evident when a number of separate, interacting components are ordered in such a way as to accomplish a function beyond the individual components. The greater the specificity of the interacting components required to produce the function, the greater is our confidence in the conclusion of design. (Behe DBB p. 194)

The conclusion that something was designed can be made quite independently of knowledge of the designer. As a matter of procedure, the design must first be apprehended before there can be any further question about the designer. The inference to design can be held with all the firmness that is possible in this world, without knowing anything about the designer. (Behe, DBB p. 197)

The designing that is currently going on in biochemistry laboratories throughout the world - the activity that is required to plan a new plasminogen that can be cleaved by thrombin, or a cow that gives growth hormone in its milk, or a bacteria that secretes human insulin - is analogous to the designing that preceded the bacterial flagellum. (Behe DBB p. 205)
The fact that biochemical systems can be designed by intelligent agents for their own purposes is conceded by all scientists, even Richard Dawkins. Since Dawkins agrees that biochemical systems can be designed, and that people who did not see or hear about the designing can nonetheless detect it, then the question of whether a given biochemical system was designed boils down simply to adducing evidence to support design. (Behe DBB p. 203)


Behe - Major Points
1. Highly sophisticated molecular machines control much of cellular activity.
2. The origin of these machines can be found nowhere in scientific literature.
3. Many mathematicians have agreed that there is a considerable gap between neo- Darwinian evolutionary theory that cannot be bridged with biology as we know it today.
4. The problems identified over a hundred years ago, are still present today.
5. Darwin recognized that his theory would fail if a complex organ could not be formed by numerous slight modifications.
6. Irreducibly complexity identifies just such organs or structures; and many of them.
7. Natural Selection appears not to be a viable method to account for these machines.
8. Design is evident when separate individual components are ordered to create a function beyond the ability of the components themselves.
9. Design is a detectable entity.

Not By Chance - Shattering the Modern Theory of Evolution (1998)
By Dr. Lee Spetner - See biography in the appendix


Adaptive Mutations
Dr. Lee Spetner in his book Not By Chance (NBC) has researched the area involving adaptive mutations. Following is some of his findings.

- Barbara McClintock, who received the Nobel Prize in 1983 for her work on genetic rearrangements, noted that there are indications that these genetic modifications occur in response to stress.
- Barry Wanner of Emory University has suggested that genomic rearrangements could be part of a control system in bacteria that would produce heritable changes in response to environmental cues.
- John Cairns and his team at the School of Public Health at Harvard University described other experiments with bacteria and concluded:

The cells may have mechanisms for choosing which mutation will occur…. Bacteria apparently have an extensive armory of such 'cryptic' genes that can be called upon for the metabolism of unusual substrates.(Spetner NBC p. 187-189)

Dr. Spetner suggests that these experiments which indicate that adaptive mutations are stimulated by the environment, contradict the basic dogma of neo-Darwinism. Mutations are random, and should occur independent of the environment. He further suggests that other organisms, apart from bacteria, also may have latent parts of their genome dedicated to be adaptive to a certain set of environmental conditions that may arise. (Spetner NBC p.190-192)

A parallel process appears to take place during embryological development. Animal embryos' develop under joint influence of their genetic program and the environment. Signals from both sources act together in the development process. Different cells in an embryo even though they have identical DNA, take different development pathways because their signal inputs are different. Indeed, that's what differentiation in the embryo is all about.

Essentially, we can see different phenotypes of animals and plants depending upon how the environment along with the individual's genetic makeup, affect the developmental pathways. Dr. Spetner wonders how much of the fossil record might be the result of the direct influence of environment on the phenotype without any change in the genotype. (Spetner, NBC p.195-196)


Chance
Spetner has a few things to say about the probability of mutations in living things. For cumulative selection to work, a lot of good mutations have to occur by chance. At each step, a mutant with a positive selective value has to appear. It also has to be lucky enough to survive and eventually to take over the population. Then another good mutation has to appear for the next step, and so on. The neo-Darwinians seem to think the chance of all this happening is large enough to make evolution work. But no one has ever shown that to be so. No one has ever shown that such a thing is likely - or even possible.

Richard Dawkins believes that macroevolution takes place gradually through long chains of many small steps that are cumulative "cumulative selection". Dawkins talked about chance, but he didn't calculate the chance of anything. (Spetner, NBC p.162)

The rarity of copying errors is a problem for the neo Darwinian Theory. In bacteria the mutation rate per nucleotide is between 0.1 and 10 per billion transcriptions. But in all other forms of life the rate is smaller…between 0.01 and 1 per billion. Error rates are low only because the cell has a proofreading mechanism which corrects most of the errors made in transcription. The mean geometric rate mutation rate is one in ten billion. (Spetner, NBC pp. 91-93)

Some other elements that are necessary for cumulative selection to work include the following:
- Each step must have a selective advantage, and they must each on the average add a small amount of information to the genome. (Spetner, NBC p.163).
- Each improvement must be significant. Most "slight" improvements will not be the basis for anything. Unless they occur in large numbers they are likely to disappear.
- The mutation that leads to the improvement must also be dominant. If it was recessive, the mutation would have to appear in both a male and female, and they would have to find each other.
- A mutation, even if favorable, will have only a very small chance of establishing itself in the species if it occurs once only. (Spetner, NBC p.101-102)
- Information must be added in small steps. It cannot be much more than one bit, or the change that would affect one nucleotide. If the information seems to have much more than one bit it can't be a part of cumulative selection.
- No mutations that have selective value are known to satisfy these conditions. They either reduce the information in the genome, or they seem to add too much. (Spetner, NBC p. 106)


Spetner - Major Points
1. Adaptive mutations are stimulated by the environment. This is a contradiction of the basic tenants of evolution.
2. Animal embryos develop according to a dual process involving the influence of their genetic program and their environment.
3. Cumulative selection has never been demonstrated, whereby a series of positive mutations, each of which must survive, will lead to a new organism
4. Low mutation rates are a problem for evolution due to a proofreading process that corrects most of the errors in transcription.
5. Information must be added incrementally. New information cannot be accounted for. Losses of information can be accounted for during mutations.

Werner Gitt - In the Beginning was Information (IBI) (2001)
See biography in the appendix.


Information
The first question Dr. Gitt focuses in on is whether or not information is a material or a mental quantity? The essential aspect of each and every piece of information is its mental content, and not the number of letters used. (Gitt, IBI p. 45) Gitt goes on to formulate one of his theorems.

Theorem 1: The fundamental quantity information is a non-material (mental) entity. It is not a property of matter, so that purely material processes are fundamentally precluded as sources of information. (Gitt IBI p. 47)

Gitt devised five levels of information briefly summarized as follows:

Level One - Statistics (Gitt IBI p. 54)
At this level it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. An example would be; how many letters, numbers and words make up the entire text, or how frequently do certain letters and words occur?


Level Two - Syntax (Gitt IBI p. 57)


At this level, letters in a book do not appear in random sequences. Combinations occur. Only certain combinations of letters are allowed such as English words. It is also not a random process when words are arranged in sentences; the rules of grammar must be adhered to. Arrangements of words in sentences to form information-bearing sequences of symbols, is subject to specific rules based on deliberate conventions for the particular language.

Syntax is meant to include all structural properties of the process of setting up information. At this level we are only concerned with the actual sets of symbols or codes and the rules governing the way they are assembled into sequences, regarding grammar and vocabulary, independent of any meaning they may or may not have.

This second level includes Dr. Gitt's Theorem 6 , 7, and 9. Since this is a summary of the five levels of information, not all of Dr. Gitt's theorems are listed.

Theorem 6; A code is an essential requirement for establishing information.
Theorem 7; The allocation of meanings to the set of available symbols is a mental process depending on convention.
Theorem 9; If the information is to be understood, the particular code must be known to both the sender and the recipient.
Matter as such is unable to generate any code. All experiences indicate that a thinking being, voluntarily exercising his own free will, cognition, and creativity, is required.


Level Three - Semantics (Gitt IBI p. 69)
At this level we are interested in the meaning of the code. It is the message being conveyed, the conclusions and the semantics or meanings. It is the meaning sent by the sender and received by the recipient that changes a sequence of symbols into information.

Typical semantic questions are:

Concerning the sender:
- What are the thoughts in the sender's mind?
- What meaning is contained in the information being formulated?

Concerning the recipient:
- Does the recipient understand the information?
- Is the message true or false?

Theorem 13; Any piece of information has been transmitted by somebody and is meant for somebody. A sender and a recipient are always involved whenever and wherever information is concerned.


Level Four - Pragmatics (Gitt IBI p. 73)
Every transmission of information indicates that the sender has some purpose in mind for the recipient. In order to achieve the intended result, the sender describes the actions required of the recipient to bring him to implement the desired purpose.

Theorem 18; Information is able to cause the recipient to take some action (stimulate, initialize, or implement). This reactive functioning of information is valid for both inanimate systems (e.g. computer, and an automatic car wash) as well as living organisms (e.g. activities in cells, action of animals, and activities of human beings).


Level Five - Apobetics (Gitt IBI p. 75)
Apobetics is the teleological aspect of the information or the question of the purpose. For every result on the side of the recipient there is a corresponding conceptual purpose, plan, or representation in the mind of the sender. The teleological aspect of information is the most important, because it concerns the premeditated purpose of the sender. Any piece of information involves the question: "Why does the sender communicate this information, and what result does he want to achieve for or in the recipient?"

Examples could include:
- The male bird calls a mate by means of his song, or his establishes his territory.
- Computer programs are written with a purpose.
- God gives us a purpose in life through the Bible.

Theorem 19; Every piece of information is intentional.
Theorem 21; The five aspects of information are valid for both the sender and the recipient. The five levels are involved in a continuous interplay between the two.
Theorem 22; The separate aspects of information are interlinked in such a way that every lower level is a necessary prerequisite for the realization of the next one above it.
Theorem23; There is no known natural law through which matter can give rise to information, neither is any physical process or material [phenomenon known that can do this.
All the theorems proposed by Dr. Gitt, are based on empirical reality. They may thus be regarded as natural laws, since they exhibit the characteristics of natural laws. Any natural law can be rejected the moment a single counter example is found, and this also holds for these information theorems. Until one or more of the theorems can be nullified, they should be valuable in assessing the nature and origin of information. (Gitt IBI p. 79) This is a challenge to the materialistic view of the post-modernist. How does a non-intelligent source in the material world produce information that meets the requirements found in all five levels?


Characteristics of Information - (Gitt IBI p. 85)
Dr. Gitt goes on to elaborate on the nature of information.

Information is not the thing itself, neither is it a condition, but it is an abstract representation of material realities or conceptual relationships, like problem formulations, ideas, programs, or algorithms. The representation is in a suitable coding system and the realities could be objects, or physical, chemical, or biological conditions. The reality being represented, is usually not present at the time and place of the transfer of information, neither can it be observed or measured at that moment. (Gitt IBI p. 84)

Dr. Gitt proposes another theorem that is appropriate at this time.
Theorem 24 - Information requires a material medium for storage. (Gitt IBI p. 85)

Some of the obvious means by which humans store information is in books, newspapers, computer programs, etc. Consider though, how information got stored that is used to produce living things. The DNA molecule stores information that describes how to put an organism together. The DNA code seems to parallel human methods by which information is stored, and thus we look for the "one who stored this information"; someone who obviously is not human.

Theorem 25 - Biological information is not an exceptional kind of information, but it differs from other systems in that it has a very high storage density and that it obviously employs extremely ingenious concepts. (Gitt IBI p. 97)
Theorem 26 - The information present in living beings must have had a mental source.
Theorem 27 - Any model for the origin of life (and of information) based solely on physical and/or chemical processes, is inherently false. (Gitt IBI p. 98-99)

Creative information is the highest level of transmitted information: something new is produced. It does not involve copied or reproduced information (much as I am doing in this narrative). This kind of information always requires a personal mind exercising its own free will, as original source.

Theorem 29 - Every piece of creative information represents some mental effort and can be traced to a personal idea-giver who exercised his own free will, and who is endowed with an intelligent mind. (Gitt IBI p.113)

Dr. Gitt concludes this section with the following:

It should now be clear where the follies of evolutionary vies lie. If someone presents a model for explaining the origin of life, but he cannot say where the creative information characteristic of all life forms came from, then the crucial question remains unanswered. Somebody who looks for the origin of information only in physical matter, ignores the fundamental natural laws about information; what is more, he scorns them. It is clear from the history of science that one can ignore the laws of nature for a limited time only. (Gitt II p.114)

The Sender of biological information is not accessible for scientific research. Since the Sender cannot be investigated by human means, many people erroneously conclude that He does not exist, and thus they contravene the information theorems. (Gitt IBI p.135)


Affects of mutations and loss of information
All living organisms have a lot of information in them. We can appreciate how much information is present by the size of an organism's genome and subsequently by the expression of that genome, how complex the organism is by observing the organism's structure and function. Information and complexity are related. The more information the more complexity. If we shift the information level into the specificity level we can see how information is affected by even minor alterations. Lee Spetner provides us with the following example:

A protein whose performance would be affected by a change in any one of its amino acids is very specific. If its performance would not be affected by a change in some of them, the protein would be less specific. Often, an enzyme is very specific: a change of any one of its amino acids results in some sort of change in the enzyme's performance. (Spetner, NBC p. 137)

A change in the genome is called a mutation. According to Spetner, all point mutations on the molecular level turn out to reduce the genetic information and not to increase it. (Spetner, NBC p.138) A point mutation that makes a bacterium resistant to streptomycin does so by losing information. If a mutation in the bacterium should happen to change the ribosome site where the streptomycin attaches, the drug will no longer have a place to which it can attach. The mutation reduces the specificity of the ribosome protein, and that means losing genetic information. The loss of information leads to a loss of sensitivity to the drug and hence to resistance. (Spetner, NBC p.141) Other mutations in organisms, that appear to be of benefit to us, such as a greater food yield, do so at a loss of information to the organism. The organism's fine machinery has been upset and the change does not improve or add information. The organism is not evolving even though it appears to be.

Here is a further example which illustrates how bacteria can alter their genome by mutation or loss of information, and seemingly benefit themselves. Wild type bacteria normally feed on the sugar ribitol. Xylitol is a sugar not found in nature. The two sugars are very similar in chemical structure. The enzyme ribitol dehydrogenase (RDH) acts on ribitol and very weakly on xylitol. A mutation occurs in the production of RDH caused by the mutations destruction of a repressor protein which monitors the production of RDH. Now RDH is produced in such large quantities that in spite of the normal low activity on xylitol, that rate is increased dramatically. The strain of bacteria that has this ability to used xylitol appears to have had genetic information increased. The enzyme however has become less specific due to a loss of information. (Spetner NBC p.150-154)


Gitt & Spetner - Major points


1. Information is a non-material mental entity.
2. A code is an essential requirement for the establishment of information.
3. Coded information must be understood by both a sender and recipient.
4. Every transmission of information indicates that the sender has some purpose in mind for the recipient.
5. Information is able to cause the recipient to take some action.
6. Regarding evolution: How does a non-intelligent source in the material world produce information?
7. Information requires a material medium for storage such as DNA.
8. Biological information has a very high storage density employing extremely ingenious concepts.
9. Every piece of creative information represents some mental effort traced to an intelligent mind.
10. Some proteins are very specific; a single change in one amino acid will change the behavior of the protein.
11. All changes on the molecular level reduce information.
12. The loss of information may benefit mankind, but does not improve the organism which loses the information.
The activity of the organism becomes less specific.

Books by William Dembski - See biography in appendix.

Intelligent Design (ID)- The Bridge between Science and Theology (1999).
No Free Lunch - (NFL) - Why Specified Complexity Cannot Be Purchased without Intelligence (2002).
The Design Revolution (DR)- Answering the Toughest Questions about Intelligent Design (2004).

The following sections are quotations from Dembski's three books.

The Design Inference
In this section we will take a look at the evidences for design. William Dembski is an associate research professor at Baylor University and a senior fellow of the Discovery Institute's Center for the Renewal of Science and Culture. In his book Intelligent Design Dembski quotes Princeton theologian Charles Hodge:

There are in the animal and vegetable world's innumerable instances of at least apparent contrivance, (evidence of the mental) which have excited the admiration of men in all ages. There are three ways of accounting for them. The first looks to an intelligent agent….In the external world there is always and everywhere indisputable evidence of the activity of two kinds of force: the one physical, the other mental. The physical belongs to matter and is due to the properties with which it has been endowed; the other is the … mind of God.

The second method of accounting for contrivances in nature admits that they were foreseen and purposed by God, and that He endowed matter with forces which He foresaw and intended should produce such results. But here His agency stops. He never interferes to guide the operation of physical causes….

The third method is that which refers them to the blind operation of natural causes. This is the doctrine of the Materialists. (Dembski ID p. 87)

The problem with science is its attempt at empiricism. Everything has to be measured, analyzed and accounted for. How do you measure God or what God has done, or might do in the future. Because of this, science and an intelligent designer part ways. There are very definite advantages to severing the world from God. Thomas Huxley, for instance, found great comfort in not having to account for his sins to a creator. Naturalism promises to free humanity from the weight of sin by dissolving the very concept of sin. (Dembski ID p. 100)

The fact remains, that intelligent causes have played, are playing and will continue to play an important role in science. Entire industries, economic and scientific, depend crucially on such notions as intelligence, intentionality and information. Included here are forensic science, intellectual property law, insurance claims investigation, cryptography, random number generation, archaeology and the search for extraterrestrial intelligence (SETI). (Dembski ID p. 91)

Can distinctions be made between physical and intelligent causes? Are these distinctions reliable to denote marks of intelligence that signal the activity of an intelligent cause? Finding a reliable criterion for detecting the activity of intelligent causes has to date constituted the key obstacle facing Hodge's first method ….determining the mind of God. (Dembski ID p. 93)

If we prescribe in advance that science must be limited to strictly natural causes, the science will necessarily be incapable of investigating God's interaction with the world. But if we permit science to investigate intelligent causes as they do already such as in the earlier example of forensic science, then God's interaction with world, insofar as it manifests the characteristic features of intelligent causation, becomes a legitimate domain for scientific investigation. (Dembski ID p. 105)


Design as a scientific theory
Scientists are beginning to realize that design can be rigorously formulated as a scientific theory. What has kept design outside the scientific mainstream these last hundred and forty years is the absence of precise methods for distinguishing intelligently caused objects from unintelligently caused ones.

What has emerged is a new program for scientific research known as intelligent design. Within biology, intelligent design is a theory of biologically origins and development. Its fundamental claim is that intelligent causes are necessary to explain the complex, information-rich structures of biology and that these causes are empirically detectable. There exist well-defined methods that on the basis of observational features of the world are capable of reliably distinguishing intelligent causes from undirected natural causes. Such methods are found in already existing sciences such as mentioned earlier.

Whenever these methods detect intelligent causation, the underlying entity they uncover is information. Information becomes a reliable indicator of intelligent causation as well as a proper object for scientific investigation. Intelligent design is therefore not the study of intelligent causes per se but of informational pathways induced by intelligent causes. Intelligent design presupposes neither a creator nor miracles. Intelligent design is theologically minimalist. It detects intelligence without speculating about the nature of the intelligence. (Dembski ID p. 106-107) Intelligent design does not try to get into the mind of a designer and figure out what a designer is thinking. The designer's thought processes lie outside the scope of intelligent design. As a scientific research program, intelligent design investigates the effects of intelligence and not intelligence as such. (Dembski DR p. 33)

There's a joke that clarifies the difference between intelligent design and creation. Scientists come to God and claim they can do everything God can do. "Like what?" asks God. "Like creating human beings," say the scientists. "Show me, "says God. The scientists say, "Well, we start with some dust and then" - God interrupts, "Wait a second. Get your own dust". Creation asks for the ultimate resting place of explanation: the source of being of the world. Intelligent design, by contrast, inquires not into the ultimate source of matter and energy but into the cause of their present arrangements. (Dembski DR p. 38-39) Scientific creationism's reliance on narrowly held prior assumptions undercuts its status as a scientific theory. Intelligent design's reliance on widely accepted scientific principles, on the other hand, ensures its legitimacy as a scientific theory. (Dembski DR p. 43)

What will science look like once intelligent causes are readmitted to full scientific status? The worry is that intelligent design will stultify scientific inquiry. Suppose Paley was right about the mammalian eye exhibiting sure marks of intelligent causation. How would this recognition help us understand the eye any better as scientists? Actually it would help quite a bit. It would put a stop to all those unsubstantiated just-so-stories that evolutionists spin out in trying to account for the eye through a gradual succession of undirected natural causes. It would preclude certain types of scientific explanations. This is a contribution to science. Now science becomes a process whereby one intelligence is determining, what another intelligence has done. (Dembski ID p. 108-109)


The Designer
The physical world of science is silent about the revelation of Christ in Scripture. Nothing prevents the physical world from independently testifying to the God revealed in the Scripture. Now intelligent design does just this - it puts our native intellect to work and thereby confirms that a designer of remarkable talents is responsible for the physical world. How this designer connects with the God of Scripture is then left for theology to determine. (Dembski ID p. 111)

Why should anyone want to reinstate design into science? Chance and necessity have proven too thin an explanatory soup on which to nourish a robust science. In fact, by dogmatically excluding design from science, scientists are themselves stifling scientific inquiry. Richard Dawkins begins his book, The Blind Watchmaker by stating, "Biology is the study of complicated things that give the appearance of having been designed for a purpose." In What Mad Pursuit Francis Crick, Nobel laureate and co discoverer of the structure of DNA, writes, "Biologists must constantly keep in mind that what they see was not designed, but rather evolved." (Dembski ID p. 125)


The Complexity-Specification Criterion
Whenever design is inferred, three things must be established: contingency, complexity and specification. Contingency ensures that the object in question is not the result of an automatic and therefore unintelligent process that had no choice in its production. Complexity ensures that the object is not so simple that it can readily be explained by chance. Finally, specification ensures that the object exhibits the type of pattern characteristic of intelligence.

The concept of contingency is further understood as an object, event or structure becoming irreducible to any underlying physical necessity. The sequencing of DNA bases is irreducible to the bonding affinities between the bases as an example. (Dembski ID p. 128)


The Explanatory Filter
William Dembski has devised what he calls the "explanatory filter" to determine whether design is present or not. First to be assessed is if the situation or object is contingent. If not the situation is attributable to necessity. To say something is necessary is to say that it has to happen and that it can happen in one and only one way. Consider a biological structure which results from necessity. It would have to form as reliably as water freezes when its temperature is suitably lowered. (Dembski DR p. 140) The opposite of necessity is contingent. For something to be contingent is to say that it can happen in more than one way. Contingency presupposes a range of possibilities such as the possible results of spinning a roulette wheel. To get a handle on those possibilities, scientists typically assign them probabilities. (Dembski DR p. 78) Either contingency is a blind, purposeless contingency - which is chance (whether pure chance or chance constrained by necessity); or it is a guided, purposeful contingency - which is intelligent causation. (Dembski NFL p. 155) Secondly; if something is determined to be contingent then the next question is "is it complex?" If complexity is not there the situation is attributable to chance. Third; if something is determined to be complex then is it specified? If it is not specified then the situation is attributable to chance. However, if specificity is determined then the situation is determined to be designed. According to the complexity-specification criterion, once the improbabilities become too vast and the specifications too tight, chance is eliminated and design is implicated. (Dembski ID p. 166)

Whenever this criterion attributes design, it does so correctly. In every instance where the complexity-specification criterion attributes design and where the underlying causal story is known, it turns out design actually is present. It has the same logical status as concluding that all ravens are black given that all ravens observed to date have been found to be black. (Dembski ID p. 142)

William Dembski in his book The Design Revolution provides us with an example from the movie Contact that illustrates how intelligent design can be detected. After years of receiving apparently meaningless "random" signals, the Contact researchers discovered a pattern of beats and pauses that corresponds to the sequence of all the prime numbers between 2 and 101. That grabbed their attention, and they immediately detected intelligent design. When a sequence begins with two beats and then a pause, three beats and then a pause, and continues, through each prime number all the way to 101 beats researchers must infer the presence of an extraterrestrial intelligence.

Here's why. Nothing in the laws of physics requires radio signals to take one form or another, so the prim sequence is contingent rather than necessary. Also, the prime sequence is a long sequence and there for complex. Finally, it was not just complex, but it also exhibited an independently given pattern or specification. (It was not just any old sequence of numbers but a mathematically significant one - the prime numbers.) (Dembski DR p. 34-35)

A second application of the Explanatory Filter is seen in the workings of a safe's combination lock. The safe's lock is marked with a hundred numbers ranging from 00 to 99 and that five turns in alternating directions are required to open the lock. We assume that one and only one sequence of numbers is involved in the sequence (e.g., 34-98-25-09-71). There are thus 10 billion possible combinations, of which precisely one opens the lock.

Feeding this situation into the Explanatory Filter we note first that there is no regularity or law of nature requires that the combination lock turn to the combination that opens it, therefore the opening of the bank's safe is contingent. Secondly - random twirling of the combinations lock's dial is exceedingly unlikely to open the lock. This makes the opening of the safe complex. Is the opening of the safe specified? If not specified, the opening of the safe could be attributed to chance. Since there is only one in 10 billion possibilities, the opening of the safe is also specified. This moves the problem to the area of design. Any sane bank worker would instantly recognize: somebody knew, and chose to design the lock to open using the prescribed numbers in proper rotation. (Dembski DR 87-89)

Notice the word "chose" in the preceding sentence. With natural selection there is the concept of choice. To "select" is to choose. In ascribing the power to choose to unintelligent natural forces, Darwin perpetrated the greatest intellectual swindle in the history of ideas. Nature has no power to choose. All natural selection does is narrow the variability of incidental change by weeding out the less fit. It acts on the spur of the moment, based solely on what the environment at the present time deems fit and thus without any prevision of future possibilities. This blind process, when coupled with another blind process, namely incidental change, is supposed to produce designs that exceed the capacity of any designers in our experience. No wonder Daniel Dennett, in Darwin's Dangerous Ideas, credits Darwin with "the single best idea anyone has ever had." Getting design without a designer is a good trick indeed. Now with advances in technology as well as the information and life sciences, the Darwinian gig is now up. It's time to lay aside the tricks - the smokescreens and the hand-waving, the just-so-stories and the stonewalling, the bluster and the bluffing - and to explain scientifically what people have known all along, namely, why you can't get design without a designer. That's were intelligent design comes in. (Dembski DR p. 263)


Why the Criterion Works
What makes intelligent agents detectable? The principle characteristic of intelligent agency is choice. Intelligence consists in choosing between. How do we recognize that an intelligent agent has made a choice? A random ink blot is unspecified; a message written with ink on paper is specified. The exact message recorded may not be specified, but the characteristics of written language will nonetheless specify it. This is how we detect an intelligent agency.
A psychologist who observes a rat making no erroneous turns and in short order exiting a maze, will be convinced that the rat has indeed learned how to exit the maze and that this was not dumb luck. If the maze is sufficiently complex and the turns are of a highly specific nature, the more evidence the psychologist has that the rat did not accomplish this feat by chance. This general scheme for recognizing intelligent agency is but a thinly disguised form of the complexity-specification criterion. In general, to recognize intelligent agency we must observe an actualization of one among several competing possibilities, note which possibilities were ruled out and then be able to specify the possibility that was actualized. (Dembski ID p. 144-146)

Therefore there exists a reliable criterion for detecting design. This criterion detects design strictly from observational features of the world. Moreover it belongs to probability and complexity theory, not to metaphysics and theology. And although it cannot achieve logical demonstration, it does achieve statistical justification so compelling as to demand assent. This criterion is relevant to biology, it detects design. In particular it shows that Michael Behe's irreducibly complex biochemical systems are designed. (Dembski ID p. 150)

Information can be both complex and specified. Information that is both complex and specified will be called complex specified information, or CSI. The sixteen-digit number on your VISA card is an example of CSI. The complexity of this number ensures that a would-be thief cannot randomly pick a number and have it turn out to a valid VISA number.

Algorithms (mathematical procedures for solving problems) and natural laws are in principle incapable of explaining the origin of information. They can explain the flow of information. Indeed, algorithms and natural laws are ideally suited for transmitting already existing information. What they cannot do, however, is originate information. Instead of explaining the origin of CSI, algorithms and natural laws shift the problem elsewhere - in fact, to a place where the origin of CSI will be at least as difficult to explain as before. (Dembski ID p. 159-161)

Take for example a computer algorithm that performs addition. The algorithm has a correctness proof so that it performs its additions correctly. Given the input data 2 + 2, can the algorithm output anything other than 4? Computer algorithms are wholly deterministic. They allow for no contingency (other option), and thus cannot generate no information. Contingency (options) cannot be produced. Without contingency laws cannot generate information, to say nothing of complex specified information. Time, chance and natural processes have limitations. If not by means of laws, how then does contingency - and hence information - arise? Two possibilities arise. Either the contingency is a blind purposeless contingency, which is chance; or it is a guided, purposeful contingency, which is intelligent causation.


Can chance generate Complex Specified Information? (CSI)
Chance can generate complex unspecified information, and chance can generate noncomplex specified information. What chance cannot generate is information that is both complex and specified.

A typist randomly typing a long sequence of letters will generate complex unspecified information: the precise sequence of letters typed will constitute a highly improbable unspecified event, yielding complex unspecified information. Even though a meaningful word might appear, random typing cannot produce an extended meaningful text, thereby generating information that is both complex and specified.

Why can't this happen by chance? The improbabilities become too vast and the specifications too tight, chance is eliminated and design is implicated. Just where the probabilistic cutoff is can be debated, but that there is a probabilistic cutoff beyond which chance becomes an unacceptable explanation is clear. The universe will experience heat death before random typing at a keyboard produces a Shakespearean sonnet. (Dembski, ID p.165-166)

Any output of specified complexity requires a prior input of specified complexity. In the case of evolutionary algorithms, they can yield specified complexity only if they themselves are carefully front-loaded with the right information and thus carefully adapted to the problem at hand. Evolutionary algorithms there fore do not generate or create specified complexity, but merely harness already existing specified complexity. There is only one known generator of specified complexity, and that is intelligence. (Dembski, NFL p. 207)


The Probability Factor
The French mathematician Emile Borel proposed 10 to the 50th. power as a universal probability bound below which chance could definitely be preclude. Borel's probability bound translates into 166 bits of information. William Dembski in his book The Design Inference, describes a more stringent probability bound which takes into consideration the number of elementary particles in the observable universe, the duration of the observable universe until its heat death and the Planck time. A probability bound of 10 to the 150th power results, which translates into 500 bits of information. Dembski chooses this more stringent value. If we now define CSI as any specified information whose complexity exceeds 500 bits of information, it follows immediately that chance cannot generate CSI. (Dembski, ID p.166) Any specified event of probability less than 1 in 10 to the 150th. power, will remain improbable even after all conceivable probabilistic resources from the observable universe have been factored in. It thus becomes a universal probability bound. (Dembski, DR p. 85)

To take the view that the specific sequence of the nucleotides in the DNA molecule of the first organism came about by a purely random process is the early history of the earth, CSI cries out for explanation, and pure chance won't do it. Richard Dawkins makes this point eloquently:

"We can accept a certain amount of luck in our explanations, but not too much…this ration has, as its upper limit, the number of eligible planets in the universe … We therefore have at our disposal, if we want to use it, odds of 1 in 100 billion-billion as an upper limit to spend on our theory of the origin of life. Suppose we want to suggest, for instance, that life began when both DNA and its protein-based replication machinery spontaneously chanced to come into existence. We can allow ourselves the luxury of such an extravagant theory, provided that the odds against this coincidence occurring on a planet do not exceed 100 billion-billion to one." (Dembski DR p. 167)


Unlimited Probabilistic Resources
Probabilistic resources address the concept regarding the number of ways an event might occur. Unlimited probabilistic resources not only include probabilities that maybe mathematical and known in the present scientific context, but resources that go beyond what is presently known today. Evolutionists will resort to this method when their backs are to the wall. They will appeal to the addition of resources that are not within our purview at the present time and look to some future set of conditions that might help their position. It's important to deal with the here and now, and the reality of the present. If the present methods of applying probabilities to an occurrence, such as the origin of life, and those probabilities are zero, why proceed into the unknown unless out of sheer desperation?

William Dembski illustrates the following concept. What if the known universe is but one of many possible universes, each of which is as real as the known universe but causally inaccessible to it? If so, are not the probabilistic resources needed to eliminate chance vastly increased and is not the validity of 10 to the 150th power as a universal probability bound thrown into question? This line of reasoning has gained widespread currency among scientists and philosophers in recent years. Is it not illegitimate to rescue chance by invoking probabilistic resources from outside the known universe? Should there not be independent evidence to invoke a resource? (Dembski NFL p. 85)

Was Arthur Rubinstein a great pianist or was it just that whenever he sat at the piano, he happened by chance to put his fingers on the right keys to produce beautiful music? It could happen by chance, and there is some possible world where everything is exactly as it is in this world except that the counterpart to Arthur Rubenstein cannot read music and happens to be incredibly lucky whenever he sits at the piano.

Perhaps Shakespeare was an imbecile who just by chance happened to string together a long sequence of apt phrases. Unlimited probabilistic resources ensure that we will never know. (Dembski NFL p. 93)

Are not the probabilities on our side that Rubenstein and Shakespeare are consummate pianists and writers? How can we know for sure that one is listening to Arthur Rubenstein the musical genius and not the lucky poseur? Rubinstein's musical skill (design) is that he was following a pre-specified concert program, and in this instance that he was playing a particular piece listed in the program note for note. His performance exhibited specified complexity. Specified complexity is how we eliminate bizarre possibilities in which chance is made to account for things that we would ordinarily attribute to design. (Dembski NFL p. 93-95)

There is an advantage for science of limiting probabilistic resources. Limited probabilistic resources, opens possibilities for knowledge and discovery that would otherwise be closed. Limits enable us to detect design where otherwise it would elude us. Also limitations protect us from the unwarranted confidence in natural causes that unlimited probabilistic resources invariably seem to engender. (Dembski NFL p. 100)


The Law of Conservation of Information
If chance has no chance of producing complex specified information, what about natural causes? Natural causes are incapable of generating CSI. Dembski calls this result the law of conservation of information or LCI.

In his book the Limits of Science, Peter Medawar proposes several corollaries:
(1) The CSI in a closed system of natural causes remains constant or decreases.
(2) CSI cannot be generated spontaneously, originate endogenously or organize itself.
(3) The CSI in a closed system of natural causes either has been in the system eternally or was at some point added exogenously (implying that the system, though now closed, was not always closed).
(4) In particular any closed system of natural causes that is also of finite duration received whatever CSI it contains before it became a closed system.

To explain the origin of information in a closed system requires what is called a reductive explanation. Richard Dawkins, Daniel Dennett and many scientists and philosophers are convinced that proper scientific explanations must be reductive, moving from the complex to the simple. The law of conservation of information (LCI) cannot be explained reductively. To explain an instance of CSI requires at least as much CSI as we started with. A pencil-making machine is more complicated than the pencils it makes. (Dembski ID p. 170-171)

The most interesting application of the law of conservation of information is the reproduction of organisms. In reproduction one organism transmits its CSI to the next generation. Most evolutionists would argue that the Darwinian mechanism of mutation and selection introduces novel CSI into an organism, supplementing the CSI of the parents with CSI from the environment. However, there is a feature of CSI that will count decisively against generating CSI from the environment via mutation and selection. The crucial feature of CSI is that it is holistic. To say that CSI is holistic means that individual items of information cannot simply be added together and thereby form a new item of complex specified information. CSI requires not only having the right collection of parts but also having the parts in proper relation. Adding random information to an already present body of information will distort or reduce the information already present. Even if two coherent bodies of information are combined the results, unless specified in some way will not be useful to the organism. A sentence with words scrambled is nonsensical, and contains no information. Also, two sentences that have no relationship with one another do not add to information already present. The specification that identifies METHINKS IT IS LIKE A WEASEL, and the specification that identifies IN THE BEGINNING GOD CREATED, do not form a joint juxtaposed line of information. CSI is not obtained by merely aggregating component parts, nor by arbitrarily stitching items of information together. (Dembski ID p.173-174)

The best thing to happen to a book on a library shelf is that it remains as it was when originally published and thus preserves the CSI inherent in its text. Over time, however, what usually happens is that a book get old, pages fall apart, and the information on the pages disintegrates. The Law of Conservation of Information is therefore more like a law of thermodynamics governing entropy than a conservation law governing energy, with the focus on degradation rather than conservation. The Law of the Conservation of Information is that natural causes can at best preserve CSI, may degrade it, but cannot generate it. Natural causes are ideally suited as conduits for CSI. It is in this sense, then, that natural causes can be said to "produce CSI." But natural causes never produce things de novo or ex-nihilo. When natural causes produce things, they do so by reworking other things. (Dembski NFL p. 162)

A classic example whereby information is degraded over time is seen in an experiment by Spiegelman in 1967. The experiment allowed a molecular replicating system to proceed in a test tube without any cellular organization around it.
The replicating molecules (the nucleic acid templates) require an energy source, building blocks (i.e., nucleotide bases), and an enzyme to help the polymerization process that is involved in self-copying of the templates. Then away it goes, making more copies of the specific nucleotide sequences that define the initial templates. But the interesting result was that these initial templates did not stay the same; they were not accurately copied. They got shorter and shorter until they reached the minimal size compatible with the sequence retaining self-copying properties. And as they got shorter, the copying process went faster. So what happened with natural selection in a test tube: the shorter templates that copied themselves faster became more numerous, while the larger ones were gradually eliminated. This looks like Darwinian evolution in a test tube. But the interesting result was that this evolution went one way: toward greater simplicity. Actual evolution tends to go toward greater complexity, species becoming more elaborate in their structure and behavior, though the process can also go in reverse; toward simplicity. But DNA on its own can go nowhere but toward greater simplicity. In order for the evolution of complexity to occur, DNA has to be within a cellular context; the whole system evolves as a reproducing unit. (Dembski NFL p. 209)


Application to Evolutionary Biology
How does all this apply to evolutionary biology? Complex specified information (CSI) is abundant in the universe. Natural causes are able to shift it around and possibly express it in biological systems. What we wish to know, however, is how the CSI was first introduced into the organisms we see around us. In reference to the origin of life we want to know the informational pathway that takes the CSI inherent in a lifeless universe and translates it into the first organism. There are only so many options. CSI in an organism consists of CSI acquired at birth together with whatever CSI is acquired during the course of its life. CSI acquired at birth derives from inheritance with modification (mutation). Modification occurs by chance. CSI acquired after birth involves selection along with infusion or the direct introduction of novel information from outside the organism. Therefore inheritance with modification, selection and infusion - these three account for the CSI inherent in biological systems.

Modification includes - to name but a few - point mutations, base deletions, genetic crossover, transpositions, and recombination generally. Given the law of conservation of information, it follows that inheritance with modification by itself is incapable of explaining the increased complexity of CSI that organisms have exhibited in the course of natural history. Inheritance with modification or by mutations needs therefore to be supplemented. The candidate for this supplementation is selection. Selection can introduce new information into a population. Nonetheless this view places undue restrictions on the flow of biological information, restrictions that biological systems routinely violate.

For example we can use Michael Behe's bacterial flagellum. How does a bacterium without a flagellum evolve a flagellum by the processes so far discussed? We have already outlined the complexity issue of the flagellum. How does selection account for it? Selection cannot cumulate proteins, holding them in reserve until with the passing of many generations they're finally available to form a complete flagellum. The environment nor the bacterial cell contains a prescribed plan or blueprint of the flagellum. Selection can only build on partial function, gradually generation after generation. But a flagellum without its full complement of proteins parts doesn't function at all. Consequently if selection and inheritance with modification are going to produce the flagellum, they have to do it on one generation. The CSI of a flagellum far exceeds 500 bits. Selection will only deselect any bacteria, that does not have the flagellum and a 500 bit novelty is far beyond any chance of occurring.

There remains only one source for the CSI in biological systems - infusion. Infusion becomes problematic once we start racing backwards the informational pathways of infused information. Plasmid exchange is well known in bacteria, which allows bacterial cells to acquire antibiotic resistance. Plasmids are small circular pieces of DNA that can be passed from one bacterial cell to another. Problems begin when we ask, where did the bacterium that released the plasmid in turn derive it? There is a regress here, and this regress always terminates in something non-organismal. If the plasmid is cumulatively complex, then the general evolutionary methods might apply. However, if the plasmid is irreducibly complex, whence could it have arisen? Because organisms have a finite trajectory back in time, biotic infusion must ultimately give way to abiotic infusion, and endogenous (intracellular), information must ultimately derive from exogenous (extra cellular) information.

Two final questions arise. (1) How is abiotically infused CSI transmitted to an organism? And, (2) where does this information reside prior to being transmitted? The obvious alternative is and must be a theological one. The information in biological systems can be traced back to the direct intervention of God. (Dembski ID p.175-182)

As Michael Behe's irreducibly complex biochemical system readily yields to design, so to does the fine-tuning of the universe. The complexity-specification criterion demonstrates that design pervades cosmology and biology. Moreover it is transcendent design, no reducible to the physical world. Indeed no intelligent agent who is strictly physical could have presided over the origin of the universe or the origin of life.

Just as physicists reject perpetual motion machines because of what they know about the inherent constraints on energy and matter, so too design theorists reject any naturalistic reduction of specified complexity because of what they know about the inherent constraints on natural causes. (Dembski ID p. 223)

Evolutionary biologists assert that design theorists have failed to take into account indirect Darwinian pathways by which the bacterial flagellum might have evolved through a series of intermediate systems that changed function and structure over time in ways that we do not yet understand. There is no convincing evidence for such pathways. Can the debate end with evolutionary biologists chiding design theorists for not working hard enough to discover those (unknown) indirect Darwinian pathways that lead to the emergence of irreducibly and minimally complex biological structures like the bacterial flagellum? Science must form its conclusions on the basis of available evidence, not on the possibility of future evidence. (Dembski ID p. 112-113)


The Darwinian Extrapolation
According to the Darwinian Theory, organisms' possess unlimited plasticity to diversify across all boundaries; moreover, natural selection is said to have the capability of exploiting that plasticity and thereby delivering the spectacular diversity of living forms that we see.

Such a theory, however, necessarily commits an extrapolation. And as with all extrapolations, there is always the worry that what we are confident about in a limited domain may not hold more generally outside that domain. In the early days of Newtonian mechanics, physicists thought Newton's laws gave a total account of the constitution and dynamics of the universe. Maxwell, Einstein, and Heisenberg each showed that the proper domain of Newtonian mechanics was far more constricted. It is therefore fair to ask whether the Darwinian mechanism may not face similar limitations. With many extrapolations there is enough of a relationship between inputs and outputs so that the extrapolation is experimentally accessible. It then it becomes possible to confirm or disconfirm the extrapolation. This is not true for the Darwinian extrapolation. There are too many historical contingencies and too many missing data to form an accurate picture of precisely what happened. It is not possible presently to determine how the Darwinian mechanism actually transformed, say, a reptile into a mammal over the course of natural history. (Dembski NFL p. 38)


Testability
Let's now ask: Is intelligent design refutable? Is Darwinism refutable? Yes to the first question, no to the second. Intelligent design could in principle be readily refuted. Specified complexity, in general, and irreducible complexity in biology are, within the theory of intelligent design, key markers of an intelligent agency. If it could be shown that biological systems that are wonderfully complex, elegant and integrate - such as the bacterial flagellum - could have been formed by a gradual Darwinian process, then intelligent design would be refuted on the general grounds that one does not invoke intelligent causes when undirected natural causes will do.

By contrast, Darwinism seems effectively irrefutable. The problem is that Darwinists raise the standard for refutability too high. It is certainly possible to show that no Darwinian pathway could reasonably be expected to lead to an irreducibly complex biological structure. But Darwinists want something stronger, namely, to show that no conceivable Darwinian pathway could have led to that structure. Such as demonstration requires an exhaustive search of all conceptual possibilities and is effectively impossible to carry out. (Dembski ID p. 282) What an odd set of circumstances. The methodology which has the more convincing and overwhelming evidence is ignored and the methodology that has little or no evidence is in vogue and irrefutable.

Let us turn to another aspect of testability - explanatory power. Underlying explanatory power is a view of explanation known as inference to the best explanation, in which a "best explanation" always presupposes at least two competing explanations. Obviously, a "best explanation" is one that comes out on top in a competition with other explanations. Design theorists have an edge up in explanatory power over natural selection. Darwinists, of course, see the matter differently.

What is the problem of having a design-theoretical tool chest added into a Darwinian tool chest? Much as some tools just sit there never to be used, design then has the option of just sitting there and possibly becoming superfluous. What is the fear of having a broad tool-chest? (Dembski ID p. 288-289)

Is there any hope for the evolutionist in exploring, with an unlimited amount of time, indirect Darwinian pathways which have yet to be discovered? For the sake of clarification, an indirect Darwinian pathway is a way in which a complex specified biological pathway can be described by a Darwinian naturalistic methodology which has yet to present itself as a measurable entity to science.


William Dembski provides us with an illustration.
Johnny is certain that there are leprechauns hiding in his room. Imagine this child were so ardent and convincing that he set all of Scotland Yard, onto the task of searching meticulously, tirelessly, decade after decade, for these supposed leprechauns, for any solid evidence at all of their prior habitation of the bedroom. Driven by gold fever for the leprechaun's treasure, postulating new ways of catching a glimpse of a leprechaun, a hair, a fingerprint, any clue at all the search continues. After many decades, what should one say to the aging parents of the now aging boy? Would it be logical to shake your finger at the parents and tell them, "Absence of evidence is not evidence of absence. Step aside and let the experts get back to work." That would be absurd. And yet that, essentially, is what evolutionary biologists are telling us concerning that utterly fruitless search for credible indirect Darwinian pathways to account for irreducible complexity. (Dembski ID p. 295-296)


Crossing the bridge - Meeting the Designer
What if the designing intelligence responsible for biological complexity cannot be confined to physical objects? Why should this burst the bounds of science? In answering this criticism, let us first of all be clear that intelligent design does not require miracles (as does scientific creationism) in the sense of violations of natural law. Just as humans do not perform miracles every time they act as intelligent agents, so too there is no reason to assume that for a designer to act as an intelligent agent requires a violation of natural laws. How much more effective could science be if it includes intelligent causes? Intelligent causes can work with natural causes and help them to accomplish things that undirected natural causes cannot. Undirected natural causes can explain how ink gets applied to paper to form a random inkblot but cannot explain an arrangement of ink on paper that spells a meaningful message. Whether an intelligent cause is located within or outside nature is a separate question from whether an intelligent cause has acted within nature. Design has no prior commitment against naturalism or for supernaturalism unless one opens that door. Consequently science can offer no principled grounds for excluding design or relegating it to the sphere of religion automatically.

Decisions on this issue should be based upon which process has the greater explanatory power, undirected natural causes or intelligent causes. Does the designer need to be defined? Cannot the designing agent be a regulative principle- a conceptually useful device for making sense out of certain facts of biology- without assigning the designer any weight in reality? The status of the designer, can then be taken up by philosophy and theology. The fact that the designing intelligence responsible for life can't be put under the microscope poses no obstacle to science. We learn of this intelligence as we learn of any other intelligence - not by studying it directly but through its effects. (Dembski DR p. 189-191)

All of us have identified the effects of embodied designers. Our fellow human beings constitute our best example of such designers. A designer's embodiment is of no evidential significance for determining whether something was designed in the first place. We don't get into the mind of designers and thereby attribute design. Rather, we look at effects in the physical world that exhibit clear marks of intelligence and from those marks infer a designing intelligence. (Dembski DR p. 192)

There is no principled way to argue that the work of embodied designers is detectable whereas the work of un-embodied designers isn't. Even if an un-embodied intelligence is responsible for the design displayed in some phenomenon, a science committed to the Naturalized Explanatory Filter (a filter which excludes God and thus design) will never discover it. A science that on a priori grounds refuses to consider the possibility of un-embodied designers artificially limits what it can discover. (Dembski DR p. 194-195) What happens when a God is implicated in design? The Explanatory Filter doesn't consider design but becomes naturalized and takes the process back to square one with a decision between contingency and necessity. (Dembski NFL p. 351)


The Burden Of Proof
Dembski often lectures in university campuses about intelligent design. Often, he will say, a biologist in the audience will get up during the question-and-answer time to inform him that just because he doesn't know how complex biological systems might have formed by the Darwinian mechanism doesn't mean it didn't happen that way. He will then point out that the problem isn't that he personally doesn't know how such systems might have formed but that the biologist who raised the objection doesn't know how such systems might have formed - and that despite having a fabulous education in biology, a well-funded research laboratory, decades to put it all to use, security and prestige in the form of a tenured academic appointment, and the full backing of the biological community, which has also been desperately but unsuccessfully trying to discover how such systems are formed for more than one hundred years, still doesn't know. (Dembski DR p. 214)

Many scientists have expressed their lack of knowledge of how any biochemical or cellular system could have evolved. Here are a few:

James Shapiro, a molecular biologist at the University of Chicago in the National Review, September 16, 1996, conceded that there are no detailed Darwinian accounts for the evolution of any fundamental biochemical or cellular system, only a variety of wishful speculations. David Ray Griffin, is a philosopher of religion with an interest in biological origins. He writes in his book Religion and Scientific Naturalism, there are, I am assured, evolutionists who have described how the transitions in question could have occurred. When I ask in which books I can find these discussions, however, I either get no answer or else some titles that, upon examination, do not in fact contain the promised accounts. That such accounts exist seems to be something that is widely known, but I have yet to encounter someone who knows where they exist. It is up to the Darwinists to fill in the details. (Dembski DR p. 214-215)

Let us look to the evolution of the eye as an example where we find a lack of information between evolutionary jumps. Darwinists, for instance, explain the human eye as having evolved from a light sensitive spot that successively became more complicated as increasing visual acuity conferred increased reproductive capacity on an organism. In such a just-so story, all the historical and biological details in the eye's construction are lost. How did a spot become innervated and thereby light-sensitive? How did a lens form within a pinhole camera? What changes in embryological development are required to go from a light-sensitive sheet to a light-sensitive cup? None of these questions receives an answer in purely Darwinian terms. Darwinian just-so stories have no more scientific content than Rudyard Kipling's original just-so stories about how the elephant got its trunk or the giraffe its neck. (Dembski NFL p. 368)

Are not the Darwinists applying blind faith to their theory? Listen to the remark by Harvard biologist Richard Lewontin in The New York Review of Books:

We take the side of science in spite of the patent absurdity of some of its constructs, in spite of its failure to fulfill many of its extravagant promises of health and life, in spite of the tolerance of the scientific community for unsubstantiated just-so stories, because we have a prior commitment, a commitment to materialism (i.e., naturalism). It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counterintuitive, no matter how mystifying to the uninitiated. (Dembski NFL p. 370-371)

This raises another question. What is the responsibility of teachers in their classrooms? If teachers who are persuaded of intelligent design and yet are directed by the system to teach evolution, should teach Darwinian evolution and the evidence that supports it. At the same time, however, they should candidly report problems with the theory; notably that its mechanism of transformation cannot account for the specified complexity we observe in biology.


Dembski - Major Points
1. Materialists search for contrivance through the activity of natural processes rather than contrivance as the mental process of an intelligence.
2. The option of a God's interaction with the world could be a legitimate domain for scientific investigation.
3. Intelligently caused events can be distinguished from unintelligently caused events.
4. Detection involves discovering information.
5. By excluding intelligent design from science stifles scientific inquiry.
6. The Explanatory Filter identifies information and thus a designer.
7. Choice is an important feature of intelligence.
8. Natural selection has no power to choose. It has no eye on the past or the future; It is blind.
9. Information can be both complex and specified.
10. Complex specified information cannot arise by chance.
11. There exists a "probability bound", beyond which chance cannot overcome.
12. Existing information cannot increase on its own, but can remain stable for a period of time or be lost, as understood by the Law of Conservation of Information.
13. The origin of original information is a mystery to modern science.
14. The existence of a molecular machine such as the bacterial flagellum is beyond explanation by natural processes.
15. Darwin uses an extrapolation to make his case, but an extrapolation that has limited or no data to confirm it.
16. Darwinism, unlike Intelligent Design, is not subject to refutation.
17. Decisions should be made about origins, based upon which proposal has the best explanatory power.

The Limitations of Scientific Truth - Why Science Can't Answer Life's Ultimate Questions - Nigel Brush

"Deductive" and "Inductive" reasoning cannot lead to absolute truth

The most important contribution that Francis Bacon made to modern experimental science was his insistence that the "inductive method" - that is, arguing from specific instances to general/universal statements - was the only proper was to conduct scientific research. Before Bacon, Aristotle had dominated scientific thinking for nearly two thousand years. Aristotle stated that "deduction" (arguing from general principles to specific instances) was the only proper method of logic.

- A deduction starts with a hypothesis without necessarily making any specific observations.
- Hypothesis: Dogs are smarter than cats.
- Confirming the hypothesis

1. My dog can retrieve a ball; my cat can't.
2. My dog can pull a wagon; my cat can't.
3. My dog can protect me; my cat can't.
4. Dogs chase cats; cats don't chase dogs.

- Negating the hypothesis
Although there may be as many (or more) instances that negate the hypothesis as confirm it, because the researcher has started out with a hypothesis that is often based on personal opinion or bias (rather than observation), it is usually far easier to confirm the hypothesis than to negate it.

Because of this asymmetry between confirmation and negation, the deductive method often tells us far more about how the scientist wants to see the world, than how the world actually is. Despite its flaw, scientists and philosophers over the next two millennia continued to follow Aristotle's deductive methodology.

During the 1600's, Bacon championed the "experimental approach" to knowledge. Although Bacon recognized some applications for the deductive approach, he argued in Novum Organum that the only proper method of reasoning for a scientist was by "induction". Bacon believed that if a scientist really wanted to understand nature, he or she should begin by observing nature. From these observations hypotheses could be formulated in an attempt to explain what had been observed. At a murder scene, investigators look for evidence that will lead to a hypothesis as to who might have committed the murder, rather than beginning with assumptions (hypotheses) and then looking for evidence that proves the preconceived opinion.

Today, inductive reasoning forms the very foundation of experimental science.
Deduction is still used in testing hypotheses but induction is by far the most important of the two reasoning techniques. (Brush LST p. 52-56)

This philosophy became known as empiricism and was further promoted by John Locke and David Hume. Hume was interested in those things that could be apprehended only through the senses, things that could be experienced. Could you see it, smell it, hear it, touch it, taste it? Could you weigh it, measure it, dissect it? If so, Hume and other empiricists were interested. Hume, however, saw religion (knowledge through divine revelation) and metaphysics (the study of ontology or epistemology) as being essentially meaningless because they did not have these properties. Because of his love for empiricism, Hume decided to accept no knowledge that could not be proven through experience. This decision had disastrous consequences: at the heart of the inductive method is the assumption that an association of events implies causation.

According to Hume, only when one has examined every rock in the universe can one justifiably make the universal statement that "all brightly colored rocks contain iron oxide." Because it is obviously impossible to make an infinite number of observations in one or many human lifetimes, inductive statements can never be absolutely proven. Hume's empiricism eventually led him to the realization that scientific truth cannot be equated with absolute truth. Because all scientific hypotheses are derived from induction, they can never be proven on the basis of empirical observations.
Hume's Problem remained unanswered, and many of the brightest minds of the next three centuries continued to wrestle with the inevitable question that arose from it: "If all scientific theories are equally improvable, what distinguishes scientific knowledge from ignorance, science from pseudo-science?" (Brush LST p. 56-60)


Mathematics cannot lead to absolute truth
One would, perhaps believe, that anything (such as a scientific theorem) that could be proven mathematically would be the purest form of truth, due in part to the accuracy of mathematics itself. However there are weak links in mathematical methodology and processes. These weak links were first discovered by the Austrian mathematician Kurt Godel (1906-1977). If mathematics was to be the final arbiter of scientific truth, Godel and other scientists wanted to prove that mathematical systems are themselves "complete" - that is, every true statement of number theory can be derived from within the system itself - and "consistent" - that is, mathematical statements contain no contradictions. (Brush LST p. 68)

Godel made the startling discovery that all formal mathematical systems are both incomplete-- in that mathematics would not be able to prove all possible truths - and inconsistent - in that, mathematical theories could not even prove themselves. In 1931, Godel published his findings in a seminal paper on the consistency and completeness of mathematics titled "On Formally Undecidable Propositions of Principia Mathematica and Related Systems I." What Godel's First Incompleteness Theorem did was to show that all mathematical systems are incomplete because they are unable to encompass every possible truth. In other words, some things exist that we absolutely know to be true but cannot prove through the use of any mathematical system.

Godel was also able to show that all mathematical systems are inconsistent in that they contain contradictions. By substituting the idea of "proof" for "truth", Godel was able to introduce into mathematics the famous Epimenides Paradox. Epimenides was a sixth-century B.C. poet from Crete who made the paradoxical statement, "All Cretans are liars." The Epimenides Paradox forever trapped philosophers in a strange loop because they could never determine whether Epimenides' statement, was true or false. Epimendes was a Cretan, so he must be lying. If he was lying, however the statement "All Cretans are liars," must be true. Godel took the core out of the Epimenides Paradox - "This statement of number theory does not have any proof". By doing so, Godel was able to show that mathematical systems can contain contradictions and are therefore inconsistent. (This contradiction is true only of theories or systems, not of mathematical givens such as 2+2 = 4.) Therefore, how can mathematics be used to validate the empirical observations of scientists if it cannot be used even to validate its own consistency? In plain language, it cannot. (Brush LST p. 70)

Based, then, on the work of both Hume and Godel, the conclusion is inescapable that absolute truth cannot be confined within the bounds of logical (inductive) or mathematical (probabilistic) systems. At best, all that can be done with induction or mathematics is to apprehend a part of the larger truth that is out there; the systems being used are simply not robust enough to capture the entirety of this truth. (Brush LST p. 71)


The Principle of Falsification - Karl Popper (1902-1994)
By 1934, Popper had concluded that the mathematical probability of all scientific theories was zero. In his work, The Logic of Scientific Discovery, Popper stated, "My own view is that the various difficulties of inductive logic here sketched are insurmountable. So also, I fear, are those inherent in the doctrine, so widely current today, that inductive inference, although not 'strictly valid', can attain some degree of 'reliability' or of 'probability'. (Brush LST p. 73)

Popper's second major breakthrough was his recognition of the "asymmetry between verifiability and falsifiability". For example - based on a casual observation of swans, on might easily formulate the hypothesis, "All swans are white." The only way, of course, to verify this statement would be to examine every swan in the universe to absolutely certain that all swans are, indeed, white. :Popper, however, pointed out that an infinite number of observations would not be necessary to prove that this statement is false. A single observation of a black swan would be sufficient to falsify the statement, "All swans are white". Popper showed, therefore, that while it is forever beyond our ability to prove absolutely (verify) a universal statement, it is well within our means to disprove (falsify) such a statement. All truly scientific statements must be written such that they can be falsified - not verified. As Popper stated, "the criterion of the scientific status of a theory is its falsifiability, or refutability, or testability". (Brush LST p. 74-75)

In a perfect world, scientists might be willing to open up their work to criticism by pointing out the weak parts of their theories; under ideal conditions, scientist s might willingly abandon pet theories as soon as they found them to be false. But in the real world things are quite different. Is Popper's falsifiability criterion the solution to the problem of demarcating science from pseudo-science? No. For Popper's criterion ignores the remarkable tenacity of scientific theories. Scientists have thick skins. They do not abandon a theory merely because facts contradict it. They normally invent some rescue hypothesis to explain what they then call a mere anomaly or, if they cannot explain the anomaly, they ignore it, and direct their attention to other problems.

Popper's principle of falsification fails, then, to set science apart from pseudo-science. Scientists, naturally having a vested interest in the outcome of their work, are far more prone to justify than to falsify their theories. If neither induction, empiricism, verification, mathematical probability, nor falsification, can be used to separate scientific truth from religious or metaphysical truth, what can? (Brush LST p. 79-81)

Paul Feyerabend, philosopher of science (1924-1994) - came to the long-overdue conclusion that, in reality, there is no difference between scientific, religious, and metaphysical truth. Truth is truth no matter where you find it; it is the one immutable object in the universe. As Albert Einstein concluded, "All religions, arts, and sciences are branches of the same tree". Science itself, concluded Feyerabend, is a religion. Moreover, in the search for truth, because no preferred or superior methodology exists, the human mind should simply make use of every pathway that it finds available. (Brush LST p. 82-83, 85)


Science as storytelling
Stephen J. Gould notes in an essay titled "Literary Bias on the Slippery Slope,"
So much of science proceeds by telling stories - and especially vulnerable to constraints of this medium because we so rarely recognize what we are doing. We think that we are reading nature by applying rules of logic and laws of matter to our observations. But we are often telling stories - in the good sense, but stories nonetheless.

The story of human evolution has great literary appeal because we've been telling stories to our children for generations. The usual basic plots are found in folktales around the world. The appearance of common story motifs and plots in scientific accounts of human evolution should warn us that we are not being given "just the facts." The "facts" in evolutionary reconstructions have been selected and standardized from a much larger body of data and have been organized in such a way that they tell a logical, pleasing story. Discrepancies or missing data are often ignored in the interest of telling a story that is complete and the flows smoothly from one point to the next. (Brush LST p. 107-108)


Collapse of the Clockwork Universe
Newton's seventeenth-century picture of the world gave time a transcendental status. Time just passed, inexorably and uniformly, entirely unaffected by the events and contents of the universe. Einstein's picture of time was radically different. The geometry of space and the rate of low of time were both determined by the material contents of the universe. (Brush LST p. 146) Time is affected by velocity and gravity. Time runs at different rates in different places.
Martin Gardner, in his book Relativity for the Millions, discusses the problem of absolute simultaneity:
It is important to understand that this is not just a question of being unable to learn the truth of the matter. There is no actual truth of the matter. There is no absolute time throughout the universe by which absolute simultaneity can be measured. Absolute simultaneity of distant events is a meaningless concept. (Brush LST p. 146)


Quantum Mechanics and Quantum Weirdness
At the beginning of the twentieth century, when scientists set out to delineate the atomic structure of matter, they assumed that they would find an orderly microcosm within the atom that would operate by the same principles and laws that governed the stars and the planets. Once these rules had been identified and quantified, science would have fulfilled Decarte's and Laplace's dreams of mapping the movement and interaction of the invisible particles out of which the universe was built. Science would then be at the very doorstep of absolute truth, ready to advance to that final and complete explanation for the workings of this vast, clocklike universe. Quantum mechanics, however, transformed this dream into a nightmare, and by the end of the twentieth century, most scientists had given up hope of ever fully understanding the nature of reality. (Brush LST p. 147)

Einstein's theory of relativity undercut the placid assumption that the world is as we see it. He showed that even such fundamental concepts as length, mass, and time are not absolute - they change under the influence of acceleration or gravity. Because of the relativity of time, there can be no "one moment of time" for the entire universe. Relativity thus imposes severe spatial limitations on what scientists can do or know. Planck's quantum mechanics gave a very different picture of reality, a reality best described as "quantum weirdness," where uncertainty, rather than certainty, is the ruling principle. When an electron absorbs a quantum of energy, it doesn't simply move across the space from one orbit to another orbit. Instead, the electron exists either in one shell or another but is never in transition between two shells. The logic of moving from point A to point B across the intervening space is defied. The basic rules of cause and effect that govern events in the macro-world apparently do not apply in micro-world. Instead, things moved in a disjointed or discontinuous manner. They "jumped" from one place to another, seemingly without effort and without bothering to go between the two places. (Brush LST p. 152-155)


Causes of Quantum Uncertainty - The Micro-universe
Electrons sometimes behave in two ways; sometimes as particles and sometimes as waves. When attempting to determine the exact position of an electron, the focus is on the particle aspect of the electron; when attempting to determine its momentum, the focus is on its wave aspect. Consequently, the exact position of the electron can be determined only when it is treated as a particle; the exact momentum of an electron can be determined only when it is treated as a wave. Quantum theory takes away the certainty that scientists cannot hope to discover the "real" world in infinite detail, not because there is any limit to their intellectual ingenuity or technical expertise, nor even because there are laws of physics preventing the attainment of perfect knowledge. The basis of quantum theory is more revolutionary yet: it asserts that perfect objective knowledge of the world cannot be had because there is no objective world. (Brush LST p. 159, 161)


The Macro-universe
The special-relativity rule that nothing can be accelerated to a velocity greater than that of light does not apply to galaxies in an expanding universe. That rule is true in static space, but expanding cosmic space can carry galaxies away from one another at velocities greater than that of light. (Brush LST p. 170)

The Hubble radius suggests that there might be a limit to what astronomers can learn about the physical universe. Hubble found that the farther away a galaxy is from the earth, the faster it is receding due to the expansion of space. Some galaxies are moving away from us at speeds representing a significant percentage of the speed of light. If the Hubble constant remains valid, we will reach a point in our observations at which galaxies are so distant that they are receding at the speed of light or even faster. This point will denote the edge of the observable universe. Beyond the Hubble radius, myriad other galaxies may well exist, but astronomers will never be able to see them because they are receding from earth faster than their light is traveling toward earth. If so, scientific knowledge of the universe will forever be limited to the "visible" portion of the universe within the Hubble radius. Thus, in the macro-universe - as in the micro-universe - there are severe spatial limitations to how much scientists can learn about nature. (Brush LST p. 169-171)

Modern physics therefore is dominated by two theories that arose in the early part of the twentieth century: quantum mechanics and relativity. One theory explains the micro-world of atoms and subatomic particles; the other theory explains the macro-world of stars and galaxies. Obvious questions arise. Why should physicists need two different theories to explain one universe? Why should stars and galaxies be governed by different laws than electrons and atoms? If physicists try to imagine the universe collapsing back into its initial state at the beginning of the big bang, they run into problems. In a collapsing universe, general relativity predicts that gravity will eventually compress all of the matter in the universe into a singularity. As the universe shrinks we pass a certain point we enter the realm governed by quantum mechanics, not relativity. In order to discuss the beginning of the universe - (without a God), we need a theory that combines general relativity with quantum mechanics. Attempts to merge the two theories, and the forces involved has been described as trying to mix fire and water. (Brush LST p. 181-182)


Empirical limitation
Although scientists can strive for objectivity in their analysis and interpretation of empirical observations, they are never entirely free from the subjective influence of their backgrounds, experiences, educations, beliefs, hopes, fears, theories, and biases.
There is an old joke about a man who believed that he was dead. To dissuade him of this belief, the man's doctor got him to agree that dead men don't bleed. The doctor then pricked the man's finger with a pin, producing blood, whereupon the man exclaimed, "Doctor, we were both wrong; dead men do bleed!" This is the problem we face when human beings are searching for truth apart from the absolute truth described in Scripture.


Problems with Seeing
Constraints arise from the structure of the human mind. These constraints include temporal, logical, cultural, and physical limitations on what we can know about the universe or how clearly we can understand those aspects of the universe that are knowable. Because scientific truth becomes temporal and relative rather than permanent and absolute, it can easily be molded to accommodate the philosophical orientation of its user. In his famous book A Brief History of Time: From the Big Bang to Black Holes (1988), Hawking attempted to prove that the universe has neither spatial nor temporal boundaries. If this is true, then the universe has no beginning or end. Consequently, Hawking argued that his model of quantum cosmology had eliminated the need for a God. On the other hand, scientist Banesh Hoffmann found in the workings of quantum mechanics strong support for his Christian faith. Hoffmann stood in awe of the fact that the entire physical universe, including our bodies and minds, is constructed of subatomic particles that constantly flicker back and forth between matter and energy and can never be apprehended entirely by empirical means. Here we see two physicists, both of whom are using the facts of quantum mechanics, arriving at totally different conclusions about ultimate reality. According to Hawking, quantum mechanics can "eliminate" the need for a Creator; according to Hoffman, quantum mechanics can "illuminate" the work of the Creator. Because science cannot provide us with absolute truth, our interpretation of the limited facts we do have available is very much subject to our preexisting desires, beliefs, and attitudes. Our senses become filters that allow certain types of information to pass into our minds but selectively screen out other types of information. If a person is given to unbelief, as soon as one barrier to faith is dismantled, he or she will erect a new one. If a person is given to belief, models and paradigms of the universe may change, but the handiwork of God is ever present and apparent. In this sense, we are all self-made individuals; we believe only what we want to believe. (Brush LST p. 203-207)


Problems with Hearing
Eleanor Arroway, the brilliant female astronomer in Carl Sagan's science fiction novel Contact, made the following statement:

…if God wanted to send us a message, and ancient writings were the only way he could think of doing it, he could have done a better job. And he hardly had to confine himself to writings. Why isn't there a monster crucifix orbiting the Earth? Why isn't the surface of the Moon covered with the Ten Commandments? Why should God be so clear in the Bible and so obscure in the world?

In the last chapter of Sagan's Contact, titled "The Artist's Signature," Arroway discovers that God has indeed left a clear message embedded in the fabric of the universe. Using a supercomputer to analyze pi, she discovers that by using Base 11 arithmetic (which can be written out entirely as zeros and ones), "Hiding in the alternating patterns of digits, deep inside the transcendental number, was a perfect circle, is form traced out by unities in a field of zeroes." Here was clear evidence that the very geometry and mathematics that scientists use to study the universe are themselves formed by the hand of God.

If we choose to hear the message: "For since the creation of the world God's invisible qualities his eternal power and divine nature - have been clearly seen, being understood from what has been made, so that men are without excuse" (Rom. 1:20).
Nonetheless, as Jesus said, "Though seeing, they do not see; though hearing, they do not hear or understand" (Matt. 13:13), the problem lies not with the existence or clarity of the message, but with one's willingness to hear and understand that message. (Brush LST p. 208-211)


Problems with Interpreting
The idea that the universe, the galaxy, the solar system, the earth, life upon the earth, and the human mind, all arose by random chance and therefore have no real meaning staggers the human imagination - at least some human imaginations. Many ideas are initially appealing to the human mind simply because they are so foreign to common sense. Nevertheless, many scientists have prided themselves in believing the unbelievable and condemning the rest of society for not placidly following their example. As the White Queen boasted to Alice in Lewis Carroll's Through the Looking Glass, "Why, sometimes I've believed as many as six impossible things before breakfast".

In his book The Creator and the Cosmos (1993), Hugh Ross identifies no less than twenty-six physical parameters that must fall within extremely limited ranges in order for life to exist anywhere within the universe. He identifies another thirty-three parameters that must be precisely set for life to be possible on the earth. Science has found repeatedly that the statistical probabilities for life arising by chance in the universe are ridiculously low. Many scientists have looked at the evidence and have not missed the implications inherent in the fine-tuning of the universe, while others have dismissed it as "illusory." (Brush LST p. 211-213)

It is interesting, however, that when the evidence for intelligent design began to emerge from their studies of the cell, few biologists were ready to shout their discovery from the housetops. Indeed, as Behe points out, no celebration accompanied this major scientific discovery:

But no bottles have been uncorked, no hands slapped. Instead, a curious, embarrassed silence surrounds the stark complexity of the cell. When the subject comes up in public, feet start to shuffle, and breathing gets a bit labored. In private people are a bit more relaxed many explicitly admit the obvious but then stare at the ground, shake their heads, and let it go at that. (Brush LST p. 224)


Abandonment of Absolutes
In "Confessions of a Former Cultural Relativist" (1990), Henry H. Bagish bemoans the fact that college students are losing the ability to make value judgments: "I find students generally very reluctant to judge anyone's behavior, to evaluate it in any way. Most of them resist saying that anyone else's ideas or behavior are wrong, or bad." (Brush LST p. 240-241)

There is a sense of cultural relativism sweeping the country. If science, religion, and philosophy are the primary avenues by which humans have sought truth down through the ages, what happens to these disciplines if cultural relativism continues to spread and humanity completely abandons its belief in absolute truth?

Christ expects us to follow that narrow path by keeping our eyes firmly fixed on Him, who is Truth: Enter through the narrow gate. For wide is the gate and broad is the road that leads to destruction, and many enter through it. But small is the gate and narrow the road that leads to life, and only a few find it" (Matt. 7:13-14)

Science is but one manifestation of humanity's quest for absolute truth - not the ultimate acquisition of absolute truth. Because scientific truth is constantly changing, it cannot be absolute truth. Because modern science is not absolute truth, it must contain a mixture of truths and non-truths. Whether science can someday overcome these limitations and arrive at absolute truth is certainly open to debate. What cannot be debated is the current incomplete (non-absolute) state of current scientific understanding. (Brush LST p. 248, 254)

Scientific truth is not superior truth. The popular model of science states that scientific truth is the superior form of truth. This belief has been widely disseminated among the general public. Teachers are encourages to teach science in their class rooms, but religion is forbidden. Science textbooks, as well as secular movies, often portray religion as the great persecutor of science and free thought. But is scientific truth really superior to biblical truth? Repeatedly in the Old Testament, when the people of Israel or Judah placed their trust in false gods, they suffered for their idolatry. Perhaps the modern world's deification of scientific truth will have similar dire consequences.

We have seen in this study that each element in the scientific process has been found to have significant problems. If the truths of science are transitory and incomplete, if the study area of science is restricted by spatial boundaries, if the methodology of science is logically flawed, and if the techniques of science are subject to personal and cultural biases - exactly what is it that makes scientific truth superior to biblical truth? The answer is, of course, nothing! (Brush LST p. 265-267)

Christians are subject to similar limitations. We are finite beings trying to understand an infinite God, but we are not in a position to dictate the terms of that relationship. We cannot know God unless He reveals Himself to us. We therefore have our own limitations in seeking absolute truth. It is unfortunate that with human understanding and interpretation of Scripture come all of the perils of human ignorance, willful blindness, and cultural bias that we have already documented regarding scientific understanding and interpretation of facts in the physical world. Thus, theological truth, like scientific truth, also has it s limitations. However, the combination of biblical and scientific truth is more powerful than the attempts at science at obtaining truth by itself. (Brush LST p. 271, 274)


Brush - Major Points
1. The deductive method of science, due to the asymmetry between confirmation and negation, often tells us far more about how the scientist wants to see the world, than how the world actually is.
2. Empiricism or induction limits itself to what can be measured by the senses.
3. Religion and metaphysics cannot be measured and are eliminated in the process of determining truth.
4. Inductive statements cannot lead to absolute truth because of an infinite number of observations that must be made.
5. Mathematics cannot lead to absolute truth due to weak links in methodology and processes which show them to be inconsistent and incomplete.
6. Scientists do not always abandon theories when the facts contradict them.
7. Scientists tend to justify theories rather than falsify them.
8. Scientists resort to story telling; facts are selected and standardized from a much larger body of data.
9. Time is a relative term and has no simultaneity throughout the universe.
10. Due to quantum mechanics, most scientists have given up of ever fully understanding the nature of reality.
11. Concepts of length, mass, and time are not absolute.
12. The rules that govern events in the macro-world, do not apply in the micro-world.
13. Due to the universe potentially expanding faster than the speed of light, there are special limitations to how much scientists can learn about nature.
14. There exist two theories to explain one universe; one that attempts to explain the macro-universe and one that attempts to explain the micro-universe.
15. There are constraints which limit scientists' ability to see, including those which are temporal, logical, cultural and physical.
16. There are constraints which limit a scientists' ability to hear: Romans 1:20, Matthew 13:13.
17. Scientists are constrained in their ability to interpret. When faced with new "facts", they do not re-interpret to account for the new facts.
18. There is a movement away from searching for absolutes towards cultural relativism.
19. Because scientific truth is constantly changing, it can not be absolute truth.
20. Scientific truth is not superior truth.


Appendix
Michael J. Behe is Professor of Biological Sciences at Lehigh University in Pennsylvania. He received his Ph.D. in Biochemistry from the University of Pennsylvania in 1978. Behe's current research involves delineation of design and natural selection in protein structures. In addition to publishing over 35 articles in refereed biochemical journals, he has also written editorial features in Boston Review, American Spectator, and the New York Times. His book, Darwin's Black Box discusses the implications for neo-Darwinism of what he calls "irreducibly complex" biochemical systems. The book was internationally reviewed in over one hundred publications and recently named by National Review and World magazine as one of the 100 most important books of the 20th century.

Behe has presented and debated his work at major universities throughout North America and England.

Lee M. Spetner received the PhD degree in physics from MIT in 1950. He was with the Applied Physics Laboratory of the Johns Hopkins University from 1951 to 1970, where he was engaged in research and development in signal processing and the scattering of electromagnetic waves from the earth's surface. From 1958 he was a member of the principal professional staff of the laboratory. He spent the academic year 1962-63 on a fellowship in the Department of Biophysics at the Johns Hopkins University. During that time he became interested in evolution and published several papers investigating information buildup in evolution. He taught graduate courses in physics at Howard University for about five years, and for about 10 years he taught information and communication theory at the Johns Hopkins University and at the Weizman Institute. In 1970 he moved to Israel where he took the position as technical director of Eljim Ltd., a new subsidiary of KMS Inc. of the US, engaged in research and development in military electronics. In 1984 he retired and continued his hobby of studying organic evolution which began back in 1964.

Prof. Dr-Ing. Werner Gitt was born in Raineck/East Prussia in 1937. In 1963 he enrolled at the Technical University of Hanover and in 1968 he completed his studies as Diplom Ingenieur. Thereafter he worked as an assistant at the Institute of Control Engineering at the Technical University of Aachen. Following two years of research work, he received his doctorate summa cum laude together with the prestigious Borchers Medal, from the Technical University of Aachen, Germany, in 1970. He is now Director and Professor at the German Federal Institute of Physics and Technology. He has written numerous scientific papers in the field of information science, numerical mathematics, and control engineering, as well as several popular books written in several languages. Since 1984 he has been a regular guest lecturer at the State Independent Theological University of Basle, Switzerland, on the subject of 'The Bible and Science'. He has held lectures on related topics at numerous universities at home and abroad, as well as having spoken on the topic 'Faith and Science' in a number of different countries.

William Dembski - holds a PhD in mathematics from the University of Chicago and a Ph.D. in philosophy from the University of Illinois at Chicago. He also has earned degrees in statistics, theology and psychology. Currently he is associate research professor in the conceptual foundations of science at Baylor University and a senior fellow of the Discovery Institute's Center for the Renewal of Science and Culture. He has done postdoctoral work at the University Of Chicago, Massachusetts Institute of Technology, Princeton University and Northwestern University. He has written numerous scholarly articles and is the author of the critically acclaimed The Design Inference (Cambridge), Intelligent Design (Intervarsity press) and No Free Lunch: Why Special Complexity Cannot Be Purchased without Intelligence (Rowman and Littlefield.

Nigel Brush - Ph.D., UCLA - is an assistant professor of geology at Ashland University in Ohio. A committed Christian and scientist, he has conducted archaeological, geological, and environmental fieldwork in England, Canada, New York, Ohio, and California.


Bibliography
Behe, Michael J. Darwin's Black Box.: The Biochemical Challenge to Evolution. Touchstone, Simon & Schuster. 1996.
Brush, Nigel. The Limitations of Scientific Truth: Why Science Can't Answer Life's Ultimate Questions. Kregel Publications. 2005.
Dembski, William A. No Free Lunch: Why Specified Complexity Cannot Be Purchased without Intelligence. Roman & Littlefield Publishers, Inc. 2002
Dembski, William A. Intelligent Design: The Bridge between Science and Theology. Downers Grove, Ill.: InterVarsity Press, 1999.
Dembski, William A. The Design Revolution: Answering the Toughest Questions about Intelligent Design. Downers Grove, Ill.: InterVarsity Press, 2004.
Gitt, Werner. In The Beginning was Information. CLV - Christliche Literatur- Verbreitung e. V. 2001.
Spetner, Lee M. Not By Chance: Shattering the Modern Theory of Evolution. The Judaica Press, Inc. 1998.