Reasons to Believe

Connections 2008, Vol. 10, No. 1

Do Infinite Universes Explain the Fine-Tuning?
Jeff Zweerink, Ph.D.

A monkey randomly hitting keys on a keyboard will eventually produce the entire collection of Shakespeare's works–at least if the monkey types for an infinite amount of time. The truth of the previous statement relies on (at least) two conditions. First, the monkey must actually use all the keys in a random fashion. Second, but more important, the keyboard must contain all the necessary letters and punctuation to produce Shakespeare's works.

An analogous situation arises when scientists try to connect the numerous solutions of string theory (see sidebar) to a proper description of our universe. It is an uncontroversial statement that our universe appears designed and fine-tuned to support life. The controversy begins when scientists put forth explanations for the design and fine-tuning. RTB's creation model asserts that a supernatural Designer created and fashioned this universe for the explicit purpose of supporting human life. By contrast, some naturalistic scientists argue for a model where an infinitude of universes (sometimes called multiverse) exist. In this model, the apparent design simply reflects a huge selection effect. In other words, of this infinite number of universes, most do not support life. However, our existence mandates that we reside in, and therefore observe, a life-friendly universe.

What might produce this multitude of universes? Two unsolved issues lend insight into this question. The first issue relates to scientists' attempts to unify gravity with the other fundamental forces (the electromagnetic, strong, and weak nuclear forces) using string theory. In part because string theory requires an extra six dimensions beyond the familiar three spatial and one time dimension, an incredibly large number of possible solutions exist in string theory. Some string models contain an infinite number of solutions. Each of these solutions describes potential universes that would operate with different laws of physics, different fundamental constants, and even different dimensionality. In order to match our universe, any workable solution must demonstrate how these extra dimensions remain small and undetectable.

The second issue involves the mechanism that caused the faster-than-light expansion–called inflation–in the earliest moments after the creation event. Scientists' current best theoretical understanding of inflation predicts that a multitude of other universes actually exist! However, in order to explain the physical properties of our universe such as the geometry, dark energy characteristics, initial density fluctuations, and gravitational waves, the mechanism causing inflation must abide by a set of restrictive rules.

Presently, the theories describing the early universe do not specify the inflation mechanism so scientists must insert some kind of mechanism into the models. However, many hope that a better understanding of string theory will specify the proper inflation mechanism. Thus, a significant amount of effort has been directed toward finding string theory solutions that match the physical properties of the universe.

Because of the enormous number of string theory solutions scientists typically restrict themselves to a particular subset of solutions where the proper calculations are possible. One group of cosmologists performed such a study to find string theory solutions that produced inflation.1 Even the restricted subset contained an infinite number of solutions. Detailed numerical investigations of these solutions demonstrated that, even though infinite in number, none of them produced an inflationary epoch like that observed in our universe!

This discovery highlights two important apologetic implications. Even if shown true, the multiverse may not provide an adequate explanation for the design and fine-tuning observed throughout our universe. Nothing guarantees that inflation/string theory mechanisms can produce our universe through strictly natural processes. It may be like a monkey using a keyboard with no vowels. No matter how long the monkey types, it will never produce Shakespeare's works.

More importantly, assuming that a multiverse scenario is correct and that the multiverse does sample all the possible universes, such a scheme brings major philosophical issues into the science arena. For example, scientists use the multiverse to explain the incredible improbability of the universe's ability to support life compared to far more abundant sterile possible universes. However, that same argument means that inhabitants of Matrix-like simulations also abound. (Recall that in the Matrix movie series sentient machines created a simulated reality.) In fact, inhabitants of these simulations far outnumber human beings on an earth-like planet.

Sir Martin Rees explains the consequences in this way:

All the multiverse ideas lead to a remarkable synthesis between cosmology and physics...But they also lead to the extraordinary consequence that we may not be the deepest reality, we may be a simulation. The possibility that we are creations of some supreme or super-being, blurs the boundary between physics and idealist philosophy, between the natural and the supernatural, and between the relation of mind and multiverse and the possibility that we're in the matrix rather than the physics itself.2

Given the unusual difficulties associated with multiverse solutions, perhaps the best explanation for the universe's fine-tuning is also the simplest one: the universe looks designed because a Designer fashioned it, just as the great classics of literature were written not by a tireless monkey but by a guy named Bill.

What Is String Theory?
Two remarkable theories warrant the description of "most extensively tested and verified." Quantum mechanics describes how energy and matter behave on subatomic scales. According to quantum mechanics, any measurable quantity (such as charge, mass, energy) comes in discreet amounts. On the other hand, general relativity describes how mass and energy interact with one another and with space-time. According to Einstein, gravitational attraction arises from energy and matter warping space-time, which consequently influences how other objects move through space-time. However, general relativity requires a continuous space-time.

While these two theories have passed every experimental test scientists have thrown at them, a fundamental problem exists. The enormous mass and energy density of the early universe necessitate using general relativity to describe the dynamics. The small sizes and immense temperatures of the early universe require a quantum mechanical description. However, the discreet nature of quantum mechanics fundamentally conflicts with the continuous nature of general relativity. Scientists strongly believe that a unified theory incorporating both quantum mechanics and general relativity (which properly describes the dynamics of the universe) exists.

These unified theories generically require the existence of additional spatial dimensions beyond the three large dimensions we commonly experience. String theory represents the most popular of these unified theories. The basic concept behind the theory is that two dimensional strings comprise all fundamental particles and interactions. This, in essence, imposes a fundamental smallest size to any physical thing, which, in turn, allows both general relativity and quantum mechanics to coexist.

According to scientists' best estimates, more than 10500 different solutions to the equations of string theory exist. The challenge is to find the solution(s) that describe this universe. However, this research demonstrates the difficulty of finding such a solution, which highlights the fine-tuned nature of our universe.

1. Mark P. Hertzberg et al., "Searching for Inflation in Simple String Theory Models: An Astrophysical Perspective," Physical Review D, 76 (November 13, 2007): 103521.
2. Paul Davies, Cosmic Jackpot: Why Our Universe is Just Right for Life (Boston: Houghton Mifflin Company, 2007): 179-90.


Pairing Up on the Dance Floor
Fazale "Fuz" R. Rana, Ph.D.

Imagine jostling and bumping your way around a crowded dance floor. The lights are low and the music is loud. You are trying to connect with a blind date. And all you have to go on is the person's name. It might take you most of the evening to find your companion.

Proteins inside the cell face a similar problem as the couple trying to meet for the first time on a congested dance floor. The cellular interior is jam-packed with a large number and variety of proteins. In this setting, proteins have a difficult time making their way to their biochemical partners.

New research, however, indicates that the structures of proteins are carefully fine-tuned so that these biomolecules can successfully find and interact with their biochemical mates.1

Many proteins must interact with other proteins to carry out their prescribed function in the cell. These protein-protein interactions (PPIs) out of necessity must be highly specific. If the wrong proteins bind to each other their interaction is of no use to the cell.

PPIs are complicated by the crowded cellular environment. Low protein concentrations of suitable mates further compound the problem. In the cellular milieu, proteins are much more likely to encounter promiscuous interaction partners than the ones necessary for them to execute their function.

Biochemists have long wondered how highly specific PPIs, so critical to the cell's operation, take place in a cellular interior teeming with myriad proteins.

Scientists from Harvard University and the Massachusetts Institute of Technology recently determined that protein structure is carefully optimized to suppress promiscuous interactions, allowing them to readily find their biochemical companions. Protein surfaces appear to be designed so that the binding of randomly associated proteins is much weaker than the binding when the appropriate proteins encounter each other.

It's as if the proteins that are trying to meet on the cell's crowded dance floor are sporting an unusually colored boutonniere or some other distinguishing feature that makes them easy to pick out in a crowd. All the while, unsuitable partners get a quick brush-off.

The last half century of research into the structure-function relationships of biochemical systems has consistently demonstrated that the function of biomolecules critically depends on an exacting location and spatial orientation of its chemical constituents. The finely tuned PPIs comprise the latest example of this precise arrangement uncovered by biochemists.

Precision and fine-tuning do not arise by happenstance. Rather, they come about only as a result of careful planning and a commitment to executing designs with the best craftsmanship possible. Fine-tuning is a hallmark of intelligent design. This feature dominates the best human designs and is often synonymous with exceptional quality. Similarly, the fine-tuning and precision of biochemical systems as exemplified by PPIs points to the work of a Divine Designer.

1. Eric J. Deeds et al., "Robust Protein-Protein Interactions in Crowded Cellular Environments," Proceedings of the National Academy of Sciences, USA 104 (September 18, 2007): 14952-57.

Designed to Live, Designed to Die
Hugh Ross, Ph.D.

"Why don't we see new species emerging now?" Charles Darwin faced this question nearly 150 years ago when he proposed the theory of evolution. His answer–all Earth's habitats are full.

Whether or not his reply made sense then, it certainly doesn't hold up today. During the human era thousands of extinctions have emptied multiple environmental niches, and these habitats remain unfilled by new genera.

Darwin was right about one thing, however. He recognized the reality of both extinction events and speciation events in the fossil record. Recent research has amplified the reasons for, as well as the benefits of, the extinction events.

Astronomers recognize that conditions in the solar system have changed dramatically since life began. For example, the sun's brightness initially decreased and then began a steady, ongoing increase, and Earth's rotation rate has slowed. Such changes impact life.

When certain species can no longer help compensate for such changes,1 those species must be removed to make way for others that can. But if this removal were random, total ecological collapse could occur. At the just-right moment, the outdated species–usually a whole cluster of species that serve together in a particular ecological role–must be taken away and quickly replaced.

These targeted extinctions require events that selectively, and with precise timing, remove the outdated species and their ecological support system. New data shows that asteroid and comet collisions ideally provide for such extinctions.2 Earth happens to be optimally located within the solar system's layout to receive the just-right number and kind of extinction-causing impacts. If Earth were much nearer to Mars' orbit, asteroid and comet collisions would be too frequent and too extreme. If Earth were much nearer to Venus' orbit, collisions would occur too seldom and be too weak to bring about the necessary extinctions.

Prior to these findings, astronomers defined "habitable" zones around stars by two criteria:
1) the distance range where surface liquid water can be present on a planet and 2) the distance range in which the just-right quantity and kind of ultraviolet radiation can reach the planet's surface.
Now a third criterion must be added: a just-right death zone, the distance range in which the just-right frequency and intensity of asteroid and comet collisions can occur.

For life to be sustained long-term, a planet must simultaneously reside in all three zones–a very restrictive region indeed.

This analysis supports RTB's message that the more scientists learn, the more evidence accumulates for the supernatural, super-intelligent design of the cosmos and Earth for humanity's benefit. The psalmist anticipated such work by 3,000 years when he worshipped God in these words:

These [creatures] all look to you to give them their food at the proper time . . . when you take away their breath, they die and return to the dust. When you send your Spirit, they are created, and you renew the face of the earth.3


1. For exactly how this compensation works, see my book, Creation as Science (Colorado Springs, NavPress, 2006): 125-147.
2. G. Kochemasov, "On the Uniqueness of Earth as a Harbor of Steady Life: A Comparative Planetology Approach," Astrobiology 7 (June, 2007): 518.
3. Psalm 104:27-30, The Holy Bible.

God-of-the-Gaps or Best Explanation?
Kenneth Richard Samples

A common skeptical objection to Christian apologetics is that theists engage in a god-of-the-gaps form of reasoning. This charge means that when it comes to various theistic arguments, the believer typically attributes gaps in (especially) scientific knowledge to God. For example, when science can't explain how the universe came into being or how life originated on Earth, the Christian apologist is quick to point to God as a so-called cause or explanation. Thus the skeptic's accusation is that Christians do nothing more than give their ignorance a name–"God."

The naturalist (a person who believes that the physical cosmos is the ultimate reality) assumes that, given enough time, scientific exploration will discover a purely naturalistic explanation for what is now scientifically inexplicable. Oxford scientist Richard Dawkins responded in this way to Michael Behe's argument from irreducible complexity (the idea that the complexity of some life-forms cannot be accounted for through gradual evolutionary steps). Dawkins and other naturalists think that attributing scientific mysteries to God is illegitimate and it stifles scientific discovery.

Yet it is important to notice how entrenched naturalists are to their mindset and worldview. When it comes to science, only physical and material explanations are allowable (called methodological naturalism)–the supernatural is ruled out a priori (without examination). Also, some naturalists express excessive confidence that the future will explain reality. But they don't live in the future and it is illegitimate to appeal to the expected explanations of the future to explain present reality (what is needed is evidence in the present). This faulty form of reasoning constitutes the argumentum ad futuris fallacy ("accept this because future evidence will support it"). Ironically it might even be called "naturalism-of-the-gaps" reasoning.

Yet while modern science has been quite successful in explaining many particular aspects of the physical universe, some observers of the scientific enterprise think that it may have reached its limits when explaining the truly big questions of existence.1 Those limits may not allow answers to such profound questions as the origin of the universe, the origin of life, and the origin of consciousness. If this admittedly hotly contested perspective is even close to being true, then the grand natural explanations of science may have been exhausted. If so, naturalism as a worldview has not been able to adequately explain reality (especially its grand features). Given this pessimistic scenario (from a naturalist's perspective), it seems that appealing to the supernatural to explain reality may have some legitimacy after all.

Regardless of the course taken by naturalists, however, it should be noted that most sophisticated Christian theists don't engage in a god-of-the-gaps form of reasoning. Rather, Christian scholars appeal to God as an inference to the best explanation. In logic, this approach is known as abductive reasoning. Like inductive arguments, the abductive form of thinking yields only probable truth. Unlike induction, however, abductive arguments don't attempt to predict future possibilities. This careful thought process moves from the data, facts, evidence, and phenomena of the world to draw the most consistent and plausible explanation for these realities. This abductive form of logical reasoning is very similar to the way detectives, lawyers, historians, and scientists reason. For example, scientists sometimes postulate ideas that are unobservable in order to explain the data that is observed (consider dark matter). This approach posits the biblical God as the best explanation for all realities found in the world and in life.

One of Christian theism's greatest worldview strengths is the scope of its explanatory power. The historic Christian viewpoint accounts for the array of realities in nature and in human experience, including:2

  1. The universe–its source and singular beginning, order, regularity, and fine-tuning

  2. Abstract entities–the existence and validity of mathematics; the laws of logic; and scientific models (which include their correspondence to the time-space-matter universe as conceived in the mind of human beings)

  3. Ethics–the existence of universal, objective, and prescriptive moral values

  4. Human beings–their existence, consciousness, rationality, free agency, enigmatic nature, moral and aesthetic impulse, and their need for meaning and purpose in life

  5. Religious phenomena–humankind's spiritual nature and religious experience; the miraculous events of Christianity; and the unique character, claims, and credentials of Jesus Christ

These realities correspond to what the Bible teaches about God's creating the universe and, in particular, human beings in his own image (imago Dei). However, the Christian worldview overall does not naively assume divine activity or intervention as an explanation for whatever humans cannot yet explain, but rather offers a genuine and valid explanatory theory for the nature of life's realities. For many Christian thinkers, inference to the best explanation is a powerful and cogent approach to attempting to explain reality.

Skeptical philosophies of life such as naturalism have great difficulty explaining universal realities. On this basis, Christian theism's explanatory scope appears far superior to that of naturalism.

1. See John Horgan, The End of Science: Facing the Limits of Knowledge in the Twilight of the Scientific Age (New York: Broadway, 1996).
2. This list has been excerpted from Kenneth Richard Samples, A World of Difference: Putting Christian Truth-Claims to the Worldview Test (Grand Rapids: Baker, 2007), 270-71.

"Lucky" Pictures of Sky May Reveal Design

David H. Rogstad, Ph.D.

Scientists don't take kindly to the notion that luck plays a large role in achieving good results. It is well-thought-out experiments and plain hard work, not unpredictable luck that produce the best repeatable science. But even scientists might admit that with a new technique for taking high-resolution pictures of the sky, luck is the whole ballgame. Dubbed Lucky Imaging,1 this method could revolutionize the capability of ground-based telescopes.

An earlier Connections article discussed the value of having telescopes operating in space, free from atmospheric distortions.2 But, with the slow demise of the Hubble Space Telescope, astronomers will no longer be able to take pictures at visual wavelengths from space. The follow-on James Webb Space Telescope, scheduled to be launched in 2013, will work best at longer wavelengths. In anticipation, telescope builders have been working on a technique called adaptive optics to overcome this atmospheric problem for ground-based telescopes, but success has been limited. Enter Lucky Imaging!

The idea behind Lucky Imaging is not new. David Fried proposed the technique in 19783 and Robert Tubbs fully implemented and described it in 2003.4 A telescope in space is able to form a sharply defined image of an object where its resolution is limited only by the diameter of the telescope and the wavelength of the light. However, if the telescope is on the ground, density fluctuations in the atmosphere cause distortions in the image so that it becomes fuzzy.

Since these fluctuations typically occur at a rate of about a hundredth of a second, it is possible, with a faster shutter speed, to snap a picture and essentially freeze the image. After capturing a series of such images, by luck, some of them will have much smaller distortions than others. Selecting those "lucky" images and averaging them together yields a final result with much higher resolution than would have been obtained using all the images.

The results of Lucky Imaging are stunning. A team of astronomers from Caltech and the University of Cambridge used the technique combined with adaptive optics on the 200-inch Palomar Telescope to obtain images of the Cat's Eye Nebula (NGC 6543). They achieved greater detail than even the Hubble Space Telescope.5

Astronomers have been able to use this technique only now because recent improvements in the performance of CCD (charge-coupled device) detectors have achieved near perfection in efficiency and quality. These detectors are also available in off-the-shelf cameras so that amateur astronomers can do Lucky Imaging with equally stunning results.6

RTB scholars celebrate and eagerly await the results of the new technology. With any luck, more-detailed images of our own galaxy as well as external galaxies will display increasing evidence for supernatural design.

1. For more information, see the Lucky Imaging Web site.
2. David H. Rogstad, "New Telescope Promises Greater Evidence for Design," Connections Quarter 2, 2007, 8.
3. D. L. Fried "Probability of getting a lucky short-exposure image through turbulence" Journal of the Optical Society of America, vol. 68, no. 12 (1978): 1651–58.
4. Robert Tubbs, "Lucky Exposures: Diffraction limited astronomical imaging through the atmosphere," PhD diss., Cambridge University, 2003.
5. Caltech, "Caltech Astronomers Obtain Sharpest-Ever Pictures of the Heavens," press release, September 4, 2007.

The Cat's Eye Nebula (NGC6543) as imaged conventionally by the Palomar 200-inch telescope.

The Cat's Eye Nebula as imaged using The Lucky Camera together with an adaptive optics system on the Palomar 200-inch telescope.