Reasons to Believe

Connections 2004, Vol. 6, No. 2



New Date for First Aussies
Hugh Ross, Ph.D.

Australia, do we have a problem? It may seem so, or at least it did a few years ago. The RTB creation model places the creation of humanity at roughly 50,000 years ago, with the spread of peoples and civilization outward from Mesopotamia some time after that, but probably no earlier than 30,000 years ago.1 Research findings published in 1996 claimed that aborigines inhabited Australia as early as 50,000-75,000 years ago.2 RTB’s scenario appeared to contradict the data. Time to go back to the drawing board?

The answer lies in a place called the “Jinmium rock shelter” in northern Australia. Archeologists have been working there for many years. Circular engravings in the rock walls drew them to investigate the site. Initially, they applied thermoluminescence dating methods to determine the age of those engravings.

This technique simply measures the time elapsed since quartz minerals in the sand associated with the engraving process last received exposure to sunlight. The assumption is that all mineral samples were well exposed to sunlight before they were buried by winds and debris of time.

The strengths and weaknesses of thermoluminescence dating can be illustrated by familiar “glow-in-the-dark” plastics. When my sons were very young, I stuck glow-in-the-dark stars on their bedroom ceiling to teach them how to find the North Star. When I turned out the lights at bedtime, the stars would glow above their heads. A few hours later, when I’d check to see if their eyes were finally closed and their covers somewhere nearby, the stars had vanished from sight (unless, of course, the lights had been on in the interim).

In principle, I could have come into their room at any time during those few hours, measured the intensity of light emitted by the stars, and thus determined how much time had passed since the lights that eradiated them went out. Certain variables would have complicated the situation though—how much light the stars had absorbed and over what span of time, for instance.

Questions such as these drew eleven researchers back to the Jinmium site to apply more tests.3 Among them was the lead scientist from the group that published the 1996 results. This second team performed optically stimulated luminiscence tests (subtly different from thermoluminescence) on the quartz grains and made radiocarbon measurements on charcoal fragments (from human activity) in the same sediment layer.

Quartz found near the top of the sediment appears to be 2,200 years old while grains found at the maximum burial depth (two meters) measured roughly 22,000 years old.4 Dates obtained (by radiocarbon methods) from the charcoal samples ranged from 100 to 3,900 years ago.5 In the very layer that produced the 60,000-year thermoluminescence date, the radiometric tests showed the charcoal to be 1,000 to 3,000 years old. The team’s conclusion: the dates published in 1996 were off by “more than an order magnitude.”6

This reassessment does NOT rule out human occupation in Australia before three or four thousand years ago. Other sites in Australia where both radiocarbon dating and atomic mass spectrometry have been applied yield dates somewhere in the neighborhood of 30,000 years.7 Such timing is not inconsistent with the earliest dates for human occupation of other regions of the eastern hemisphere outside Mesopotamia.8 These dates correspond with RTB’s biblical creation model to a sufficient degree that no major revision seems necessary at this time.

SIDEBAR

What about other dates for humanity that appear to challenge the Genesis chronology for creation and the spread of civilization? One comes from the Mount Carmel region of Israel, where indicators suggest the presence of “homo sapiens” as early as 100,000 years ago.9 Another comes from Brazil, where a structure has been dated at 35,000 years.

As it turns out, both dates are based on thermoluminescence, which can only be trusted to provide upper limits. Further testing may show both the Israeli and Brazilian remains are much younger than thermoluminescence tests suggest. At this time, no substantial discrepancy exists between these findings and the Genesis creation chronology for humanity. I might add that in developing a creation chronology for The Genesis Question I used only measurements that are currently undisputed or uncontested.

References:

  1. Hugh Ross, The Genesis Question, 2d ed. (Colorado Springs, CO: NavPress, 2001), 107-15, 119-25, 173-87.
  2. R. L. K. Fullagar, D. M. Price, and L. M. Head, “Early Human Occupation of Northern Australia: Archeology and Thermoluminescence Dating of Jinmium Rock Shelter, Northern Territory,” Antiquity 70 (1996): 751-73.
  3. Richard Roberts et al., “Optical and Radiocarbon Dating at Jinmium Rock Shelter in Northern Australia,” Nature 393 (1998): 358-62.
  4. Roberts et al., 361.
  5. Roberts et al., 360-61.
  6. Roberts et al., 362.
  7. David B. Roberts, R. Tuniz, C. Jones, and R. & J. Head, “New Optical and Radiocarbon Dates from Ngarrabullgan Cave, a Pleistocene Archeological Site in Australia: Implications for the Comparability of Time Clocks and for the Human Colonization of Australia,” Antiquity 71 (1997): 183-88.
  8. Tim Appenzeller, “Art: Evolution or Revolution,” Science 282 (1998): 1451-54.
  9. Anthropologists dispute whether the Mount Carmel specimens are modern humans (homo sapiens sapiens) or archaic homo sapiens.


New Discovery Confirms Life’s Early Appearance on Earth
Fazale (Fuz) Rana, Ph.D.

Paleontologist Niles Eldredge refers to it as “the most arresting fact that [he has] ever learned.”1 Many others in the scientific community share Eldredge’s astonishment. What causes their amazement?

Over the last decade or so, paleontologists have assembled a body of evidence indicating that life existed on Earth as far back as 3.8+ billion years ago.2 These life forms were morphologically simple, but biochemically and metabolically complex, single-celled microbes.3

Prior to 3.8+ billion years ago, life could not originate and find permanence on Earth because of hostile conditions caused primarily by frequent asteroid and cometary impacts. Around 3.9 billion years ago, the size and frequency of these impact events diminished. For the first time, oceans and a solid crust became permanent features on Earth.4 Immediately afterward, life appeared on Earth. As Eldredge puts it, “In the very oldest rocks that stand a chance of showing signs of life, we find those signs—those vestiges—of life. Life is intrinsic to the Earth!”5

This sudden appearance of metabolically sophisticated life forms poses problems for naturalistic origin-of-life scenarios. Evolutionary models stemming from naturalism predict that life should appear gradually on Earth, after a substantial percolation period. In contrast, the RTB model for life’s origin sees the sudden appearance of biochemically complex organisms as the fingerprint for God’s creative work.6

The chief evidence for early life on Earth comes from graphite deposits in the 3.8+ billion-year-old rocks found in western Greenland. The graphite’s carbon (measured in the ratio of carbon-12 to carbon-13) indicates that photosynthetic microbes produced it. Recently, however, paleontologists searching for ways to avoid obvious problems in evolutionary models have challenged the evidence for early life on Earth.7

Recent work by Danish geochemists takes much of the sting out of these challenges. These researchers found independent confirmation for life’s residue in the 3.8+ billion-year-old rocks of western Greenland. The uranium/thorium fractionation in these rocks compels the Danish scientists to conclude that photosynthetic microbes must have been present on early Earth.8

As investigators continue to probe Earth’s oldest rocks, the evidence for early life becomes more extensive and diverse. In 1997, paleontologist J. William Schopf marveled that “no one had foreseen that the beginning of life occurred so astonishingly early.”9 No one, that is, from a naturalistic perspective.

References:
  1. Niles Eldredge, The Triumph of Evolution and the Failure of Creationism (New York: W. H. Freeman and Company, 2000), 35.
  2. For references to the scientific literature see Fazale Rana and Hugh Ross, Origins of Life: Biblical and Evolutionary Models Face Off (Colorado Springs, CO: NavPress, 2004), 63-79.
  3. Rana and Ross, 63-79.
  4. Rana and Ross, 81-92.
  5. Eldredge, 35-36.
  6. Rana and Ross, 35-46.
  7. Rana and Ross, 63-79.
  8. Minik T. Rosing and Robert Frei, “U-Rich Archaean Sea-Floor Sediments from Greenland—Indications of >3700 Ma Oxygenic Photosynthesis,” Earth and Planetary Science Letters 6907 (2003): 1-8.
  9. J. William Schopf, Cradle of Life: The Discovery of Earth’s Earliest Fossils (Princeton, NJ: Princeton University Press, 1999), 3.


Did Neanderthals and Humans Interbreed?
Fazale (Fuz) Rana, Ph.D.

Despite compelling evidence,1 a minority of paleoanthropologists still believe (as do some Christians) that Neanderthals made a genetic contribution to modern humans through interbreeding. If Neanderthals interbred with modern humans, then by definition, they must be human.

The case for Neanderthal-modern human interbreeding relies exclusively on morphological (structural, bodily) evidence. The first suggestion that humans and Neanderthals may have interbred came in 1999 when a team of paleoanthropologists reported a fossil find from Portugal (near Lapedo Valley) dated at 24,500 years ago. Researchers recovered the complete skeletal remains of a young male child from a burial site.2 At that time, these paleoanthropologists interpreted the anatomy of the “Lagar Velho child” to consist of a mix of modern human and Neanderthal features. From this, they concluded that these two species must be closely related and regularly met and mated with one another.3

In 2003, the same team of paleoanthropologists claimed to have discovered another modern human-Neanderthal hybrid. This specimen, recovered in Romania, consists of a single lower jaw that dates to about 34,000 to 36,000 years ago, a time when modern humans and Neanderthals appeared to co-exist in Europe. Again, these researchers interpreted the jaw and dental anatomy to be a mosaic of archaic, modern human, and Neanderthal features.4

Most paleoanthropologists dispute the interpretation of the Lagar Velho child and the Romanian finds as modern human-Neanderthal hybrids. Commenting on the Portuguese discovery, Ian Tattersall and Jeffrey Schwartz state, “The analysis . . . of the Lagar Velho child’s skeleton is a brave and imaginative interpretation, of which it is unlikely that a majority of paleoanthropologists will consider proven.”5 Most researchers think that the Lagar Velho child simply was either an unusually stocky modern human child or one with a growth abnormality and that the Romanian find represents a modern human jawbone with unusual features.

New research from the Max Plank Institute provides direct evidence that Neanderthals and modern humans did not interbreed.6 This work compared mitochondrial DNA recovered from four Neanderthals with mitochondrial DNA isolated from the remains of five modern human fossils. The Neanderthal and modern human specimens all date between 30,000 and 40,000 years ago and were recovered from the same geographical locations. Investigators readily recovered Neanderthal-type DNA from the Neanderthal specimens, but only human DNA in the modern human remains. Based on statistical analysis these workers concluded that it was unlikely that Neanderthals made any genetic contribution to the earliest modern humans. In other words, there is no conclusive evidence that Neanderthals and modern humans interbred nor any hint of a possible evolutionary connection.

References:

  1. For references to the original scientific literature see Fazale R. Rana, “DNA Study Cuts Link with the Past,” Connections Vol. 2, No. 3 (2000), 3; Fazale R. Rana, “Neanderthal-To-Human Link Severed,” Connections Vol. 5, No. 2 (2003), 8-9.
  2. Constance Holden, “Ancient Child Burial Uncovered in Portugal,” Science 283 (1999), 169.
  3. B. Bower, “Fossil May Expose Humanity’s Hybrid Roots,” Science News 155 (1999), 295; Cidallia Duarte et al., “The Early Upper Paleolithic Human Skeleton from the Abrigo do Lagar Velho (Portugal) and Modern Human Emergence in Iberia,” The Proceedings of the National Academy of Sciences, USA 96 (1999): 7604-09.
  4. Erik Trinkaus et al., “An Early Modern Human from Pestera cu Oase, Romania,” The Proceedings of the National Academy of Sciences, USA 100 (2003): 11231-36.
  5. Ian Tattersall and Jeffrey H. Schwartz, “Hominids and Hybrids: The Place of Neanderthals in Human Evolution,” The Proceedings of the National Academy of Sciences, USA 96 (1999): 7117-19.
  6. David Serre et al., “No Evidence of Neandertal mtDNA Contribution to Early Modern Humans”, PLOS Biology 2 (2004): 0313-17.


Does Ockham’s Razor Support Naturalism?
Kenneth Richard Samples

William of Ockham (c. 1285-1349), a Franciscan monk and philosopher,1 is remembered for his principle of parsimony or simplicity, popularly called “Ockham’s Razor.” He stated that “Entities are not to be multiplied without necessity,” and “What can be done with fewer [assumptions] is done in vain with more.” In other words, when confronted with two seemingly equal explanatory hypotheses, the simplest or most economical explanation should be granted logical deference.

Some atheists assert that application of Ockham’s Razor makes atheism and its accompanying worldview of naturalism a more reasonable alternative than Christian theism. They argue that atheism is to be preferred over theism because atheism posits at least one less entity (no God) in the inventory of ultimate reality than does theism. Atheists generally believe that their naturalistic, materialistic worldview can explain reality, truth, meaning, value, morality, beauty, and reason in one world, whereas Christian theism requires two worlds (both the physical and transcendent realms of reality).

Yet Ockham’s Razor, while meaningful in evaluating competing explanatory hypotheses of ultimate reality, cannot stand alone as the principal or final test of what is reasonable or truthful.2 The simplest explanation may deserve initial deference over the complex explanation, but sometimes what appears to be the simplest theory may actually be simplistically inadequate. In that case the more complex view becomes logically necessary. I suggest that Ockham’s Razor addresses only half of the necessary explanatory equation. The fuller logical perspective comes in what I stipulatively call the “mean test.”  This test asserts that the worldview balanced between complexity and simplicity is a better barometer of ultimate truth and reason. Accordingly, an acceptable worldview (an interpretation of reality) will be neither too simple (the reductionistic fallacy) nor unnecessarily too complex (Ockham’s Razor). The test states that the simplest, most economical, and yet fully orbed worldview is best (explanatorily superior). The mean test strives to guard against both superfluous and simplistically inadequate explanations of reality.3

Christians identify two weaknesses in naturalism when the mean test is applied. First, while naturalism may be in one sense simpler than its rival theism (by denying the existence of the transcendent world), naturalism is itself a metaphysical system, and not merely reducible to the scientific enterprise. In the end it doesn’t appear to be all that simple or precise. The idea that complex reality (the world, life, consciousness, etc.) can be reduced to or explained solely by the natural world seems a difficult and presumptuous claim. I remain unconvinced that naturalism is truly a simpler or more economical explanation of reality than is Christian theism.

Second, naturalism holds that such meaningful realities as life, the mind, personhood, and reason came from an accidental natural mechanism (e.g., phylogenetic evolution). But any such mechanism would lack these realities. The effect appears to be profoundly greater than the cause. How does this inconceivably improbable cause-effect anomaly square with the foundational scientific principle of causality? How can such rational and objective enterprises as logic, mathematics, and science be the result of an unguided, accidental, purely mindless natural process?

The mean test strikes me as a more balanced and reasonable calibrator of worldview truth-claims, and naturalism doesn’t score any higher on this test than does Christian theism. In fact, Christian theism’s robust explanatory power and scope (including helping human beings understand life and its challenges) may be one of its most probative features. The meaningful realities of life (the world, abstract entities, consciousness, morality, logic, etc.) need an adequate metaphysical ground. The God of the Bible stands not as a “god of the gaps” superstitious substitute for ignorance, but rather as a simple, economical, and yet fully orbed explanatory hypothesis for the various meaningful realities discovered in the world and in human life.

References:
  1. For a detailed discussion of William of Ockham’s philosophical and theological views, see Frederick Copleston, A History of Philosophy, vol. 3 (New York: Doubleday, 1993), 43-152; and Paul Edwards ed., The Encyclopedia of Philosophy, vols. 7 and 8 (New York: Macmillan, 1967), s.v. “William of Ockham.”
  2. I summarize nine suggested worldview tests in Hugh Ross, Kenneth Samples, and Mark Clark, Lights in the Sky &Little Green Men (Colorado Springs: CO, Navpress, 2002), 156-58.
  3. A thoughtful and fair-minded comparison of the worldviews of naturalism and theism is provided in William J. Wainright, Philosophy of Religion (Belmont, CA: Wadsworth, 1988), 166-75. Wainwright’s thinking has in some respects influenced my own on this topic.