It’s hard to imagine a refrigerator (at least in the U.S.) that doesn’t have a gallon or two of milk in it. Milk is a staple in the diet of humans—young and old—around the world.
But it wasn’t always that way. For much of human history, people lacked the ability to digest milk as adults. Humans suffered from this deficiency because they stopped producing the digestive enzyme for the milk sugar, lactose, as they grew out of infancy.
The enzyme lactase (which breaks down lactose) is produced in all mammals (including humans) at an early age. Normally, the gene for this enzyme is not expressed as mammals mature. In some human adults, however, the gene for lactase is expressed into adulthood. This condition, referred to as lactase persistence, appears to have evolved in humans when animal domestication became widespread about 10,000 years ago as part of the Neolithic revolution.
Lactase persistence is commonplace in Northern and Central European populations and occurs to a lesser extent in people groups from Southern and Eastern Europe. The ability to digest milk sugar into adulthood varies for Middle Eastern and African populations, correlating with pastoralist lifestyles. Interestingly, recent work indicates that lactase persistence evolved independently in European and African people groups.1 In European people groups, lactase persistence stems from a single change (or mutation) in the part of the DNA sequence that controls expression of the gene that encodes lactase. A different mutation yielded lactase persistence in African populations.
Evolutionary biologists propose two models to explain the origin of lactase persistence. The lead hypothesis argues that the mutation that led to lactase persistence occurred recently after animal domestication took place. (Most studies support this idea.) The ability to digest nutrient-rich animal milk offered an obvious advantage and consequently took hold and spread quickly among human populations. The other model—a reverse-cause hypothesis—asserts that the mutation for lactase persistence was present in humans well before the Neolithic revolution. This model maintains that only those humans with the ability to digest lactose domesticated animals. Those who couldn’t didn’t pursue that particular lifestyle.
A recent study, which explored the natural history of lactase persistence, directly evaluated these two models.2 Researchers analyzed ancient DNA isolated from the fossil remains of eight humans found in several sites in Europe for the DNA mutation that imparts lactase persistence. These human fossils dated between 7,000 and 8,000 years in age. The analysis revealed no evidence for lactase persistence in any of the human remains. There is no reason to believe that lactase persistence existed prior to the Neolithic revolution. It appears that humans recently evolved the ability to digest milk sugar in adulthood after animals were first domesticated.
Is evidence that humans evolved evidence for human evolution (the notion that humans emerged from an ape-like creature over the span of 6-7 million years through a series of transitional forms)? Not necessarily. (There are many reasons—see Who Was Adam?—to be skeptical of evolutionary explanations for the origin of humanity.) The emergence of lactase persistence is simply an example of a microevolutionary change—variation within a species—in which a single mutation, in this case, alters the expression of a single gene, allowing humans to persist in their ability to digest milk sugar into adulthood. In fact, it could be argued from a creationist perspective that the ability of humans (and other creatures) to adapt through microevolutionary change is evidence for God’s provision and providence.
The evolution of lactase persistence falls into the same category as: (1) the acquisition of antibiotic resistance by bacteria; (2) the development of pesticide and herbicide resistance by insects and plants, respectively; (3) the change in wing color of the peppered moth; and (4) the variation in beak shape by the finches on the Galapagos Islands. These common examples of evolutionary changes are often cited as evidence for biological evolution. Microevolutionary changes, however, don’t necessarily extend to support macroevolutionary changes (the creation of biological novelty through undirected evolutionary processes).
Microevolution is a fact. On the other hand, there are plenty of reasons to be skeptical of macroevolution.