Explaining Faster Star Formation in the Past

Explaining Faster Star Formation in the Past

Grilling a tasty, medium-rare hamburger requires the right kind of fire. Too much heat causes the outside to taste like charcoal. An overly cool fire cooks too slowly and leaves the inside of the burger brown and dried out. But a just-right fire sears the outside, retains the juices inside, and produces an enjoyable burger-eating experience. A similar principle applies to the universe’s habitability, particularly to how quickly stars formed throughout its history.

For more than a decade, astronomers have known that stars formed more rapidly in the past compared to today. At present, only a few stars form each year in a typical large spiral galaxy like the Milky Way, but five to ten billion years ago, the same kind of galaxy formed stars ten times more quickly. This change in star-formation rate (SFR) could have many causes. The two leading ideas posit either a decrease in star-formation efficiency over time or a decrease in the gas available to form stars. Observations distinguishing between these two ideas are difficult to acquire since they require detailed images of galaxies residing billions of light-years away.

Recently, an international team of astronomers used optical and infrared images and spectra as well as radio observations to determine the fraction of cold molecular gas (the component of a galaxy that collapses to form stars) contained in a sample of distant galaxies. In contrast to previous work, this sample contained more “normal” galaxies (which are more difficult to detect) rather than the rarer but extremely bright galaxies (like quasars and merging galaxies). As the researchers point out, the more-normal galaxy sample minimizes the chance of biased results, just as studies of people 5’8″ tall will be less biased than studies of those 7′ tall. The use of advanced space telescopes like Hubble and Spitzer make such observations possible.

Results from a detailed analysis of these observations1 indicate that the change in SFR results from a decreasing supply of gas to form stars. In particular, the researchers found that the average fraction of cold gas to total normal mass (i.e., excluding dark matter) in the sample of distant galaxies is three to ten times greater than the fraction in large spiral galaxies today. Additionally, the oldest (ten billion years ago) galaxies sampled showed a higher gas fraction than slightly younger (eight billion years ago) galaxies. The research also suggests that ongoing star formation requires a semi-continuous replenishment of gas that declines over time.

The fact that the SFR in galaxies starts high and decreases over time provides part of an answer to the question of why humanity does not appear in the universe for over 13 billion years. Before the universe could support advanced life, sufficient elements heavier than helium needed to be synthesized and this occurred in the hearts of massive stars (that eventually exploded in supernovae). While the quantities of stable elements like carbon, oxygen, and iron steadily increase as more stars form, radioactive elements like uranium and thorium increase to a peak value and then decay away. The Sun formed near the peak of the uranium and thorium abundance. These elements contribute a critical component that maintains the life-essential plate tectonics experienced by Earth.

Our planet needed to form at the just-right time in order to possess enough uranium and thorium, but not too much of other elements. Also, a higher SFR means more supernovae, which means Earth experienced more cosmic radiation bombardment in the past. Earth formed at the just-right time to meet all of life’s requirements. Such “fine-timing” points toward an intelligent Creator purposely preparing a fit habitat for humanity.

Endnotes
  1. L. J. Tacconi et al., “High Molecular Gas Fractions in Normal Massive Star-forming Galaxies in the Young Universe,” Nature 463 (February 11, 2010): 781–84.