kw: earth, history, geology, radioactivity
It didn't require much sideways thinking, after yesterday's post about uranium and radioactive decay, to recall the history of these elements in the Earth. The age of the planet is 4.5 billion years (Gy), and even these long-lived isotopes must have changed over time. One question that comes up sometimes in Freshman Earth Science courses is, "How much does radioactivity affect the Earth's temperature?" A more thoughtful student may also ask whether it was much greater in the past.
The short answer to the first question is, "Not much." In spite of a century of intensive Geological exploration, the total heat flow from Earth's interior is known only within a factor of about 2, as between 30 and 60 Terawatts (3-6x1013W). A commonly accepted figure it 40 TW. About 75% of this is thought to be radiogenic heat.
This diagram, representing a view toward the upper end (60TW), has radiogenic heating near 52TW, and gives the breakdown by the four isotopes that contribute significantly. The source of the image is this Jrank article. In spite of this diagram, the article's author, David Rothery, considers the total heat flux from Earth's interior to be closer to 40 TW.
Let us first compare that to solar influx. The Solar Constant (which is variable in a narrow range) is about 1,350 W/m2, or 1.35x109 W/km2. The Earth's nominal radius is 6,370 km, so it intercepts 1.275 km2 of sunlight, a total of 1.72x1017 W. One-third of this is reflected outright by clouds and ice, leaving about 1.1x1017 to reach the ground and heat the surface. Divide this by 40 TW, and we see it is 2,800 times the internal heat flow.
Now, whether radiogenic heating is 52 TW, as shown in this diagram, or closer to 30 TW (0.75x40 TW), it doesn't contribute much heat compared to the Sun. How about in the past? The diagram shows that when Earth first came together, radioactive heating from these four isotopes was 8x what it is today. There were other short-lived isotopes that no doubt added significantly to this, but they didn't last long and we have no evidence how abundant they were 4.5 billion years ago (Ga).
Now, the Sun was 40% fainter then, but we don't know how much of its radiation reached the surface. At least during the Hadean period, between 4 and 4.5 Ga, when the entire planet was molten, there were not likely any water clouds in the atmosphere, but we don't know what the atmosphere was like. Still, the radiogenic heat probably never supplied more than about 1/400th of Solar heat. It has never been much of a factor in the planet's temperature.
However, today it supplies about 3/4 of the energy that drives plate tectonics. The other quarter is remnant primordial heat and the heat of crystallization as the outer liquid core slowly freezes onto the solid inner core. Two billion years ago, radiogenic heat was more than twice what it is today, and I expect that plate tectonics ran at a brisker clip. There was also more volcanism than today.
Can we predict the future? I don't have a good handle on how much internal heating is required to keep plate tectonics going. Without it, the biosphere and atmosphere would change a lot, and the continents would erode down to just below sea level, leaving an ocean planet. If the critical value is half of today's amount (this is a wild guess), we can predict that such a level will be reached in about two more billion years. That's the time we have left to figure out how to live on an ocean planet, or how to leave the planet altogether. Given humanity's tendency to procrastinate, it probably isn't enough time!
No comments:
Post a Comment