Since scientists first determined that atmospheric carbon dioxide (CO2) was significantly lower during ice age periods than warm phases, they have sought to discover why, theorizing that it may be a function of ocean circulation, sea ice, iron-laden dust or temperature.
Yet no computer model based on existing evidence has been able to explain why CO2 levels were as much as one-third lower when an ice age settled in.
A new study published this week in Science Advances provides compelling evidence for a solution—the combination of sea water temperature variation and iron from dust off Southern Hemisphere continents.
"Many of the past studies that analyzed ocean temperatures made the assumption that ocean temperatures cooled at the same rate over the entire globe—about 2.5 degrees (Celsius)," said Andreas Schmittner, a climate scientist at Oregon State University and co-author on the study. "When they ran their models, temperature thus accounted for only a small amount of atmospheric CO2 decrease.
"We now know that the oceans cooled much more in some regions, as much as five degrees (C) in the mid-latitudes. Since cold water has a higher degree of CO2 solubility, it had the potential to soak up a lot more carbon from the atmosphere than past studies accounted for—and it realized more of that potential."
Schmittner and his colleagues estimate that colder ocean temperatures would account for about half of the decrease in CO2 during the last glacial maximum—or height of the last ice age. Another third or so, they say, was likely caused by an increase in iron-laden dust coming off the continents and "fertilizing" the surface of the Southern Ocean. An increase in iron would boost phytoplankton production, absorbing more carbon and depositing it deep in the ocean.
The researchers' models suggest that this combination accounts for more than three-quarters of the reduced amount of atmospheric CO2 during the last ice age. During the last glacial maximum, CO2 levels were about 180 parts per million, whereas levels in 1800 A . D.—just prior to the Industrial Revolution—were at about 280 parts per million.
Schmittner said the remaining amount of reduced carbon may be attributable to variations in nutrient availability and/or ocean alkalinity.
"The increase in iron likely resulted from ice scouring the landscape in Patagonia, Australia and New Zealand, pulling iron out of the rocks and soil," Schmittner said. "Since it was very cold and dry, the iron would have been picked up by the wind and deposited in the ocean.
"Our three-dimensional model of the global ocean agrees well with observations from ocean sediments from the last glacial maximum, giving us a high degree of confidence in the results."
The researchers say that when the Earth cooled during the last ice age, the oceans naturally cooled as well—except near the polar regions, which already were as cold as they could get without freezing. During warm phases, the difference in ocean surface temperatures between the high latitudes and the mid-latitudes was significant.
As warmer water moves toward Antarctica and begins to cool, the lost heat goes into the atmosphere, increasing the ocean's potential to soak up CO2.
"It's like when you take a beer out of the refrigerator," Schmittner said. "As it warms, the bubbles come out. Carbon dioxide is a gas, and as it can dissolve in water as well get into the ocean from the atmosphere, and it is more soluble in colder water. But that process takes a while and therefore the ocean doesn't realize all of its potential to take up CO2 in those waters around Antarctica that fill much of the deep ocean."
When the mid-latitude oceans began cooling, they began soaking up more CO2 from the atmosphere, and emitting less because colder water is more CO2 soluble.
"It was the perfect combination that can explain almost exactly why CO2 levels were about one-third lower during ice age periods," Schmittner said.