The researchers measured the ratio of nitrogen isotopes, which have the same number of protons but differing numbers of neutrons, that were preserved within the carbonate shells of a group of marine microfossils called foraminifera. The investigators found that nitrogen concentrations indeed declined during the cold periods when iron deposition and productivity rose, in a manner consistent with the dust-borne iron fertilization theory. Ocean models as well as the strong correlation of the sediment core changes with the known changes in atmospheric CO2 suggest that this iron fertilization of Southern Ocean plankton can explain roughly half of the CO2 decline during peak ice ages.
Although Martin had proposed that purposeful iron addition to the Southern Ocean could reduce the rise in atmospheric CO2, Sigman noted that the amount of CO2 removed though iron fertilization is likely to be minor compared to the amount of CO2 that humans are now pushing into the atmosphere.
"The dramatic fertilization that we observed during ice ages should have caused a decline in atmospheric CO2 over hundreds of years, which was important for climate changes over ice age cycles," Sigman said. "But for humans to duplicate it today would require unprecedented engineering of the global environment, and it would still only compensate for less than 20 years of fossil fuel burning."
Edward Brook, a paleoclimatologist at Oregon State University who was not involved in the research, said, "This group has been doing a lot of important work in this area for quite a while and this an important advance. It will be interesting to see if the patterns they see in this one spot are consistent with variations in other places relevant to global changes in carbon dioxide."
|Contact: Catherine Zandonella|