by Carolyn Gramling Thursday, January 5, 2012
EARTH’s Carolyn Gramling is in Vienna, Austria, at the European Geophysical Union meeting this week. Here are a few more sessions that she has found interesting (for more from the meeting, see her first “Dispatches").
The Sahel, the semi-arid transition zone south of the Sahara Desert, supports about 50 million inhabitants and is particularly vulnerable to changes in precipitation. In 1975, meteorologist Jule Charney of MIT suggested that desertification in the region — the advance of the Sahara into the Sahel — might be due to overgrazing, through a biogeophysical feedback mechanism. Charney suggested that the removal of vegetative cover would increase the albedo over the region, reflecting more radiative energy back into space. That, in turn, would reinforce the sinking circulation of dry air masses in the troposphere; those dry air masses would produce less rainfall, which in turn would foster even less plant growth.
But was overgrazing or natural climate variability to blame for a drought that occurred in the Sahel during the 1970s and 1980s? In trying to answer this question, Ulrike Holzwarth of the University of Bremen in Germany turned to a deep-sea core filled with dust and river inputs from the Sahel region into the ocean off of Mauritania. The core represents more than 3,000 years of climate and ocean data.
To disentangle the natural variability from the impact of overgrazing, Holzwarth and her team studied the core’s record of dust, sediment, pollen and species of dinoflagellate cysts, algae that bloom in response to increased nutrient inputs, such as from agriculture. Each of these tells a different piece of the climate story of the region: Dust inputs began to rise from A.D. 1500 onward, but more dramatically in the last 70 years, related to the onset of commercial agriculture and increased erosion. Pollen from desert plants — which increases when the rainfall decreases — increasingly accumulated in the core from A.D. 1500 to 1700, and even more so in the last 70 years, corresponding to a general aridification trend.
The accumulation of the dinoflagellate cysts also increased dramatically from about A.D. 1700 to 1800. But there was no corresponding signal over the last 70 years, Holzwarth announced Tuesday at the EGU meeting. Furthermore, she said, the drought of the 1970s to 1980s occurred before the measured increases in desert pollen in the last few decades. All of this, she said, suggests that overgrazing wasn’t the cause of the drought after all.
The eruption of Iceland’s Laki volcano in June 1783 had a distinct impact on the summer weather around the North Atlantic: Western Europe was unusually hot and hazy, and North America unusually cold. The following winter was also anomalous, producing record cold and snow in the northeastern United States. George Washington, James Madison and Benjamin Franklin complained of the worst winter in memory; Franklin suggested that both the chilly summer and the snowy winter were due to the Laki eruption.
But the winter, at least, might not have been Laki’s fault. Rosanne D’Arrigo, a geologist at Lamont-Doherty’s Tree-Ring Lab, was thinking about the most recent example of an anomalously snowy winter in the northeastern United States — the record-breaking winter of 2009-2010. That winter was caused by the rare — but natural, not anthropogenic — concurrence of two large-scale atmospheric patterns: a negative phase of the North Atlantic Oscillation (NAO) and an El Niño Southern Oscillation (ENSO) warm event. Could the same one-two punch have been at work during the winter of 1783-1784, D’Arrigo wondered?
D’Arrigo used tree rings to reconstruct the past 600 years of NAO and ENSO phases, and presented her data in a poster at the EGU conference on Wednesday (her findings have also just appeared in Geophysical Research Letters). The data were quite clear, she says: A combined, negative NAO-ENSO warm event did indeed happen in 1783-1784. In fact, D’Arrigo says, that winter was the second-most powerful confluence of those two phases in the last 600 years.
The most powerful such combination? Well, that was just a winter ago.
Winnipeg, in Manitoba, Canada, has a long history of using its chilly groundwater in the summertime as a handy source of refrigeration. But recently, city managers were becoming a bit concerned: The urban heat island effect, in which cities become anomalously warmer than their surroundings in the summer due to an increase of pavement and a decrease in vegetative cover, doesn’t just stay on the surface. It extends underground, says Grant Ferguson, a hydrogeologist at St. Francis Xavier University in Nova Scotia, Canada.
Beneath the center of a city, Ferguson says, urban heat islands can increase underground temperatures by as much as 11 degrees Celsius, which warms up the groundwater as well — and that could considerably hamper the refrigerative power of the groundwater in summer.
But there may be a bright side — depending on how you look at it. Doctoral student Ke Zhu of the University of Tübingen in Germany, along with Ferguson and other scientists, have been investigating whether it would be feasible to use the subsurface urban heat island effect to generate geothermal energy. In a poster presented at the EGU conference Wednesday, they assessed the geothermal energy potential of warmed-up groundwater beneath cities including Winnipeg, as well as Cologne, Germany, and Zurich, Switzerland. They found that it might be possible to produce as much as six months or even a year of Winnipeg’s heating needs this way. It might not be the primary source of energy to a city, Ferguson says — “but it’s a little extra resource.”
There is an entire field of scientists studying small stones that exist in a fish’s inner ear. Those carbonaceous stones, called otoliths, tell scientists about the chemistry of the water in which the fish lived. Because the fish take up a range of elements through their gills, otoliths can contain multiple elements derived from seawater, particularly calcium, strontium and manganese. Strontium, according to ecologist Karin Limburg of the SUNY College of Environmental Science and Forestry in Syracuse, N.Y., is a favorite element for otolithologists: They use measured ratios of strontium to calcium in the otoliths as proxy for sea temperatures and salinity (the higher the strontium-to-calcium ratio, the lower the temperature of the water and the salinity.)
But there are a lot of elements in the otoliths. What other stories might they have to tell? Limburg was particularly interested in manganese, which might be useful as a proxy for low-oxygen (hypoxic) conditions in places such as the Baltic Sea — currently the largest anthropogenic dead zone in the world.
To see if she could identify a distinct signal for hypoxia using manganese, Limburg analyzed otoliths from young codfish that lived during a particularly hypoxic event in the Baltic Sea in the early 1990s. Indeed, she found, the manganese-to-calcium ratios in otoliths from that period looked considerably different from the ratios in cod otoliths from a decade later — and from Stone Age cod otoliths as well. All of this suggests, she announced Thursday at the EGU meeting, that otolith chemistry could be used to track historic — and even prehistoric — fish interactions with dead zones.
© 2008-2021. All rights reserved. Any copying, redistribution or retransmission of any of the contents of this service without the expressed written permission of the American Geosciences Institute is expressly prohibited. Click here for all copyright requests.