Archive for 2008

E. Zorita et al., How unusual is the recent series of warm years?

Citation: Zorita, E., T. F. Stocker, and H. von Storch (2008), How unusual is the recent series of warm years?, Geophys. Res. Lett., 35, L24706, doi:10.1029/2008GL036228.

How unusual is the recent series of warm years?

E. Zorita (Institute for Coastal Research, GKSS Research Centre, Geesthacht, Germany), T. F. Stocker (Physics Institute and Oeschger Centre for Climate Change Research, University of Bern, Bern, Switzerland) and H. von Storch (Institute for Coastal Research, GKSS Research Centre, Geesthacht, Germany)

Previous statistical detection methods based partially on climate model simulations indicate that, globally, the observed warming lies very probably outside the natural variations. We use a more simple approach to assess recent warming at different spatial scales without making explicit use of climate simulations. It considers the likelihood that the observed recent clustering of warm record-breaking mean temperatures at global, regional and local scales may occur by chance in a stationary climate. Under two statistical null-hypotheses, autoregressive and long-memory, this probability turns to be very low: for the global records lower than p = 0.001, and even lower for some regional records. The picture for the individual long station records is not as clear, as the number of recent record years is not as large as for the spatially averaged temperatures.

(Received 5 October 2008, accepted 18 November 2008, published 30 December 2008.)

Link to abstract: http://www.agu.org/pubs/crossref/2008/2008GL036228.shtml

No Quick Easy Technological Fix For Climate Change, Researchers Say

No Quick Or Easy Technological Fix For Climate Change, Researchers Say

ScienceDaily (Dec. 27, 2008) — Global warming, some have argued, can be reversed with a large-scale "geoengineering" fix, such as having a giant blimp spray liquefied sulfur dioxide in the stratosphere or building tens of millions of chemical filter systems in the atmosphere to filter out carbon dioxide.

But Richard Turco, a professor in the UCLA Department of Atmospheric and Oceanic Sciences and a member and founding director of UCLA's Institute of the Environment, sees no evidence that such technological alterations of the climate system would be as quick or easy as their proponents claim and says many of them wouldn't work at all.

Turco will present his new research on geoengineering — conducted with colleague Fangqun Yu, a research professor at the State University of New York–Albany's atmospheric sciences research center — today and Thursday at the American Geophysical Union's annual meeting in San Francisco.

"We're talking about tinkering with the climate system that affects everybody on Earth," said Turco, an atmospheric chemist with expertise in the microphysics of fine particles suspended in the atmosphere. "Some of the ideas are extreme. There would certainly be winners and losers, but no one would know who until it's too late.

"If people are going to pursue geoengineering, they have to realize that it won't be quick, cheap or easy; indeed, suggestions that it might be are utter nonsense, and possibly irresponsible. Many of these ideas would require massive infrastructure and manpower commitments. For example, one concept to deliver reflective particles to the upper atmosphere on aircraft would require numerous airports, fleets of planes and a weather forecasting network dedicated only to this project. Its operation might be comparable to the world's entire commercial flight industry. And even after that massive investment, the climatic response would be highly uncertain."

Given the difficulties of reducing greenhouse gas emissions, the idea of a simple large-scale technological solution to climate change can seem very appealing.

"Global warming due to carbon dioxide emissions appears to be happening even faster than we expected," Turco said. "Carbon dioxide emissions are continuing to grow despite all of the warnings about climate change, despite all of the data showing such change is occurring and despite all of the efforts to control carbon emissions. The emissions are rising, in part, because China and India are using increasingly more energy and because fossil fuels still represent the cheapest source of energy.

"If we continue down this path, the climate is likely to change dramatically — major ice sheets could melt, sea levels could rise, it may evolve into a climate catastrophe. So it is tempting to seek an alternative response to climate change in case we can't get emissions under control. The result is that more and more geoengineering proposals are surfacing. Some of the people developing such proposals know what they're talking about; many don't."

Turco and Yu have been studying a particular geoengineering approach that involves the injection of nanoparticles, or their precursor gases — such as sulfur dioxide or hydrogen sulfide — into the stratosphere from aircraft or large balloons.

While our climate system normally involves a balance between incoming sunlight and outgoing heat radiation, excess atmospheric greenhouse gases trap additional heat and cause the Earth's temperature to rise, Turco noted. "One way to control the potential warming is to reduce the emissions of greenhouse gases," he said. "We haven't been able to get a handle on that. Another idea, instead of reducing emissions, is to somehow compensate for them."

The idea of injecting sulfur dioxide or other toxic gases into the stratosphere in gaseous or liquefied form would mean that planes or balloons would have to fly as high as 13 miles — higher than any commercial aircraft can reach. And the amounts involved range to many millions of tons.

"Some of these proposals are preposterous, mind-boggling," Turco said. "What happens, for example, when you spray liquefied sulfur dioxide into the stratosphere? Nobody knows."

In a study published earlier this year, Turco analyzed what happens when a stream of very small particles is injected into the atmosphere. He showed that when the particles are first emitted, they are highly concentrated, collide frequently and coagulate to much larger sizes than expected.

"To create the desired climate outcomes, you would need to insert roughly 10 million tons of optimally-sized particles into the stratosphere," he said. "You would have to disperse these particles very quickly over the entire stratosphere or they would coagulate into much larger sizes. At such enhanced sizes, the particles do not have the same effect; they're much less effective in forcing climate compensation. In the end, you would have to fly thousands of high-altitude jets every day to get enough particles into the atmosphere to achieve your goal. And this activity would have to be sustained for hundreds of years."

The basic idea behind stratospheric particle injections is that the Earth's temperature depends on the reflectivity of the atmosphere. About one-third of the energy from the sun hitting the Earth is reflected back into space. That fraction is called the "albedo." If the albedo increases, the average global temperature decreases because less energy is available to warm the planet. So if we can increase the albedo sufficiently, we can compensate for global warming.

"The size distribution of the particles is critical," Turco said. "If the particles are too large, that will actually create a warming effect, a greenhouse warming. Small particles are not useful because they don't reflect much radiation; you need something in between, and we have shown that is hard to achieve reliably."

Turco and Yu have simulated, for the first time, the actual injection processes that might be used, focusing on the early evolution of the injection plumes created from aircraft or balloon platforms. They used an advanced computer model developed by Yu to calculate the detailed microphysical processes that ensue when reactive, particle-forming vapors are emitted into the atmosphere. They also accounted for the photochemical reactions of the injected vapors, as well as the mixing and dilution of the injection plume.

"We found that schemes to emit precursor gases in large quantities would be extremely difficult to design and implement within the constraints of a narrow tolerance for error, and in addition, the outcomes would be very sensitive to variables over which we would have little control, such as the stability and mixing conditions that occur locally," Turco said.

"Advocates of geoengineering have tried to make climate engineering sound so simple," he added. "It's not simple at all. We now know that the properties and effects of a geoengineered particle layer in the stratosphere would be far more unpredictable, for example, than the physics of global warming associated with carbon dioxide emissions. Embarking on such a project could be foolhardy."

How can global warming be combated?

"We must reduce carbon emissions," Turco said. "We need to invest big-time in alternative energy sources with minimal carbon footprints."

The research is federally funded by the National Science Foundation.


Adapted from materials provided by University of California Los Angeles.

University of California – Los Angeles (2008, December 27). No Quick Or Easy Technological Fix For Climate Change, Researchers Say. ScienceDaily. Retrieved December 27, 2008, from http://www.sciencedaily.com/releases/2008/12/081217190429.htm

Stronger Coastal Winds Due To Climate Change May Have Far-reaching Effects

Stronger Coastal Winds Due To Climate Change May Have Far-reaching Effects

ScienceDaily (Dec. 22, 2008) — Future increases in wind strength along the California coast may have far-reaching effects, including more intense upwelling of cold water along the coast early in the season and increased fire danger in Southern California, according to researchers at the Climate Change and Impacts Laboratory at the University of California, Santa Cruz.

Earth scientist Mark Snyder will present the findings in a poster titled "Future Changes in Surface Winds in the Western U.S. due to Climate Change" at the Fall Meeting of the American Geophysical Union (AGU) in San Francisco on Friday, December 19, 2008.

Snyder's group used a regional climate model to study how the climate along the U.S. West Coast might change in the future as a result of global warming driven by increasing atmospheric concentrations of greenhouse gases. The results suggest that a general increase in wind speeds along the coast is likely to accompany regional changes in climate.

"What we think is going on is that land temperatures are increasing at a faster rate than the ocean temperatures, and this thermal gradient between the land and the ocean is driving increased winds," Snyder said.

The researchers conducted multiple runs of their regional model to compare simulations of the coastal climate for two time periods: 1968 to 2000 ("modern climate") and 2038 to 2070 ("future climate"). The regional model was driven by input from the global climate models used in the most recent report of the Intergovernmental Panel on Climate Change (IPCC AR4). The future climate projections were based on a "high-growth" emissions scenario (A2) thought to provide an upper range of possible future climates, although Snyder noted that recent global carbon dioxide emissions have exceeded even the highest projections of earlier IPCC reports.

The results showed increases in wind speeds of up to 2 meters per second, which is a large change in relation to current average wind speeds of about 5 meters per second, Snyder said. One effect of these increased winds may be earlier and more intense upwelling of cold water along the coast.

Upwelling is generally a good thing, bringing up nutrient-rich deep water to support thriving coastal ecosystems. But researchers think too much upwelling may be causing the massive "dead zone" that has begun to appear with alarming regularity off the Oregon coast. According to an earlier study by Oregon researchers, intense upwelling driven by stronger, more persistent winds stimulates excessive growth of phytoplankton (microscopic algae), which ultimately sink to the bottom and decompose, sucking oxygen out of the bottom waters.

Snyder said these conditions may become more prevalent in the future, and stronger winds all along the coast may cause the Oregon dead zone to expand into California waters.

Strong winds can also create extremely hazardous fire conditions, as was seen this fall in Southern California. On the positive side, strong winds would be good for the growing wind energy industry. Snyder also noted that an enhanced sea breeze during the warm months of the year has a cooling effect along the coast. Such a cooling trend could have many ramifications, particularly for coastal species adapted to seasonal changes in temperatures and fog, he said.

Snyder's coauthors are graduate student Travis O'Brien and Lisa Sloan, professor of Earth and planetary sciences and director of the Climate Change and Impacts Lab.


Adapted from materials provided by University of California - Santa Cruz.

University of California - Santa Cruz (2008, December 22). Stronger Coastal Winds Due To Climate Change May Have Far-reaching Effects. ScienceDaily. Retrieved December 25, 2008, from http://www.sciencedaily.com/releases/2008/12/081219172037.htm

Ray Hudson: Material matters and the search for resilience: Rethinking regional and urban development strategies in the context of global environment

International Journal of Innovation and Sustainable Development, 2008, Vol. 3, No. 3/4, pp. 166–184.

Material matters and the search for resilience: Rethinking regional and urban development strategies in the context of global environmental change

Ray Hudson
(Durham University, DH1 3HP, U.K.)

Abstract

Recently it has become clear that the scale of environmental pollution because of human activities is expanding on a scale previously unimagined. The growth of greenhouse gases is producing perhaps irreversible changes with potentially apocalyptic consequences to the ecological systems on which life on earth as we know it depends. In part this growth reflects neoliberal strategies for urban and regional development that seek to maximise the global movement of people and things. This raises questions about how we think – or perhaps more accurately should think – about regional and urban development and possible transitions to more resilient and sustainable cities and regions as a necessary element in a transition to a more resilient and sustainable planet. Can those who live in the core cities and regions of the affluent global 'north' continue to rely on the global movements of commodities and people from distant regions to sustain their lifestyles? Can they assume that the wastes produced via their activities can continue to be dumped in the global commons or exported to places in the global 'south'? What will the looming global crisis of sustainability entail for both the theory and practice of regional and urban development?

Takayuki Toyama & Alan Stainer: Cosmic Heat Emission concept to 'stop' global warming

International Journal of Global Environmental Issues, 2009, Vol. 9, No. 1/2, pp. 151–168.

Cosmic Heat Emission concept to 'stop' global warming

Takayuki Toyama (Avix Inc., 3-5-26 Miyazaki, Miyamae-ku, Kawasaki, Kanagawa 216 0033, Japan) and Alan Stainer (Middlesex University Business School, The Burroughs, Hendon, London NW4 4BT, U.K.)

Abstract

Global warming, over the last few decades, has become one of the most debated challenges facing society. There have been very few innovative remedies to combat this phenomenon which could lead to ecological disaster. A remedial concept is proposed to 'stop' or at least minimise, global warming and cool Earth. This can be achieved by focusing on the primary heat balance between human activities' generated heat and equivalent Cosmic Heat Emission. The process relates to putting Heat Reflecting Sheets on arid areas, thus combining anti-desertification measures with comparatively minimal costs involved as well as relatively quick realisation and positive results. A feasibility study based on this notion would provide a warning and a back-up move against radical actions that might be detrimental for the future of planet Earth and its sustainability.

Link to abstract: http://www.inderscience.com/search/index.php?action=record&rec_id=22093

Fix For Global Warming? Scientists Propose Covering Deserts With Reflective Sheeting

Fix For Global Warming? Scientists Propose Covering Deserts With Reflective Sheeting

ScienceDaily (Dec. 23, 2008) — A radical plan to curb global warming and so reverse the climate change caused by our rampant burning of fossil fuels since the industrial revolution would involve covering parts of the world's deserts with reflective sheeting, according to researchers writing in the International Journal of Global Environmental Issues.

Engineers Takayuki Toyama of company Avix, Inc., in Kanagawa, Japan, and Alan Stainer of Middlesex University Business School, London, UK, complain that there have been very few innovative remedies discussed to combat the phenomenon of global warming caused by human activities, despite the widespread debate of the last few decades. They now suggest that uncompromising proposals are now needed if we are to avert ecological disaster.

Finding a way to 'stop,' or at least minimise, global warming and to even cool the Earth can be achieved by focusing on the primary heat balance between the amount heat produced by human activities and the loss of heat to outer space. They emphasise that efforts to reduce atmospheric concentrations of greenhouse gases, primarily carbon dioxide, are not likely to work soon enough.

Pessimism that minimising carbon dioxide will no longer solve the problem seems to be spreading among environmental specialists," they say. As such, a lateral-thinking approach that acknowledges the fact that the heat created by human activities does not even amount to 1/10,000th of the heat that the earth receives from the sun.

Toyama and Stainer suggest that heat reflecting sheets could be used to cover arid areas and not only reflect the sun's heat back into space by increasing the Earth's overall reflectivity, or albedo, but also to act as an anti-desertification measure. The technology would have relatively minimal cost and lead to positive results quickly. They add that the same approach might also be used to cover areas of the oceans to increase the Earth's total heat reflectivity.

The team's calculations suggest that covering an area of a little more than 60,000 square kilometres with reflective sheet, at a cost of some $280 billion, would be adequate to offset the heat balance and lead to a net cooling without any need to reduce atmospheric carbon dioxide. However, they caution that it would be necessary to control the area covered very carefully to prevent overcooling and to continue with efforts to reduce our reliance on fossil fuels.


Journal reference:

  1. Toyama et al. Cosmic Heat Emission concept to 'stop' global warming. International Journal of Global Environmental Issues, 2009; 9 (1/2): 151 DOI: 10.1504/IJGENVI.2009.022093
Adapted from materials provided by Inderscience, via AlphaGalileo.

Inderscience (2008, December 23). Fix For Global Warming? Scientists Propose Covering Deserts With Reflective Sheeting. ScienceDaily. Retrieved December 25, 2008, from http://www.sciencedaily.com/releases/2008/12/081222114546.htm

New York Times: The E.P.A.'s Doctor No (outgoing E.P.A. Administrator Stephen Johnson)

New York Times Editorial, December 24, 2008

E.P.A.’s Doctor No

On April 2, 2007, the Supreme Court ruled that the federal Clean Air Act plainly empowered the Environmental Protection Agency to regulate greenhouse gases from cars and trucks — and, by inference, other sources like power plants.

There was great hope at the time that the decision would force President Bush to confront the issue of climate change, which he had largely ignored for six years. Instead, it became the catalyst for a campaign of scientific obfuscation, political flimflam and simple dereliction of duty — which United States Senator Barbara Boxer aptly described as a “master plan” — to ensure that the administration did as little as possible.

The guiding intelligence behind the master plan has been Vice President Dick Cheney; Mr. Cheney’s point man, in turn, has been Stephen Johnson, the administrator of the Environmental Protection Agency.

It was Mr. Johnson who refused to grant California a normally routine waiver that would have allowed it to impose its own greenhouse gas standards on cars and trucks. It was Mr. Johnson who was trotted out to explain why the administration could not possibly fulfill the Supreme Court’s mandate before leaving office.

And it was Mr. Johnson, in one final burst of negativity, who declared last week that his agency was under no obligation to even consider greenhouse gas emissions when deciding whether to allow a new coal-fired power plant to go forward.

The case involved a proposed power plant in Utah and turned on an arcane regulatory question: whether a Clean Air Act provision requiring the monitoring of carbon-dioxide emissions also meant that they could be controlled. This sort of question would normally prompt careful review. Mr. Johnson responded with a 19-page memorandum that added up to one word: No.

Senator Boxer has gone so far as to call Mr. Johnson’s peremptory judgment illegal.

The only thing saving this administration from a total wipeout on clean air issues was a timely decision on Tuesday from the Court of Appeals for the District of Columbia Circuit. It temporarily reinstated the Clean Air Interstate Rule — a 2005 regulation aimed at reducing soot and smog, and the most important clean air proposal to emerge from Mr. Bush’s E.P.A. In July, the court found the rule deficient on several counts, but on appeal it decided that a flawed rule was better than none at all.

So there we have it. One original initiative in eight years, saved at the bell. That’s a poor showing, and the Democrats are hardly alone in hoping for better under an Obama administration. Last week, two prominent moderate Republicans — William K. Reilly, who ran the E.P.A. under President George H.W. Bush, and William D. Ruckelshaus, who served as administrator under both Presidents Richard Nixon and Ronald Reagan — sent a little-noticed but eloquent letter to President-elect Barack Obama.

The gist of the letter was that the E.P.A. could be an enormously positive force in the fight against climate change and oil dependency. All it needed was someone who believed in its mission and was prepared to use the laws already on the books. Granting California its waiver, carrying out the Supreme Court decision, regulating emissions from vehicles and power plants — all this and more, they wrote, could be accomplished with the statutory tools at hand.

This exhortation from two veterans of the environmental wars was designed to encourage not only Mr. Obama, but also Lisa Jackson, the woman he has chosen to run the agency. It was also, however, an arrow aimed at the ideologues who have been running the agency for the last half-dozen years — and a lament for how little they have done with the weapons Congress gave them.

Link to the New York Times' editorial: http://www.nytimes.com/2008/12/25/opinion/25thu1.html

Washington Post: Faster Climate Change Feared -- New Report Points to Accelerated Melting, Longer Drought

Faster Climate Change Feared

New Report Points to Accelerated Melting, Longer Drought

by Juliet Eilperin, Washington Post Staff Writer, December 25, 2008; Page A02

The United States faces the possibility of much more rapid climate change by the end of the century than previous studies have suggested, according to a new report led by the U.S. Geological Survey.

The survey -- which was commissioned by the U.S. Climate Change Science Program and issued this month -- expands on the 2007 findings of the United Nations Intergovernment Panel on Climate Change. Looking at factors such as rapid sea ice loss in the Arctic and prolonged drought in the Southwest, the new assessment suggests that earlier projections may have underestimated the climatic shifts that could take place by 2100.

However, the assessment also suggests that some other feared effects of global warming are not likely to occur by the end of the century, such as an abrupt release of methane from the seabed and permafrost or a shutdown of the Atlantic Ocean circulation system that brings warm water north and colder water south. But the report projects an amount of potential sea level rise during that period that may be greater than what other researchers have anticipated, as well as a shift to a more arid climate pattern in the Southwest by mid-century.

Thirty-two scientists from federal and non-federal institutions contributed to the report, which took nearly two years to complete. The Climate Change Science Program, which was established in 1990, coordinates the climate research of 13 different federal agencies.

Tom Armstrong, senior adviser for global change programs at USGS, said the report "shows how quickly the information is advancing" on potential climate shifts. The prospect of abrupt climate change, he said, "is one of those things that keeps people up at night, because it's a low-probability but high-risk scenario. It's unlikely to happen in our lifetimes, but if it were to occur, it would be life-changing."

In one of the report's most worrisome findings, the agency estimates that in light of recent ice sheet melting, global sea level rise could be as much as four feet by 2100. The IPCC had projected a sea level rise of no more than 1.5 feet by that time, but satellite data over the past two years show the world's major ice sheets are melting much more rapidly than previously thought. The Antarctic and Greenland ice sheets are now losing an average of 48 cubic miles of ice a year, equivalent to twice the amount of ice that exists in the Alps.

Konrad Steffen, who directs the Cooperative Institute for Research in Environmental Sciences at the University of Colorado at Boulder and was lead author on the report's chapter on ice sheets, said the models the IPCC used did not factor in some of the dynamics that scientists now understand about ice sheet melting. Among other things, Steffen and his collaborators have identified a process of "lubrication," in which warmer ocean water gets in underneath coastal ice sheets and accelerates melting.

"This has to be put into models," said Steffen, who organized a conference last summer in St. Petersburg, Russia, as part of an effort to develop more sophisticated ice sheet models. "What we predicted is sea level rise will be higher, but I have to be honest, we cannot model it for 2100 yet."

Still, Armstrong said the report "does take a step forward from where the IPCC was," especially in terms of ice sheet melting.

Scientists also looked at the prospect of prolonged drought over the next 100 years. They said it is impossible to determine yet whether human activity is responsible for the drought the Southwestern United States has experienced over the past decade, but every indication suggests the region will become consistently drier in the next several decades. Richard Seager, a senior research scientist at Columbia University's Lamont-Doherty Earth Observatory, said that nearly all of the 24 computer models the group surveyed project the same climatic conditions for the North American Southwest, which includes Mexico.

"If the models are correct, it will transition in the coming years and decades to a more arid climate, and that transition is already underway," Seager said, adding that such conditions would probably include prolonged droughts lasting more than a decade.

The current models cover broad swaths of landscape, and Seager said scientists need to work on developing versions that can make projections on a much smaller scale. "That's what the water managers out there really need," he said. Current models "don't give them the hard numbers they need."

Armstrong said the need for "downscaled models" is one of the challenges facing the federal government, along with better coordination among agencies on the issue of climate change. When it comes to abrupt climate shifts, he said, "We need to be prepared to deal with it in terms of policymaking, keeping in mind it's a low-probability, high-risk scenario. That said, there are really no policies in place to deal with abrupt climate change."

Richard Moss, who directed the Climate Change Science Program's coordination office between 2000 and 2006 and now serves as vice president and managing director for climate change at the World Wildlife Fund-U.S., welcomed the new report but called it "way overdue."

"There is finally a greater flow of climate science from the administration," Moss said, noting that the report was originally scheduled to come out in the summer of 2007. "It really is showing the potential for abrupt climate change is real."

The report is reassuring, however, on the prospects for some potentially drastic effects -- such as a huge release of methane, a potent heat-trapping gas, that is now locked deep in the seabed and underneath the Arctic permafrost. That is unlikely to occur in the near future, the scientists said.

"It's unlikely that we're going to see an abrupt change in methane over the next hundred years, but we should worry about it over a longer time frame," said Ed Brook, the lead author of the methane chapter and a geosciences professor at Oregon State University. "All of these places where methane is stored are vulnerable to leaking."

By the end the century, Brook said, the amount of methane escaping from natural sources such as the Arctic tundra and waterlogged soils in warmer regions "could possibly double," but that would still be less than the current level of human-generated methane emissions. Over the course of the next thousand years, he added, methane hydrates stored deep in the seabed could be released: "Once you start melting there, you can't really take it back."

In the near term, Brook said, more precise monitoring of methane levels worldwide would give researchers a better sense of the risk of a bigger atmospheric release. "We don't know exactly how much methane is coming out all over the world," he said. "That's why monitoring is important."

While predictions remain uncertain, Steffen said cutting emissions linked to global warming represents one of the best strategies for averting catastrophic changes.

"We have to act very fast, by understanding better and by reducing our greenhouse gas emissions, because it's a large-scale experiment that can get out of hand," Steffen said. "So we don't want that to happen."

Link to article: http://www.washingtonpost.com/wp-dyn/content/article/2008/12/24/AR2008122402174.html

D. Pollard & R. M. Deconto (2008), Modeling West Antarctic Ice Sheet Growth and Retreat Through the Last 5 Million Years

American Geophysical Union, Fall Meeting 2008, abstract #C34B-02; published December 2008

Modeling West Antarctic Ice Sheet Growth and Retreat Through the Last 5 Million Years

D. Pollard (Pennsylvania State University, Earth and Environmental Systems Institute, 2217 Earth-Engineering Science Bldg., University Park, PA 16802, U.S.A.; e-mail: pollard@essc.psu.edu) and R. M. DeConto (University of Massachusetts, Department of Geosciences, 233 Morrill Science Center, Amherst, MA 01003, U.S.A.; e-mail: deconto@geo.umass.edu)

The West Antarctic Ice Sheet, grounded mostly below sea level and fringed by floating ice shelves, is considered to be vulnerable to future anthropogenic warming. However, projections of its future behavior are hampered by limited understanding of past variations and the main forcing mechanisms. Here a combined ice sheet-shelf model with imposed grounding-line fluxes following C. Schoof (J. Geophys. Res., 2007) is used to simulate Antarctic variations over the last 5 million years. We argue that oceanic melting below ice shelves is an important long-term forcing, controlled mainly by far-field influences that can be correlated with deep-sea- core d18O records. Model West Antarctic configurations range between full glacial extents with grounding lines near the continental shelf break, intermediate states similar to modern, and brief collapses to isolated ice caps on small West Antarctic islands. Transitions between these states can be relatively rapid, taking one to several thousand years. Several aspects of our simulation agree with a sediment record recently recovered beneath the Ross Ice Shelf by ANDRILL (MIS AND-1B core), including a long-term trend from more frequently collapsed to more glaciated states, and brief but dramatic collapses at Marine Isotope Stage 31 (~1 Ma) and other super-interglacials. Higher-resolution nested simulations over the Ross Embayment resolve Siple Coast ice streams, Transantarctic outlet glaciers, and details of shelf flow. Correlations between modeled local conditions near the AND-1B core site and the overall West Antarctic state are examined, along with implications for the AND-1B lithologic record.

Link to abstract: http://adsabs.harvard.edu/abs/2008AGUFM.C34B..02P

Greenland and Siberia see rapid changes: Arctic warming spurs record melting

Published online 17 December 2008 | Nature | doi:10.1038/news.2008.1314

News

Arctic warming spurs record melting

Greenland and Siberia see rapid changes

Record melting in northern Greenland and the widespread release of methane gas from formerly frozen deposits off the Siberian coast suggest that major changes are sweeping the Arctic, researchers say.

The recent observations, reported on 16 December at the autumn meeting of the American Geophysical Union in San Francisco, California, have surprised scientists who — although used to Arctic changes — did not expect to see them so dramatically over the past year.

Link to rest of article in Nature: http://www.nature.com/news/2008/081217/full/news.2008.1314.html

NASA scientist, James Hansen, warns of runaway global warming

NASA scientist warns of runaway global warming

by Michael Le Page, features editor, NewScientist, December 22, 2008

Here's a prediction to take note of: there will be an unambiguous new global temperature record during the first term of the Obama administration.

This prediction comes from leading climate scientist James Hansen of NASA. He made it in response to a question from a member of the audience during a lecture to the American Geophysical Union on 17 December.

The prediction comes at a time when there has been much discussion of average annual global temperatures, and just how hot or cool 2008 really was.

As you will no doubt see from the comments, Hansen is a hate figure for the climate deniers who insist it's getting cooler.

It's a bold and simple prediction, but I'm not sure how helpful it is. Making short-term forecasts is much harder than making long-term ones, and there are some climate researchers who think cyclic changes in the oceans will mask the underlying warming trend for the next five years or so.

What's more, a big volcanic eruption would cause a sharp but temporary dip in temperatures.

If Hansen is right - as he has been before - the deniers will continue to find reasons to persuade themselves it's not true. An ever-popular one is that global warming is all a conspiracy to impose a global government on the world.

If he's wrong, he'll have handed the deniers what might seem like a convincing argument to people who don't know much about the evidence for climate change.

As for the rest of Hansen's lecture, much of it emphasised points he has been making for some time: we should to aim to reduce carbon dioxide levels to 350 parts per million, we cannot afford to built more coal power plants (without carbon capture and storage) and that a carbon tax with a 100% dividend should be introduced.

However, he also made another striking prediction. According to Hansen, human activity is causing greenhouse gas levels to rise so rapidly that his model suggests there is a risk of a runaway greenhouse effect, ultimately resulting in the loss of oceans and of all life on the planet:

"In my opinion, if we burn all the coal, there is a good chance that we will initiate the runaway greenhouse effect. If we also burn the tar sands and tar shale (a.k.a. oil shale), I think it is a dead certainty."

That's a cheerful thought. Let's end instead with a quote from Hansen's new boss:

"Today, more than ever before, science holds the key to our survival as a planet and our security and prosperity as a nation... It's about ensuring that facts and evidence are never twisted or obscured by politics or ideology. It's about listening to what our scientists have to say, even when it's inconvenient - especially when it's inconvenient."

Michael Le Page, features editor

Link to article: http://www.newscientist.com/blogs/shortsharpscience/2008/12/nasa-scientist-warns-of-runawa.html

James Hansen to Obama on 4th generation, integral fast (IFR) and liquid-fluoride thorium nuclear reactors (LFTR)

From the Brave New Climate blog:

Hansen to Obama Pt III - Fast nuclear reactors are integral

Posted by Barry Brook on 28 November 2008

Nuclear energy? Pah! Too dangerous (risk of meltdown or weapons proliferation), too expensive, too slow to come on line, insufficient uranium reserves to power more than a small fraction of the world’s energy demand, blah di blah blah blah blah. There is certainly plenty of opposition out there to nuclear energy in any way, shape or form. Nuclear is bad news, it’s a distraction, it’s a carry over from the cold war, it’s old school thinking. And so on.

Well, the above is what the majority of environmentalists and pacifists would tell you. And there is some very solid reason for scepticism about the widespread use of nuclear power, especially Generation II nuclear fission reactors (I suggest we keep the ones we’ve got, but don’t bother with any more of them). But in the brave new world of the Sustainability Emergency (climate crisis + energy crisis + water crisis + mineral crisis + biodiversity crisis, etc.), we simply haven’t got time or scope for such hard-line negativity. We need every solution we can lay our hands on — and more for good measure.

Hansen is willing to talk about nuclear energy. I am too — given chronic intermittency issues with large-scale renewables and the need for plenty of extra energy to fix huge looming problems with hanging together a sophisticated civilisation on a habitable planet, it’s got to be in the mix. Indeed, in the long run, it, in the form of fusion power, could well be the only form of energy that matters to humanity (if we manage to get through the post-industrial crunch, that is). There are plenty of tantilising prospects for safe, effective, long-term baseload power from 4th+ generation nuclear fission power. But for now, there is just nowhere near enough action ($$ and willpower) on the R&D and roll out front.

Hansen explains this in part III. He also goes into more detail on this issue in his earlier Trip Report, which I also quote below…

————————————————————–

Tell Barack Obama the Truth – The Whole Truth (Part III of IV)

Dr James E. Hansen

Nuclear Power. Some discussion about nuclear power is needed. Fourth generation nuclear power has the potential to provide safe base-load electric power with negligible CO2 emissions.

There is about a million times more energy available in the nucleus, compared with the chemical energy of molecules exploited in fossil fuel burning. In today’s nuclear (fission) reactors neutrons cause a nucleus to fission, releasing energy as well as additional neutrons that sustain the reaction. The additional neutrons are ‘born’ with a great deal of energy and are called ‘fast’ neutrons. Further reactions are more likely if these neutrons are slowed by collisions with non-absorbing materials, thus becoming ‘thermal’ or slow neutrons.

All nuclear plants in the United States today are Light Water Reactors (LWRs), using ordinary water (as opposed to ‘heavy water’) to slow the neutrons and cool the reactor. Uranium is the fuel in all of these power plants. One basic problem with this approach is that more than 99% of the uranium fuel ends up ‘unburned’ (not fissioned). In addition to ‘throwing away’ most of the potential energy, the long-lived nuclear wastes (plutonium, americium, curium, etc.) require geologic isolation in repositories such as Yucca Mountain.

There are two compelling alternatives to address these issues, both of which will be needed in the future. The first is to build reactors that keep the neutrons ‘fast’ during the fission reactions. These fast reactors can completely burn the uranium. Moreover, they can burn existing long-lived nuclear waste, producing a small volume of waste with half-life of only sever decades, thus largely solving the nuclear waste problem. The other compelling alternative is to use thorium as the fuel in thermal reactors. Thorium can be used in ways that practically eliminate buildup of long-lived nuclear waste.

The United States chose the LWR development path in the 1950s for civilian nuclear power because research and development had already been done by the Navy, and it thus presented the shortest time-to-market of reactor concepts then under consideration. Little emphasis was given to the issues of nuclear waste. The situation today is very different. If nuclear energy is to be used widely to replace coal, in the United States and/or the developing world, issues of waste, safety, and proliferation become paramount.

Nuclear power plants being built today, or in advanced stages of planning, in the United States, Europe, China and other places, are just improved LWRs. They have simplified operations and added safety features, but they are still fundamentally the same type, produce copious nuclear waste, and continue to be costly. It seems likely that they will only permit nuclear power to continue to play a role comparable to that which it plays now.

Both fast and thorium reactors were discussed at our 3 November workshop. The Integral Fast Reactor (IFR) concept was developed at the Argonne National Laboratory, and it has been built and tested at the Idaho National Laboratory. IFR keeps neutrons “fast” by using liquid sodium metal as a coolant instead of water. It also makes fuel processing easier by using a metallic solid fuel form. IFR can burn existing nuclear waste, making electrical power in the process. All fuel reprocessing is done within the reactor facility (hence the name “integral”) and many enhanced safety features are included and have been tested, such as the ability to shutdown safely under even severe accident scenarios.

The Liquid-Fluoride Thorium Reactor (LFTR) is a thorium reactor concept that uses a chemically stable fluoride salt for the medium in which nuclear reactions take place. This fuel form yields flexibility of operation and eliminates the need to fabricate fuel elements. This feature solves most concerns that have prevented thorium from being used in solid fueled reactors. The fluid fuel in LFTR is also easy to process and to separate useful fission products, both stable and radioactive. LFTR also has the potential to destroy existing nuclear waste, albeit with less efficiency than in a fast reactor such as IFR.

Both IFR and LFTR operate at low pressure and high temperatures, unlike today’s LWR’s. Operation at low pressures alleviates much of the accident risk with LWR. Higher temperatures enable more of the reactor heat to be converted to electricity (40% in IFR, 50% in LFTR vs 35% in LWR). Both IFR and LFTR have the potential to be air-cooled and to use waste heat for desalinating water.

Both IFR and LFTR are 100–300 times more fuel efficient than LWRs. In addition to solving the nuclear waste problem, they can operate for several centuries using only uranium and thorium that has already been mined. Thus they eliminate the criticism that mining for nuclear fuel will use fossil fuels and add to the greenhouse effect.

The Obama campaign, properly in my opinion, opposed the Yucca Mountain nuclear repository. Indeed, there is a far more effective way to use the $25 billion collected from utilities over the past 40 years to deal with waste disposal. This fund should be used to develop fast reactors that eat nuclear waste and thorium reactors to prevent the creation of new long-lived nuclear waste. By law the federal government must take responsibility for existing spent nuclear fuel, so inaction is not an option. Accelerated development of fast and thorium reactors will allow the US to fulfill its obligations to dispose of the nuclear waste, and open up a source of carbon-free energy that can last centuries, even millennia.

The common presumption that 4th generation nuclear power will not be ready until 2030 is based on assumption of "business-as-usual.” Given high priority, this technology could be ready for deployment in the 2015–2020 time frame, thus contributing to the phase-out of coal plants. Even if the United States finds that it can satisfy its electrical energy needs via efficiency and renewable energies, 4th generation nuclear power is probably essential for China and India to achieve clear skies with carbon-free power.

————————————————————–

MORE by Hansen on the same topic, with some extra details and a book recommendation for further reading…

Trip Report - Nuclear Power

On one of my trips I read a draft of “Prescription for the Planet” by Tom Blees, which I highly recommend. Let me note two of its topics that are especially relevant to global warming. Blees makes a powerful case for 4th generation nuclear power, the Integral Fast Reactor (IFR). IFR reactors (a.k.a. fast or breeder reactors) eliminate moderating materials used in thermal reactors, allowing the neutrons to move faster. More energetic splitting of nuclei releases more neutrons. Instead of using up less than 1% of the fissionable material in the ore, a fast reactor burns practically all of the uranium. Primary claimed advantages are:

(a) The fuel is recycled on-site, incorporating radioactive elements into new fuel rods. The eventual ‘ashes’ are not usable as fuel or weapons. The radioactive half-life of the ashes is short, their radioactivity becoming less than that of naturally occurring ore within a few hundred years. The volume of this waste is relatively small and can be stored easily either on-site or off-site.

(b) The IFR can burn the nuclear ‘waste’ of current thermal reactors. So we have a supply of fuel that is better than free – we have been struggling with what to do with that ‘waste’ for years. We have enough fuel for IFR reactors to last several centuries without further uranium mining. So the argument that nuclear power uses a lot of fossil fuels during uranium mining becomes moot.

(c) IFR design can be practically failsafe, relying on physical properties of reactor components to shut down in even the most adverse situations, thus avoiding coolant problems of Chernobyl and Three Mile Island, as well as the earthquake problem. The terrorist threat can be minimized by building the reactor below grade and covering it with reinforced concrete and earth.

Wait a minute! If it’s that good, why aren’t we doing it? Well, according to Blees, it’s because, in 1994, just when we were ready to build a demonstration plant, the Clinton Administration cancelled the IFR program. Blees offers a partial explanation, noting that Clinton had used the phrase “You’re pro-nuclear!” to demonize rivals during his campaign, suggesting that Clinton had a debt to the anti-nuclear people. Hmm. The matter warrants further investigation and discussion. It’s not as if we didn’t know about global warming in 1994.

Even more curious is the assertion that Argonne scientists, distraught about the cancellation, were told they could not talk about it (why do I find this easy to believe?). Here too there is no explanation in depth, although Blees notes that the Secretary of Energy, Hazel O. Leary, was previously a lobbyist for fossil fuel companies (my gosh, is everybody in Washington an ex-lobbyist – alligators will go extinct!).

I have always been agnostic on nuclear power. I like to hope that, if our next President gives high priority to a low-loss national electric grid, renewables will be able to take over most of the power generation load4. Wind and solar–thermal are poised to become big players. IEA’s estimate that renewables will only grow from 1% to 2% (by 2030!) can be dismissed due to IEA’s incestuous relation with fossil industries – nevertheless, one must have healthy skepticism about whether renewables can take over completely. Maybe an understatement – I’m not certain.

Blees argues that it made no sense to terminate research and development of 4th generation nuclear power. Was it thought that nuclear technology would be eliminated from Earth, and thus the world would become a safer place?? Not very plausible – as Blees points out, several other countries are building or making plans to build fast reactors. By opting out of the technology, the U.S. loses the ability to influence IFR standards and controls, with no realistic hope of getting the rest of the world to eschew breeder reactors. Blees suggests, probably rightly, that this was a political calculation for domestic purposes, a case of dangerous self-deception.

Bottom line: I can’t seem to agree fully with either the anti-nukes or Blees. Some of the anti-nukes are friends, concerned about climate change, and clearly good people. Yet I suspect that their ‘success’ (in blocking nuclear R&D) is actually making things more dangerous for all of us and for the planet. It seems that, instead of knee-jerk reaction against anything nuclear, we need hard-headed evaluation of how to get rid of long-lived nuclear waste and minimize dangers of proliferation and nuclear accidents. Fourth generation nuclear power seems to have the potential to solve the waste problem and minimize the others. In any case, we should not have bailed out of research on fast reactors. (BTW, Blees points out that coal-fired power plants are exposing the population to more than 100 times more radioactive material than nuclear power plants – some of it spewed out the smokestacks, but much of it in slag heaps of coal ash. See http://www.inthesetimes.com/article/3614/dirty_smoke_signals/ re the effect of this waste on Native Americans in the Southwest, as well as ‘Burning the Future,’ above, re the Appalachians.)

I don’t agree with Blees’ dismissal of the conclusion of most energy experts that there is no ‘silver bullet’; they argue that we need a mix of technologies. Blees sees a ‘depleted uranium bullet’ that could easily provide all of our needs for electrical energy for hundreds of years. His argument is fine for pointing out that existing nuclear material contains an enormous amount of energy (if we extract it all, rather than leaving >99% in a very long-lived waste heap), but I still think that we need a range of energy sources. Renewable energies and nuclear power are compatible: they both need, or benefit from, a low-loss grid, as it is more acceptable to site nuclear plants away from population centers, and nuclear energy provides base-load power, complementing intermittent renewables.

BTW, nuclear plants being proposed for construction now in the U.S. are 3rd generation (the ones in operation are mostly 2nd generation). The 3rd generation reactors are simplified (fewer valves, pumps and tanks), but they are still thermal pressurized reactors that require (multiple) emergency cooling systems. France is about to replace its aging 2nd generation reactors with the European Pressurized Reactor (EPR); a prototype is now being built in Finland. According to Blees, OECD ranks EPR as the cheapest electric energy source, cheaper than pulverized coal – that evaluation doubtless presumes use of a standard design, a la the French procedure for its 2nd generation reactors. The prototype in Finland, according to reports, is running behind schedule and over budget – that was also true in the prior generation, yet the eventual standard French reactors have been economical. Current efforts to start construction of 3rd generation nuclear plants in the U.S., so far, do not seem to have achieved a standard design or to have avoided project delays (partly due to public opposition) that drive up costs.

Blees argues that the 4th generation technology basically exists, that the design will be simplified, especially due to the absence of a need for emergency cooling systems. He foresees a standard modular construction of the reactor per se, smaller than earlier generations, which can be built at the factory, shipped to the site, and dropped in the prepared excavation. His cost estimates have this nuclear power yielding cheaper electricity than any of the competition. The system is designed to eliminate long-lived nuclear ‘waste’ and minimize proliferation dangers. There is enough fuel available without further uranium mining to handle electricity needs for several centuries, for whatever fraction of electricity needs cannot be covered by renewable energies. If these claims are anywhere close to being correct, we could phase out use of fossil fuels for electricity generation over the next few decades.

I do not have the expertise or insight to evaluate the cost and technology readiness estimates. The overwhelming impression that I get, reinforced by the ‘boron’ topic below, is that Blees is a great optimist. But we need some good ideas and optimism. The book contains a lot of interesting insights and tidbits, e.g., there is more energy available in the nuclear material spewn out as waste by coal plants than the amount of energy produced by the coal burning. The book will be available in about a month; see his web site www.prescriptionfortheplanet.com

Link to Brave New Climate blog post: http://bravenewclimate.com/2008/11/28/hansen-to-obama-pt-iii-fast-nuclear-reactors-are-integral/

D. E. Becker et al., Greenland ice sheet outlet glacier front changes: comparison of year 2008 with past years

American Geophysical Union, Fall Meeting 2008, abstract #C11D-0538

Greenland ice sheet outlet glacier front changes: Comparison of year 2008 with past years


D. E. Becker, J. Box, and R. Benson (Byrd Polar Research Center, 1090 Carmack Rd, Columbus, OH 43210, U.S.A.)

Abstract

NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) imagery are used to calculate inter-annual, end of summer, glacier front area changes at 10 major Greenland ice sheet outlets over the 2000-2008 period. To put the recent 8 end of summer net annual changes into a longer perspective, glacier front position information from the past century are also incorporated. The largest MODIS-era area changes are losses/retreats; found at the relatively large Petermann Gletscher, Zachariae Isstrom, and Jakobshavn Isbrae. The 2007-2008 net ice area losses were 63.4 sq. km, 21.5 sq. km, and 10.9 sq. km, respectively. Of the 10 largest Greenland glaciers surveyed, the total net cumulative area change from end of summer 2000 to 2008 is -536.6 sq km, that is, an area loss equivalent with 6.1 times the area of Manhattan Is. (87.5 sq km) in New York, USA. Ice front advances are evident in 2008; also at relatively large and productive (in terms of ice discharge) glaciers of Helheim (5.7 sq km), Store Gletscher (4.9 sq km), and Kangerdlugssuaq (3.4 sq km). The largest retreat in the 2000-2008 period was 54.2 sq km at Jakobshavn Isbrae between 2002 and 2003; associated with a floating tongue disintegration following a retreat that began in 2001 and has been associated with thinning until floatation is reached; followed by irreversible collapse. The Zachariae Isstrom pro-glacial floating ice shelf loss in 2008 appears to be part of an average ~20 sq km per year disintegration trend; with the exception of the year 2006 (6.2 sq km) advance. If the Zachariae Isstrom retreat continues, we are concerned the largest ice sheet ice stream that empties into Zachariae Isstrom will accelerate, the ice stream front freed of damming back stress, increasing the ice sheet mass budget deficit in ways that are poorly understood and could be surprisingly large. By approximating the width of the surveyed glacier frontal zones, we determine and present effective glacier normalized length (L') changes that also will be presented at the meeting. The narrow Ingia Isbrae advanced in L' the most in 2006-2007 by 9.2 km. Jakobshavn decreased in L' the most in 2002-2003 by 8.0 km. Petermann decreased in length the most in 2000-2001, that is, L' = -5.3 km and again by L' = -3.9 km in 2007-2008. Helheim Gl. retreated in 2004-2005 by L' = -4.6 km and advanced 2005-2006 by L' = 4.4 km. The 10 glacier average L' change from end of summer 2000 end of summer 2008 was 0.6 km. Results from a growing list of glaciers will be presented. We attempt to interpret the observed glacier changes using glaciological theory and regional climate observations.

Link to abstract: http://adsabs.harvard.edu/abs/2008AGUFM.C11D0538D

Hugues Goosse et al., Consistent past half-century trends in the atmosphere, the sea ice and the ocean at high southern latitudes

Journal:
Climate Dynamics
Publisher:Springer, Berlin/Heidelberg
ISSN:0930-7575 (print); 1432-0894 (online)
DOI:10.1007/s00382-008-0500-9
Subject Collection:Earth and Environmental Science
SpringerLink Date:
Wednesday, December 3, 2008

Hugues Goosse1 Contact Information, Wouter Lefebvre1, 3, Anne de Montety1, Elisabeth Crespin1 and Alejandro H. Orsi2


¹Institut d’Astronomie et de Géophysique G. Lemaître, Université Catholique de Louvain, Chemin du Cyclotron, 2, 1348, Louvain-la-Neuve, Belgium

²Department of Oceanography, Texas A&M University, College Station, TX, USA

³The Vlaams Instituut voor Technologisch Onderzoek (VITO), Boeretang 200, Mol, Belgium

(Received 4 June 2008; accepted 19 November 2008; published online 3 December 2008.)

Abstract Simulations performed with the climate model LOVECLIM, aided with a simple data assimilation technique that forces a close matching of simulated and observed surface temperature variations, are able to reasonably reproduce the observed changes in the lower atmosphere, sea ice and ocean during the second half of the twentieth century. Although the simulated ice area slightly increases over the period 1980–2000, in agreement with observations, it decreases by 0.5 × 106 km2 between early 1960s and early 1980s. No direct and reliable sea ice observations are available to firmly confirm this simulated decrease, but it is consistent with the data used to constrain model evolution as well as with additional independent data in both the atmosphere and the ocean. The simulated reduction of the ice area between the early 1960s and early 1980s is similar to the one simulated over that period as a response to the increase in greenhouse gas concentrations in the atmosphere while the increase in ice area over the last decades of the twentieth century is likely due to changes in atmospheric circulation. However, the exact contribution of external forcing and internal variability in the recent changes cannot be precisely estimated from our results. Our simulations also reproduce the observed oceanic subsurface warming north of the continental shelf of the Ross Sea and the salinity decrease on the Ross Sea continental shelf. Parts of those changes are likely related to the response of the system to the external forcing. Modifications in the wind pattern, influencing the ice production/melting rates, also play a role in the simulated surface salinity decrease.

Contact Information Hugues Goosse
e-mail: hugues.goosse@uclouvain.be
e-mail: hgs@astr.ucl.ac.be

Link to abstract: http://www.springerlink.com/content/c6513024kw761472/

D. H. Bromwich et. al., Surface and Mid-tropospheric Climate Change in Antarctica

AGU 2008 Fall Meeting

Abstract: C41A-0497

Surface and Mid-tropospheric Climate Change in Antarctica

D. H. Bromwich* (e-mail: bromwich.1@osu.edu; Byrd Polar Research Center, Ohio State University, 1090 Carmack Road, Columbus, OH 43210, U.S.A.),
A. J. Monaghan (e-mail: monaghan@ucar.edu; Research Applications Laboratory, National Center for Atmospheric Research, P.O. Box 3000, Boulder, CO 80307, U.S.A.), and S. R. Colwell (e-mail: src@bas.ac.uk; British Antarctic Survey, High Cross, Madingley Road, Cambridge, CB3 0ET, U.K.)

Near-surface air temperatures and 500-hPa temperatures over Antarctica for 1960-2007 have been reconstructed over the entire continent using manned station observations and radiosonde records, respectively, from the READER database maintained by British Antarctic Survey. The 50-year trends found in our near-surface temperature reconstruction agree with recent work by others using a variety of spatial extrapolation techniques. It is found that the statistically significant Antarctic Peninsula near-surface warming on an annual basis has spread into West Antarctica reaching as far as east as the Pine Island Bay-Thwaites Glacier region. The warming is most marked in recent years with 2007 being the warmest year in the 1960-2007 interval. In contrast to the western (eastern) Antarctic Peninsula warming which is maximized in winter (summer), the warming over West Antarctica is maximized in the spring (SON) and in that season statistically significant warming stretches across all of West Antarctica and into northern Victoria Land. Weak near-surface warming is found over East Antarctica and the continent as a whole on an annual basis although continental warming in the spring is statistically significant and driven largely by the strong and widespread changes in West Antarctica. The 1960-2007 500-hPa temperature reconstruction is compared to the changes described by Turner et al. (2005), who found strong winter warming in radiosonde records over Antarctica for 1971-2003 but noted greater uncertainty over West Antarctica where there are few observational constraints.

Cite:
Bromwich, D. H. et al. (2008), Surface and mid-tropospheric climate change in Antarctica, Eos Trans. AGU, 89(53), Fall Meet. Suppl., Abstract C41A-0497.

AGU 2008: Evidence that Antarctica has warmed significantly over past 50 years

AGU 2008: Evidence that Antarctica has warmed significantly over past 50 years

New research presented at the AGU today suggests that the entire Antarctic continent may have warmed significantly over the past 50 years. The study, led by Eric Steig of the University of Washington in Seattle and soon to be published in Nature, calls into question existing lines of evidence that show the region has mostly cooled over the past half-century.

Steig and colleagues combined satellite thermal infra-red collected over 25 years with weather station data for the region. Although the satellite data span a shorter time period and are accurate only for blue sky days, i.e., when there is no cloud cover, they provide high spatial coverage of the region, which cannot be obtained from discrete ground measurements. In contrast, the weather station data provide complete temporal resolution over the past half-century.

Using an iterative process to analyse the data, they found warming over the entire Antarctic continent for the period 1957-2006. Restricting their analysis to 1969 to 2000, a period for which other studies have found a net cooling trend, Steig’s study found slight cooling in east Antarctica, but net warming over west Antarctica.

As well as uncovering evidence of warming over a wider region than previous studies have shown, the researchers found that warming occurred throughout all of the year and was greatest in winter and spring. In contrast, cooling over east Antarctica was restricted to autumn.

They independently confirmed these trends by using data from automatic weather stations, and excluding the satellite data.

Overall, the study suggests that warming is not limited to the Antarctica peninsula region. Steig says their findings are backed up by recent results from David Bromwich of the Byrd Polar Research Centre at Ohio State University also presented at this meeting and by a climate modelling study using data assimilation from Hugues Goosse of the Université Catholique de Louvain in Belgium and colleagues, which is due to be published in the journal Climate Dynamics.

The authors speculate that the warming trend may be due to shifts in circulation coupled with sea ice changes.

Olive Heffernan

Link to Nature blog post: http://blogs.nature.com/climatefeedback/2008/12/agu_2008_evidence_that_antarct.html

Climate Progress blog: Another AGU stunner -- Evidence that Antarctica has warmed significantly over past 50 years

Another AGU stunner: Evidence that Antarctica has warmed significantly over past 50 years

hot-penguin.jpgScientists know the Antarctic ice sheet is losing mass “100 years ahead of schedule” (see “AGU 2008: Two trillion tons of land ice lost since 2003” and “Antarctic ice sheet hits the fan“).

Now, as Nature’s climate blog reports, two studies presented last week at the AGU meeting document what should not be a surprise, but still is. New research suggests “the entire Antarctic continent may have warmed significantly over the past 50 years“:

The study, led by Eric Steig of the University of Washington in Seattle and soon to be published in Nature, calls into question existing lines of evidence that show the region has mostly cooled over the past half-century.

they found warming over the entire Antarctic continent for the period 1957-2006. Restricting their analysis to 1969 to 2000, a period for which other studies have found a net cooling trend, Steig’s study found slight cooling in east Antarctica, but net warming over west Antarctica.

How did they perform the analysis of the harshest, most remote climate on the planet?

Steig and colleagues combined satellite thermal infra-red collected over 25 years with weather station data for the region. Although the satellite data span a shorter time period and are accurate only for blue sky days i.e. when there is no cloud cover, they provide high spatial coverage of the region, which cannot be obtained from discrete ground measurements. In contrast, the weather station data provide complete temporal resolution over the past half-century….

They independently confirmed these trends by using data from automatic weather stations, and excluding the satellite data.

Turns out they were not the only study to present at the AGU meeting to find that warming extends beyond the Antarctic’s Peninsula region. David Bromwich of the Byrd Polar Research Centre at Ohio State also presented at AGU his new study, “Surface and Mid-tropospheric Climate Change in Antarctica,” which found:

Near-surface air temperatures and 500-hPa temperatures over Antarctica for 1960-2007 have been reconstructed over the entire continent using manned station observations and radiosonde records, respectively, from the READER database maintained by British Antarctic Survey. The 50-year trends found in our near-surface temperature reconstruction agree with recent work by others using a variety of spatial extrapolation techniques. It is found that the statistically significant Antarctic Peninsula near-surface warming on an annual basis has spread into West Antarctica reaching as far as east as the Pine Island Bay-Thwaites Glacier region.

The warming is most marked in recent years with 2007 being the warmest year in the 1960- 2007 interval…. The warming over West Antarctica is maximized in the spring (SON) and in that season statistically significant warming stretches across all of West Antarctica and into northern Victoria Land. Weak near- surface warming is found over East Antarctica and the continent as a whole on an annual basis although continental warming in the spring is statistically significant and driven largely by the strong and widespread changes in West Antarctica.

How credible and comprehensive is this research? It is based on a three-year study funded by the National Science Foundation’s Office of Polar Programs (Glaciology), in which Bromwich worked with the U.S. National Center for Atmospheric Research and the British Antarctic Survey. As Bromwich explains on his website, he blended model data and observations “to reconstruct a record of Antarctic near-surface temperature back to 1960″:

Considering that there are only 15 long-term observational records of near-surface temperature over the entire continent of Antarctica (1-1/2 times the size of the U.S.), this record fills important gaps in our current knowledge of the spatial and temporal variability of Antarctic near-surface temperatures. Only two other such observational records - that depict temperatures over the entire continent - exist. We have collaborated with the creators of the other two datasets … to perform the most comprehensive evaluation of Antarctic near-surface temperatures yet….

The key finding is that temperatures over most of Antarctica have been warming subtly since the early 1990s, consistent with a leveling-off of trends in the Southern Hemisphere Annular Mode over the same period. This result contrasts with most recent results indicating the temperatures over Antarctica (other than on the Antarctic Peninsula) haven’t changed much in recent decades.

So notwithstanding the amateur meteorologist-deniers who sometimes comment on this blog and elsewhere about how cold it is outside right now, the whole damn planet is warming and melting.

“The science is beyond dispute… Delay is no longer an option. Denial is no longer an acceptable response”

Related Posts:

2 Responses to “Another AGU stunner: Evidence that Antarctica has warmed significantly over past 50 years”

  1. mauri pelto Says:

    Bromwich is the person to trust on this. I taught a course on Antarctic Climate glacier interactions at the University of Maine in 1987 and his work was the best then for the continent and still is when it comes to winds and temperature.

  2. Johnny Rook Says:

    Meanwhile at the other pole, have you seen the trend line for sea-ice refreezing at NSIDC recently? After hovering well above last years refreeze rate for several months it has now crossed below last year’s refreeze line, which puts it even farther below the 1979-2000 average.

    Last year we had lots of thin 1st year ice, which is why there was so much summer ice melt even though temperatures were cooler. If this current trend continues were going to have lots of thin 1st and 2nd year ice but less ice refreezing overall. And then if we get an El Niño come spring…

    2008 Arctic Sea-Ice Refreeze Trend Line Dropping Below Last Year’s

Link to this post on the Climate Progress blog: http://climateprogress.org/2008/12/22/another-agu-stunner-evidence-that-antarctica-has-warmed-significantly-over-past-50-years/

Clean Coal Technology and the Canary in the Coal Mine

Awesome cute if it were not so sad:

http://www.thisisreality.org/#/?p=facility

James Hansen's AGU presentation "The Venus Syndrome"

Eduardo Zorita: 13 of last 17 years hottest since 1880 statistically unlikely in stable climate

Glut of hot years a coincidence? Fat chance

by Catherine Brahic, NewScientist, December 17, 2008

Thirteen of the hottest years since records of global temperatures began in 1880 have clustered in the last 17 years. It is tempting – and it sure makes good headlines – to blame it on climate change. But does science support such a claim?

According to new statistical research, it does. The recent glut of unusually hot years is incredibly unlikely to happen in a stable climate.

Eduardo Zorita of Germany's Institute for Coastal Research and colleagues calculated the probability of this happening in a range of scenarios.

A key consideration is that the weather one year is not independent of the weather the year before. If it were, the odds of having any given temperature would be the same each year, and the likelihood of getting a such a 17-year cluster would be tiny – on the order of 1 in 10 trillion.

Natural memory

"An anomalous warm year tends to be followed by a warm year," says Zorita, because of the way oceans store heat and release it slowly. "A devil's advocate could argue that the clustering of warmest years at the end of the record could be simply due to chance, since the climate system has a natural memory."

However, even when Zorita included this natural feedback in his model, but excluded global warming, the odds of observing the cluster of record-breaking years was still about 1 in 10,000.

"We cannot ascribe the anomaly to any particular physical factor, like anthropogenic greenhouse gases," says Zorita. "But our conclusions are consistent with those of the fourth IPCC report," which states there is a very high probability that human emissions are causing global warmingSpeaker.

Journal reference (not yet accessible): Geophysical Research Letters (DOI: 10.1029/2008GL036228, in press).

Link to article: http://www.newscientist.com/article/dn16292-glut-of-hot-years-a-coincidence-fat-chance.html