Archive for October 2009

Sea level rises would flood Philly…and NYC and DC and Miami

Sea level rises would flood Philly…and NYC and DC and Miami

by Barbara Kessler,  Green Right Now, October 20, 2009


By now you’ve heard the dire predictions for how sea level rise would affect Miami. Basically this city, already imperiled by worsening hurricanes, is in the bulls-eye for rising oceans too.

But did you realize that a one meter sea level increase — now believed by many scientists to be a likely outcome of global warming by 2100 — would put Philadelphia underwater?
Greenland Ice Flow (Photo: NASA)
Greenland Ice Flow (Photo: NASA)

Yes, the city of Brotherly Love would be among the large family of coastal cities potentially devastated by coastline changes. And not in the too-distance future either.

According to glacier and ice shelf expert Dr. Gordon Hamilton, Philadelphia could experience troubles decades before that 2100 benchmark if storm surges pushed rising oceans inland.

In other words, there is no magic threshold when the seas, warmed by the atmosphere and swelled by melting ice sheets, will spill over their old boundaries. There is a steady creep occurring now. But flooding, hastened by storms, could happen well before the ocean’s reach the 1 meter increase (absent any serious human action to slow the current progression).

Hamilton, a research professor at the University of Maine who studies melting ice sheets in Greenland and Antarctica , and Dr. Asa Rennermalm, a Rutgers University professor who studies Arctic and Greenland ice sheets,  are kicking off a lecture tour today to spread this news about how the oceans are rising even faster than projected just a couple years ago.

The first talk was this morning at the Wagner Free Institute in Philadelphia followed by a demonstration at the Adventure Aquarium in Camden, N.J. Subsequent engagements will take the pair to Miami; Washington, New York City and several other cities. The tour, dubbed the “hip boot tour” to emphasize the reality of the coming floods, is sponsored by Clean Air-Cool Planet, a non-profit dedicated to fighting global warming.

None of these cities where the scientists will be speaking will be spared by rising sea levels. Just as most mega-cities around the globe will be affected, because so many population centers sit on the coast or on rivers that lead directly to the coast. Cities like Paris. And Philadelphia.

Talking to Hamilton is a bit like previewing one of those apocalyptic movies where the world suffers from monster storms, vast floods, temperature changes and incredible destruction of infrastructure.

At a one-meter rise, for instance, the subway entrances in Manhattan would be at the water level, which means the subways would be inundated, permanently, said Dr. Hamilton, whose degree is in geophysics.
One doesn’t need a degree in geophysics to understand the consequences of the nation’s financial capital being underwater. Having St. Louis and Chicago on dry ground would not ameliorate the devastation to humans and world trade.

In Philadelphia, a 1 meter increase would flood the downtown district and areas along the river. Harbor trade would be shut down and on the east side, Camden, N.J., would be inundated. Across New Jersey, aquifers would likely be contaminated with sea water.

Neighborhoods at higher elevations, north and west of Philadelphia would remain dry.
Parts of Florida at 33 feet above sea level and below are shown flooded (Image: NASA.)
Parts of Florida at 33 feet above sea level and below are shown flooded (Image: NASA.)

In Miami, nothing would be unaffected. A one-meter sea level rise would put most of the city underwater, and it wouldn’t be alone. “Most of Florida’s big cities would be severely affected,” Hamilton said. Models overlaid on satellite images show Miami, the Keys, St. Petersburg and Tampa under water. The everglades would become a saltwater marsh and aquifers in the state would become brackish or completely salinated.
Hamilton says he shows people how their city’s coastline would change, but also tries to get local audiences to see the global nature of the problem.  “Not only are you flooding downtown DC, but hundreds of millions of people in Southeast Asia like Bangladesh, ” he said.

The key point of the tour is not just to demonstrate impending devastation, but to explain that the threat is more imminent than was predicted by the Interplanetary [a reader notes that this word should be "Intergovernmental"] Panel on Climate Change (IPCC) just two years ago.
In 2007, the IPCC warned that the sea levels would rise a little more than half a meter and possibly more.

Even at that less drastic increase, the “the impacts are virtually certain to be overwhelmingly negative,” scientists wrote.

That prediction was based on the best available science.

What didn’t make the report, Dr. Hamilton said, was that in 2005, geophysicists studying the freshwater ice sheets in Greenland and changes in Antarctica had witnessed an alarming quickening in the speed of some glaciers as they carried ice toward the ocean.

In Greenland, some of these rivers of ice “were doing these crazy things,” he said. Some were moving 45 meters in a day — about the distance of one half a football field. In glacial terms, they were moving very fast. You could hear the ice cracking, he said.

“Almost over night, in the course of 9 to 10 months, they started moving about three times faster than they had been,” Dr. Hamilton said.

Scientists know the changes were prompted by global warming, and that the ice melts can grow exponentially, with water in crevasses contributing to the problem. But they still don’t understand what it all means. Some glaciers later slowed, but others sped up, Hamilton said. The net effect is likely to be a faster melt, with more water raising the ocean levels worldwide.

“Our talks right now are to emphasize that the picture has changed dramatically. If you were to take a consensus among my colleagues who work in Greenland and Antarctica, everybody is likely to say that it (sea rise) is more likely to be a meter.”

If not more.

“Politicians,” he said, “regardless of their political leanings on climate change need to be aware that they’re ethically bound to consider the upper bounds of sea level change…It’s delinquent for people to say they’re going to plan for the minimum (possible change) and then in 50 years time find that huge amounts of their infrastructure is flooded because they didn’t pay attention.”
  • The lecture tour dates and cities are:
  • Oct. 20 – Philadelphia
  • Oct. 21 -  Portland, Maine
  • Oct. 22 – Tampa, Fla.
  • Oct. 23 -  Tampa, Fla.
  • Oct. 24 – Miami, Fla.
  • Oct. 27 -  Wilmington, N.C.
  • Oct. 28 – Norfolk, Va.

Sea level rise: Current global average is 3.2 mm per year

Sea Level Change Data 
from the University of Colorada at Boulder

Long-term mean sea level change is a variable of considerable interest in the studies of global climate change. The measurement of long-term changes in global mean sea level can provide an important corroboration of predictions by climate models of global warming. Long term sea level variations are primarily determined with two different methods. Over the last century, global sea level change has typically been estimated from tide gauge measurements by long-term averaging. Alternatively, satellite altimeter measurements can be combined with precisely known spacecraft orbits to provide an improved measurement of global sea level change.
Since August 1992 the satellite altimeters have been measuring sea level on a global basis with unprecedented accuracy. The TOPEX/POSEIDON (T/P) satellite mission provided observations of sea level change from 1992 until 2005. Mean sea levelJason-1, launched in late 2001 as the successor to T/P, continues this record by providing an estimate of global mean sea level every 10 days with an uncertainty of 3-4 mm. The latest mean sea level time series and maps of regional sea level change can be found on this site. Concurrent tide gauge calibrations are used to estimate altimeter drift. Sea level measurements for specific locations can be obtained from our Interactive Wizard. Details on how these results are computed can be found in the documentation and the bibliography. Please contact us for further information.

Link:  http://sealevel.colorado.edu/

New Hampshire could lose entire coastline by 2100, according to Dr. Gordon Hamilton of the Hip-Boot Tour

New Hampshire could lose entire coastline by 2100, according to Dr. Gordon Hamilton of the Hip Boot Tour


by Robert M. Cook, October 30, 2009
Picture
Rafe Pomerance, president of the Clean Air-Cool Planet in Portsmouth, discussed Thursday the important role the U.S. must play to get the international community to agree to reduce its carbon dioxide emissions to reduce the damage caused by global warming worldwide. Cook/Democrat photo


 
HAMPTON BEACH — Two research scientists foretold of a flooded future for New Hampshire's 18-mile coastline Thursday morning.

The changes, they say, will be caused by global warming and continued ice melt.

By the year 2100, ice melt brought on by warmer ocean temperatures will increase sea level by 3.3 feet and force the relocation of thousands of homes, businesses and resort cottages in Hampton Beach, Seabrook, Rye, and many other coastal communities, according to Dr. Mark Fahnestock, a professor at the Institute for the Study of Oceans and Space at the University of New Hampshire in Durham and Dr. Gordon Hamilton, a research professor at the University of Maine's Climate Change Institute.

The two men told a group of concerned residents and state and local officials they have spent several years studying the continued melting of glaciers in Greenland using satellites in space and physical markers on the ground.

Fahnestock said as ocean temperatures continue to rise, he has seen greater amounts of ice melt along the Greenland coastline into the sea.

He said one glacier has dropped 600 feet in size over the last several years.

As a result, "the thinning is changing the way this glacier behaves over time," Fahnestock said.

Hamilton said research shows the glaciers in Greenland and the Arctic Circle can respond very rapidly to changing conditions whether they are trigged by natural or man-made actions such as global warming which scientists say is caused by excessive carbon dioxide emissions produced by fossil fuel burning plants.

Hamilton said the sea level could rise anywhere from 18 to 59 cm by 2029.

As more ice melts off glaciers, they create icebergs which swell the sea level, Hamilton explained. He compared it to a typical drink with ice cubes.

"When ice goes into a cocktail, a drink gets bigger," he said.

Hamilton said scientists have also measured the amount of glacial melt that has taken place in Greenland the last few years and found in one instance it has increased its speed from 5 miles to 8 miles per year and in another case increased from 3 miles to 9 miles per year.

The forum at the Ashworth by the Sea hotel on Hampton Beach was organized by Clean Air-Cool Planet of Portsmouth, a group dedicated to reducing carbon dioxide emissions, as part of its "Hip-Boot Tour" of East Coast communities.

Across Ocean Boulevard, the Atlantic Ocean's waves pounded the surf during high tide as the scientists and members of the environmental group discussed how the projected higher sea level could forever change the New Hampshire coastline.

Roger Stephenson, the group's executive director, said they provided a large-scale map showing the sea-level could rise as much as 6.6 feet by 2100 based on the data scientists have collected. Areas on the map shaded in blue indicate all of Hampton Beach, North Beach, Seabrook Beach and large swaths of land that extend all the way to Interstate 95 could be underwater by then.

Sections of Route 101, Route 1A, Route 1 and thousands of acres of salt marsh would also be affected by the higher sea level.

Steve Miller, coastal training coordinator for the Great Bay National Estuarine Center in Stratham, said more people want to understand what is happening and also understand what action they can take to mitigate the potential loss of coastline, beaches, residences and businesses.

Rafe Pomerance, president of Clean Air-Cool Planet, described the scientists' findings as "game-changing data" he hopes will create the political will necessary in Washington to get the U.S. Senate to pass legislation to create a national cap and trade policy to reduce carbon dioxide emissions.

Under a cap and trade system that some states like New Hampshire and Maine have approved, businesses that burn fossil fuels have to pay states money for every ton of CO2 they burn, which encourages companies to incorporate more renewable energy.

Thomas Burack, commissioner of the New Hampshire Department of Environmental Services in Concord, said there is much at stake for the state and the region if more is not done to reduce carbon dioxide emissions.

Burack said the state has already seen evidence as to how global warming and climate change are playing out in New Hampshire. He cited the recent floods in 2006, 2007 and 2008, excessive precipitation, and warmer temperatures in the summer and winter.

Burack said doing nothing will lead to food shortages, a loss of ocean fisheries and an increase in mosquito-borne illnesses like West Nile virus. He said some scientists have said climate change will cause New Hampshire to become more like South Carolina.

Burack said many communities should start planning now to adapt to the higher predicted sea level. He said the state joined nine other Northeast states to create the Regional Greenhouse Gasses Initiative and the state also put together a state climate control plan to help communities deal with this issue.

Under the plan, Burack said New Hampshire wants its fossil fuel plants to reduce CO2 emissions 20% by 2025 and then 80% by the year 2050.

"We're going to unfortunately feel the impact of climate change for sometime to come," Burack said.

During the question and answer period that followed the panel discussion, Pomerance acknowledged that even if the international community and the U.S. agreed at the global climate summit in Copenhagen, Denmark, in December to greatly reduce future greenhouse gases, it would not reverse the ice melt happening in and around the Arctic and Antarctic circles.

He said the best thing the world can do is to reduce future CO2 emissions to avoid what could be catastrophic climate change. Pomerance said the U.S. must play a key leadership role to get the rest of the world to reduce greenhouse gases.

"Unless we give the signal to the world to get busy, the world will not get busy and it will slow down," he said.


Picture
Cook/Democrat photo Dr. Mark Fahnestock of the Institute for the Study of Earth, Oceans and Space at the University of New Hampshire in Durham, discusses Thursday how global warming has caused a greater acceleration of glacier melt in Greenland and how that will cause the sea-level to rise by as much as 3.3 feet along the New Hampshire coastline by the year 2100.

Link:  http://www.fosters.com/apps/pbcs.dll/article?AID=/20091030/GJNEWS_01/710309872

Real Climate: An open letter to Steve Levitt by Raypierre

An open letter to Steve Levitt

— raypierre @ 29 October 2009

Dear Mr. Levitt,

The problem of global warming is so big that solving it will require creative thinking from many disciplines. Economists have much to contribute to this effort, particularly with regard to the question of how various means of putting a price on carbon emissions may alter human behavior. Some of the lines of thinking in your first book, Freakonomics, could well have had a bearing on this issue, if brought to bear on the carbon emissions problem. I have very much enjoyed and benefited from the growing collaborations between Geosciences and the Economics department here at the University of Chicago, and had hoped someday to have the pleasure of making your acquaintance. It is more in disappointment than anger that I am writing to you now.

I am addressing this to you rather than your journalist-coauthor because one has become all too accustomed to tendentious screeds from media personalities (think Glenn Beck) with a reckless disregard for the truth. However, if it has come to pass that we can’t expect the William B. Ogden Distinguished Service Professor (and Clark Medalist to boot) at a top-rated department of a respected university to think clearly and honestly with numbers, we are indeed in a sad way.

By now there have been many detailed dissections of everything that is wrong with the treatment of climate in Superfreakonomics , but what has been lost amidst all that extensive discussion is how really simple it would have been to get this stuff right. The problem wasn’t necessarily that you talked to the wrong experts or talked to too few of them. The problem was that you failed to do the most elementary thinking needed to see if what they were saying (or what you thought they were saying) in fact made any sense. If you were stupid, it wouldn’t be so bad to have messed up such elementary reasoning, but I don’t by any means think you are stupid. That makes the failure to do the thinking all the more disappointing. I will take Nathan Myhrvold’s claim about solar cells, which you quoted prominently in your book, as an example.

As quoted by you, Mr. Myhrvold claimed, in effect, that it was pointless to try to solve global warming by building solar cells, because they are black and absorb all the solar energy that hits them, but convert only some 12% to electricity while radiating the rest as heat, warming the planet. Now, maybe you were dazzled by Mr Myhrvold’s brilliance, but don’t we try to teach our students to think for themselves? Let’s go through the arithmetic step by step and see how it comes out. It’s not hard.

Let’s do the thought experiment of building a solar array to generate the entire world’s present electricity consumption, and see what the extra absorption of sunlight by the array does to climate. First we need to find the electricity consumption. Just do a Google search on “World electricity consumption” and here you are:
GoogleElec
Now, that’s the total electric energy consumed during the year, and you can turn that into the rate of energy consumption (measured in Watts, just like the world was one big light bulb) by dividing kilowatt hours by the number of hours in a year, and multiplying by 1000 to convert kilowatts into watts. The answer is two trillion Watts, in round numbers. How much area of solar cells do you need to generate this? On average, about 200 Watts falls on each square meter of Earth’s surface, but you might preferentially put your cells in sunnier, clearer places, so let’s call it 250 Watts per square meter. With a 15% efficiency, which is middling for present technology the area you need is

2 trillion Watts/(.15 X 250. Watts per square meter)

or 53,333 square kilometers. That’s a square 231 kilometers on a side, or about the size of a single cell of a typical general circulation model grid box. If we put it on the globe, it looks like this:
Globe
So already you should be beginning to suspect that this is a pretty trivial part of the Earth’s surface, and maybe unlikely to have much of an effect on the overall absorbed sunlight. In fact, it’s only 0.01% of the Earth’s surface. The numbers I used to do this calculation can all be found in Wikipedia, or even in a good paperbound World Almanac.

But we should go further, and look at the actual amount of extra solar energy absorbed. As many reviewers of Superfreakonomics have noted, solar cells aren’t actually black, but that’s not the main issue. For the sake of argument, let’s just assume they absorb all the sunlight that falls on them. In my business, we call that “zero albedo” (i.e. zero reflectivity). As many commentators also noted, the albedo of real solar cells is no lower than materials like roofs that they are often placed on, so that solar cells don’t necessarily increase absorbed solar energy at all. Let’s ignore that, though. After all, you might want to put your solar cells in the desert, and you might try to cool the planet by painting your roof white. The albedo of desert sand can also be found easily by doing a Google search on “Albedo Sahara Desert,” for example. Here’s what you get:
GoogleSand
So, let’s say that sand has a 50% albedo. That means that each square meter of black solar cell absorbs an extra 125 Watts that otherwise would have been reflected by the sand (i.e., 50% of the 250 Watts per square meter of sunlight). Multiplying by the area of solar cell, we get 6.66 trillion Watts.

That 6.66 trillion Watts is the “waste heat” that is a byproduct of generating electricity by using solar cells. All means of generating electricity involve waste heat, and fossil fuels are not an exception. A typical coal-fired power plant only is around 33% efficient, so you would need to release 6 trillion Watts of heat to burn the coal to make our 2 trillion Watts of electricity. That makes the waste heat of solar cells vs. coal basically a wash, and we could stop right there, but let’s continue our exercise in thinking with numbers anyway.

Wherever it comes from, waste heat is not usually taken into account in global climate calculations for the simple reason that it is utterly trivial in comparison to the heat trapped by the carbon dioxide that is released when you burn fossil fuels to supply energy.. . .

More here:  http://www.realclimate.org/index.php/archives/2009/10/an-open-letter-to-steve-levitt/

Global warming is putting the East Coast cities at risk for severe flooding


HAMPTON — Sea levels are rising faster than expected and coastal cities on the East Coast are at risk for severe flooding, according to arctic scientists. The Northeast may face a "double whammy" with climate change, too.

The new scientific data was part of the discussion Thursday as local and state officials, representatives and members of the Rockingham Planning Commission united for a community roundtable on climate change and sea-level rise, at the Ashworth Hotel.

The event, hosted by the nonprofit Clean Air—Cool Planet in partnership with Great Bay National Estuarine Research Reserve, was the final stop on the Hipboot Tour. Featuring scientists who recently returned from the poles on climate studies, the eight-stop tour, which kicked off Oct. 20 in Philadelphia.

"In our time (climate change and sea-level rise) will affect all aspects of human society," said Steven Miller, coordinator of the Coastal Training Program at the Great Bay NERR.

"What we know about climate change today is different than what we knew yesterday. Tomorrow, what we know will be richer and deeper than what we know today," he continued.

According to Miller, the issue is "complex" and "more dialogue within coastal communities (concerning) how to address the issue of adapting locally" as well as conversation broaching the "mitigation of climate change" is essential.

Gordon Hamilton, research associate professor at the University of Maine, who has done extensive work on glaciers in Greenland, told attendees estimates in a 2007 report from the Intergovernmental Panel on Climate Change, which indicated sea-level would rise between 18 and 59 centimeters by 2100, are not realistic.

"Work done in the last couple of years ...; (shows) that the amount of sea-level rise by 2100 will be in excess of a meter," he said. "That's an important take-home message."

According to Hamilton, "thermal expansion" — the expansion of fluids in warmer temperatures — and ice melting, are not the only causes of sea-level rise.

"The other way you put mass in the ocean is not in liquid, but in solid form," he said, referencing ice that breaks into the ocean, displacing water, like "ice in a cocktail glass."

Hamilton said some glaciers he has studied are moving "three times faster" than they were just a few years ago.

Additionally, Hamilton said the Northeast seaboard might get hit with climate changes, creating what he called a "double whammy." Aside from "extra mass" from ice in Greenland, the Gulf Stream could be disrupted, slowing down and "piling up" warm water that comes up to the New Hampshire region from the tropics, thus exacerbating the "thermal expansion component."

Mark Fahnstock, research associate professor at the University of New Hampshire, who has also studied Greenland's glaciers extensively, shared videos of glacial flow and told the audience that Antarctica's glaciers are of concern as well.

Relative to Antarctica, "Greenland is a small place with only 10 percent of the ice," he said. In Antarctica, (glaciers and glacial events) are double or multiplied by four." According to Fahnstock, there has been an "outbreak" of glacial flow into the ocean in western Antarctica.

Rafe Pomerance, founder of the Climate Policy Center and president of CA-CP, spoke to attendees about the upcoming United Nations Climate Change Conference scheduled to take place in Copenhagen Dec. 7-18. He also addressed the Senate's struggle on the climate change issue and attempts to institute a "cap-and-trade system" for greenhouse gas emissions, as well as arctic nations' roles in intervention.

"The U.S. is in a tough position going into Copenhagen," Pomerance said, citing the executive branch's reluctance to sign a treaty if it does not have "the authority to implement what we agree to."

Cliff Sinnott, executive director of the Rockingham Planning Commission, expressed concern with local governments' abilities to plan for and institute policy in anticipation of sea-level rise on the Seacoast.

Sinnott said he wanted to know "how we should be planning from a scientific standpoint. It would be very helpful if there was some consolidation, putting together of information in a way that normal people could evaluate," he said, adding it would be useful to know "what it (scientific data) means in terms of a certain risk and how it translates into policy."

According to Hamilton, officials in London have already set a sea-level rise estimate to be used when considering new policy. Hamilton said the estimate is above anticipated predictions to ensure the safety of the "economy within the coastal floodplain." Hamilton called the practice a "pick a number ...; no regrets policy."

Sinnott expressed concern that it takes time to vote through legislation and said "some people do the easiest thing" and delay developing a preparedness plan while waiting for "better information."

Presenters encouraged attendees to stay up-to-date on the latest findings and maintain an open dialogue about climate change and sea-level issues within their communities.

"Tomorrow we will know more. We can't say exactly where we'll be in 2050," Miller said. In terms of the "adaptation and mitigation" strategies, it's important that a dialogue remains open on "all levels — state, federal and local ...; connect yourself to the issue."

Sharp Solar Cell



This will not be in a market very soon. I am sure, because of cost issues. However it means that for critical applications, it is possible to achieve 35% which is a far cry from what can be supplied to the domestic market which operates as yet between 10 and fifteen percent.



As I have posted, there are pretty neat things been accomplished in the lab, and until we transition to a complex three dimensional nano architecture, the best results will run somewhere between 30 and perhaps 40 percent. This is obviously pretty good. It just is not cheap.



Every lab has understood the goal posts for decades. It is a little hard to imagine that this effort has been underway for so long, yet that is typical of technology research. The legend of the eureka moment is obviously overrated.



This will be of great value where money is not an issue and where surface area is.




Sharp Develops Solar Cell With World's Highest Conversion Efficiency



http://www.solardaily.com/images/sharp-new-pv-solar-ceel-structure-bg.jpg




Sharp has now succeeded in forming an InGaAs layer with high crystallinity by using its proprietary technology for forming layers. As a result, the amount of wasted current has been minimized, and the conversion efficiency, which had been 31.5% in Sharp's previous cells, has been successfully increased to 35.8%.



by Staff Writers



Washington DC (SPX) Oct 29, 2009



http://www.solardaily.com/reports/Sharp_Develops_Solar_Cell_With_World_Highest_Conversion_Efficiency_999.html



Sharp has achieved the world's highest solar cell conversion efficiency (for non-concentrator solar cells) of 35.8% using a triple-junction compound solar cell. Unlike silicon-based solar cells, the most common type of solar cell in use today, the compound solar cell utilizes photo-absorption layers made from compounds consisting of two or more elements such as indium and gallium.


Due to their high conversion efficiency, compound solar cells are used mainly on space satellites.


Since 2000, Sharp has been advancing research and development on a triple-junction compound solar cell that achieves high conversion efficiency by stacking three photo-absorption layers.


To boost the efficiency of triple-junction compound solar cells, it is important to improve the crystallinity (the regularity of the atomic arrangement) in each photo-absorption layer (the top, middle, and bottom layer).


It is also crucial that the solar cell be composed of materials that can maximize the effective use of solar energy.


Conventionally, Ge (germanium) is used as the bottom layer due to its ease of manufacturing. However, in terms of performance, although Ge generates a large amount of current, the majority of the current is wasted, without being used effectively for electrical energy.


The key to solving this problem was to form the bottom layer from InGaAs (indium gallium arsenide), a material with high light utilization efficiency. However, the process to make high-quality InGaAs with high crystallinity was difficult.


Sharp has now succeeded in forming an InGaAs layer with high crystallinity by using its proprietary technology for forming layers.


As a result, the amount of wasted current has been minimized, and the conversion efficiency, which had been 31.5% in Sharp's previous cells, has been successfully increased to 35.8%.

ReVolt Zinc Air Battery



We have been writing up so much leading edge energy storage technology that it is almost a relief to find someone doing magic with a basic chemical design and getting it to perform like a thorough bred.


What is been suggested here is a very safe battery that can deliver an operating range of at least 100 to 150 miles. This is a great jump over present battery technology and takes the electric vehicle from awful to quite bearable. When a driver enters his vehicle in the morning, he is always at some risk of having to drive an extra errand that demands a range reserve. 30 miles normally cannot give you that but 100 miles will.


Thus a battery able to supply over a hundred miles clearly makes the city car feasible.


At worst, it is a transition technology or an alternate for long range systems when they are available and perhaps too expensive to be underutilized.


This technology may also be packable into small devices. Then it certainly will displace the lithium business.


October 27, 2009


ReVolt Zinc Air Batteries Store Three to four Times Lithium Iion Batteries at Half the Cost


MIT Technology Review reports that REvolt of Switzerland claims to have solved issues with rechargeable Zinc Air Batteries. Nonrechargeable zinc-air batteries have long been on the market. But making them rechargeable has been a challenge.


http://nextbigfuture.com/2009/10/revolt-zinc-air-batteries-store-three.html



ReVolt Technology claims breakthrough achievements in developing a metal-air battery that overcomes all of the above barriers to deliver:


POWER: ReVolt’s new technology has a theoretical potential of up to 4 times the energy density of Lithium-Ion batteries at a comparable or lower production cost.



LIFETIME: Extended battery life due to stable reaction zone, low rates of dry-out and flooding, and no pressure build-up problems.



RECHARGEABILITY: Controlled deposition with no short-circuit, high mechanical stability.



COMPACT SIZE: No need for bulky peripherals such as cooling fans or temperature control systems.




ReVolt says it has developed methods for controlling the shape of the zinc electrode (by using certain gelling and binding agents) and for managing the humidity within the cell. It has also tested a new air electrode that has a combination of carefully dispersed catalysts for improving the reduction of oxygen from the air during discharge and for boosting the production of oxygen during charging. Prototypes have operated well for over one hundred cycles, and the company's first products are expected to be useful for a couple of hundred cycles. McDougal hopes to increase this to between 300 and 500 cycles, which will make them useful for mobile phones and electric bicycles.


For electric vehicles, ReVolt is developing a novel battery structure that resembles that of a fuel cell. Its first batteries use two flat electrodes, which are comparable in size. In the new batteries, one electrode will be a liquid--a zinc slurry. The air electrodes will be in the form of tubes. To generate electricity, the zinc slurry, which is stored in one compartment in the battery, is pumped through the tubes where it's oxidized, forming zinc oxide and releasing electrons. The zinc oxide then accumulates in another compartment in the battery. During recharging, the zinc oxide flows back through the air electrode, where it releases the oxygen, forming zinc again.


In the company's planned vehicle battery, the amount of zinc slurry can be much greater than the amount of material in the air electrode, increasing energy density. Indeed, the system would be like a fuel-cell system or a conventional engine, in that the zinc slurry would essentially act as a fuel--pumping through the air electrode like the hydrogen in a fuel cell or the gasoline in a combustion engine. McDougal says the batteries could also last longer--from 2,000 to 10,000 cycles. And, if one part fails--such as the air electrode--it could be replaced, eliminating the need to buy a whole new battery.


SunPower Panels



Don‘t you just love vigorous competition at work? Sharp may be breaking performance records for the exotics, but SunPower is breaking performance records for the domestics. Here they can lock in panels with an efficiency of around twenty percent. This is a solid improvement over the fifteen percent benchmark bandied about and certainly starts making the technology attractive.



When you realize that ten percent efficiency is awfully close to zero on a cloudy day, it is real hard to get excited about all that area you are sponging up. At twenty percent, a thirty percent loss is not so catastrophic.



In the meantime we will be seeing a lot more of Nansolar’s printed solar cells likely operating around ten percent and cheap at $1.00 per watt.



Right now efficiency is directly attached to cost per watt. The business strategy is to design a power farm that allows easy panel replacement as new product arrives. That way, the operator can expect to increase his output through the normal replacement cycle.




SunPower Announces World-Record Solar Panel



by Staff Writers



San Jose CA (SPX) Oct 29, 2009



http://www.solardaily.com/reports/SunPower_Announces_World_Record_Solar_Panel_999.html



SunPower has announced that it has produced a world-record, full-sized solar panel with a 20.4 percent total area efficiency. The prototype was successfully developed using funds provided by the U.S. Department of Energy (DOE) under its Solar America Initiative (SAI), which was awarded to SunPower approximately two years ago.


The new 96-cell, 333-watt solar panel is comprised of SunPower's third generation solar cell technology that offers a minimum cell efficiency of 23 percent. In addition, the larger area cells are cut from a 165 mm diameter ingot and include an anti-reflective coating for maximum power generation.


With a total panel area of 1.6 square meters, including the frame, SunPower's 20.4 percent panel achieved the highest efficiency rating of a full sized solar panel and this rating was confirmed by the National Renewable Energy Lab (NREL), an independent testing facility.


"SunPower has the engineering expertise and proven technology to accomplish this remarkable milestone in such a short period of time," said Larry Kazmerski, executive director, science and technology partnerships, located at NREL.


"My colleagues at the DOE and NREL had cautioned me that reaching a 20 percent solar panel was a stretch, but this did not dampen my optimism that it would happen. I congratulate SunPower and its team of talented engineers on realizing this accomplishment."


SunPower expects to make the 20.4 percent efficiency solar panel commercially available within the next 24 months. The company plans to begin operating a U.S. panel manufacturing facility in 2010 using automated equipment designed and commercialized with SAI funding.


SunPower recently announced the availability of the SunPower T5 Solar Roof Tile (T5), the first photovoltaic roof product to combine solar panel, frame and mounting system into a single pre-engineered unit. The T5 was also developed using research and development funds from the SAI.


"We are excited with the rapid pace in which we've been able to develop these advanced technologies," said Bill Mulligan, SunPower's vice president of technology and development.


"Without the funding from the SAI, it would have taken us much longer to deliver both the world-record 96-cell solar panel and the innovative T5 Solar Roof Tile. We appreciate the DOE's continued support of the solar energy industry."


The Solar America Initiative is focused on accelerating widespread commercialization of clean solar energy technologies by 2015 and to provide the U.S. additional electricity supply options while reducing dependence on fossil fuels and improving the environment.


It aims to achieve market competitiveness for solar electric power through government partnerships with industry, universities, national laboratories, states, and other public entities by funding new research and development activities.


Andrew C. Kemp et al., Geology, 2009, Timing and magnitude of recent accelerated sea-level rise (North Carolina, United States)

Geology (November 2009), Vol. 37, No. 11, pp. 1035-1038; DOI: 10.1130/G30352A.1

Timing and magnitude of recent accelerated sea-level rise (North Carolina, United States)

Andrew C. Kemp1,*, Benjamin P. Horton1,*, Stephen J. Culver2, D. Reide Corbett2, Orson van de Plassche3, W. Roland Gehrels4, Bruce C. Douglas5 and Andrew C. Parnell6  

1 Sea-Level Research Laboratory, Department of Earth and Environmental Science, University of Pennsylvania, Philadelphia, PA 19104, U.S.A.
2 Department of Geological Sciences, East Carolina University, Greenville, NC 27858, U.S.A.
3 Faculty of Earth and Life Sciences, VU University, 1081 HV Amsterdam, Netherlands
4 School of Geography, University of Plymouth, Plymouth PL4 8AA, U.K.
5 International Hurricane Center, Florida International University, Miami, FL 33199, U.S.A.
6 School of Mathematical Sciences (Statistics), University College Dublin, Dublin 4, Ireland

Abstract

We provide records of relative sea level since A.D. 1500 from two salt marshes in North Carolina to complement existing tide-gauge records and to determine when recent rates of accelerated sea-level rise commenced. Reconstructions were developed using foraminifera-based transfer functions and composite chronologies, which were validated against regional twentieth century tide-gauge records. The measured rate of relative sea-level rise in North Carolina during the twentieth century was 3.0–3.3 mm/a, consisting of a background rate of ~1 mm/a, plus an abrupt increase of 2.2 mm/a, which began between A.D. 1879 and 1915. This acceleration is broadly synchronous with other studies from the Atlantic coast. The magnitude of the acceleration at both sites is larger than at sites farther north along the U.S. and Canadian Atlantic coast and may be indicative of a latitudinal trend.

*Correspondence e-mails: kempac@sas.upenn.edu; bphorton@sas.upenn.edu.

Link to abstract:  http://geology.geoscienceworld.org/cgi/content/abstract/37/11/1035

Andrew Kemp: North Carolina sea levels rising three times faster than in previous 500 years, Penn study says

North Carolina sea levels rising three times faster than in previous 500 years, Penn study says

October 28, 2009



PHILADELPHIA –- An international team of environmental scientists led by the University of Pennsylvania has shown that sea-level rise, at least in North Carolina, is accelerating. Researchers found 20th-century sea-level rise to be three times higher than the rate of sea-level rise during the last 500 years. In addition, this jump appears to occur between 1879 and 1915, a time of industrial change that may provide a direct link to human-induced climate change.

The results appear in the current issue of the journal Geology.

The rate of relative sea-level rise, or RSLR, during the 20th century was 3.0-3.3 mm per year, higher than the usual rate of one per year. Furthermore, the acceleration appears consistent with other studies from the Atlantic coast, though the magnitude of the acceleration in North Carolina is larger than at sites farther north along the U.S. and Canadian Atlantic coast and may be indicative of a latitudinal trend related to the melting of the Greenland ice sheet.


Understanding the timing and magnitude of this possible acceleration in the rate of RSLR is critical for testing models of global climate change and for providing a context for 21st century predictions.

“Tide gauge records are largely inadequate for accurately recognizing the onset of any acceleration of relative sea-level rise occurring before the 18th century, mainly because too few records exist as a comparison,” Andrew Kemp, the paper’s lead author, said. “Accurate estimates of sea-level rise in the pre-satellite era are needed to provide an appropriate context for 21st century projections and to validate geophysical and climate models.”

The research team studied two North Carolina salt marshes that form continuous accumulations of organic sediment, a natural archive that provides scientists with an accurate way to reconstruct relative sea levels using radiometric isotopes and stratigraphic age markers. The research provided a record of relative sea-level change since the year 1500 at the Sand Point and Tump Point salt marshes in the Albemarle-Pamlico estuarine system of North Carolina. The two marshes provided an ideal setting for producing high-resolution records because thick sequences of high marsh sediment are present and the estuarine system is microtidal, which reduces the vertical uncertainty of paleosea-level estimates. The study provides for the first time replicated sea-level reconstructions from two nearby sites.

In addition, comparison with 20th century tide-gauge records validates the use of this approach and suggests that salt-marsh records with decadal and decimeter resolution can supplement tide-gauge records by extending record length and compensating for the strong spatial bias in the global distribution of longer instrumental records.

The study was funded by the National Oceanic and Atmospheric Administration Coastal Ocean Program, North Carolina Coastal Geology Cooperative Program, U.S. Geological Survey and National Science Foundation.

The study was conducted by Kemp and Benjamin P. Horton of the Sea-Level Research Laboratory at Penn, Stephen J. Culver and D. Reide Corbett of the Department of Geological Sciences at East Carolina University, Orson van de Plassche of Vrije Universiteit, W. Roland Gehrels of the University of Plymouth, Bruce C. Douglas of Florida International University and Andrew C. Parnell of University College Dublin.

Link:  http://www.upenn.edu/pennnews/article.php?id=1748

J. Brigham-Grette, PNAS 106, Contemporary Arctic change: A paleoclimate déjà vu?

Proceedings of the National Academy of Sciences,Vol. 106, No. 44, pp. 18431–18432 (November 3, 2009); DOI: 10.1073/pnas.0910346106

Contemporary Arctic change:  A paleoclimate déjà vu?

Julie Brigham-Grette

Department of Geosciences, University of Massachusetts, Amherst, MA 01003

[N.B.  go to link to see the figures and references.]

Observations of warming in the high northern latitudes provide a variety of scientific datasets to better understand the forcings and feedbacks at work in the global climate system. Instrumental data and satellites show that most of the current Arctic warming is the result of large changes in winter temperatures and that, by comparison, changes in summer temperatures have been relatively modest (1). Yet changes in seasonal temperatures are having a profound influence on glaciers, sea ice cover, snow cover, nutrient flux, and vegetation assemblages, causing shifts in both terrestrial and marine ecosystems (2, 3).

Profoundly  provocative is the suggestion that rapid melt rates now observed at the margins of the Greenland Ice Sheet (GIS) still lag significantly behind recent Northern Hemisphere warming (4). How out of the ordinary are these changes? Although changes of the last few decades can be assessed in the short term against instrumental and historical data and observations, it is only paleoclimate records that provide the necessary perspective to inform these questions.

Kaufman et al. (5) documented that the past decade was the warmest for the last 2,000 years by using high resolution lake sediment, ice core, and tree ring records from multiple sites across the Arctic. The report by Axford et al. in this issue of PNAS (6) builds on this necessary paleo-perspective by comparing a lake  sediment record of the last century to interglacial episodes preserved at depth in the same lake on the Clyde Forelands, Baffin Island, in the eastern Canadian Arctic.

Records of past interglacials (warm intervals) are somewhat rare in the terrestrial arctic, and finding well preserved, organic-rich interglacial lake sediments stacked in sequence within the limits of the Laurentide Ice Sheet are even rarer. Yet Axford and colleagues (7) have previously reported discovering three interglacials [marine isotope stage (MIS) 1, substages 5a and 5e, and MIS 7] preserved between intervening glaciogenic sands.

This new study (6) of Lake CF8 on eastern Baffin Island (Fig. 1) describes a variety of proxies (including chironomids, diatoms, chlorophyll-a, % organic carbon, etc.) that collectively provide a measure of past lake temperature, productivity, and pH. Axford et al. (6) use these data to evaluate the lake system’s response to changes in insolation driven by Earth’s orbital forcing (8) for each interglacial episode over the past 200,000 years. The overall trends in most of the proxies follow a similar pattern for interglacial MIS 1 and 5, with MIS 7 interpreted to be the tail end of the interglacial preceding glacial onset into MIS 6. Their argument concerning the biological response for each interglacial is aimed squarely at the Holocene, the best dated part of the record characterized by a marked peak in insolation until ~8,000 years ago [the so-called Holocene Thermal Maximum (9)]. They then infer insolation forcing for MIS 5e and 7 notably because the dating of these intervals in their long sediment core is not so well constrained. Nevertheless, using detrended correspondence analysis (DCA), a type of ordination analysis popular in ecosystem studies, on the measured proxies averaged to achieve a common sampling resolution, they show that for both of the earlier warm interglacial sequences the proxies all fall within a relatively well-defined window described by the first and second DCA axes. In contrast, these same proxies measured on sediments representing the past century show an ‘‘ecological trajectory’’ away from this interglacial window, suggesting that factors causing this change are unprecedented for the past 200,000 years. Lakes throughout the Arctic document remarkable 20th-century change (10) but few can make the direct comparison to earlier interglacials.

So to answer our earlier questions: yes, this has happened before but not quite like this. Having said that, some readers will agree that the comparison needs to be viewed with some caution given that the raw sampling interval of the record over the past century is better than that for earlier interglacials. Several points highlight the significance of this work in the context of what is known about the Arctic past and present. First, although Lake CF8 shows remarkable ongoing change in summer-based proxies over the later half of this century, it is located where maps of recent (2003–2007) National Centers for Environmental Prediction surface air temperature data show little or no change in summer; in contrast, large positive anomalies of 2–3 °C occur over the region in autumn (September– November; ref. 1). In other words, it could be that later onset of winter is driving ecological change. Second, the interglacial records from Lake CF8 join a number of long lacustrine records from around the Arctic that extend to the last interglacial and beyond (11), yet very few extend to MIS 7. Lake El’gygytgyn in central Chukotka was recently drilled in spring 2009 with the expectation that it will extend continuously to 3.6 million years (12). Published records demonstrate continuous deposition to nearly 350,000 years B.P., including MIS 9 (refs. 13 and 14 and Fig. 2). Long paleoclimate records from other Arctic lakes include those from Imuruk, Squirrel, and Ahaliorak lakes in the western Arctic (11).

What do the interglacials in this and other Arctic lake records inform us about the future? In short, if it happened before, it could happen again, and it’s happening now. A growing number of observations show that summer Arctic sea ice was much reduced during MIS 5e and may have been almost seasonal because of Milankovitch-driven summer insolation as much as 11–13% above present (11). Emerging records from the central Arctic Ocean [Arctic Coring Expedition (ACEX) and Greenland Arctic Shelf Ice and Climate Project (GreenICE); Figs. 1 and 2] also point to seasonally open water during MIS 5e (15, 16). The GIS was reduced in size and tree line advanced northward across large parts of Arctic (11). The early Holocene was another period only slightly warmer than today and forced by enhanced summer insolation approaching 10% in the high latitudes (8) that drove marked changes in tree line (9) and a significant reduction in sea ice along the Canadian Arctic (17) and northern Greenland (18). These warm periods inform us about the sensitivity of the Arctic system to warming in response to Arctic amplification (1) and provide the testing ground for climate model verification.

But modern climate change is driven largely by atmospheric CO2 concentrations in the face of decreasing insolation (5). Therefore, we need only look to Arctic records of the mid-Pliocene to capture our geologic moment of déjà vu when CO2 is estimated to have been in the range of 350–400 ppm like it is now (19). Intermittently throughout this time period sea level may have been +5 to +40 m above present (ref. 19 and references therein), driven in part by massive reductions in Antarctic ice sheets (20). Syntheses of this Pliocene interval and later interglacials (ref. 21 and www.globalchange.gov/publications/reports/scientificassessments/saps/sap1-2) leave little doubt that renewed studies in the high latitude are well justified to test and improve the chronological coherence of Arctic records. With a seasonally ice-free Arctic now projected to be only a few decades from now, perhaps Yogi Berra was right: ‘‘it’s déjà vu all over again.’’

www.pnas.org/cgi/doi/10.1073/pnas.0910346106

Link:  http://www.pnas.org/content/early/2009/10/27/0910346106.full.pdf

Atlantic meridional overturning circulation (AMOC) and glacier runoff from Greenland

Glaciers and the Atlantic

by Graham Cogley, environmentalresearchweb.org, October 26, 2009

The Atlantic keeps cropping up when we try to understand why glaciers change. If you look at the right kind of map, you can see that the Atlantic, including the Arctic, is an enormously long inlet in the shore of the world ocean. (The Bering Strait, between Alaska and Siberia, is too shallow to make a difference.)

The ocean is hot near the equator and not so hot near the poles. The heat flows down the temperature gradient, which drives the ocean currents. The water has to go somewhere once it has got where it is going and has surrendered the heat to the atmosphere. So it sinks. The sinking has a natural explanation: now that the water is colder, it is also denser.

At this point, the first two of several intertwined complications alter the picture. First, as it surrenders heat, the north Atlantic also surrenders water vapour. The warmer the water, the faster it evaporates. Second, during evaporation the salt stays behind, making the ocean water denser than it was to begin with, and that makes it yet more likely to sink.

The water that sinks gets back where it started from by flowing southwards at depth. Eventually it finds its way out of the Atlantic inlet and wells up, or is dragged up by the wind, in broad regions of the low-latitude ocean, where it is reheated by the Sun and thus begins the cycle over again. This is the essence of the meridional overturning circulation, or MOC. The Atlantic part of this circulation, predictably abbreviated AMOC, is a sensitive part of the machine because its northern end is where most of the Northern Hemisphere overturning happens. There is nothing comparable in the north Pacific.

The water vapour released from the north Atlantic adds greatly to the poleward deliveries of vapour through the atmosphere, and helps to make the northern shores of the Atlantic snowy. The glaciers around the north Atlantic today, and the much bigger ice sheets we had during the ice ages, owe a lot to the AMOC.

But now we come up against the third of the intertwined complications. After the water vapour has condensed and fallen on the bordering landmasses, it flows back sooner or later to the ocean, where it dilutes the salt that helps the AMOC through the crucial sinking part of its cycle. In principle, a large enough return flow of fresh water from rivers and glaciers could reduce the density of the surface waters sufficiently to stop them from sinking, in which case the whole AMOC would stop.

And now, enter the fourth of the intertwined complications: in one word, us. If we heat the whole system by enough to shrink the Greenland Ice Sheet significantly, flooding the north Atlantic with fresh water, we raise the prospect of just such a switching-off of the AMOC.

All the climate models suggest that, if the AMOC collapsed, the northward heat transfer would also be greatly reduced and the shores of the north Atlantic would suffer cooling. But fears of a new ice age being triggered by a collapse of the AMOC, itself triggered by a collapse of the Greenland Ice Sheet, are not realistic.

In the first place, these collapses would happen in a context of global warming, and again the climate-model evidence shows that they would not suffice for the job. Secondly, we have lots of palaeoclimatic evidence for abrupt changes in the AMOC, which are leading candidates to explain Dansgaard-Oeschger transitions during the last ice age, and the cold snap 8,200 years ago. They didn’t last all that long, and they were all reversible. Thirdly, models of ancient climates suggest persuasively that the AMOC is not implicated as a mechanism for starting ice ages.

And finally, the models agree that, without actually collapsing, the AMOC is nevertheless very likely to weaken over the next century. Even decanting the Greenland Ice Sheet into the ocean would not switch it off. But several metres of sea level rise, and a weaker AMOC in a warmer world, are enormous problems in themselves. That they are not harbingers of a colder world is not a good reason for relaxing.

TrackBack URL for this entry:  http://www.iop.org/mt4/mt-tb.cgi/3450

Link:  http://environmentalresearchweb.org/blog/2009/10/glaciers-and-the-atlantic.html

Solar Variability




When I went and dug up information of the sun’s variability a couple of years ago, I came across figures that showed minuscule changes. This article shows us that that was misleading. I get the impression that the folks who care about these things are possibly still collecting relevant data. This is possible because the important parts are those the atmosphere absorbs and we do not see at all.


The importance of all this is that the variability of the extreme ultraviolet (EUV) is over a range of six percent of the total EUV. On top of that, it is absorbed by the atmosphere or less likely reflected. Absorption means that this will heat that part of the atmosphere that does the absorbing. That heat must be accounted for in our global heat engine equation.


After observing the field for a couple of years, I am not too trusting these days. A six percent 11 year variation in a single source of heat is clearly and convincingly prospective as a climate driver. If not, it is because the output in that range is too low. It also reacts mainly with the outer reaches of our atmosphere but that may a too conservative opinion. After all, how could we properly tell?


In space we have satellites to measure the total flux of any wave band.. We can do the same on the ground and near ground. In between our coverage is much weaker to the point of been nearly blind.


The point is that this is a prospective heat driver that has not been respected for whatever reason. A couple of unrecognized heat transfer mechanisms and we are in business. The variation itself is not trivial. It is linked to the sunspot cycle rather directly. The increase in radiation is significant and is much different than variation over the other bands (as far as I can tell)


A heating of the upper atmosphere would naturally slow down the radiative heat loss from the planet and generate a warming climate. The loss of this heated zone of the upper atmosphere, once accomplished, would allow built up heat in the lower atmosphere to escape. Should the cycle fail to rekindle, then this cooling cycle will continue.


Now understand that the heat loss during the latter stages is been drawn from a much lesser atmospheric volume. This implies that it will get a lot colder.


What I am saying is that putting a heat cap on the upper atmosphere is possibly sufficient to explain the warming effect of the sunspot cycle. The failure of the cycle allows a progressive chilling of the climate. Two or three failed cycles in a row and we have a little ice age to contend with.


It will not produce a real ice age, but it will certainly curtail our recent livable winters. I had recognized for some time that we needed something that leveraged the sunspot cycle to properly explain the data. The apparent output of the sun lacked variability and appeared to do it all. This effect modifies heat loss.




The Sun's Sneaky Variability


10.27.2009


http://science.nasa.gov/headlines/y2009/27oct_eve.htm?list1109684


October 27, 2009: Every 11 years, the sun undergoes a furious upheaval. Dark sunspots burst forth from beneath the sun's surface. Explosions as powerful as a billion atomic bombs spark intense flares of high-energy radiation. Clouds of gas big enough to swallow planets break away from the sun and billow into space. It's a flamboyant display of stellar power.


So why can't we see any of it?


Almost none of the drama of Solar Maximum is visible to the human eye. Look at the sun in the noontime sky and—ho-hum—it's the same old bland ball of bright light.


"The problem is, human eyes are tuned to the wrong wavelength," explains Tom Woods, a solar physicist at the University of Colorado in Boulder. "If you want to get a good look at solar activity, you need to look in the EUV."


EUV is short for "extreme ultraviolet," a high-energy form of ultraviolet radiation with wavelengths between 1 and 120 nanometers. EUV photons are much more energetic and dangerous than the ordinary UV rays that cause sunburns. Fortunately for humans, Earth's atmosphere blocks solar EUV; otherwise a day at the beach could be fatal.



When the sun is active, intense solar EUV emissions can rise and fall by factors of thousands in just a matter of minutes. These surges heat Earth's upper atmosphere, puffing it up and increasing the drag on satellites. EUV photons also break apart atoms and molecules, creating a layer of ions in the upper atmosphere that can severely disturb radio signals. To monitor these energetic photons, NASA is going to launch a sensor named "EVE," short for EUV Variability Experiment, onboard the Solar Dynamics Observatory as early as this winter.


"EVE gives us the highest time resolution (10 sec) and the highest spectral resolution (<>


Although EVE is designed to study solar activity, its first order of business is to study solar inactivity. SDO is going to launch during the deepest solar minimum in almost 100 years. Sunspots, flares and CMEs are at low ebb. That's okay with Woods. He considers solar minimum just as interesting as solar maximum.


"Solar minimum is a quiet time when we can establish a baseline for evaluating long-term trends," he explains. "All stars are variable at some level, and the sun is no exception. We want to compare the sun's brightness now to its brightness during previous minima and ask ourselves, is the sun getting brighter or dimmer?"


Lately, the answer seems to be dimmer. Measurements by a variety of spacecraft indicate a 12-year lessening of the sun's "irradiance" by about 0.02% at visible wavelengths and 6% at EUV wavelengths. These results, which compare the solar minimum of 2008-09 to the previous minimum of 1996, are still very preliminary. EVE will improve confidence in the trend by pinning down the EUV spectrum with unprecedented accuracy.


http://science.nasa.gov/headlines/y2009/images/deepsolarminimum/irradiance.jpg



Above: Space-age measurements of the total solar irradiance or "TSI". TSI is the sun's brightness summed across all the wavelengths of the electromagnetic spectrum--visible light and EUV included. TSI goes up and down with the 11 year solar cycle. Credit: C. Fröhlich.



The sun's intrinsic variability and its potential for future changes are not fully understood—hence the need for EVE. "The EUV portion of the sun's spectrum is what changes most during a solar cycle," says Woods, "and that is the part of the spectrum we will be observing."


Woods gazes out his office window at the Colorado sun. It looks the same as usual. EVE, he knows, will have a different story to tell.

Acidification and Shellfish Decline


As you might imagine, this is not going to happen any time soon, but it demonstrates a possible consequence of continuing to build up the CO2 content in the atmosphere. I am really not too sure about this because the ocean is sponge for CO2 and a major active part of the CO2 cycle in the same way the ocean is for water.




The issue is simply that we are producing a huge amount of CO2 and we all agree that this is likely a bad idea. The good news is that the transition to a non carbon economy is both possible and also starting to happen, with or without cheap sources of energy. A lot will transition fairly fast but the stupendous growth of the global economy will keep a lot of that carbon based power afloat.




However, global change is presently so fast that we are no longer talking a century but are talking decades. The USA and Europe can fully change out their entire rolling stock of automobiles to electric over a single decade. Heavy transport will change out in the decade after. This leaves the carbon market only the coal power industry and that is waiting to see if we will have success with hydrogen boron fusion energy. If that occurs, the total conversion will take perhaps ten to fifteen years worldwide.




The whole issue of carbon dioxide is about to become mute when that happens.




We may end up deciding some day that it is a good idea to burn coal simply to produce CO2 to replace that which we are naturally sequestering as we terraform the planet with deep rich soil and intensive agriculture.






Ocean Acidification May Contribute To Global Shellfish Decline




http://www.terradaily.com/reports/Ocean_Acidification_May_Contribute_To_Global_Shellfish_Decline_999.html




http://www.terradaily.com/images/hard-clam-bay-scallop-eastern-oyster-acid-ocean-experiment-bg.jpg





At the end of experiments (about 20 days) larvae of the hard clam, bay scallop, and Eastern oyster that were raised in seawater with high carbon dioxide concentration (right column) were smaller and later to develop than larvae raised in seawater with carbon dioxide concentration matching current ocean levels (left column.) Credit: Stephanie Talmage




by Staff Writers



Stony Brook NY (SPX) Oct 28, 2009



Relatively minor increases in ocean acidity brought about by high levels of carbon dioxide have significant detrimental effects on the growth, development, and survival of hard clams, bay scallops, and Eastern oysters, according to researchers at Stony Brook University's School of Marine and Atmospheric Sciences.


In one of the first studies looking at the effect of ocean acidification on shellfish, Stephanie Talmage, PhD candidate, and Professor Chris Gobler showed that the larval stages of these shellfish species are extremely sensitive to enhanced levels of carbon dioxide in seawater. Their work will be published in the November issue of the journal Limnology and Oceanography.


"In recent decades, we have seen our oceans threatened by overfishing, harmful algal blooms, and warming. Our findings suggest ocean acidification poses an equally serious risk to our ocean resources," said Gobler.


During the past century the oceans absorbed nearly half of atmospheric carbon dioxide derived from human activities such as burning fossil fuels. As the ocean absorbs carbon dioxide it becomes more acidic and has a lower concentration of carbonate, which shell-making organisms use to produce their calcium carbonate structures, such as the shells of shellfish.


In lab experiments, Talmage and Gobler examined the growth and survivorship of larvae from three species of commercially and ecologically valuable shellfish. They raised the larvae in containers bubbled with different levels of carbon dioxide in the range of concentrations that are projected to occur in the oceans during the 21st century and beyond.


Under carbon dioxide concentrations estimated to occur later this century, clam and scallop larvae showed a more than 50% decline in survival. These larvae were also smaller and took longer to develop into the juvenile stage. Oysters also grew more slowly at this level of carbon dioxide, but their survival was only diminished at carbon dioxide levels expected next century.


"The longer time spent in the larval stage is frightening on several levels," said Talmage. "Shellfish larvae are free swimming. The more time they spend in the water column, the greater their risk of being eaten by a predator. A small change in the timing of the larval development could have a large effect on the number of larvae that survive to the juvenile stage and could dramatically alter the composition of the entire population."


Although levels of carbon dioxide in marine environments will continue to rise during this century, organisms in some coastal zones are already exposed to high levels of carbon dioxide due to high levels of productivity and carbon input from sources on land.


"This could be an additional reason we see declines in local stocks of shellfish throughout history," said Talmage. "We've blamed shellfish declines on brown tide, overfishing, and local low-oxygen events. However it's likely that ocean acidification also contributes to shellfish declines."


Talmage and Gobler hope their work might help improve the success rate of shellfish restoration projects.


"On Long Island there are many aquaculturists who restock local waters by growing shellfish indoors at the youngest stages and then release them in local estuaries," said Talmage. "We might be able to advise them on ideal carbon dioxide conditions for growth while larvae are in their facilities, and offer suggestions on release times so that conditions in the local marine environment provide the young shellfish the best shot at survival.