BLOGGER'S NOTE: I have a terrible memory, and I often want to refresh it with regard to the way in which GISS temperatures are calculated. The Earth Observatory website is often inaccessible, so I have saved this article, here, for my own benefit and the benefit of anyone who is interested. Many denialists take potshots at the methodology, accusing the scientists of intentional manipulation of the data to show warmer temperatures. Note that GISS-TEMP extrapolates to include the Poles and other temperature trackers do not. As the Arctic is warming faster than any other region of the planet, this would naturally mean that GISS-TEMPs will be slightly warmer, but they still show no significant difference in the trend when compared to the other temperature trackers. Occasionally, GISS-TEMPs are even lower than the other trackers. The second link below is to a more lengthy discussion of the history of the methodology and the changes that have been made over the years. Due to the fact that occasionally I cannot open the NASA page, I will also post the contents of that page, later.
Link to NASA's Earth Observatory:
http://earthobservatory.nasa.gov/Study/GISSTemperature/giss_temperature.html
Link to GISS temperature analysis and history: http://data.giss.nasa.gov/gistemp/
Earth is Cooling…No It’s Warming | |||
In 1967 Hansen went to work for NASA’s Goddard Institute for Space Studies, in New York City, where he continued his research on planetary problems. Around 1970, some scientists suspected Earth was entering a period of global cooling. Decades prior, the brilliant Serbian mathematician Milutin Milankovitch had explained how our world warms and cools on roughly 100,000-year cycles due to its slowly changing position relative to the Sun. Milankovitch’s theory suggested Earth should be just beginning to head into its next ice age cycle. The surface temperature data gathered by Mitchell seemed to agree; the record showed that Earth experienced a period of cooling (by about 0.3°C) from 1940 through 1970. Of course, Mitchell was only collecting data over a fraction of the Northern Hemisphere—from 20 to 90 degrees North latitude. Still, the result drew public attention and a number of speculative articles about Earth’s coming ice age appeared in newspapers and magazines. | |||
But other scientists forecasted global warming. Russian climatologist Mikhail Budyko had also observed the three-decade cooling trend. Nevertheless, he published a paper in 1967 in which he predicted the cooling would soon switch to warming due to rising human emissions of carbon dioxide. Budyko’s paper and another paper published in 1975 by Veerabhadran Ramanathan caught Hansen’s attention. Ramanathan pointed out that human-made chlorofluorocarbons (or CFCs) are particularly potent greenhouse gases, with as much as 200 times the heat-retaining capacity of carbon dioxide. Because people were adding CFCs to the lower atmosphere at an increasing rate, Ramanathan expressed concern that these new gases would eventually add to Earth’s greenhouse effect and cause our world to warm. (Because CFCs also erode Earth’s protective ozone layer, their use was mostly abolished in 1989 with the signing of the Montreal Protocol.) The notion that humans could override nature and force the globe to warm intrigued Hansen. “It had been known for more than a century that increasing carbon dioxide could have an effect on global temperature,” Hansen said (referring to the pioneering work of John Tyndall and Svante Arrhenius in the 1800s). But global warming in the near future? That was another matter. Hansen returned his attention to the physics equations he’d played with almost 10 years earlier. Collaborating with Andy Lacis, a colleague at NASA, he built a simple climate model to simulate how changes in the atmosphere cause Earth’s average temperature to change over time. Hansen and Lacis tweaked the inputs to simulate the cumulative influence of all known human-made greenhouse gases except carbon dioxide (including CFCs, methane, nitrous oxide, and ozone) to see if their net effect could even be felt on a global scale in the climate system. To their surprise, Hansen’s team found that the warming effect of all those gases added together is comparable to the warming effect of carbon dioxide alone. | |||
The simple model also allowed Hansen to simulate the climate impact of Mount Agung’s eruption 15 years after the event. The model indicated that loading the atmosphere with volcanic aerosols should have caused a global cooling—a prediction that agreed pretty well with observed temperature data. The model demonstrated that both human and natural activities could force climate to change. But Hansen knew that natural forcings, like volcanic eruptions or changes in the Sun’s activity, tend to go up and down over a long period of time whereas the human forcing from greenhouse gas emissions was steadily increasing. “It became clear that human-produced greenhouse gases should become a dominant forcing and even exceed other climate forcings, such as volcanoes or the Sun, at some point in the future,” Hansen observed. How soon would the human forcing begin to dominate? No one knew. | |||
To find out, Hansen would need real-world data on a global scale. He requested data tapes from Roy Jenne, of the National Center for Atmospheric Research, who was widely recognized in the 1970s as having the best weather dataset in the world. Of course, there remained the problem that the weather stations supplying Jenne’s dataset were rather sparse compared to the vastness of Earth’s surface. | |||
“The lack of any global temperature analysis [for Earth] did not seem right to me,” Hansen recalled. Drawing from his previous work in estimating the average planetary surface temperature of Venus, he knew that if scientists had measurements from as many places on another planet as were available from Jenne’s dataset they would not hesitate to estimate Earth’s global temperature. He decided to try. At the outset Hansen knew that weather fluctuations would introduce short-term temperature anomalies into the weather station dataset that are not the same thing as climate change. But he reasoned that by taking averages over several years, and appropriately “weighting” the weather stations’ data, it should be possible to determine meaningful temperature changes over longer time periods. In the mid-1970s, he hired Jeremy Barberra, a New York University undergraduate student at the time, to automate the processing of Jenne’s dataset. They decided to process the data to produce average temperature changes, and not absolute temperature. “If you focus your analysis on temperature change, and not on determining absolute temperature values, then the station coverage is adequate,” Hansen explained. “What matters is the long-term mean over large scales, not single measurements from individual stations.” The success of Hansen’s and Barberra’s approach depended on the principle that temperature anomalies have a much larger scale than absolute temperature. Consider a mountain on which it can be much cooler on one side than the other. This example illustrates how absolute temperature patterns can vary sharply over relatively short distances. On the other hand, temperature anomalies are typically large-scale events driven by Rossby Waves. Rossby Waves are slow-moving waves in the ocean or atmosphere, driven from west to east by the force of Earth spinning. We see such waves in the atmosphere as large-scale meanders of the mid-latitude jet stream. | |||
“If it is an unusually warm winter in New York, it is probably also warm in Washington, D.C., for example,” Hansen explained. “At high- and mid-latitudes Rossby Waves are the dominant cause of short-term temperature variations. And since those are fairly long waves we didn’t think we needed a station at every one degree of separation.” A station at every 1 degree would mean a station roughly every 80 kilometers (at mid-latitudes). But in a 1987 paper appearing in the Journal of Geophysical Review, Hansen and Sergei Lebedeff demonstrated that the temperature readings of weather stations within 1,000 kilometers (620 miles) of one another are highly correlated. The close correlation meant they could map global temperature changes over time despite the fact that weather stations are widely spaced and located mainly on continents and islands. Here’s basically how their approach works: For each center point in a global grid of 1-degree boxes they let all weather station data within a 1,200-kilometer radius influence the estimated temperature change at that point. They gave greatest “weight” to the station closest to that point; for all other stations within that radius, they let the weighting fall off linearly with distance, all the way to a weighting of zero for stations 1,200 kilometers away or farther. “Again, our objective was not to determine the precise temperature of individual stations, but to produce a global-scale map of temperature change,” Hansen emphasized. “We were interested in tracking global climate patterns, not local weather variations.” In their 1981 analysis, published in the journal Science, Hansen’s team reported finding that, overall, Earth’s average temperature rose by about 0.4°C for the period from 1880 to 1978. There was roughly 0.1°C of global cooling from 1940-1970. This cooling was less than what Mitchell had found earlier due to the fact that Hansen’s team was now using global data, and not just data from a swath around the Northern Hemisphere. Just as Budyko had predicted, Hansen found that Earth’s cooling trend swung back in the warming direction around 1970 and has been warming ever since. Moreover, Hansen noted, the warming trend observed in real-world data is consistent with his (and others’) global climate model outputs in their 100-year simulations. | |||
Since 1978, global warming has become even more apparent. Over the last 30 years, Hansen’s analysis reveals that Earth warmed another 0.5°C, for a total warming of 0.9°C since 1880. | |||
“To questions about whether this warming is natural or just a fluctuation, the answer has become clear: the world is getting warmer,” Hansen stated. “This fact agrees so well with what we calculate with our global climate model that I am confident we are looking at warming that is mainly due to increasing human-made greenhouse gases.” | |||
The Data and the Details | |||
Some nagging questions remained for Hansen and his colleagues. Citing issues such as stations located too close to paved surfaces, stations located in urban areas that are known to be warmer than rural regions, and stations located in developing nations where data collection methods may be unreliable, critics argued that any of these problems could throw off an individual station’s temperature readings. Don’t such concerns cast a shadow of doubt on the NOAA weather station data? Initially, perhaps, but not after the data have been carefully tested in several ways. First, Hansen’s team (and others) finds good agreement of the weather station data with “proxy” data sets that are sensitive to surface temperature changes—such as the rate at which glaciers are receding, or subsurface temperature measurements in boreholes drilled down into the ground. (Scientists can infer surface temperature change from underground temperatures based on equations that describe how heat diffuses through the ground over time.) The results in thousands of remote locations around the world agree well with the surface temperature measurements. Second, Hansen’s team “cleans” the weather station data by finding and filtering out flawed data entries. Specifically, they apply a computer algorithm that checks each data point for temperature readings that are very significantly higher or lower than average for a given location at that time of year. Whenever such an anomaly is flagged, the algorithm compares those data to data from nearby stations to see if they show a similar anomaly. If so, then the data in question are kept; if not, or if there are no nearby stations for comparison, then the data are thrown away. | |||
His team also modifies the data from stations located in densely populated areas by removing the long-term bias of these “urban heat islands.” The team uses satellite data to determine if a given station is in an urban or near-urban location. If so, then the team uses the nearest rural stations to determine the long-term trend at the urban site. If there are no rural neighbors, then Hansen’s team throws out the urban station data. | |||
One lesson to be learned here is weather science and climate science are quite different: weather is concerned with what conditions are like at a given location and time, whereas climate is concerned with what conditions are like over large regions, or over the entire globe, and for a long period of time. That explains why climate scientists are not as interested in any given reading for an individual station as they are in 5-year and 10-year blocks of time for the entire planet. Hansen acknowledged there may be flaws in the weather station data. “But that doesn’t mean you give up on the science, and that you can’t draw valid conclusions about the nature of Earth’s temperature change,” he asserted. | |||
From A Dimmer Past to a Brighter Future? | |||
Of greater concern to Hansen than global warming skeptics is the problem of global warming itself. If greenhouse gases are to blame then why did Earth’s average temperature cool from 1940-1970? And why has the rate of global warming accelerated since 1978? Hansen’s answers to these questions brought him full circle to where he began his investigation more than 40 years ago. “I think the cooling that Earth experienced through the middle of the twentieth century was due in part to natural variability,” he said. “But there’s another factor made by humans which probably contributed, and could even be the dominant cause: aerosols.” | |||
In addition to greenhouse gas emissions, human emissions of particulate matter are another significant influence on global temperature. But whereas greenhouse gases force the climate system in the warming direction, aerosols force the system in the cooling direction because the airborne particles scatter and absorb incoming sunlight. “Both greenhouse gases and aerosols are created by burning fossil fuels,” Hansen said, “but the aerosol effect is complicated because aerosols are distributed inhomogeneously [unevenly] while greenhouse gases are almost uniformly spaced. So you can measure greenhouse gas abundance at one place, but aerosols require measurements at many places to understand their abundance.” After World War II, the industrial economies of Europe and the United States were revving up to a level of productivity the world had never seen before. To power this large-scale expansion of industry, Europeans and Americans burned an enormous quantity of fossil fuels (coal, oil, and natural gas). In addition to carbon dioxide, burning fossil fuel produces particulate matter—including soot and light-colored sulfate aerosols. Hansen suspects the relatively sudden, massive output of aerosols from industries and power plants contributed to the global cooling trend from 1940-1970. |
“That’s my suggestion, though it’s still not proven,” he said. “There is a nice record of sulfates in Greenland ice cores that shows this type of particle was peaking in the atmosphere around 1970. And then the ice core record shows a rapid decline in sulfates, right about the time nations began regulating their emission.” (Sulfates cause acid rain and other health and environmental problems.) In 2007, Michael Mischenko, of NASA GISS, published a paper in the journal Science in which he reported tropospheric aerosols have indeed declined slightly over the last 30 years. The net effect is that more sunlight passes through the atmosphere, slightly brightening the surface. This increased exposure to sunlight could partially account for the increase in surface temperature that Mischenko and Hansen observed over the same time span. | |||
Over the course of the twentieth century, Hansen and other climate scientists estimate aerosols may have offset global warming by as much as 50 percent by reducing the amount of sunlight reaching the surface. Scientists call this phenomenon “global dimming,” although the change was too gradual and too slight to be perceived by the human eye. (Aerosols’ dimming potential has been observed, of course, after dramatic events like the Agung Volcano eruption that Hansen noticed during the lunar eclipse of December 1963.) Hansen describes the global dimming effect of human-emitted aerosols as a “Faustian bargain”—a deal with the devil. “Eventually you get to a point where you don’t want aerosols in the atmosphere because they’re harmful to human health, harmful to agriculture, and harmful to natural resources,” he stated. “So in the U.S. and much of Europe, we’ve been reducing aerosol emissions.” But we haven’t seen a corresponding reduction in greenhouse gas emissions. Indeed, humans’ use of fossil fuels rose rapidly (about 5 percent per year) from the period after World War II until 1973. After the oil embargo and price shock of oil in 1973, annual average consumption continued to increase, but at a slower pace (between 1.5 and 2 percent per year). A byproduct of that rising fossil fuel consumption has been a corresponding rise in carbon dioxide emission. Because greenhouse gases reside in the atmosphere for decades, while aerosols usually wash out over a span of days to weeks, the warming influence of greenhouse gases gradually won out. “For much of the twentieth century, both types of human emissions were on nearly equal footing, and aerosols were able to compete with greenhouse gases,” Hansen said. But that balance has tilted increasingly in favor of greenhouse gases in the last 30 years. Today, Hansen’s team estimates the human forcing from greenhouse gases to be about 3 watts per square meter (warming) and the forcing from aerosols to be about minus 1.5 watts per square meter (cooling). Hansen sees these trends as very likely to lead to what he calls “dangerous human interference” with the climate system. “I think action [to reduce greenhouse gas emissions] is needed urgently, because we are on the precipice of a climate system ‘tipping point’,” Hansen concluded. “I believe the evidence shows with reasonable clarity that the level of additional global warming that would put us into dangerous territory is at most 1°C.” | |||
If we follow a ‘business-as-usual’ course, Hansen predicts, then at the end of the twenty-first century we will find a planet that is 2-3°C warmer than today, which is a temperature Earth hasn’t experienced since the middle Pliocene Epoch about three million years ago, when sea level was roughly 25 meters higher than it is today. |
Post a Comment