Average surface temperature

Instrumental temperature record

See also temperature record.

The instrumental temperature record shows the fluctuations of the temperature of the atmosphere and the oceans as measured by temperature sensors. Currently, the longest-running temperature record is the Central England temperature data series, that starts in 1659. The longest-running quasi-global record starts in 1850.

Global records databases

Currently, the Hadley Centre maintains the HADCRUT3, a global surface temperature dataset, NASA maintains GISTEMP, which provides a measure of the changing global surface temperature with monthly resolution for the period since 1880 and the NOAA maintains the Global Historical Climatology Network (GHCN-Monthly) data base contains historical temperature, precipitation, and pressure data for thousands of land stations worldwide.

The global record from 1850

The time period for which reasonably reliable near-surface temperature records exist from actual observations from thermometers with quasi-global coverage is generally considered to start in about 1850 - earlier records exist, but coverage and instrument standardization are less. The instrumental temperature record is viewed with considerable skepticism for the early years.

The temperature data for the record come from measurements from land stations and ships. On land, temperature sensors are kept in a Stevenson screen or a maximum minimum temperature system (MMTS). The sea record consists of surface ships taking sea temperature measurements from engine inlets or buckets. The land and marine records can be compared. Land and sea measurement and instrument calibration is the responsibility of national meteorological services Standardization of methods is organized through the World Meteorological Organization and its predecessor, the International Meteorological Organization.

Currently, most meteorological observations are taken for use in weather forecasts. Centers such as ECMWF show instantaneous map of their coverage; or the Hadley Centre show the coverage for the average of the year 2000 Coverage for earlier in the 20th and 19th centuries would be significantly less. While temperature changes vary both in size and direction from one location to another, the numbers from different locations are combined to produce an estimate of a global average change.

There are concerns about possible uncertainties in the instrumental temperature record including the fraction of the globe covered, the effects of changing thermometer designs and observing practices, and the effects of changing land-use around the observing stations. The ocean temperature record too suffers from changing practices (such as the switch from collecting water in canvas buckets to measuring the temperature from engine intakes) but they are immune to the urban heat island effect or to changes in local land use/land cover (LULC) at the land surface station.

Warming in the instrumental temperature record

Most of the observed warming occurred during two periods: 1910 to 1945 and 1976 to 2000; the cooling/plateau from 1945 to 1976 has been mostly attributed to sulphate aerosol. However, a study in 2008 suggests that the temperature drop of about 0.3°C in 1945 could be the apparent result of uncorrected instrumental biases in the sea surface temperature record. Attribution of the temperature change to natural or anthropogenic factors is an important question: see global warming and attribution of recent climate change.

Land and sea measurements independently show much the same warming since 1860. The data from these stations show an average surface temperature increase of about 0.74 °C during the last 100 years. The Intergovernmental Panel on Climate Change (IPCC) stated in its Fourth Assessment Report (AR4) that the temperature rise over the 100 year period from 1906-2005 was 0.74 °C [0.56 to 0.92 °C] with a confidence interval of 90%.

For the last 50 years, the linear warming trend has been 0.13 °C [0.10 to 0.16 °C] per decade according to AR4.

The U.S. National Academy of Sciences, both in its 2002 report to President George W. Bush, and in later publications, has strongly endorsed evidence of an average global temperature increase in the 20th century.

In relation to the instrumental temperature record of the last 100 years, the IPCC Fourth Assessment Report found that:

"Urban heat island effects are real but local, and have a negligible influence (less than 0.006 °C per decade over land and zero over the oceans) on these values."

For more information about the effects or otherwise of urbanization on the temperature record, see the main article: Urban heat island effect

Spatial variability

The global temperature changes are not uniform over the globe, nor would they be expected to be, whether the changes were naturally or humanly forced. Certain places, such as the north shore of Alaska, show dramatic rises in temperature, far above the average for the globe as a whole. The Antarctic peninsula has warmed by 2.5 °C (4.5 °F) in the past five decades in certain places.; meanwhile the majority of the continent has shown a slight cooling.

Calculating the global temperature

Deriving a reliable global temperature from the instrument data is not easy because the instruments are not evenly distributed across the planet, the hardware has changed over the years, and there has been extensive urbanization around some of the sites.

The calculation needs to filter out the changes that have occurred over time that are not climate related (eg urban heat islands), then interpolate across regions where instrument data has historically been sparse (eg in the southern hemisphere and at sea), before an average can be taken.

There are two main global temperature datasets, both developed since the late 1970s: that maintained by the Climatic Research Unit at the University of East Anglia and that maintained by the Goddard Institute for Space Studies . Both datasets produce very similar results, and are updated every month with additional data.

In the late 1990s, the Goddard team used the same data to produce a global map of temperature anomalies to illustrate the difference between the current temperature and average temperatures prior to 1950 across every part of the globe. The paper included details of the calculations.

Temperature processing software

In September 2007, the GISTEMP software which is used to process the GISS version of the historical instrument data was made public. The software that was released has been developed over more than 20 years by numerous staff and is mostly in FORTRAN; large parts of it were developed in the 1980s before massive amounts of computer memory was available as well as modern programming languages and techniques.

Two recent open source projects have been developed by individuals to re-write the processing software in modern open code. One, http://www.opentemp.org/, was by John van Vliet. More recently, a project which began in April 2008 (Clear Climate Code) by staff of Ravenbrook Ltd to update the code to Python has so far detected two minor bugs in the original software which did not significantly change any results.

Uncertainties in the temperature record

A number of scientists and scientific organizations have expressed concern about the possible deterioration of the land surface observing network. Climate scientist Roger A. Pielke has stated that he has identified a number of sites where poorly sited stations in sparse regions "will introduce spatially unrepresentative data into the analyses. The metadata needed to quantify the uncertainty from poorly sited stations does not currently exist. Pielke has called for a similar documentation effort for the rest of the world.

The uncertainty in annual measurements of the global average temperature (95% range) is estimated to be ~0.05°C since 1950 and as much as ~0.15°C in the earliest portions of the instrumental record. The error in recent years is dominated by the incomplete coverage of existing temperature records. Early records also have a substantial uncertainty driven by systematic concerns over the accuracy of sea surface temperature measurements. Station densities are highest in the northern hemisphere, providing more confidence in climate trends in this region. Station densities are far lower in other regions such as the tropics, northern Asia and the former Soviet Union. This results in less confidence in the robustness of climate trends in these areas. If a region with few stations includes a poor quality station, the impact on global temperature would be greater than in a grid with many weather stations.

Evaluation of the United States land surface temperature record

In 1999 a panel of the U.S. National Research Council studied the state of US climate observing systems. The panel evaluated many climate measurement aspects, 4 of which had to do with temperature, against ten climate monitoring principles proposed by Karl et al 1995. Land surface temperature had "known serious deficiencies" in 5 principles, vertical distribution and sea surface in 9 and subsurface ocean in 7.

The U.S. National Weather Service Cooperative Observer Program has established minimum standards regarding the instrumentation, siting, and reporting of surface temperature stations. The observing systems available are able to detect year-to-year temperature variations such as those caused by El Niño or volcanic eruptions. These stations can undergo undocumented changes such as relocation, changes in instrumentation and exposure (including changes in nearby thermally emitting structures), changes in land use (e.g., urbanization), and changes in observation practices. All of these changes can introduce biases into the stations' long term records. In the past, these local biases were generally considered to be random and therefore would cancel each other out using many stations and the ocean record.

A 2006 paper analyzed a subset of U.S. surface stations, 366 stations, and found that 95% displayed a warming trend after land use/land cover (LULC) changes. The authors stated "this does not necessarily imply that the LULC changes are the causative factor. Another study has documented examples of well and poorly sited monitoring stations in the United States, including ones near buildings, roadways, and air conditioning exhausts. Brooks investigated Historical Climate Network (USHCN) sites in Indiana, and assigned 16% of the sites an ‘excellent’ rating, 59% a ‘good’ rating, 12.5% a ‘fair’ rating, and 12.5% ‘poor’ rating. Davey and Pielke visited 10 HCN sites in Eastern Colorado, but did not provide percentages of good or badly sited stations. They stated that some of the sites "are not at all representative of their surrounding region" and should be replaced in the instrumental temperature records with other sites from the U.S. cooperative observer network.

Peterson has argued that existing empirical techniques for validating the local and regional consistency of temperature data are adequate to identify and remove biases from station records, and that such corrections allow information about long-term trends to be preserved. Pielke and co-authors disagree.

External links

References

  • IPCC Fourth Assessment Report (AR4) WGI Summary for Policy Makers (SPM)
  • Global average temperature for the last 150 years and discussion of trends
  • Preliminary data from the last 2000 years

Search another word or see Average surface temperatureon Dictionary | Thesaurus |Spanish
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature