Definitions

# Celsius

[sel-see-uhs, -shee-]
The Celsius temperature scale was previously known as the centigrade scale. The degree Celsius (symbol: °C) can refer to a specific temperature on the Celsius scale as well as serve as a unit increment to indicate a temperature (a difference between two temperatures or an uncertainty). "Celsius" is named after the Swedish astronomer Anders Celsius (1701–1744), who developed a similar temperature scale two years before his death.

From 1744 until 1954, 0 °C on the Celsius scale was defined as the freezing point of water and 100 °C was defined as the boiling point of water under a pressure of one standard atmosphere; this close equivalency is taught in schools today. However, the unit "degree Celsius" and the Celsius scale are currently, by international agreement, defined by two different points: absolute zero, and the triple point of VSMOW (specially prepared water). This definition also precisely relates the Celsius scale to the Kelvin scale, which is the SI base unit of temperature (symbol: K). Absolute zero—the temperature at which no energy remains in a substance—is defined as being precisely 0 K and −273.15 °C. The triple point of water is defined as being precisely 273.16 K and 0.01 °C.

This definition fixes the magnitude of both the degree Celsius and the unit kelvin as being precisely 1 part in 273.16 parts the difference between absolute zero and the triple point of water. Thus, it sets the magnitude of one degree Celsius and the kelvin to be exactly equivalent. Additionally, it establishes the difference between the two scales' null points as being precisely 273.15 degrees Celsius (−273.15 °C = 0 K and 0.01 °C = 273.16 K).

Some key temperatures relating the Celsius scale to other temperature scales are shown in the table below.

kelvin Celsius Fahrenheit
Absolute zero
(precisely, by definition)
0 K −273.15 °C −459.67 °F
Melting point of ice (approximate) 273.15 K 0 °C 32 °F
Water's triple point
(precisely, by definition)
273.16 K 0.01 °C 32.018 °F
Water's boiling point at 1 atm (101.325 kPa) (approximate: see Boiling point) 373.1339 K 99.9839 °C 211.9710 °F

## History

In 1742 Swedish Anders Celsius (1701–1744) created a "reversed" version of the modern Celsius temperature scale whereby zero represented the boiling point of water and one hundred represented the freezing point of water. In his paper Observations of two persistent degrees on a thermometer, he recounted his experiments showing that ice's melting point was effectively unaffected by pressure. He also determined with remarkable precision how water's boiling point varied as a function of atmospheric pressure. He proposed that zero on his temperature scale (water's boiling point) would be calibrated at the mean barometric pressure at mean sea level. This pressure is known as one standard atmosphere. (The BIPM's 10th CGPM later defined one standard atmosphere to equal precisely 1,013,250 dynes per cm2 (101.325 kPa))

Two years later, that is, 1744 — coincident with the death of Anders Celsius — the famous Swedish botanist Carolus Linnaeus (1707–1778) effectively reversed Celsius's scale upon receipt of his first thermometer featuring a scale where zero represented the melting point of ice and 100 represented water's boiling point. His custom-made "linnaeus-thermometer", for use in his greenhouses, was made by Daniel Ekström, Sweden's leading maker of scientific instruments at the time and whose workshop was located in the basement of the Stockholm observatory. As often happened in this age before modern communications, numerous physicists, scientists, and instrument makers are credited with having independently developed this same scale; among them were Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences (which had an instrument workshop) and with whom Linnaeus had been corresponding; Christian of Lyons; Daniel Ekström, the instrument maker; and Mårten Strömer (1707–1770) who had studied astronomy under Anders Celsius.

The first known document reporting temperatures in this modern "forward" Celsius scale is the paper Hortus Upsaliensis dated 16 December 1745 that Linnaeus wrote to a student of his, Samuel Nauclér. In it, Linnaeus recounted the temperatures inside the orangery at the Botanical Garden of Uppsala University:

"... since the caldarium (the hot part of the greenhouse) by the angle of the windows, merely from the rays of the sun, obtains such heat that the thermometer often reaches 30 degrees, although the keen gardener usually takes care not to let it rise to more than 20 to 25 degrees, and in winter not under 15 degrees ..."

For the next 204 years, the scientific and thermometry communities worldwide referred to this scale as the "centigrade scale". Temperatures on the centigrade scale were often reported simply as "degrees" or, when greater specificity was desired, "degrees centigrade". The symbol for temperature values on this scale was °C (in several formats over the years).

Because the term "centigrade" was also the Spanish and French language name for a unit of angular measurement (1/10000 of a right angle) and had a similar connotation in other languages, the term "centesimal degree" was used when very precise, unambiguous language was required by international standards bodies such as the Bureau international des poids et mesures (BIPM). The 9th CGPM (Conférence générale des poids et mesures) and the CIPM (Comité international des poids et mesures) formally adopted "degree Celsius" (symbol: °C) in 1948. For lay-people worldwide — including school textbooks — the full transition from centigrade to Celsius is far from complete, with centigrade still the commonly used term in many communities.

In modern days the word "degrees" is often omitted: for example, on the BBC weather, the forecaster may read a temperature as "30 Celsius" instead of "30 degrees Celsius". In some places where the Celsius scale is the main standard, the word "Celsius" is omitted leaving just "30 degrees".

## Formatting

The "degree Celsius" is the only SI unit whose full unit name contains an uppercase letter.

The following are permissible ways to express degree Celsius: singular / (plural)

• degree Celsius / degrees Celsius
• °C

The general rule is that the numerical value always precedes the unit, and a space is always used to separate the unit from the number, e.g., "23 °C" (not "23°C" nor "23° C"). Thus the value of the quantity is the product of the number and the unit, the space being regarded as a multiplication sign (just as a space between units implies multiplication). The only exceptions to this rule are for the unit symbols for degree, minute, and second for plane angle, °, ′, and ″, respectively, for which no space is left between the numerical value and the unit symbol.

## The special Unicode °C character

Unicode provides a compatibility character for the degree Celsius at U+2103 (decimal 8451), for compatibility with CJK encodings that provide such a character (as such, in most fonts the width is the same as for fullwidth characters). Its appearance is similar to the one synthesized by individually typing its two components (°) and (C). Shown below is the degree Celsius character followed immediately by the two-component version:

℃ °C

When viewed on computers that properly support Unicode, the above line may be similar to the line below (size and font may vary):

The canonical decomposition is simply an ordinary degree sign and "C", so some browsers may simply display "°C" in its place due to Unicode normalization.

## Temperatures and intervals

The degree Celsius is a special name for the kelvin for use in expressing Celsius temperatures. The degree Celsius is also subject to the same rules as the kelvin with regard to the use of its unit name and symbol. Thus, besides expressing specific temperatures along its scale (e.g. "Gallium melts at 29.7646 °C" and "The temperature outside is 23 degrees Celsius"), the degree Celsius is also suitable for expressing temperature intervals: differences between temperatures or their uncertainties (e.g. "The output of the heat exchanger is hotter by 40 degrees Celsius", and "Our standard uncertainty is ±3 °C"). Because of this dual usage, one must not rely upon the unit name or its symbol to denote that a quantity is a temperature interval; it must be unambiguous through context or explicit statement that the quantity is an interval. Therefore, automated systems such as Google Calculator can give unexpected results when temperatures are added, such as the sum of 1°C and 1°C being 275.15°C (as both are converted to kelvin first). What is often confused about the Celsius measurement is that it follows an interval system but not a ratio system. This is put simply by illustrating that while 10°C and 20°C have the same interval difference as 20°C and 30°C the temperature 20°C is not twice the air heat energy as 10°C. As this example shows degrees Celsius is a useful interval measurement does not possess the characteristics of ratio measures like weight or distance.

## Why technical articles use a mix of Kelvin and Celsius scales

In science (especially) and in engineering, the Celsius scale and the kelvin are often used simultaneously in the same article (e.g. "…its measured value was 0.01023 °C with an uncertainty of 70 µK…"). This practice is permissible because 1) the degree Celsius is a special name for the kelvin for use in expressing Celsius temperatures, and 2) the magnitude of the degree Celsius is precisely equal to that of the kelvin. Notwithstanding the official endorsement provided by decision #3 of Resolution 3 of the 13th CGPM, which stated "a temperature interval may also be expressed in degrees Celsius," the practice of simultaneously using both "°C" and "K" remains widespread throughout the scientific world as the use of SI prefixed forms of the degree Celsius (such as "µ°C" or "microdegrees Celsius") to express a temperature interval has not been well-adopted.

This practice should be avoided for literature directed to lower-level technical fields and in non-technical articles intended for the general public where both the kelvin and its symbol, K, are not well recognized and could be confusing.

## The melting and boiling points of water

One effect of defining the Celsius scale at the triple point of Vienna Standard Mean Ocean Water (273.16 kelvins and 0.01 °C), and at absolute zero (zero kelvins and −273.15 °C), is that neither the melting nor the boiling point of water under one standard atmosphere (101.325 kPa) remain defining points for the Celsius scale. In 1948 when the 9th General Conference on Weights and Measures (CGPM) in Resolution 3 first considered using the triple point of water as a defining point, the triple point was so close to being 0.01 °C greater than water's known melting point, it was simply defined as precisely 0.01 °C. However, current measurements show that the triple and melting points of Vienna Standard Mean Ocean Water (VSMOW) are actually very slightly (<0.001 °C) greater than 0.01 °C apart. Thus, the actual melting point of ice is very slightly (less than a thousandth of a degree) below 0 °C. Also, defining water's triple point at 273.16 K precisely defined the magnitude of each 1 °C increment in terms of the absolute thermodynamic temperature scale (referencing absolute zero). Now decoupled from the actual boiling point of water, the value "100 °C" is hotter than 0 °C — in absolute terms — by a factor of precisely $textstylefrac\left\{373.15\right\}\left\{273.15\right\}$ (approximately 36.61% thermodynamically hotter). When adhering strictly to the two-point definition for calibration, the boiling point of VSMOW under one standard atmosphere of pressure is actually 373.1339 K (99.9839 °C). When calibrated to ITS-90 (a calibration standard comprising many definition points and commonly used for high-precision instrumentation), the boiling point of VSMOW is slightly less, about 99.974 °C.

This boiling-point difference of 16.1 millikelvins (thousandths of a degree Celsius) between the Celsius scale's original definition and the current one (based on absolute zero and the triple point) has little practical meaning in real life because water's boiling point is extremely sensitive to variations in barometric pressure. For example, an altitude change of only 28 cm (11 inches) causes water's boiling point to change by one millikelvin.