Celsius

From Wikipedia, the free encyclopedia

Jump to: navigation, search
Celsius temperature conversion formulas
To find From Formula
Fahrenheit Celsius °F = (°C × 1.8) + 32
Celsius Fahrenheit °C = (°F − 32) /1.8
Kelvin Celsius K = °C + 273.15
Celsius Kelvin °C = K − 273.15
Rankine Celsius °R = (°C + 273.15) × 1.8
Celsius Rankine °C = (°R ÷ 1.8) – 273.15
For temperature intervals rather than specific temperatures,
1 °C = 1 K
and
1 °C = 1.8 °F
Comparisons among various temperature scales

Celsius is, or relates to, the Celsius temperature scale (previously known as the centigrade scale). The degree Celsius (symbol: °C) can refer to a specific temperature on the Celsius scale as well as serve as unit increment to indicate a temperature interval (a difference between two temperatures or an uncertainty). “Celsius” is named after the Swedish astronomer Anders Celsius (1701 – 1744), who developed a similar temperature scale two years before his death. 

Until 1954, 0 °C on the Celsius scale was defined as the melting point of ice and 100 °C was defined as the boiling point of water under a pressure of one standard atmosphere; this close equivalency is taught in schools today. However, the unit “degree Celsius” and the Celsius scale are currently, by international agreement, defined by two different points: absolute zero, and the triple point of specially prepared water. This definition also precisely relates the Celsius scale to the Kelvin scale, which is the SI base unit of temperature (symbol: K). Absolute zero—the temperature at which nothing could be colder and no heat energy remains in a substance—is defined as being precisely 0 K and −273.15 °C. The triple point of water is defined as being precisely 273.16 K and 0.01 °C.

This definition fixes the magnitude of both the degree Celsius and the unit kelvin as being precisely 1 part in 273.16 parts the difference between absolute zero and the triple point of water. Thus, it sets the magnitude of one degree Celsius and the kelvin to be exactly equivalent. Additionally, it establishes the difference between the two scales’ null points as being precisely 273.15 degrees Celsius (−273.15 °C = 0 K and 0.01 °C = 273.16 K).

Some key temperatures relating the Celsius scale to other temperature scales are shown in the table below.

Kelvin Celsius Fahrenheit
Absolute zero

(precisely, by definition)

0 K −273.15 °C −459.67 °F
Melting point of ice

(approximate) [1]

273.15 K 0 °C 32 °F
Water’s triple point

(precisely, by definition)

273.16 K 0.01 °C 32.018 °F
Water's boiling point

(approximate) [2]

373.1339 K 99.9839 °C 211.9710 °F

Contents

  • 1 History
  • 2 Formatting
  • 3 Temperatures and intervals
  • 4 Why technical articles use a mix of Kelvin and Celsius scales
  • 5 The melting and boiling points of water
  • 6 World-wide adoption
  • 7 The special Unicode °C character
  • 8 See also
  • 9 Notes
  • 10 External links

[edit] History

An illustration of Anders Celsius's original thermometer. Note the reversed scale, where 0 is the boiling point of water and 100 is its freezing point.

In 1742, Anders Celsius (1701 – 1744) created a "reversed" version of the modern Celsius temperature scale whereby zero represented the boiling point of water and 100 represented the melting point of ice. In his paper Observations of two persistent degrees on a thermometer, he recounted his experiments showing that ice’s melting point was effectively unaffected by pressure. He also determined with remarkable precision how water’s boiling point varied as a function of atmospheric pressure. He proposed that zero on his temperature scale (water’s boiling point) would be calibrated at the mean barometric pressure at mean sea level. This pressure is known as one standard atmosphere. In 1954, Resolution 4 of the 10th CGPM (the General Conference on Weights and Measures) established internationally that one standard atmosphere was a pressure equivalent to 1,013,250 dynes per cm2 (101.325 kPa). 

In 1744, coincident with the death of Anders Celsius, the famous Swedish botanist Carolus Linnaeus (1707 – 1778) effectively reversed [3] Celsius’s scale upon receipt of his first thermometer featuring a scale where zero represented the melting point of ice and 100 represented water’s boiling point. His custom-made “linnaeus-thermometer,” for use in his greenhouses, was made by Daniel Ekström, Sweden’s leading maker of scientific instruments at the time and whose workshop was located in the basement of the Stockholm observatory. As often happened in this age before modern communications, numerous physicists, scientists, and instrument makers are credited with having independently developed this same scale;[4] among them were Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences (which had an instrument workshop) and with whom Linnaeus had been corresponding; Christian of Lyons; Daniel Ekström, the instrument maker; and Mårten Strömer (1707 – 1770) who had studied astronomy under Anders Celsius.

The first known document[5] reporting temperatures in this modern “forward” Celsius scale is the paper Hortus Upsaliensis dated 16 December 1745 that Linnaeus wrote to a student of his, Samuel Nauclér. In it, Linnaeus recounted the temperatures inside the orangery at the Botanical Garden of Uppsala University:

“…since the caldarium (the hot part of the greenhouse) by the angle
of the windows, merely from the rays of the sun, obtains such heat
that the thermometer often reaches 30 degrees, although the keen
gardener usually takes care not to let it rise to more than 20 to 25
degrees, and in winter not under 15 degrees…”

For the next 204 years, the scientific and thermometry communities world-wide referred to this scale as the “centigrade scale.” Temperatures on the centigrade scale were often reported simply as “degrees” or, when greater specificity was desired, “degrees centigrade.” The symbol for temperature values on this scale was °C (in several formats over the years). Because the term “centigrade” was also the Spanish and French language name for a unit of angular measurement (one-hundredth of a right angle) and had a similar connotation in other languages, the term “centesimal degree” was used when very precise, unambiguous language was required by international standards bodies such as the Bureau international des poids et mesures (BIPM). The 9th CGPM (Conférence générale des poids et mesures) and the CIPM (Comité international des poids et mesures) formally adopted “degree Celsius” (symbol: °C) in 1948.[6] For lay-people worldwide — including school textbooks — the full transition from centigrade to Celsius required nearly two decades after this formal adoption.

In modern days the word "degrees" is often omitted: for example, on the BBC weather, the forecaster may read a temperature as "30 Celsius" instead of "30 degrees Celsius".

[edit] Formatting

The “degree Celsius” is the only SI unit whose full unit name contains an uppercase letter.

The following are permissible ways to express degree Celsius: singular / (plural)

The general rule is that the numerical value always precedes the unit, and a space is always used to separate the unit from the number, e.g., “23 °C” (not “23°C” or “23° C”). Thus the value of the quantity is the product of the number and the unit, the space being regarded as a multiplication sign (just as a space between units implies multiplication). The only exceptions to this rule are for the unit symbols for degree, minute, and second for plane angle, °, ′, and ″, respectively, for which no space is left between the numerical value and the unit symbol.[7]

[edit] Temperatures and intervals

The degree Celsius is a special name for the kelvin for use in expressing Celsius temperatures.[8] The degree Celsius is also subject to the same rules as the kelvin with regard to the use of its unit name and symbol. Thus, besides expressing specific temperatures along its scale (e.g. “Gallium melts at 29.7646 °C” and “The temperature outside is 23 degrees Celsius”), the degree Celsius is also suitable for expressing temperature intervals: differences between temperatures or their uncertainties (e.g. “The output of the heat exchanger is hotter by 40 degrees Celsius,” and “Our standard uncertainty is ±3 °C”).[9] Because of this dual usage, one must not rely upon the unit name or its symbol to denote that a quantity is a temperature interval; it must be unambiguous through context or explicit statement that the quantity is an interval.[10]

[edit] Why technical articles use a mix of Kelvin and Celsius scales

In science (especially) and in engineering, the Celsius scale and the kelvin are often used simultaneously in the same article (e.g. “…its measured value was 0.01023 °C with an uncertainty of 70 µK…”). This practice is permissible because 1) the degree Celsius is a special name for the kelvin for use in expressing Celsius temperatures, and 2) the magnitude of the degree Celsius is precisely equal to that of the kelvin. Notwithstanding the official endorsement provided by decision #3 of Resolution 3 of the 13th CGPM, which stated “a temperature interval may also be expressed in degrees Celsius,” the practice of simultaneously using both “°C” and “K” remains widespread throughout the scientific world as the use of SI prefixed forms of the degree Celsius (such as “µ°C” or “millidegrees Celsius”) to express a temperature interval has not been well-adopted.

This practice should be avoided for literature directed to lower-level technical fields and in non-technical articles intended for the general public where both the kelvin and its symbol, K, are not well recognized and could be confusing.

[edit] The melting and boiling points of water

One effect of defining the Celsius scale at the triple point of Vienna Standard Mean Ocean Water (273.16 kelvins and 0.01 °C), and at absolute zero (zero kelvins and −273.15 °C), is that neither the melting nor the boiling point of water under one standard atmosphere (101.325 kPa) remain defining points for the Celsius scale. In 1948 when the 9th General Conference on Weights and Measures (CGPM) in Resolution 3 first considered using the triple point of water as a defining point, the triple point was so close to being 0.01 °C greater than water’s known melting point, it was simply defined as precisely 0.01 °C. However, current measurements show that the triple and melting points of Vienna Standard Mean Ocean Water (VSMOW) are actually very slightly (<0.001 °C) greater than 0.01 °C apart. Thus, the actual melting point of ice is very slightly (less than a thousandth of a degree) below 0 °C. Also, defining water’s triple point at 273.16 K precisely defined the magnitude of each 1 °C increment in terms of the absolute thermodynamic temperature scale (referencing absolute zero). Now decoupled from the actual boiling point of water, the value “100 °C” is hotter than 0 °C — in absolute terms — by a factor of precisely \textstyle\frac{373.15}{273.15} (approximately 36.61% thermodynamically hotter). When adhering strictly to the two-point definition for calibration, the boiling point of VSMOW under one standard atmosphere of pressure is actually 373.1339 K (99.9839 °C). When calibrated to ITS-90 (a calibration standard comprising many definition points and commonly used for high-precision instrumentation), the boiling point of VSMOW is slightly less, about 99.974 °C.[11]

This boiling–point difference of 16.1 millikelvins (thousandths of a degree Celsius) between the Celsius scale’s original definition and the current one (based on absolute zero and the triple point) has little practical meaning in real life because water’s boiling point is extremely sensitive to variations in barometric pressure. For example, an altitude change of only 28 cm (11 inches) causes water’s boiling point to change by one millikelvin.

[edit] World-wide adoption

Throughout the world, except in the U.S. and perhaps a few other countries (for example, Belize [12]), the Celsius temperature scale is used for practically all purposes. The only exceptions are some specialist fields (e.g., low-temperature physics, astrophysics, light temperature in photography) where the closely related Kelvin scale dominates instead. Even in the U.S., almost the entire scientific world and most engineering fields, especially high-tech ones, use the Celsius scale. The general U.S. population (not considering foreign immigrants), however, remains more accustomed to the Fahrenheit scale, which is therefore the scale that most U.S. broadcasters use in weather forecasts. The Fahrenheit scale is also commonly used in the U.S. for body temperatures. The United Kingdom has almost exclusively used the Celsius scale since the 1970s, with the notable exception that some broadcasters and publications still quote Fahrenheit air temperatures occasionally in weather forecasts, for the benefit of generations born before about 1950, and air-temperature thermometers sold still show both scales for the same reason.

[edit] The special Unicode °C character

Unicode includes a special “°C” character at U+2103 (decimal value 8451) for compatibility with CJK encodings that provide such a character (as such, in most fonts the width is the same as for fullwidth characters). One types &#x2103; (or &#8451;) when encoding this special character in a Web page. Its appearance is similar to the one synthesized by individually typing its two components (°) and (C). To better see the difference between the two, shown below is the degree Celsius character followed immediately by the two-component version:

℃°C

When viewed on computers that properly support and map Unicode, the above line may be similar to the line below (size may vary):

Depending on the operating system, web browser, and the default font, the “C” in the Unicode character may be narrower and slightly taller than a plain uppercase C; precisely the opposite may be true on other platforms. However, there will usually be a discernible difference between the two.