A thermometer is a device that measures temperature or a temperature gradient (the degree of hotness or coldness of an object). A thermometer has two important elements: (1) a temperature sensor (e.g. the bulb of a mercury-in-glass thermometer or the pyrometric sensor in an infrared thermometer) in which some change occurs with a change in temperature; and (2) some means of converting this change into a numerical value (e.g. the visible scale that is marked on a mercury-in-glass thermometer or the digital readout on an infrared model). Thermometers are widely used in technology and industry to monitor processes, in meteorology, in medicine, and in scientific research.
See also: Scale of temperature
While an individual thermometer is able to measure degrees of hotness, the readings on two thermometers cannot be compared unless they conform to an agreed scale. Today there is an absolute thermodynamic temperature scale. Internationally agreed temperature scales are designed to approximate this closely, based on fixed points and interpolating thermometers. The most recent official temperature scale is the International Temperature Scale of 1990. It extends from 0.65 K (−272.5 °C; −458.5 °F) to approximately 1,358 K (1,085 °C; 1,985 °F).
The thermometer was not a single invention, but a development. Various authors have, however, credited the invention of the thermometer to Hero of Alexandria (10–70 AD). Hero knew of the principle that certain substances, notably air, expand and contract and described a demonstration in which a closed tube partially filled with air had its end in a container of water. The expansion and contraction of the air caused the position of the water/air interface to move along the tube.
Temperature scales were invented before thermometers. The Roman Greek physician Galen may have conceived of the first temperature scale with a fixed point. The fixed, "neutral" point was the temperature of a mixture of equal amounts of ice and boiling water. On either side of that point were four degrees of hotness and coldness respectively.
Main article: Thermoscope
In the 16th and 17th centuries, several European scientists, notably Galileo Galilei and Italian physiologist Santorio Santorio developed devices with an air-filled glass bulb, connected to a tube, partly filled with water, in which the water level was controlled by the expansion and contraction of the air in the bulb. These devices reliably showed the hotness and coldness of the air in the bulb, and of the matter surrounding the bulb. The term thermoscope was adopted because the movement of the water level reflected the changes in sensible heat (the modern concept of temperature was yet to arise).
The difference between a thermoscope and a thermometer is that the latter has a scale.
A thermometer is simply a thermoscope with a scale. ... I propose to regard it as axiomatic that a “meter” must have a scale or something equivalent. ... If this is admitted, the problem of the invention of the thermometer becomes more straightforward; that of the invention of the thermoscope remains as obscure as ever.— W. E. Knowles Middleton, A history of the thermometer and its use in meteorology
Given this, the possible inventors of the thermometer are usually considered to be Galileo, Santorio, Dutch inventor Cornelis Drebbel, or British mathematician Robert Fludd. Though Galileo is often said to be the inventor of the thermometer, there is no surviving document that he actually produced any such instrument.
The first clear diagram of a thermoscope was published in 1617 by Giuseppe Biancani (1566 – 1624); the first showing a scale and thus constituting a thermometer was by Santorio Santorio in 1625. This was a vertical tube, closed by a bulb of air at the top, with the lower end opening into a vessel of water. The water level in the tube was controlled by the expansion and contraction of the air, so it was what we would now call an air thermometer.
The word thermometer (in its French form) first appeared in 1624 in La Récréation Mathématique by Jean Leurechon, who describes one with a scale of 8 degrees. The word comes from the Greek words θερμός, thermos, meaning "hot" and μέτρον, metron, meaning "measure".
See also: Alcohol thermometer
The above instruments suffered from the disadvantage that they were also barometers, i.e. sensitive to air pressure. In 1629, Joseph Solomon Delmedigo, a student of Galileo and Santorio in Padua, published what is apparently the first description and illustration of a sealed liquid-in-glass thermometer. It is described as having a bulb at the bottom of a sealed tube partially filled with brandy. The tube had a numbered scale. Delmedigo did not claim to have invented this instrument. Nor did he name anyone else as its inventor. In about 1654, Ferdinando II de' Medici, Grand Duke of Tuscany (1610–1670) did produce such an instrument, the first modern-style thermometer, dependent on the expansion of a liquid and independent of air pressure. Many other scientists experimented with various liquids and designs of thermometer. However, each inventor and each thermometer was unique — there was no standard scale.
Early attempts at standardization added a single reference point such as the freezing point of water. The use of two references for graduating the thermometer is said to have been introduced by Joachim Dalence in 1668, although Christiaan Huygens (1629–1695) in 1665 had already suggested the use of graduations based on the melting and boiling points of water as standards and, in 1694, Carlo Renaldini (1615–1698) proposed using them as fixed points along a universal scale. In 1701, Isaac Newton (1642–1726/27) proposed a scale of 12 degrees between the melting point of ice and body temperature.
In 1714, scientist and inventor Daniel Gabriel Fahrenheit invented a reliable thermometer, using mercury instead of alcohol and water mixtures. In 1724, he proposed a temperature scale which now (slightly adjusted) bears his name. In 1742, Anders Celsius (1701–1744) proposed a scale with zero at the boiling point and 100 degrees at the freezing point of water, though the scale which now bears his name has them the other way around. French entomologist René Antoine Ferchault de Réaumur invented an alcohol thermometer and, temperature scale in 1730, that ultimately proved to be less reliable than Fahrenheit's mercury thermometer.
The first physician to use thermometer measurements in clinical practice was Herman Boerhaave (1668–1738). In 1866, Sir Thomas Clifford Allbutt (1836–1925) invented a clinical thermometer that produced a body temperature reading in five minutes as opposed to twenty. In 1999, Dr. Francesco Pompei of the Exergen Corporation introduced the world's first temporal artery thermometer, a non-invasive temperature sensor which scans the forehead in about two seconds and provides a medically accurate body temperature.
Traditional thermometers were all non-registering thermometers. That is, the thermometer did not hold the temperature reading after it was moved to a place with a different temperature. Determining the temperature of a pot of hot liquid required the user to leave the thermometer in the hot liquid until after reading it. If the non-registering thermometer was removed from the hot liquid, then the temperature indicated on the thermometer would immediately begin changing to reflect the temperature of its new conditions (in this case, the air temperature). Registering thermometers are designed to hold the temperature indefinitely, so that the thermometer can be removed and read at a later time or in a more convenient place. Mechanical registering thermometers hold either the highest or lowest temperature recorded until manually re-set, e.g., by shaking down a mercury-in-glass thermometer, or until an even more extreme temperature is experienced. Electronic registering thermometers may be designed to remember the highest or lowest temperature, or to remember whatever temperature was present at a specified point in time.
Thermometers increasingly use electronic means to provide a digital display or input to a computer.
Thermometers may be described as empirical or absolute. Absolute thermometers are calibrated numerically by the thermodynamic absolute temperature scale. Empirical thermometers are not in general necessarily in exact agreement with absolute thermometers as to their numerical scale readings, but to qualify as thermometers at all they must agree with absolute thermometers and with each other in the following way: given any two bodies isolated in their separate respective thermodynamic equilibrium states, all thermometers agree as to which of the two has the higher temperature, or that the two have equal temperatures. For any two empirical thermometers, this does not require that the relation between their numerical scale readings be linear, but it does require that relation to be strictly monotonic. This is a fundamental character of temperature and thermometers.
As it is customarily stated in textbooks, taken alone, the so-called "zeroth law of thermodynamics" fails to deliver this information, but the statement of the zeroth law of thermodynamics by James Serrin in 1977, though rather mathematically abstract, is more informative for thermometry: "Zeroth Law – There exists a topological line which serves as a coordinate manifold of material behaviour. The points of the manifold are called 'hotness levels', and is called the 'universal hotness manifold'." To this information there needs to be added a sense of greater hotness; this sense can be had, independently of calorimetry, of thermodynamics, and of properties of particular materials, from Wien's displacement law of thermal radiation: the temperature of a bath of thermal radiation is proportional, by a universal constant, to the frequency of the maximum of its frequency spectrum; this frequency is always positive, but can have values that tend to zero. Another way of identifying hotter as opposed to colder conditions is supplied by Planck's principle, that when a process of isochoric adiabatic work is the sole means of change of internal energy of a closed system, the final state of the system is never colder than the initial state; except for phase changes with latent heat, it is hotter than the initial state.
There are several principles on which empirical thermometers are built, as listed in the section of this article entitled "Primary and secondary thermometers". Several such principles are essentially based on the constitutive relation between the state of a suitably selected particular material and its temperature. Only some materials are suitable for this purpose, and they may be considered as "thermometric materials". Radiometric thermometry, in contrast, can be only slightly dependent on the constitutive relations of materials. In a sense then, radiometric thermometry might be thought of as "universal". This is because it rests mainly on a universality character of thermodynamic equilibrium, that it has the universal property of producing blackbody radiation.
There are various kinds of empirical thermometer based on material properties.
Many empirical thermometers rely on the constitutive relation between pressure, volume and temperature of their thermometric material. For example, mercury expands when heated.
If it is used for its relation between pressure and volume and temperature, a thermometric material must have three properties:
(1) Its heating and cooling must be rapid. That is to say, when a quantity of heat enters or leaves a body of the material, the material must expand or contract to its final volume or reach its final pressure and must reach its final temperature with practically no delay; some of the heat that enters can be considered to change the volume of the body at constant temperature, and is called the latent heat of expansion at constant temperature; and the rest of it can be considered to change the temperature of the body at constant volume, and is called the specific heat at constant volume. Some materials do not have this property, and take some time to distribute the heat between temperature and volume change.
(2) Its heating and cooling must be reversible. That is to say, the material must be able to be heated and cooled indefinitely often by the same increment and decrement of heat, and still return to its original pressure, volume and temperature every time. Some plastics do not have this property;
(3) Its heating and cooling must be monotonic. That is to say, throughout the range of temperatures for which it is intended to work,
At temperatures around about 4 °C, water does not have the property (3), and is said to behave anomalously in this respect; thus water cannot be used as a material for this kind of thermometry for temperature ranges near 4 °C.
Gases, on the other hand, all have the properties (1), (2), and (3)(a)(α) and (3)(b)(α). Consequently, they are suitable thermometric materials, and that is why they were important in the development of thermometry.
According to Preston (1894/1904), Regnault found constant pressure air thermometers unsatisfactory, because they needed troublesome corrections. He therefore built a constant volume air thermometer. Constant volume thermometers do not provide a way to avoid the problem of anomalous behaviour like that of water at approximately 4 °C.
Planck's law very accurately quantitatively describes the power spectral density of electromagnetic radiation, inside a rigid walled cavity in a body made of material that is completely opaque and poorly reflective, when it has reached thermodynamic equilibrium, as a function of absolute thermodynamic temperature alone. A small enough hole in the wall of the cavity emits near enough blackbody radiation of which the spectral radiance can be precisely measured. The walls of the cavity, provided they are completely opaque and poorly reflective, can be of any material indifferently. This provides a well-reproducible absolute thermometer over a very wide range of temperatures, able to measure the absolute temperature of a body inside the cavity.
A thermometer is called primary or secondary based on how the raw physical quantity it measures is mapped to a temperature. As summarized by Kauppinen et al., "For primary thermometers the measured property of matter is known so well that temperature can be calculated without any unknown quantities. Examples of these are thermometers based on the equation of state of a gas, on the velocity of sound in a gas, on the thermal noise voltage or current of an electrical resistor, and on the angular anisotropy of gamma ray emission of certain radioactive nuclei in a magnetic field."
In contrast, "Secondary thermometers are most widely used because of their convenience. Also, they are often much more sensitive than primary ones. For secondary thermometers knowledge of the measured property is not sufficient to allow direct calculation of temperature. They have to be calibrated against a primary thermometer at least at one temperature or at a number of fixed temperatures. Such fixed points, for example, triple points and superconducting transitions, occur reproducibly at the same temperature."
Thermometers can be calibrated either by comparing them with other calibrated thermometers or by checking them against known fixed points on the temperature scale. The best known of these fixed points are the melting and boiling points of pure water. (Note that the boiling point of water varies with pressure, so this must be controlled.)
The traditional way of putting a scale on a liquid-in-glass or liquid-in-metal thermometer was in three stages:
Other fixed points used in the past are the body temperature (of a healthy adult male) which was originally used by Fahrenheit as his upper fixed point (96 °F (35.6 °C) to be a number divisible by 12) and the lowest temperature given by a mixture of salt and ice, which was originally the definition of 0 °F (−17.8 °C). (This is an example of a Frigorific mixture.) As body temperature varies, the Fahrenheit scale was later changed to use an upper fixed point of boiling water at 212 °F (100 °C).
These have now been replaced by the defining points in the International Temperature Scale of 1990, though in practice the melting point of water is more commonly used than its triple point, the latter being more difficult to manage and thus restricted to critical standard measurement. Nowadays manufacturers will often use a thermostat bath or solid block where the temperature is held constant relative to a calibrated thermometer. Other thermometers to be calibrated are put into the same bath or block and allowed to come to equilibrium, then the scale marked, or any deviation from the instrument scale recorded. For many modern devices calibration will be stating some value to be used in processing an electronic signal to convert it to a temperature.
The precision or resolution of a thermometer is simply to what fraction of a degree it is possible to make a reading. For high temperature work it may only be possible to measure to the nearest 10 °C or more. Clinical thermometers and many electronic thermometers are usually readable to 0.1 °C. Special instruments can give readings to one thousandth of a degree. However, this precision does not mean the reading is true or accurate, it only means that very small changes can be observed.
A thermometer calibrated to a known fixed point is accurate (i.e. gives a true reading) at that point. The invention of the technology to measure temperature led to the creation of Scales of Temperature. In between fixed calibration points, interpolation is used, usually linear. This may give significant differences between different types of thermometer at points far away from the fixed points. For example, the expansion of mercury in a glass thermometer is slightly different from the change in resistance of a platinum resistance thermometer, so these two will disagree slightly at around 50 °C. There may be other causes due to imperfections in the instrument, e.g. in a liquid-in-glass thermometer if the capillary tube varies in diameter.
For many purposes reproducibility is important. That is, does the same thermometer give the same reading for the same temperature (or do replacement or multiple thermometers give the same reading)? Reproducible temperature measurement means that comparisons are valid in scientific experiments and industrial processes are consistent. Thus if the same type of thermometer is calibrated in the same way its readings will be valid even if it is slightly inaccurate compared to the absolute scale.
An example of a reference thermometer used to check others to industrial standards would be a platinum resistance thermometer with a digital display to 0.1 °C (its precision) which has been calibrated at 5 points against national standards (−18, 0, 40, 70, 100 °C) and which is certified to an accuracy of ±0.2 °C.
According to British Standards, correctly calibrated, used and maintained liquid-in-glass thermometers can achieve a measurement uncertainty of ±0.01 °C in the range 0 to 100 °C, and a larger uncertainty outside this range: ±0.05 °C up to 200 or down to −40 °C, ±0.2 °C up to 450 or down to −80 °C.
See also: List of temperature sensors
Thermometers utilize a range of physical effects to measure temperature. Temperature sensors are used in a wide variety of scientific and engineering applications, especially measurement systems. Temperature systems are primarily either electrical or mechanical, occasionally inseparable from the system which they control (as in the case of a mercury-in-glass thermometer). Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist. Indoors, thermistors are used in climate control systems such as air conditioners, freezers, heaters, refrigerators, and water heaters. Galileo thermometers are used to measure indoor air temperature, due to their limited measurement range.
Such liquid crystal thermometers (which use thermochromic liquid crystals) are also used in mood rings and used to measure the temperature of water in fish tanks.
Fiber Bragg grating temperature sensors are used in nuclear power facilities to monitor reactor core temperatures and avoid the possibility of nuclear meltdowns.
Nanothermometry is an emergent research field dealing with the knowledge of temperature in the sub-micrometric scale. Conventional thermometers cannot measure the temperature of an object which is smaller than a micrometre, and new methods and materials have to be used. Nanothermometry is used in such cases. Nanothermometers are classified as luminescent thermometers (if they use light to measure temperature) and non-luminescent thermometers (systems where thermometric properties are not directly related to luminescence).
Main article: cryometer
Thermometers used specifically for low temperatures.
Main article: Medical thermometer
Various thermometric techniques have been used throughout history such as the Galileo thermometer to thermal imaging. Medical thermometers such as mercury-in-glass thermometers, infrared thermometers, pill thermometers, and liquid crystal thermometers are used in health care settings to determine if individuals have a fever or are hypothermic.
Thermometers are important in food safety, where food at temperatures within 41 and 135 °F (5 and 57 °C) can be prone to potentially harmful levels of bacterial growth after several hours which could lead to foodborne illness. This includes monitoring refrigeration temperatures and maintaining temperatures in foods being served under heat lamps or hot water baths. Cooking thermometers are important for determining if a food is properly cooked. In particular meat thermometers are used to aid in cooking meat to a safe internal temperature while preventing over cooking. They are commonly found using either a bimetallic coil, or a thermocouple or thermistor with a digital readout. Candy thermometers are used to aid in achieving a specific water content in a sugar solution based on its boiling temperature.
Alcohol thermometers, infrared thermometers, mercury-in-glass thermometers, recording thermometers, thermistors, and Six's thermometers (maximum-minimum thermometer) are used in meteorology and climatology in various levels of the atmosphere and oceans. Aircraft use thermometers and hygrometers to determine if atmospheric icing conditions exist along their flight path. These measurements are used to initialize weather forecast models. Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist and indoors in climate control systems.
For decades mercury thermometers were a mainstay in many testing laboratories. If used properly and calibrated correctly, certain types of mercury thermometers can be incredibly accurate. Mercury thermometers can be used in temperatures ranging from about -38 to 350°C. The use of a mercury-thallium mixture can extend the low-temperature usability of mercury thermometers to -56°C. (...) Nevertheless, few liquids have been found to mimic the thermometric properties of mercury in repeatability and accuracy of temperature measurement. Toxic though it may be, when it comes to LiG [Liquid-in-Glass] thermometers, mercury is still hard to beat.
((cite book)): CS1 maint: multiple names: authors list (link)