Login

Top Links

Social Share

Amazon

Thermometer (3023 views - Tooling)

A thermometer is a device that measures temperature or a temperature gradient. A thermometer has two important elements: (1) a temperature sensor (e.g. the bulb of a mercury-in-glass thermometer or the digital sensor in an infrared thermometer) in which some change occurs with a change in temperature, and (2) some means of converting this change into a numerical value (e.g. the visible scale that is marked on a mercury-in-glass thermometer or the digital readout on an infrared model). Thermometers are widely used in industry to monitor processes, in meteorology, in medicine, and in scientific research.
Some of the principles of the thermometer were known to Greek philosophers of two thousand years ago. The modern thermometer gradually evolved from the thermoscope with the addition of a scale in the early 17th century and standardisation through the 17th and 18th centuries. Go to Article

Explanation by Hotspot Model

Thermometer

Thermometer

A thermometer is a device that measures temperature or a temperature gradient. A thermometer has two important elements: (1) a temperature sensor (e.g. the bulb of a mercury-in-glass thermometer or the digital sensor in an infrared thermometer) in which some change occurs with a change in temperature, and (2) some means of converting this change into a numerical value (e.g. the visible scale that is marked on a mercury-in-glass thermometer or the digital readout on an infrared model). Thermometers are widely used in industry to monitor processes, in meteorology, in medicine, and in scientific research.

Some of the principles of the thermometer were known to Greek philosophers of two thousand years ago. The modern thermometer gradually evolved from the thermoscope with the addition of a scale in the early 17th century and standardisation through the 17th and 18th centuries.[2][3][4]

Temperature

While an individual thermometer is able to measure degrees of hotness, the readings on two thermometers cannot be compared unless they conform to an agreed scale. Today there is an absolute thermodynamic temperature scale. Internationally agreed temperature scales are designed to approximate this closely, based on fixed points and interpolating thermometers. The most recent official temperature scale is the International Temperature Scale of 1990. It extends from 0.65 K (−272.5 °C; −458.5 °F) to approximately 1,358 K (1,085 °C; 1,985 °F).

Early developments

Various authors have credited the invention of the thermometer to Hero of Alexandria. The thermometer was not a single invention, however, but a development.
Hero of Alexandria (10–70 AD) knew of the principle that certain substances, notably air, expand and contract and described a demonstration in which a closed tube partially filled with air had its end in a container of water.[5] The expansion and contraction of the air caused the position of the water/air interface to move along the tube.

Such a mechanism was later used to show the hotness and coldness of the air with a tube in which the water level is controlled by the expansion and contraction of the gas. These devices were developed by several European scientists in the 16th and 17th centuries, notably Galileo Galilei.[6] As a result, devices were shown to produce this effect reliably, and the term thermoscope was adopted because it reflected the changes in sensible heat (the concept of temperature was yet to arise).[6] The difference between a thermoscope and a thermometer is that the latter has a scale.[7] Though Galileo is often said to be the inventor of the thermometer, what he produced were thermoscopes.

The first clear diagram of a thermoscope was published in 1617 by Giuseppe Biancani (1566 – 1624): the first showing a scale and thus constituting a thermometer was by Robert Fludd in 1638. This was a vertical tube, closed by a bulb of air at the top, with the lower end opening into a vessel of water. The water level in the tube is controlled by the expansion and contraction of the air, so it is what we would now call an air thermometer.[8]

The first person to put a scale on a thermoscope is variously said[by whom?] to be Francesco Sagredo (1571–1620) or Santorio Santorio in about 1611 to 1613.

The word thermometer (in its French form) first appeared in 1624 in La Récréation Mathématique by J. Leurechon, who describes one with a scale of 8 degrees.[9] The word comes from the Greek words θερμός, thermos, meaning "hot" and μέτρον, metron, meaning "measure".

The above instruments suffered from the disadvantage that they were also barometers, i.e. sensitive to air pressure. In 1629, Joseph Solomon Delmedigo, a student of Galileo, published what is apparently the first description and illustration of a sealed liquid-in-glass thermometer. It is described as having a bulb at the bottom of a sealed tube partially filled with brandy. The tube has a numbered scale. Delmedigo does not claim to have invented this instrument, nor does he name anyone else as its inventor.[10] In about 1654 Ferdinando II de' Medici, Grand Duke of Tuscany (1610–1670), actually produced such an instrument, the first modern-style thermometer, dependent on the expansion of a liquid, and independent of air pressure.[9] Many other scientists experimented with various liquids and designs of thermometer.

Registering

Old thermometers were all non-registering thermometers. That is, the thermometer did not hold the temperature reading after it was moved to a place with a different temperature. Determining the temperature of a pot of hot liquid required the user to leave the thermometer in the hot liquid until after reading it. If the non-registering thermometer was removed from the hot liquid, then the temperature indicated on the thermometer would immediately begin changing to reflect the temperature of its new conditions (in this case, the air temperature). Registering thermometers are designed to hold the temperature indefinitely, so that the thermometer can be removed and read at a later time or in a more convenient place. Mechanical registering thermometers hold either the highest or lowest temperature recorded, until manually re-set, e.g., by shaking down a mercury-in-glass thermometer, or until an even more extreme temperature is experienced. Electronic registering thermometers may be designed to remember the highest or lowest temperature, or to remember whatever temperature was present at a specified point in time.

Thermometers increasingly use electronic means to provide a digital display or input to a computer.

Physical principles of thermometry

Various thermometers from the 19th century.

Thermometers may be described as empirical or absolute. Absolute thermometers are calibrated numerically by the thermodynamic absolute temperature scale. Empirical thermometers are not in general necessarily in exact agreement with absolute thermometers as to their numerical scale readings, but to qualify as thermometers at all they must agree with absolute thermometers and with each other in the following way: given any two bodies isolated in their separate respective thermodynamic equilibrium states, all thermometers agree as to which of the two has the higher temperature, or that the two have equal temperatures.[17] For any two empirical thermometers, this does not require that the relation between their numerical scale readings be linear, but it does require that relation to be strictly monotonic.[18] This is a fundamental character of temperature and thermometers.[19][20][21]

As it is customarily stated in textbooks, taken alone, the so-called "zeroth law of thermodynamics" fails to deliver this information, but the statement of the zeroth law of thermodynamics by James Serrin in 1977, though rather mathematically abstract, is more informative for thermometry: "Zeroth Law – There exists a topological line M{\displaystyle M} which serves as a coordinate manifold of material behaviour. The points L{\displaystyle L} of the manifold M{\displaystyle M} are called 'hotness levels', and M{\displaystyle M} is called the 'universal hotness manifold'."[22] To this information there needs to be added a sense of greater hotness; this sense can be had, independently of calorimetry, of thermodynamics, and of properties of particular materials, from Wien's displacement law of thermal radiation: the temperature of a bath of thermal radiation is proportional, by a universal constant, to the frequency of the maximum of its frequency spectrum; this frequency is always positive, but can have values that tend to zero. Another way of identifying hotter as opposed to colder conditions is supplied by Planck's principle, that when a process of isochoric adiabatic work is the sole means of change of internal energy of a closed system, the final state of the system is never colder than the initial state; except for phase changes with latent heat, it is hotter than the initial state.[23][24][25]

There are several principles on which empirical thermometers are built, as listed in the section of this article entitled "Primary and secondary thermometers". Several such principles are essentially based on the constitutive relation between the state of a suitably selected particular material and its temperature. Only some materials are suitable for this purpose, and they may be considered as "thermometric materials". Radiometric thermometry, in contrast, can be only slightly dependent on the constitutive relations of materials. In a sense then, radiometric thermometry might be thought of as "universal". This is because it rests mainly on a universality character of thermodynamic equilibrium, that it has the universal property of producing blackbody radiation.

Thermometric materials

There are various kinds of empirical thermometer based on material properties.

Many empirical thermometers rely on the constitutive relation between pressure, volume and temperature of their thermometric material. For example, mercury expands when heated.

If it is used for its relation between pressure and volume and temperature, a thermometric material must have three properties:

(1) Its heating and cooling must be rapid. That is to say, when a quantity of heat enters or leaves a body of the material, the material must expand or contract to its final volume or reach its final pressure and must reach its final temperature with practically no delay; some of the heat that enters can be considered to change the volume of the body at constant temperature, and is called the latent heat of expansion at constant temperature; and the rest of it can be considered to change the temperature of the body at constant volume, and is called the specific heat at constant volume. Some materials do not have this property, and take some time to distribute the heat between temperature and volume change.[26]

(2) Its heating and cooling must be reversible. That is to say, the material must be able to be heated and cooled indefinitely often by the same increment and decrement of heat, and still return to its original pressure, volume and temperature every time. Some plastics do not have this property;[27]

(3) Its heating and cooling must be monotonic.[18][28] That is to say, throughout the range of temperatures for which it is intended to work,

(a) at a given fixed pressure,

either (α) the volume increases when the temperature increases, or else (β) the volume decreases when the temperature increases;

but not (α) for some temperatures and (β) for others; or

(b) at a given fixed volume,

either (α) the pressure increases when the temperature increases, or else (β) the pressure decreases when the temperature increases;

but not (α) for some temperatures and (β) for others.

At temperatures around about 4 °C, water does not have the property (3), and is said to behave anomalously in this respect; thus water cannot be used as a material for this kind of thermometry for temperature ranges near 4 °C.[20][29][30][31][32]

Gases, on the other hand, all have the properties (1), (2), and (3)(a)(α) and (3)(b)(α). Consequently, they are suitable thermometric materials, and that is why they were important in the development of thermometry.[33]

Constant volume thermometry

According to Preston (1894/1904), Regnault found constant pressure air thermometers unsatisfactory, because they needed troublesome corrections. He therefore built a constant volume air thermometer.[34] Constant volume thermometers do not provide a way to avoid the problem of anomalous behaviour like that of water at approximately 4 °C.[32]

Radiometric thermometry

Planck's law very accurately quantitatively describes the power spectral density of electromagnetic radiation, inside a rigid walled cavity in a body made of material that is completely opaque and poorly reflective, when it has reached thermodynamic equilibrium, as a function of absolute thermodynamic temperature alone. A small enough hole in the wall of the cavity emits near enough blackbody radiation of which the spectral radiance can be precisely measured. The walls of the cavity, provided they are completely opaque and poorly reflective, can be of any material indifferently. This provides a well-reproducible absolute thermometer over a very wide range of temperatures, able to measure the absolute temperature of a body inside the cavity.

Primary and secondary thermometers

A thermometer is called primary or secondary based on how the raw physical quantity it measures is mapped to a temperature. As summarized by Kauppinen et al., "For primary thermometers the measured property of matter is known so well that temperature can be calculated without any unknown quantities. Examples of these are thermometers based on the equation of state of a gas, on the velocity of sound in a gas, on the thermal noisevoltage or current of an electrical resistor, and on the angular anisotropy of gamma ray emission of certain radioactivenuclei in a magnetic field."[35]

In contrast, "Secondary thermometers are most widely used because of their convenience. Also, they are often much more sensitive than primary ones. For secondary thermometers knowledge of the measured property is not sufficient to allow direct calculation of temperature. They have to be calibrated against a primary thermometer at least at one temperature or at a number of fixed temperatures. Such fixed points, for example, triple points and superconducting transitions, occur reproducibly at the same temperature."[35]

Calibration

Thermometers can be calibrated either by comparing them with other calibrated thermometers or by checking them against known fixed points on the temperature scale. The best known of these fixed points are the melting and boiling points of pure water. (Note that the boiling point of water varies with pressure, so this must be controlled.)

The traditional way of putting a scale on a liquid-in-glass or liquid-in-metal thermometer was in three stages:

Immerse the sensing portion in a stirred mixture of pure ice and water at atmospheric pressure and mark the point indicated when it had come to thermal equilibrium.

Divide the distance between these marks into equal portions according to the temperature scale being used.

Other fixed points used in the past are the body temperature (of a healthy adult male) which was originally used by Fahrenheit as his upper fixed point (96 °F (36 °C) to be a number divisible by 12) and the lowest temperature given by a mixture of salt and ice, which was originally the definition of 0 °F (−18 °C).[36] (This is an example of a Frigorific mixture). As body temperature varies, the Fahrenheit scale was later changed to use an upper fixed point of boiling water at 212 °F (100 °C).[37]

These have now been replaced by the defining points in the International Temperature Scale of 1990, though in practice the melting point of water is more commonly used than its triple point, the latter being more difficult to manage and thus restricted to critical standard measurement. Nowadays manufacturers will often use a thermostat bath or solid block where the temperature is held constant relative to a calibrated thermometer. Other thermometers to be calibrated are put into the same bath or block and allowed to come to equilibrium, then the scale marked, or any deviation from the instrument scale recorded.[38] For many modern devices calibration will be stating some value to be used in processing an electronic signal to convert it to a temperature.

Precision, accuracy, and reproducibility

The "Boyce MotoMeter" radiator cap on a 1913 Car-Nation automobile, used to measure temperature of vapor in 1910s and 1920s cars.

The precision or resolution of a thermometer is simply to what fraction of a degree it is possible to make a reading. For high temperature work it may only be possible to measure to the nearest 10 °C or more. Clinical thermometers and many electronic thermometers are usually readable to 0.1 °C. Special instruments can give readings to one thousandth of a degree.[citation needed] However, this precision does not mean the reading is true or accurate, it only means that very small changes can be observed.

A thermometer calibrated to a known fixed point is accurate (i.e. gives a true reading) at that point. Most thermometers are originally calibrated to a constant-volume gas thermometer.[citation needed] In between fixed calibration points, interpolation is used, usually linear.[38] This may give significant differences between different types of thermometer at points far away from the fixed points. For example, the expansion of mercury in a glass thermometer is slightly different from the change in resistance of a platinum resistance thermometer, so these two will disagree slightly at around 50 °C.[39] There may be other causes due to imperfections in the instrument, e.g. in a liquid-in-glass thermometer if the capillary tube varies in diameter.[39]

For many purposes reproducibility is important. That is, does the same thermometer give the same reading for the same temperature (or do replacement or multiple thermometers give the same reading)? Reproducible temperature measurement means that comparisons are valid in scientific experiments and industrial processes are consistent. Thus if the same type of thermometer is calibrated in the same way its readings will be valid even if it is slightly inaccurate compared to the absolute scale.

An example of a reference thermometer used to check others to industrial standards would be a platinum resistance thermometer with a digital display to 0.1 °C (its precision) which has been calibrated at 5 points against national standards (−18, 0, 40, 70, 100 °C) and which is certified to an accuracy of ±0.2 °C.[40]

According to British Standards, correctly calibrated, used and maintained liquid-in-glass thermometers can achieve a measurement uncertainty of ±0.01 °C in the range 0 to 100 °C, and a larger uncertainty outside this range: ±0.05 °C up to 200 or down to −40 °C, ±0.2 °C up to 450 or down to −80 °C.[41]

Indirect methods of temperature measurement

This section needs expansion with: Needs completion. Add any missing, give brief overview of each principle or thermometer, pros/cons for example temperature range and accuracy, etc.. You can help by adding to it.(January 2017)

In theory any physical phenomenon exhibiting a temperature dependence could be used as a thermometer, measuring temperature indirectly. Some of these properties have been exploited. For example blackbody radiation allows one to measure the temperature in a blast furnace or kiln, or the temperature of a distant star[citation needed]

Some compounds exhibit thermochromism at distinct temperature changes. Thus by tuning the phase transition temperatures for a series of substances the temperature can be quantified in discrete increments, a form of digitization. This is the basis for a liquid crystal thermometer.

Thermocouples are useful over a wide temperature ranges from cryogenic temperatures to over 1000°C, but typically have an error of ±0.5-1.5°C.

Silicon bandgap temperature sensors are commonly found packaged in integrated circuits with accompanying ADC and interface such as I2C. Typically they are specified to work within about —50 to 150°C with accuracies in the ±0.25 to 1°C range but can be improved by binning.[43][44]

Chemical shift is temperature dependent. This property is used to calibrate the thermostat of NMR probes, usually using methanol or ethylene glycol.[45][46] This can potentially be problematic for internal standards which are usually assumed to have a defined chemical shift (e.g 0 ppm for TMS) but in fact exhibit a temperature dependence.[47]

Applications

Thermometers utilize a range of physical effects to measure temperature. Temperature sensors are used in a wide variety of scientific and engineering applications, especially measurement systems. Temperature systems are primarily either electrical or mechanical, occasionally inseparable from the system which they control (as in the case of a mercury-in-glass thermometer). Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist. Indoors, thermistors are used in climate control systems such as air conditioners, freezers, heaters, refrigerators, and water heaters.[50] Galileo thermometers are used to measure indoor air temperature, due to their limited measurement range.

Nanothermometry

Nanothermometry is an emergent research field dealing with the knowledge of temperature in the sub-micrometric scale. Conventional thermometers cannot measure the temperature of an object which is smaller than a micrometre, and new methods and materials have to be used. Nanothermometry is used in such cases. Nanothermometers are classified as luminescent thermometers (if they use light to measure temperature) and non-luminescent thermometers (systems where thermometric properties are not directly related to luminescence).[52]

Food and food safety

Thermometers are important in food safety where food at temperatures within 41 and 135 °F (5 and 57 °C) can be prone to potentially harmful levels of bacterial growth after several hours which could lead to foodborne illness. This includes monitoring refrigeration temperatures and maintaining temperatures in foods being served under heat lamps or hot water baths.[50]
Cooking thermometers are important for determining if a food is properly cooked. In particular meat thermometers are used to aid in cooking meat to a safe internal temperature while preventing over cooking. They are commonly found using either a bimetallic coil, or a thermocouple or thermistor with a digital readout.
Candy thermometers are used to aid in achieving a specific water content in a sugar solution based on its boiling temperature.