Get Tech Tips
Subscribe to free tech tips.
History of Thermometry and the Invention of Thermometers
So, what is thermometry anyway?
Thermometry is the scientific study of measuring temperature and has roots that date back to as early as 220 BC. During this time, Philo of Byzantium began to study the effects that temperature had on air. Specifically, he observed expansion and contraction caused by temperature change. He did this using a device that he improvised from rudimentary materials available at the time—which was quite simply a glass sphere filled with air that was connected to a container of water with a submerged tube.
Heating the sphere caused the air inside to expand, pushing through the tube and releasing it to the container of water, creating bubbles. But the interesting part came when the sphere cooled down. Lowering the temperature of the sphere allowed the air to contract, and because the tube was submerged in the container, water was drawn up the tube. This was an important discovery in the history of the thermoscope, which would later become the thermometer.
Nysus, CC BY-SA 4.0, via Wikimedia Commons
Because this device had no graduation marks for measurement, it was referred to as a thermoscope. It wasn’t until the 1600s that real progress was made toward the creation of thermometers. Early on, Galileo Galilei experimented with the principles of the thermoscope; his design consisted of a glass sphere of air connected to a vertical column of water that was submerged in a pool of water. In Galileo's experiment, he also observed that the expansion or contraction of the air would cause the column of water to rise or fall when the sphere experienced changes in temperature.
The major disadvantage to Galileo’s thermoscope was its lack of portability. The design required that the tube be very tall in order to observe small temperature changes. Additionally, its open design was affected by atmospheric pressures.
Galileo’s thermoscope also fell short because it had no scale or graduation marks for consistent measurement. So, around 1612, Venetian physician Santorio Santorio created the first thermometer by adding a scale or markings to the thermoscope. The problem is that this scale had no real reference, as a fixed-point temperature was not yet understood.
It wasn’t until around 1654 that Ferdinando II de’ Medici – Grand Duke of Tuscany, invented the first sealed thermometer by melting the end of the thermoscope’s glass tube, solving the issue of atmospheric pressure. He also discovered that by substituting water for alcohol or urine, the thermometer could be much smaller than Galileo’s thermoscope.
Measurement and the Scale War
During the middle part of the 1600s, thermometers had become very common. However, due to their lack of a standardized scale, they were found to be unreliable; each thermometer craftsman tended to use his own standard with a disregard for any other. Standardized fixed points had to be developed because scientific experimentation relies on consistency. These fixed points or thermometric points of reference refer to those phenomena in nature that will always take place at the same temperature. These phenomena include temperature points such as the temperature of boiling water or melting ice.
Famed astronomer Edmond Halley, discoverer of Halley’s comet, suggested using the boiling point of alcohol as a fixed point. He had observed that the alcohol in his thermometer always rose to the same level when it boiled. However, Halley refused to accept the concept of using the freezing point of water as the lower fixed point. Instead, he suggested that “the just beginning of heat and cold should not come from such a point as freezes anything.” He held that the lower temperature fixed point should be derived from temperatures deep underground, such as the Grottoes under the observatory in Paris.
Halley’s proposals were just the tip of the iceberg in the battle of fixed points and revealed the absolute necessity for discovering true natural fixed points to reference. The problem was clear, and it was an even bigger problem than the invention of the thermometer itself. How do we determine and validate whether a fixed point is actually fixed?
During that era, there was a wide range of proposed fixed points—from simple ideas like the “greatest summer heat” to ideas such as a temperature scale proposed by Joachim Dalence that used the melting point of butter as the high fixed point. It was also suggested that the King’s Chamber of the Great Pyramid of Giza be used as the upper fixed point. Isaac Newton also weighed in with the idea of using what he referred to as blood heat or body temperature as a fixed point for his temperature scale.
Today, we can look at these choices and understand that body temperature or the greatest summer heat can be highly variable depending on conditions, but at the time, scientists really struggled with the concept of fixed points. Without any agreed-upon fixed points for a temperature scale, the lines on a thermometer were little more than arbitrary dashes and dots and had no reliable use to science.
It wasn’t until the mid-1800s that a consensus regarding fixed points was established. Attributed to the work of Swedish astronomer Anders Celsius and other scientists of the time, the use of the boiling and freezing points of water would arguably be the fixed points. But even these fixed points met adversity, especially the boiling point.
One such quibble of the time was over the premise that water had two boiling points. George Adams the Elder, who was one of Britain's premier instrument makers, claimed that water began to boil at 204 degrees Fahrenheit and would boil vehemently at 212 degrees Fahrenheit. There is recorded evidence that even Isaac Newton believed that water boiled in what could be described as a continuum. On his scale, he observed that water would begin boiling at 33 degrees and boil vehemently at 34.5 degrees.
A committee led by Henry Cavendish was assembled to settle the boiling water debate. It would be the prerogative of Cavendish and his fellow researchers to investigate the viability of boiling water as a fixed temperature point. Experiments performed by Jean-Andre De Luc showed that there was an interval of temperature degrees from the point at which the first layer of water began to boil and when the total volume of the water came to a full boil or ebullition.
Superheat
Further experimentation led to the study of superheat. Cavendish believed that steam was a more reliable fixed point than the variable degrees of boiling water. He thought that during its phase change from water to steam, the boiling water was always hotter than steam.
De-Luc held a different belief and asked, “Why would the temperature of boiled-off steam be a more accurate fixed point than the boiling water itself?” An argument about the subject ensued between De-Luc and Cavendish. In the end, De-Luc ultimately forfeited his position and ceded to the use of steam as the fixed point. While questions remained unanswered about the phenomena of water boiling, at least there was now an agreed-upon upper limit fixed point to which artisans could design thermometers.
Temperature Scale
Once the conundrum of fixed points was arguably solved, the focus turned to what the scale should be. While there were many different physicists experimenting with scales at the time, the two most well-known scales were created by Daniel Fahrenheit and Anders Celsius.
Working as a glass blower, Fahrenheit not only developed the first mercury in a sealed glass bulb thermometer, but he also developed a scale that had up to four times the number of degree marks than the thermometers of the time, which allowed for precise measurements. On the Fahrenheit scale, the freezing point of brine was set to 0°F, while the freezing point of pure water was 32°F and the boiling point 212°F. Fahrenheit’s scale can be found in use today in the United States and some Caribbean countries.
Swedish astronomer Anders Celsius developed what was known as the centigrade scale. His original scale had the freezing point of water set to 100 degrees and the boiling point at 0 degrees. While it is not completely understood why Celsius created this seemingly backward scale, it is thought that he was trying to avoid negative numbers when dealing with colder temperatures. Eventually, the centigrade scale would be renamed the Celsius scale and would be inverted to resemble the scale we know today, with the freezing point being 0°C and the boiling point set at 100°C.
The third scale, which is not as commonly used as Fahrenheit and Celsius, is the Kelvin scale. The Kelvin scale is considered an absolute scale, which means it starts at 0 degrees or absolute zero. It shares the same value of a degree as the Celsius scale, meaning that one degree of change on the Kelvin scale equals a Celsius degree. It is used primarily in the study of physical sciences.
Types of Thermometers
There are a variety of different thermometers available today. The liquid in glass thermometer is the oldest and was the most commonly used for a long time. As the name suggests, it is a sealed glass tube with a bulb filled with a liquid, such as mercury. As the temperature changes, it affects the mercury by causing it to either or expand or contract inside the tube of the thermometer.
Gas Thermometer
Instead of using a liquid like mercury or alcohol, the gas thermometer uses an inert gas as its thermometric fluid. Instead of using the expansion and contraction of a fluid medium, the gas thermometer works off of pressure increases and decreases because temperature and pressure are directly linked.
Infrared Thermometer
An infrared thermometer uses a lens to focus infrared radiation from an object onto a detector called a thermopile.
Thermometers can also be found in tools like the Fieldpiece digital pocket thermometer, which can also detect wet bulb and dry bulb measurements.
Now you probably know more about thermometers than you’ve ever cared to know. Still, it’s pretty neat to think that there used to be a time in the not-so-distant past when we didn’t have a uniform standard for measuring temperature. In America, we take the Fahrenheit scale for granted—not just when it comes to superheat and subcooling but also when it comes to basic health and safety, like knowing which numbers to look for when we put a thermometer in our mouths or in a piece of freshly cooked meat.
It’s kind of amazing to think that thermometry and the temperature scales we use today took several centuries to reach their current point. Maybe now we’ll appreciate temperature clamps, infrared thermometers, and even the humble mercury thermometers we grew up with a little more.
Comments
Great history. Thank you Bryan.
Great history. Thank you Bryan.
To leave a comment, you need to log in.
Log In