Total Immersion, Partial Immersion, what do these terms mean?

Some helpful information on ASTM and other glass laboratory thermometers

All ASTM and other glass laboratory thermometers can be classified into 2 general groups – those designed and fabricated for total immersion and those designed and fabricated for partial immersion.   If you use glass thermometers, it is essential that you understand the difference, and how each type of thermometer is used.

Most laboratory errors in temperature measurement result from incorrect usage (immersion) of the thermometer!

Total immersion thermometers are designed with scales which indicate actual temperature when the bulb and the entire liquid column are exposed to the temperature being measured. In practice, a short length of liquid column (usually one-half inch) is permitted to extend above the surface of the liquid being measured to allow reading of the thermometer.

Most total immersion thermometers can also be used in a condition of complete immersion, wherein the entire thermometer is exposed to the temperature being measured, as with a thermometer inside a refrigerator, freezer, incubator or other chamber.

Partial immersion thermometers are designed to indicate the actual temperature when a specified portion of its stem is exposed to the temperature being measured.

How can I know the difference?

Partial immersion thermometers

partial-immersion

The immersion line is a quick and easy visual indication to the user. The thermometer should be immersed to this line for correct temperature indication. The reverse of the thermometer should have the inscription “76MM IMM” (or as appropriate). Partial immersion thermometers are usually easy to identify.

Note: some partial immersion thermometers do not have an immersion line inscribed; ie, thermometers with standard taper joints, or thermometers with very short immersions, such as for melting point applications. Pay special heed to the inscription on these thermometers, and immerse them as specified for the most accurate readings.

Total immersion thermometers

total-immersion

Total immersion thermometers are sometimes a little trickier to identify. Some of the better manufacturers are inscribing TOTAL or TOTAL IMMERSION on the reverse of the thermometer, but regrettably this is not an industry-wide practice. The photo below is of an older ASTM 112C thermometer, designed for total immersion. There is no immersion line, and there is no “TOTAL IMMERSION” marking on the reverse.

If there is no inscription on the reverse indicating immersion, you should assume the thermometer is designed for total immersion.

What’s the difference in use?

As explained above, the partial immersion thermometer is immersed in the liquid being measured up to the line, or ring.

The total immersion thermometer must be immersed into the medium being measured to within approximately one-half inch of where the top of the liquid column (the meniscus) resides (ASTM E-77).

So what happens if the total immersion thermometer is not immersed to the depth it should be?

You will have an erroneous temperature reading. The amount of the error depends upon what the temperature is that you are measuring, and how much of the liquid column that should be immersed is outside the medium you are measuring. An extreme example: you have a -1/201 °C thermometer, 24 inches in length, total immersion, and you are testing the liquid in a beaker on a hotplate. Only about 2 inches of the thermometer is in the liquid. The thermometer indicates 190 °C. How much error do we have? Almost 5 degrees C. The liquid in the beaker is 5 degrees hotter than the thermometer indicates.

I have this expensive, calibrated, total immersion thermometer I bought recently, similar to the one in the example above. I need it for applications similar to that described. Is this useless to me?

No, it may not be the best thermometer for your application, but you can use it. However, you’ll need to calculate and apply a correction.

Please refer to the following articles:

Some helpful thermometer suggestions:

GENERAL CONSIDERATIONS FOR MAKING AN ACCURATE READING

The error due to parallax may be eliminated by taking care that the reflection of the scale can be seen in the mercury thread, and by adjusting the line of sight so that the graduation of the scale nearest the meniscus exactly hides its own image: the line of sight will then be normal to the stem at that point. In reading thermometers, account must be taken of the fact that the lines are of appreciable width. The best practice is to consider the position of the lines as defined by their middle parts.

PERFORMING A CALIBRATION AT THE ICE POINT (0 °C or 32 °F) *

Select clear pieces of ice, preferably ice made from distilled water. Rinse the ice with distilled water and shave or crush into small pieces, avoiding direct contact with the hands or any chemically unclean objects. Fill a Dewar or other insulated vessel with the crushed ice and add sufficient distilled and preferably pre-cooled water to form a slush, but not enough to float the ice. Insert the thermometer, packing the ice gently about the stem, to a depth sufficient to cover the 0 °C (32 °F) graduation (total immersion), or to the immersion line (partial immersion). As the ice melts, drain off some of the water and add more crushed ice.

Raise the thermometer a few millimeters after at least 3 minutes have elapsed, tap the stem gently and observe the reading. Successive readings taken at least one minute apart should agree within one tenth of one graduation.

APPLYING THE CORRECTION AT ICE POINT*

Record readings and compare with previous readings. If the readings are found to be higher or lower than the reading corresponding to a previous calibration, readings at all other temperatures will be correspondingly increased or decreased.

*Reproduced in part from ASTM E77