Choosing Test Points (Test Temperatures) For Liquid-in-glass Themometers

If your thermometer is an ASTM thermometer (it will have the inscription ‘ASTM 1C’ or similar), the standard test temperatures are specified by ASTM specification E-1 (and appear in our ASTM thermometer listings). These test temperatures have been specified considering the intended application as well as the behaviors of the particular instrument, and should be used in order to assure that the calibration has been performed in accordance with ASTM requirements.

If your thermometer has been previously calibrated, the test points have already been established and appear on the test report. Generally, those test points should be repeated in future calibrations, which will allow the user to see the magnitude and the direction of any changes with each new calibration.

If your thermometer has never been calibrated (or you don’t know, or don’t have a test report), you can either let us choose the most suitable (default) test points, or you may wish to specify those test points to us.

To assist you in choosing test points, we present the following considerations, which are drawn from ASTM and NIST recommendations. One should consider all three suggestions:

1. A minimum of three temperatures should be calibrated, generally low, medium and high on the scale of the instrument.

This is the old “10% – 50% – 90%” (of scale) rule. Example: you have a thermometer with a range of -10 to 110 °C in 1° divisions. Calibrating this thermometer at 0 °C (low on the scale), 50 °C (mid scale), and 100 °C (high on the scale) is sufficient, and will allow you to use the thermometer at virtually any temperature it measures by making a straight-line interpolation. See ASTM-E-77 and NBS Monograph 150 for more information on interpolating. Unfortunately, this is not adequate for all thermometers. See suggestion #2.

2. There should be no more than 100 graduations between any two calibrated temperatures; for the ultimate precision, calibrate every 50 divisions.

For example, if your thermometer has a range of -1 to 51 °C in 0.1° divisions, suggestion #1 above, to calibrate three temperatures, does not provide an adequate calibration. You must calibrate every 100 divisions: 0, 10, 20, 30, 40 & 50 °C to have an adequate calibration.

3. If the temperatures used by the manufacturer for scale placement are known (or can be easily determined visually), the temperatures used for calibration should correspond.

This will assure linearity of spacing between the calibrated points, therefore allowing the user to interpolate intermediate values. Example: your thermometer has a scale of 25 to 60 °C in 0.1 degree divisions, and a careful examination of the thermometer reveals that “scale placement marks’ (usually a scratch in the glass, under a major graduation line, visible with a magnifying glass) were made at 25, 30, 40, 50 & 60 °C, then those temperatures should be used for the calibration to afford the best linearity.

Can I just have one temperature calibrated?

Certainly. You are the customer, and the one who best knows your needs. Many times a single point calibration is all that is needed, for example when a thermometer is dedicated to the measurement of a single temperature and will not be used for other work. Under ANSI/NCSL Z-540-1 we are required to identify the test report of a single point calibration as a “limited calibration”, or “not a full scale calibration”. The intent is logical and desirable: if your thermometer is calibrated only at 37 °C, for a dedicated test, you want to know that, and not to use it for a critical application at 50 °C.

I want to use the thermometer only across a defined range within its scale. Do I need to do a full scale calibration?

No, we can choose test temperatures which “bracket” the range within which you are going to work. As above, the test report will specify that this calibration is NOT a full scale calibration, and that the thermometer can be used with full confidence only within the range bracketed.