A2.2.1 Use liquid-in-glass thermometers with an accuracy after correction of 0.02°C or better, calibrated by a laboratory meeting the requirements of ISO9000 or ISO17025, and carrying certificates confirming that the calibration is traceable to a national standard. As an alternative, use thermometric devices such as platinum resistance thermometers, of equal or better accuracy, with the same certification requirements.
A2.2.2 The scale correction of liquid-in-glass thermometers can change during storage and use, and therefore regular re-calibration is required. This is most conveniently achieved in a working laboratory by means of a re-calibration of the ice point, and all of the main scale corrections altered for the change seen in the ice point.
A2.2.2.1 The interval for ice-point recalibration shall be no longer than six months (see NIST GMP 11). For new thermometers, monthly checking for the first six months is recommended. A change of one or more scale divisions in the ice point means that the thermometer may have been overheated or damaged, and it may be out of calibration. Such thermometers shall be removed from service until inspected, or recalibrated, or both. A complete recalibration of the thermometer, while permitted, is not necessary in order to meet the accuracy ascribed to this design thermometer (see NIST Special Publication 819). Any change in ice-point correction shall be added to the other corrections of the original Report of Calibration.
A2.2.2.2 Other thermometric devices, if used, will also require periodic recalibration. Keep records of all recalibration.
A2.2.3 Procedure for Ice-point Recalibration of Liquid-in-glass Thermometers.
A2.2.3.1 Unless otherwise listed on the certificate of calibration, the recalibration of calibrated kinematic viscosity thermometers requires that the ice-point reading shall be taken within 60min after being at test temperature for not less than 3 min.
A2.2.3.2 Select clear pieces of ice, preferably made from distilled or pure water. Discard any cloudy or unsound portions. Rinse the ice with distilled water and shave or crush into small pieces, avoiding direct contact with the hands or any chemically unclean objects. Fill the Dewar vessel with the crushed ice and add sufficient water to form a slush, but not enough to float the ice. As the ice melts, drain off some of the water and add more crushed ice. Insert the thermometer, and pack the ice gently about the stem, to a depth approximately one scale division below the 0°C graduation.
A2.2.3.3 After at least 3min have elapsed, tap the thermometer gently and repeatedly at right angles to its axis while making observations. Successive readings taken at least 1min apart shall agree within 0.005°C.
A2.2.3.4 Record the ice-point readings and determine the thermometer correction at this temperature from the mean reading. If the correction is found to be higher or lower than that corresponding to a previous calibration, change the correction at all other temperatures by the same value.
A2.2.3.5 During the procedure, apply the following conditions:
(1) The thermometer shall be supported vertically.
(2) View the thermometer with an optical aid that gives a magnification of approximately five and also eliminates parallax.
(3) Express the ice-point reading to the nearest 0.005°C.
A2.2.4 When in use, immerse the thermometric device to the same depth as when it was fully calibrated. For example, if a liquid-in-glass thermometer was calibrated at the normal total immersion condition, it shall be immersed to the top of the mercury column with the remainder of the stem and the expansion volume at the uppermost end exposed to room temperature and pressure. In practice, this means that the top of the mercury column shall be within a length equivalent to four scale divisions of the surface of the medium whose temperature is being measured.
A2.2.4.1 If this condition cannot be met, then an extra correction may be necessary.
A3.1 Regularly check timers for accuracy and maintain records of such checks.