Written by John Sander
Brief Descriptions of Viscosity Tests
ASTM D2983 Low-Temperature Viscosity of Automotive Fluid Lubricants Measured by Brookfield Viscometer
The low-temperature, low-shear-rate viscosity of gear oils, automatic transmission fluids, torque and tractor fluids, and industrial and automotive hydraulic oils is often specified as Brookfield viscosity. This test method introduces the fluid lubricant into a cooled bath for 16 hours, and then uses the Brookfield viscometer for the determination of its low-shear-rate viscosity in the temperature range from -5 to -40°C and in the viscosity range of 1,000 to 1,000,000 centipoise (cP). The result is reported in cP at a given temperature.
ASTM D445 Kinematic Viscosity of Transparent and Opaque Liquids
In this method, the time is measured for a fixed volume of liquid lubricant, either transparent or opaque, to flow under gravity through a calibrated capillary viscometer at a given temperature, usually 100°C and 40°C. The kinematic viscosity is then calculated by multiplying the measured flow time by the calibration constant for that viscometer. The viscosity is then reported in centistokes (cSt) at a given temperature.
ASTM D2270 Calculating Viscosity Index from Kinematic Viscosity at 40°C and 100°C
The viscosity index (VI) is an arbitrary measure of the variation in the kinematic viscosity of a petroleum product due to changes in temperature between 40°C and 100°C. For example, a higher viscosity index indicates that the kinematic viscosity of the lubricant will decrease very little when the temperature is increased. The VI is simply reported as a numerical value that has no units.
ASTM D4683 Measuring Viscosity at High Shear Rate and High Temperature by Tapered Bearing Simulator
Viscosity at the shear rate and temperature of this test method is thought to be representative of the condition encountered in the bearings of automotive engines in severe service. In this method, the viscosity of fluid is measured using a tapered bearing simulator-viscometer. This viscometer uses a closely fitted rotor inside a matched stator to subject the fluid to a 1X106 s-1 shear rate at 150°C. The rotor exhibits a reactive torque response when it encounters resistance from oil that fills the area between the rotor and the stator. This torque is measured and compared to calibration oils with known torque values to determine the viscosity of the test oil. The resulting viscosity is then reported in units of centipoise (cP).ASTM D4684 Determination of Yield Stress and Apparent Viscosity of Engine Oils at Low Temperature
When a fluid is cooled, the rate and duration of cooling may affect the oil's yield stress and viscosity. In this test method, oil is cooled slowly through a temperature range in which wax crystallization is known to occur, followed by rapid cooling to the final test temperature. Correlations have been found between lack of pumpability in real field applications and failures in this test. These failures in the field are thought to be the result of the oil forming a gel structure that results in excessive yield stress or viscosity of the engine oil, or both. In this test, test fluid is placed in the cells of the Mini Rotary Viscometer (MRV), held at 80°C for a short time, then cooled at a programmed cooling rate over a period exceeding 45 hours to a final test temperature between -15°C and -35°C. A low torque is applied to a rotor shaft to measure the yield stress. A higher torque is then applied to determine the apparent viscosity of the sample oil. The low temperature viscosity is reported in the standard unit of millipascal-second (mPa-s) but may also be reported in units of centipoise (cP), which is numerically equal to mPa-s.
ASTM D5293 Apparent Viscosity of Engine Oils Between -5°C and -30°C Using the Cold-Cranking Simulator
The apparent viscosity of automotive oils at low temperatures is measured using the cold-cranking simulator (CCS). As the name would suggest, results from this test have been correlated with low-temperature engine cranking field data. In this test method, an electric motor drives a rotor that is closely fitted inside a stator. The space between the rotor and the stator is filled with oil. The test temperature, in the range of -5°C to -30°C, is measured near the stator inner wall and maintained by regulated flow of refrigerated coolant through the stator. The speed of the rotor is calibrated as a function of viscosity, and the test oil viscosity is determined from this calibration and the measured rotor speed. Shear stresses, shear rates and viscosity ranges are in the range of 50,000 to 100,000 pascals, 105 to 10-4 s-1, and 500 to 10,000 mPa-s, respectively. The resulting viscosity is reported in units of millipascal-second (mPa-s) or centipoise (cP).
ASTM D217 Standard Test Method for Cone Penetration of Lubricating Greases
The cone penetration test evaluates the consistency of lubricating greases over the full range of NLGI numbers from 000 to 6. Although no correlation has been developed between cone penetration results and field service, the test is widely used for specification purposes, such as in users' material specifications and suppliers' manufacturing specifications. In this test, consistency is determined by the penetration of a cone of specified dimensions, mass and finish into a standard amount of grease at 25°C. The penetration is measurement in tenths of a millimeter of how far gravity sinks the cone into the surface of the grease within 5 seconds. The NLGI ranges are based upon conducting this measurement after subjecting grease to 60 strokes of shear in a standardized worker and then performing the cone penetration.