ASTM D5133 low temp, low shear rate, viscosity/temp dependence of lubricants
ASTM D5133 standard test method for Low Temperature, Low Shear Rate, Viscosity/Temperature Dependence of Lubricating Oils Using a Temperature-Scanning Technique
1. Scope
1.1 This test method was developed to measure the apparent viscosity of engine oil at low temperatures.

1.2 A shear rate of approximately 0.2 s(-1) is produced at shear stresses below 100 Pa. Apparent viscosity is measured continuously as the sample is cooled at a rate of 1°C/h over the range -5 to -40°C, or to the temperature at which the viscosity exceeds 40 000 mPa•s (cP).

1.3 The measurements resulting from this test method are viscosity, the maximum rate of viscosity increase (Gelation Index) and the temperature at which the Gelation Index occurs.

1.4 Applicability to petroleum products other than engine oils has not been determined in preparing this test method.

1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

1.6 All terms in this test method are in SI units.

2. Referenced Documents
2.1 ASTM Standards:
D341 Test Method for Viscosity-Temperature Charts for Liquid Petroleum Products
D3829 Test Method for Predicting the Borderline Pumping Temperature of Engine Oils
D4684 Test Method for Determination of Yield Stress and Apparent Viscosity of Engine Oils at Low Temperature

3. Terminology
3.1 Definitions:
3.1.1 apparent viscosity, n - the viscosity obtained by use of this test method.

3.1.1.1 Discussion - See 3.1.6 for definition of viscosity and units.

3.1.2 Newtonian oil, n - an oil that, at a given temperature, exhibits a constant viscosity at all shear rates or shear stresses.

3.1.3 non-Newtonian oil, n - an oil that, at a given temperature, exhibits a viscosity that varies with shear stress or shear rate.

3.1.4 shear rate, n - velocity gradient perpendicular to the direction of flow.

3.1.4.1 Discussion - The SI unit for shear rate is the reciprocal second (1/s; also s(-1)).

3.1.5 shear stress, n - force per unit area in the direction of flow.

3.1.5.1 Discussion - The SI unit for shear stress is the Pascal (Pa).

3.1.6 viscosity, n - that property of a fluid which resists flow.

3.1.6.1 Discussion - Viscosity is defined as the ratio of the applied shear stress (force causing flow) and the shear rate (resultant velocity of flow per unit distance from a stationary surface wet by the fluid). Mathematically expressed:
viscosity = shear stress/shear rate or, symbolically, h = t/G

in which the symbols in the second portion of Eq 1 are defined by the terms in the first portion of the equation. The SI unit for viscosity used herein is milliPascal seconds (mPa•s).

3.2 Definitions of Terms Specific to This Test Method:
3.2.1 air-binding oils - those engine oils whose borderline pumping temperatures are determined by a combination of gelation and viscous flow.

3.2.2 borderline pumping temperature, n - that temperature at which an engine oil may have such poor flow characteristics that the engine oil pump may not be capable of supplying sufficient lubricant to the engine.

3.2.3 calibration oil, n - Newtonian oils developed and used to calibrate the viscometer drive module over the viscosity range required for this test method.

3.2.3.1 Discussion - these calibration oils are specially blended to give sufficient sensitivity and range for the special viscometer head used.

3.2.4 computer-programmed automated analysis, n - use of modern techniques for acquiring analog data, converting these to digital values and using this information to automatically record and analyze torque output from the viscometer drive module and to render this information into tabular data and plotted relationships.

3.2.4.1 analog-to-digital (A-D) converter, n - a device for converting continuously produced electrical signals into discrete numerical values capable of being analyzed by computer technology.

3.2.5 critical pumpability temperature, n - the temperature in the viscometer bath at which an oil reaches a chosen critical pumpability viscosity (see 3.2.6).

3.2.6 critical pumpability viscosity, n - that apparent viscosity believed to cause pumpability problems in an engine. This apparent viscosity is chosen to test an oil for its critical pumpability temperature.

3.2.7 flow-limited oils, n - those oils whose borderline pumping temperatures are determined by viscous flow.

3.2.8 gelation, n - a rheological condition of an oil characterized by a marked increase in the flow resistance over and above the normal exponential increase of viscosity with decreasing temperature, particularly at lower shear stresses and temperatures.

3.2.8.1 Discussion - Gelation has been attributed to a process of nucleation and crystallization of components of the engine oil and the formation of a structure.

3.2.9 Gelation Index, n - the maximum value of the incremental ratio
-1[(log log h1)-(log log h2)/(log T1-log T2)]

(in which h is dynamic viscosity and T is in degrees Kelvin) over the temperature range scanned when the incremental decrease in temperature is 1 K.

3.2.9.1 Discussion - The technique of deriving Gelation Index was first developed and practiced6 collecting information from a strip-chart recording and applying the empirical MacCoull-Walther-Wright equation (Test Method D341). For further information, see Appendix X1.

3.2.10 Gelation Index reference oils, n - non-Newtonian oils chosen to give certain levels of Gelation Index as a check on instrument performance.

3.2.11 Gelation Index temperature, n - the temperature (t2 in Eq 2) in degrees Celsius at which the Gelation Index occurs.

3.2.12 pre-treatment sample heating bath, n - a water or air bath to heat the samples for 1.5 to 2.0 h at 90 +/- 2°C before testing.

3.2.13 programmable liquid cold bath, n - a liquid bath having a temperature controller capable of being programmed to run the calibration and the analysis portions of the test method.

3.2.14 temperature controller, n - a programmable device which, when properly programmed, ramps the temperature upward or downward at a chosen rate or series of steps while simultaneously controlling temperature excursions.

3.2.14.1 calibration program, n - a program to run the required series of temperatures at which the torque values necessary to calibrate the viscometer drive module are collected and analyzed.

3.2.14.2 test program, n - a program to run the test oil analysis at 1°C/h temperature decrease.

3.2.14.3 hold program, n - a program to reach and hold the programmable liquid cold bath at -5°C.

3.2.15 test cell, n - the combination of the rotor and stator. Critical elements of the test cell are sketched in Fig. 1.

3.2.15.1 rotor, n - a titanium rotor sized to give a compromise of sensitivity and range to the determination of viscosity and gelation using this test method.
(a) stator, n - a precision-bore borosilicate glass tube, to which a measured amount of oil is added for the test and within which the specially-made rotor turns.

(b) stator collar, n - a clamp for the stator which also positions it on the test cell alignment device.

3.2.16 test cell alignment device, n - a special device used to support the viscometer drive module while maintaining the stator and the rotor coaxial and vertical in regard to the viscometer driveshaft. Later designs permit dry gas into the cell to prevent moisture and frost buildup.

3.2.17 test oil, n - any oil for which apparent viscosity is to be determined using the procedure described by this test method.

3.2.18 viscometer drive module, n - the rotor drive and torque-sensing component of a rotational viscometer.

3.2.19 viscometer module support, n - a part of the test cell alignment device supporting the viscometer drive module.