ASTM D7303 Standard Test Method for Determination of Metals in Lubricating Greases by Inductively Coupled Plasma Atomic Emission Spectrometry
13. Calibration
13.1 The linear range must be established once for the particular instrument being used. This is accomplished by running intermediate standards between the blank and the working standard, and by running standards containing higher concentrations than the working standards. Analyses of test specimen solutions must be performed within the linear range of response.

13.2 At the beginning of the analysis of each batch of specimens, perform a two-point calibration consisting of the blank and the working standard. Use the check standard to determine if each element is in calibration. When the results obtained with the check standard are within 5 % of the expected concentrations for all elements, proceed with the test specimen analyses. Otherwise, make any adjustments to the instrument that are necessary and repeat the calibration. Repeat this procedure with check standard every five samples.

13.3 Calculate the calibration factors from the intensity ratios. Alternatively, use the computer programs provided by the instrument manufacturer to calibrate the instrument.

14. Sample Analysis
14.1 Determine the ICP detection limits for all elements of interest as follows: Prepare a dilute acid blank with an (optional) internal standard by pipetting 1000 µL of the internal standard stock solution into a 50 mL volumetric flask, and fill to the volume with dilute acid. Seal the flask and mix well. Perform ten consecutive analyses of this solution for all elements of interest under the same conditions/parameters that the two-point calibration standards were analyzed. With the ICP instrument software, determine the standard deviation of these ten results for each element of interest. The detection limit of each element is its standard deviation multiplied by three. Detection limits should be determined daily after calibration.

14.2 Analyze the test specimen solutions in the same manner as the calibration standards (that is, the same integration time, background correction points (optional), plasma conditions, etc.). Between test specimens nebulize water for a minimum of 60 s.

14.3 When the concentration of any analyte exceeds the linear range of the calibration, dilute the test specimen solution to bring it into calibration range. Then reanalyze.

14.4 Analyze the check standard after every fifth test specimen solution. If any result is not within 5 % of the expected concentration, recalibrate the instrument and reanalyze the test specimen solutions back to the previous acceptable check standard analysis.

15. Quality Assurance/Quality Control (required)
15.1 Confirm the performance of the instrument and the test procedure by analyzing a quality control (QC) sample.
15.1.1 When QA/QC protocols are already established in the testing facility, these may be used to confirm the reliability of the test result.

15.1.2 When there is no QA/C protocol established in the testing facility, Appendix X1 can be used as the QA/QC protocol.

NOTE 10 - Further guidance on the laboratory QA/QC protocols can be found in Guide D6792.

15.2 Users of this test method are advised that in contractual agreements, one or more of the contracting parties can and may make Appendix X1 a mandatory practice.

16. Calculation
16.1 Calculate the elemental concentrations by multiplying the determined concentration in the diluted test specimen solution by the dilution factor. Calculation of concentrations can be done manually or by instrument computer software when such a feature is available.