Anda di halaman 1dari 2

Central Texas College 2401 Clinical Chemistry

Calibration, Standards, and Quality Control Calibration: A calibration is a method used to verify that the measurements obtained by an instrument are accurate. Different methods exist for calibrating instruments. One method of calibration involves the use of reagents standards. Standards: standards, and the laboratory, or reagents used to perform the calibration. The standard reagents contain a known value or concentration. Standard reagents are prepared and placed on automated instruments to perform calibration. These reagents help verify the instrument is in proper working condition. In the laboratory, calibration of an instrument is performed when one of the more the following events occur: Receipt of a new instrument At set intervals established by the accreditation agency (once a month, quarterly, annually, etc.) Extensive power failure Relocation of the instrument Troubleshooting the instrument

Once the instrument is calibrated and verified for proper working order, the instrument must undergo quality control verification. Quality control: In the laboratory, quality control reagents contain a known level of analyte. The quality control material may be ordered from the manufacturer. The quality control reagents have been previously tested and assign certain values for each specific test. When the quality control sample is placed on the automated instrument, the report value should be within the specific parameter set by the manufacture. The specific parameters are referred to as the reference range or reference interval. When the quality control reagents obtain results within the appropriate reference range, the instrument becomes qualified for patient testing. When the quality control reagents results exceed the reference range, the instrument does not qualify for patient testing. Reference range: Reference range values are established by the manufacture. Reference ranges contain a 95% confidence (or two standard deviations) for the value of the analyte tested. In the development of the reference range, the analyte is tested in a mean value is established. Using the mean and statistical calculations, the standard deviation can be generated.

1 Standard Deviation (1 SD) Ensures that the value of the quality control reagent will fall within the reference range 68% of the time. 2 Standard Deviations (2 SD)- Ensures that the value of the quality-control reagent will fall within the reference range 95% of the time. 3 Standard Deviations (3SD) Ensures that the value of the quality control reagents will fall within the reference range 98% of the time. In the clinical chemistry laboratory, 2 (SD) Standard Deviations are used in determining acceptable quality control testing. Results that fall outside of the 2 SD range are considered invalid. Corrective action must be taken and documented.
Levy-Jennings Graph: The levy-Jennings chart plots the quality control results onto a linear graph to demonstrate changes in reagent concentration patterns. Reagent concentrations patterns may be described as shifts or trends. For the graph, the mean value of the control reagent is placed in the middle of the chart using a horizontal line. Above and below the line for the mean is the area for 1 Standard Deviation results. Above and below the 1SD line is the 2SD area for plotting quality control results.

3SD 2SD 1SD x 1SD 2SD 3SD

! ! ! ! ! ! * * *

Trend- A pattern on the graph demonstrate a slow pattern in one direction. (*). Shift A pattern on the graph demonstrating a large disperse pattern. (!). Refer to page 72 in the Clinical Chemistry Textbook.

Anda mungkin juga menyukai