This is an old revision of the document!


Reducing Waste: Through Optimizing Calibration Intervals

One of the most often overlooked areas in Lean Six Sigma for reducing waste is the optimization of calibration (i.e., metrological confirmation) intervals.

Why should you care about optimizing calibration intervals?

Most M&TE are on arbitrary 12 month calibration intervals… as if “one size” fits all. These are typically “manufacturer recommended intervals” - which are often extremely conservative because the manufacturer wants their calibration labs to stay busy. And most companies happily pay them to do so. Yes… over-calibrating M&TE reduces risk… but only to a degree1). Is it efficient or cost effective? No. Does the reduction in risk justify workers being without M&TE or the company paying excessive amounts for this “over calibration”? In most situations, the answer is a resounding NO!

If M&TE calibration intervals were optimized based upon performance, optimal calibration intervals for some instruments might be 18 months, 24 months, or even longer. This results in immediate tangible cost savings. And while a few instruments may require shorter calibration intervals (e.g., 9 month intervals), immediate intangible savings are realized through the reduction of risks.

One company I visited, had over 13,000 instruments in their calibration system. They'd contracted all of the calibrations with a metrology laboratory who “optimized” their calibration intervals for them. The number of instruments found “Out-of-Tolerance” dropped from 5% (650 instruments) per year to less than 0.5% (only 65 instruments)!

Methodologies for the Determination of Calibration Intervals

Methodologies for the determination of calibration intervals are defined in documents such as:

There are many methods and theories to calculate calibration intervals, such as those found in NCSL RP-1, Method S1 (Classical Method), Method S2 (Binomial Method), and Method S3 (Renewal time Method). As a result, it can be difficult to choose the best method to determine the interval (Ref. "A Quantitative Comparison of Calibration Interval Adjustment Methods").

<note>Integrated Sciences Group (ISG) offers a free “Method S2” interval calculator (for MS Windows only) augmented by the “Method A3 Interval Tester” (adjusting for “sparse” data) called IntervalMAX.</note>

Perhaps the simplest and most widely used methodology for optimizing calibration intervals is the “Automatic adjustment” or “Staircase” method (described in ILAC G24:2007, sec. 3, "Methods of reviewing calibration intervals").

Using the “Staircase” method

Each time an instrument is calibrated on a routine basis, the subsequent interval is extended IF it is found to be within a certain percentage (e.g., 80%) of the maximum permissible error that is required for measurement, or reduced if it is found to be outside this maximum permissible error.

Of course this method assumes that the company is being provided with “as found” data for each calibration performed.

A critical component when using this methodology is determining the percentage of the maximum permissible error. The higher the percentage, the greater the risk of an instrument being found Out-of-Tolerance (OOT); potentially resulting in nonconforming product escapes. The lower the percentage, the greater the cost associated with lowering the risk of an OOT condition; and reducing the potential for nonconforming product escapes. This percentage will often vary based upon the type of instrumentation to which it is applied.

Most often companies establish a “range” (or “window”) for the optimization. For example, IF an instrument is found exceeding 75% of its maximum permissible error, then the calibration interval is shortened. However, IF an instrument is consistently found below 50% of its maximum permissible error, then the calibration interval is lengthened. And IF the instrument is found between 50% and 75% of it's maximum permissible error, then the interval is considered acceptable.

Initial Calibration Intervals

Of course you will need to establish an initial calibration interval. Unless otherwise specified by the manufacturer, reference U.S.A.F T.O. 33K-1-100-1, "Technical Manual - Calibration Procedure for Maintenance Data Collection Codes and Calibration Measurement Summaries", “Table 3.1 General Calibration TOs”.

<note>T.O. 33K-1-100-1, “Table 3.1 General Calibration TOs”, also lists the “Calibration TO” for each instrument “item type”; which can be obtained separately.</note>

1)
A MUCH better way to reduce risk is to increase the minimum “Accuracy Ratio” between the M&TE and the tolerance of the characteristic being measured, but that's a topic for a separate article.