This is an old revision of the document!


Reducing Waste: Through Optimizing Calibration Intervals

One of the most often overlooked areas in Lean Six Sigma for reducing waste is the optimization of calibration (i.e., metrological confirmation) intervals.

Why should you care about optimizing calibration intervals?

Most M&TE are on arbitrary 12 month calibration intervals… as if “one size” fits all. These are typical “manufacturer-recommended intervals” - which are often extremely conservative because the manufacturer wants their calibration labs to stay busy. And most companies happily pay them to do so. Yes… over-calibrating M&TE reduces risk… but only to a degree1). Is it efficient or cost-effective? No. Does the reduction in risk justify workers being without M&TE or the company paying excessive amounts for this “over calibration”? In most situations, the answer is a resounding NO!

If M&TE calibration intervals were optimized based upon performance, optimal calibration intervals for some instruments might be 18 months, 24 months, or even longer. This results in immediate tangible cost savings. And while a few instruments may require shorter calibration intervals (e.g., 9-month intervals), immediate intangible savings are realized through the increased confidence in the reliability of the M&TE.

One company I visited, had over 13,000 instruments in their calibration system. They'd contracted all of the calibrations with a metrology laboratory that “optimized” their calibration intervals for them. The number of instruments found “Out-of-Tolerance” dropped from 5% (650 instruments) per year to less than 0.5% (only 65 instruments)! This reduced their risk (of “Out-of-Tolerance” instruments being used to inspect product), reduced their administrative costs (associated with performing “Out-of-Tolerance” impact analysis), and reduced their total annual cost for calibration services (fewer calibrations were performed)!

<note>To provide a balanced view of this topic, a paper defending use of the 12-month interval is "Calibration Intervals, A Manufacturer's Perspective", written by David Deaver - Fluke Corporation.</note>

Methodologies for the Determination of Calibration Intervals

Methodologies for the determination of calibration intervals are defined in documents such as:

There are many methods and theories to calculate calibration intervals, such as those found in NCSL RP-1, Method S1 (Classical Method), Method S2 (Binomial Method), and Method S3 (Renewal time Method). As a result, it can be difficult to choose the best method to determine the interval (Ref. "A Quantitative Comparison of Calibration Interval Adjustment Methods").

<note>Integrated Sciences Group (ISG) offers a free “Method S2” interval calculator (for MS Windows only) augmented by the “Method A3 Interval Tester” (adjusting for “sparse” data) called IntervalMAX.</note>

Perhaps the simplest and most widely used methodology for optimizing calibration intervals is the “Automatic adjustment” or “Staircase” method (described in ILAC G24:2007, sec. 3, "Methods of reviewing calibration intervals").

Using the “Staircase” method

Each time an instrument is calibrated on a routine basis, the subsequent interval is extended IF it is found to be within a certain percentage (e.g., 80%) of the maximum permissible error that is required for measurement, or reduced if it is found to be outside this maximum permissible error.

Of course this method assumes that the company is being provided with “as found” data for each calibration performed.

A critical component when using this methodology is determining the percentage of the maximum permissible error. The higher the percentage, the greater the risk of an instrument being found Out-of-Tolerance (OOT); potentially resulting in nonconforming product escapes. The lower the percentage, the greater the cost associated with lowering the risk of an OOT condition; and reducing the potential for nonconforming product escapes. This percentage will often vary based upon the type of instrumentation to which it is applied.

Most often companies establish a “range” (or “window”) for the optimization. For example, IF an instrument is found exceeding 75% of its maximum permissible error, then the calibration interval is shortened. However, IF an instrument is consistently found below 50% of its maximum permissible error, then the calibration interval is lengthened. And IF the instrument is found between 50% and 75% of it's maximum permissible error, then the interval is considered acceptable.

Initial Calibration Intervals

Of course you will need to establish an initial calibration interval. Unless otherwise specified by the manufacturer, reference U.S.A.F T.O. 33K-1-100-1, "Technical Manual - Calibration Procedure for Maintenance Data Collection Codes and Calibration Measurement Summaries", “Table 3.1 General Calibration TOs”.

<note>T.O. 33K-1-100-1, “Table 3.1 General Calibration TOs”, also lists the “Calibration TO” for each instrument “item type”; which can be obtained separately.</note>

For laboratory standards, refer to NIST GMP 11 -Good Measurement Practice for Assignment and Adjustment of Calibration Intervals for Laboratory Standards.

Usage-Based Intervals

A topic rarely seen addressed is the establishment of calibration intervals based upon usage rather than time. This generally applies to dimensional gages

<note>ASME B1.7-2006, “Definitions”, defines a “gage” as: “a device for inspecting / evaluating a limit or size of a specified product dimension.”.

More specifically, “Gages” are instruments WITHOUT indicators (e.g., Gage Blocks, Pin Gages, Ring Gages, Thickness Gages, Thread (Plug) Gages, Thread Pitch Gages, Threaded Ring Gages), used as a standard for comparative determinations (e.g., inspections). Gages are typically used for “Go/NoGo” or “Pass/Fail” measurements.

In contrast, “Gauges” are instruments WITH indicators (e.g., Analog OR Digital Multimeters, Dial Indicators, Force Gauges, Pressure Gauges) used to obtain measurements. This includes Digital, Dial, OR Vernier devices, such as Calipers or Micrometers.</note>

The ONLY way that a “gage” can change is through either “wear” (e.g., from “grit”) or “damage”.

Use of a Sealing Wax

A simple way to utilize “usage-based” calibration intervals for gages is to apply a sealing wax after each calibration to provide a visual indication of whether the gage has been used since it was last calibrated. Using this approach, the “time-based” calibration interval does not begin until the wax is broken/removed. At that point, whoever is managing calibrated gages would update the calibration interval to “start the clock”. If any gages aren’t being used, these can be either removed from service or disposed of. If you're considering adopting this option, begin by dipping all of your threaded plug gages immediately. Then, while keeping the same calibration time-based“ interval, determine which gages are actually being used. There is no point in incurring the expense of calibrating gages that aren't being used.

A “good” calibration lab will dip calibrated plug gages (both plain and threaded) into a sealing wax to protect the gage. Alternatively, this type of wax (red, green or clear color) can be purchased from: https://ring-plug-thread-gages.com/ti-SW-Thread-Gage-Sealing-Wax.htm

Also, some companies use inexpensive hot glue guns for the same purpose (the colored wax really adds no value).

Another benefit of applying the wax is that it protects the gages from rust or other damage.

Extending the Life of Gages

Cleaning threaded plug gages & threaded ring gages, after each “production run” (or end of the day) will extend the life - and calibration - of these gages (as this removes the grit from that day's use.. so that it doesn't accumulate). Gage cleaning is most easily accomplished through the use of ultrasonic cleaners filled with an environmentally friendly solvent.

Black-oxide Treated Gages

A simple way to utilize “usage-based” calibration intervals for gages is to purchase plain (Go / No Go) plug gages that have a black-oxide treatment to show wear patterns - indicating when the gage needs to be calibrated or replaced. HOWEVER, be advised that using an ultrasonic cleaner with solvent COULD remove all of the black oxide. So you should check with the manufacturer to get clarification of using this cleaning technique; vs. simply wiping them down to remove any grit.

1)
A MUCH better way to reduce risk is to increase the minimum “Accuracy Ratio” between the M&TE and the tolerance of the characteristic being measured, but that's a topic for a separate article.