Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
articles:optimizing_calibration_intervals [2019/03/17 18:33] rrandallarticles:optimizing_calibration_intervals [2020/01/14 12:52] – [Reducing Waste: Through Optimizing Calibration Intervals] rrandall
Line 1: Line 1:
-====== Optimizing Calibration Intervals ======+====== Reducing Waste: Through Optimizing Calibration Intervals ======
  
 One of the most often overlooked areas in Lean Six Sigma for reducing waste is the optimization of calibration (i.e., metrological confirmation) intervals. \\ One of the most often overlooked areas in Lean Six Sigma for reducing waste is the optimization of calibration (i.e., metrological confirmation) intervals. \\
Line 5: Line 5:
 Why should you care about optimizing calibration intervals? Why should you care about optimizing calibration intervals?
  
-Most M&TE are on arbitrary 12 month calibration intervals... as if “one size” fits all. These are typically “manufacturer recommended intervals” - which are often extremely conservative because the manufacturer wants their calibration labs to stay busy. And most companies happily pay them to do so. Yes… over-calibrating M&TE reduces risk… but only to a degree((A MUCH better way to reduce risk is to increase the minimum "Accuracy Ratio" between the M&TE and the tolerance of the characteristic being measured, but that's a topic for a separate article.)). Is it efficient or cost effective? No. Does the reduction in risk justify workers being without M&TE or the company paying excessive amounts for this “over calibration”? In most situations, the answer is a resounding NO!+Most M&TE are on arbitrary 12 month calibration intervals... as if “one size” fits all. These are typical “manufacturer-recommended intervals” - which are often extremely conservative because the manufacturer wants their calibration labs to stay busy. And most companies happily pay them to do so. Yes… over-calibrating M&TE reduces risk… but only to a degree((A MUCH better way to reduce risk is to increase the minimum "Accuracy Ratio" between the M&TE and the tolerance of the characteristic being measured, but that's a topic for a separate article.)). Is it efficient or cost-effective? No. Does the reduction in risk justify workers being without M&TE or the company paying excessive amounts for this “over calibration”? In most situations, the answer is a resounding NO!
  
-If M&TE calibration intervals were optimized based upon performance, optimal calibration intervals for some instruments might be 18 months, 24 months, or even longer. This results in immediate tangible cost savings. And while a few instruments may require shorter calibration intervals (e.g., 9 month intervals), immediate intangible savings are realized through the reduction of risks.+If M&TE calibration intervals were optimized based upon performance, optimal calibration intervals for some instruments might be 18 months, 24 months, or even longer. This results in immediate tangible cost savings. And while a few instruments may require shorter calibration intervals (e.g., 9-month intervals), immediate intangible savings are realized through the reduction of risks. 
 + 
 +One company I visited, had over 13,000 instruments in their calibration system. They'd contracted all of the calibrations with a metrology laboratory that "optimized" their calibration intervals for them. The number of instruments found "Out-of-Tolerance" dropped from 5% (650 instruments) per year to less than 0.5% (only 65 instruments)! This reduced their risk (of "Out-of-Tolerance" instruments being used to inspect product), reduced their administrative costs (associated with performing "Out-of-Tolerance" impact analysis), and reduced their total annual cost for calibration services (fewer calibrations were performed)! 
 + 
 +<note>An excellent paper on this topic is [[https://us.flukecal.com/literature/articles-and-education/electrical-calibration/papers-articles/calibration-intervals-manuf|"Calibration Intervals, A Manufacturer's Perspective"]], written by David Deaver -  Fluke Corporation.</note>
 ===== Methodologies for the Determination of Calibration Intervals ===== ===== Methodologies for the Determination of Calibration Intervals =====
  
Line 28: Line 32:
 A critical component when using this methodology is determining the percentage of the maximum permissible error. The higher the percentage, the greater the risk of an instrument being found Out-of-Tolerance (OOT); potentially resulting in nonconforming product escapes. The lower the percentage, the greater the cost associated with lowering the risk of an OOT condition; and reducing the potential for nonconforming product escapes. This percentage will often vary based upon the type of instrumentation to which it is applied. \\ A critical component when using this methodology is determining the percentage of the maximum permissible error. The higher the percentage, the greater the risk of an instrument being found Out-of-Tolerance (OOT); potentially resulting in nonconforming product escapes. The lower the percentage, the greater the cost associated with lowering the risk of an OOT condition; and reducing the potential for nonconforming product escapes. This percentage will often vary based upon the type of instrumentation to which it is applied. \\
  
-Most often companies establish a "range" (or "window) for the optimization. For example, IF an instrument is found exceeding 75% of its maximum permissible error, then the calibration interval is shortened. However, IF an instrument is consistently found below 50% of its maximum permissible error, then the calibration interval is lengthened. And IF the instrument is found between 50% and 75% of it's maximum permissible error, then the interval is considered acceptable.+Most often companies establish a "range" (or "window") for the optimization. For example, IF an instrument is found exceeding 75% of its maximum permissible error, then the calibration interval is shortened. However, IF an instrument is consistently found below 50% of its maximum permissible error, then the calibration interval is lengthened. And IF the instrument is found between 50% and 75% of it's maximum permissible error, then the interval is considered acceptable.
 ===== Initial Calibration Intervals ===== ===== Initial Calibration Intervals =====