Skip to main content

I&C Calibration Frequency Mistakes

Today’s I&C workforce is stretched like never before. I&C shops are being pushed to do more with less, and it can sometimes be a nearly impossible challenge to get all of the necessary work completed properly. 

Interestingly, I have observed that much of the work that our I&C personnel are so busy performing provides little value. Much of our PM work schedules and procedures are based on technologies, equipment, and problems that are no longer applicable. 

As an example - Is your organization blindly performing annual calibrations of all field transmitters? If so, is there sound reasoning (or a regulatory requirement or some other reason) for this frequency – or is it simply based on maintenance plans that have been blindly carried forward for decades without any actual analysis or justification?

Before the advent of microprocessor based “Smart” transmitters, it was common for typical analog electronic instrumentation transmitters to have drift rates of up to 0.5% per year. Because of these high drift rates, most I&C shops literally had to perform annual calibrations of all instruments just to ensure everything operated within tolerable limits (which were and still are around +/- 0.5 to 1.0% of span per year). See the red line in graph below. 

 

Mike Glass

About the author

Mike Glass

Mike Glass is an ISA Certified Automation Professional (CAP) and a Master Certified Control System Technician (CCST III). Mike has 38 years of experience in the I&C industry performing a mix of startups, field service and troubleshooting, controls integration and programming, tuning & optimization services, and general I&C consulting, as well as providing technical training and a variety of skills-related solutions to customers across North America.

Mike can be reached directly via [email protected] or by phone at (208) 715-1590.