Skip to main content

Question: 

Why are time-based instrument measurements so precise as compared to instruments with true analog inputs like pressure and temperature transmitters? 

Ever wonder why certain instruments such as radar-based level transmitters don’t suffer from the dreaded gradual sensor input drift in the way that temperature, pressure, and many other measurement transmitters do?

Ever wonder why the PM checks that you perform on a radar level transmitter sensor section are quite a bit different from those you perform on other common instruments?  

Answer:

To understand the differences, we must first understand the basic principles of the differences in how they work. 

Typical Transmitter Input (example Pressure, Temperature, etc.)

In a pressure or DP transmitter, a diaphragm or other sensing element will slowly change response over time due to irradiation, aging, metallurgical issues, corrosion, scale buildup or erosion, temperature cycles, and other issues – all of which cause a gradual change in the measured value over time. 
Temperature sensors (especially thermocouple sensors can also drift.

Additionally, any instrument with a true analog input will suffer drift due to slow changes in the actual electronic components because as the transistors and other electrical components age and suffer effects of heat, radiation, temperature cycles, and other factors, the electrical parameters slowly shift off target over time. For this reason, we periodically simulate an input and measure the process variable of the input section. For example, we periodically apply a known pressure as measured by a calibrated reference gauge and comparing the process variable of the transmitter input section to the reference reading. We refer to this as a calibration check of the transmitter input section. 

Time Based Instruments (such as most Radar Level Transmitters) 

Radar Level Transmitters for example, are based on the time that it takes electromagnetic energy to travel from the probe to the product and back.

Since the speed that electromagnetic energy travels at is constant, and very precisely known (i.e. the speed of light, which is 186,000 miles per second) we can accurately measure the distance to the product level or interface with a simple calculation: 

Distance = (Travel Time   x   Speed of Light) / 2

So - Why do radar level transmitters not suffer from the same magnitude of drift in the input section as other common instruments? 

The secret to the accuracy of radar level gauges is in the way they ‘digitize’ the input section to eliminate the analog factor. Here is how they do that: 

In typical radar level transmitters, an extremely high-speed and extremely accurate clock pulse is fed to a microprocessor counter from the time the pulse leaves the probe until it returns.

By counting the number of clock pulses required for the pulse to travel to the level or interface and back, the distance can be accurately determined via the formula given earlier.  

The ‘clock-frequency’ used in these counter circuits is at an extremely high frequency, so the bit resolution error (interval between two successive clock pulses) is very small and is basically negligible. 

So the only thing that could cause any notable drift error would be if the clock frequency drifted. But those 'clock's are incredibly accurate.. 

***Note - that being said, these instruments can still suffer from problems such as incorrect configurations, threshold settings, and other issues - but they don't suffer the analog input drift that instruments with analog input electronics and analog sensors do. 

Why is the clock (time base) so accurate? 

Modern time-based instruments typically use precision crystal oscillators to establish a known ‘clock frequency’. The crystals are placed in oscillator circuits that will resonate at a specific frequency based on the physical dimensions of the actual crystal. Crystal oscillators are manufactured to incredibly tight dimensional tolerances (as in to the molecular level), which results in extremely precise resonant frequencies, and thereby extremely precise and stable clock frequencies. The stability of the crystal oscillator is more stable and precise than typical analog electronic circuitry by orders of magnitude. 

By combining an extremely stable and high frequency clock pulse to the ‘counter’ and then timing a very precise and predictable speed of travel of the electromagnetic wave, we end up with a somewhat ‘digitized’ input measurement. 

This is why (typically) we don’t do the same types of checks on the input sections of time-based transmitters like Radar level transmitters, Coriolis flowmeters, Vortex flowmeters, and others as we would perform on other types of instrumentation.

*Note there are some exceptions to this rule, but the concept is the focus of this short article.  

But what about the transmitter 4-20mA OUTPUT section – does it still drift?  

Anything that has true analog circuitry, including the output section of a time-based instrument such as a radar level transmitter, can and will drift over time, even if the input side is digitized and extremely stable & accurate. 

So - you would still need to test/verify the 4-20mA output signal on those types of instruments, just like any analog point in the system. But at least you won’t also have to do a full calibration check of the sensor/input side of the transmitter so frequently. And that is a huge maintenance benefit, since simulating & testing the input section is usually operationally difficult, time consuming, and would often involve various risk factors. But it is quick and easy to test and compare the transmitter output at various points and compare to the data shown in the controller or the HMI.  

Take Home Points for Technicians & Engineers

1) By effectively ‘digitizing’ portions of instrumentation measurements and other brilliant engineering & design advances, instrument manufacturers have dramatically improved the performance of modern instrumentation. 

2) Instrumentation manufacturers should probably brag more, and explain the how’s and why’s of some of the technological advances they have implemented way more than they do. Understanding the new technologies, improvements, and advances in an instrument helps the technicians and engineers to better understand how the maintenance and troubleshooting efforts should change. 

3) Because of the changes, the maintenance practices and approaches for some of the new instruments are vastly different from those of the past – but many times the I&C practitioners in the field are blindly carrying forward assumptions or even procedures or practices that no longer apply. 

4) The technological advancements in the I&C world have outpaced the training and general knowledge of most industrial I&C shops. 
Most industrial organizations could benefit from taking a close look at their I&C maintenance plans and procedures, in order to better capitalize on all of that powerful instrumentation that they have already paid millions of dollars for… 

5) Most industrial organizations could reduce workloads of their I&C shops significantly and could also improve the reliability and operational performance of their plant by analyzing the realities of their equipment and then updating their maintenance procedures and plans to capitalize on the advances of equipment already installed. Note - The assumption that somebody else surely thought through that, is probably incorrect.  

Take Home Points for Managers

1) It is important for your personnel who are involved with I&C field to fully understand the concepts and principles that underly the operation of each type of instrument and device they work on. 

2) Responsible managers should ensure the organization operations and maintenance programs are not based on assumptions or blind procedural repetitiveness.

3) Train personnel on the core principles – so they can read and make sense of details on each and every device they work on! And push them to RTFM (Read the Functional Manuals!) to fully understand their equipment and any technological advances. 

4) Also encourage a questioning mindset. I&C technicians and engineers should routinely be pondering the how’s and the why’s of their work and their procedures and tasks.  

5) Avoid complacency and assumption. Most instrument shops (even in critical and high-stakes industries) are making many mistakes similar to this, probably including your shop!

6) By learning and understanding the underlying principles of core devices and then reading functional manuals of equipment they work on, the I&C techs and engineers can better understand the how’s and the why’s of their maintenance and operating procedures, and they can better identify problems, mistakes, or potential hazards in the course of their work. And they will also be able to troubleshoot and repair systems and field devices quicker and more effectively – all of which helps improve the overall safety, efficiency, and reliability of the team and of the plant or facility.  

Notes:

1) This article describes how typical time-of-travel type GWR’s work, but there are some other technologies that differ somewhat, such as Frequency Modulated / Continuous Wave Radar types that modulate the frequency at a predetermined cycle rate, and measure phase shift between outgoing and returning signals instead of a measuring the time between pulses. More on this in a future blog…

2) Even equipment such as the radar level transmitter (or other ‘digitized’ instrument measurements) can still have measurement errors and inaccuracies, due to various issues such as improper threshold settings, incorrect setup or configurations, or problems with the sensor probe or process materials – but they are less susceptible to the sensor / input drift of most other instrument measurements and this is why some of the maintenance procedures differ on these instruments as compared to instruments with true analog input sections.

3) Even on equipment that has been designed to reduce or eliminate significant drift, some periodic verifications of the measurement accuracy and other maintenance will still be required/suggested. The techniques for these checks vary - but are usually spelled out clearly in vendor reference manuals. .

 

Mike Glass

About the author

Mike Glass

Mike Glass is an ISA Certified Automation Professional (CAP) and a Master Certified Control System Technician (CCST III). Mike has 38 years of experience in the I&C industry performing a mix of startups, field service and troubleshooting, controls integration and programming, tuning & optimization services, and general I&C consulting, as well as providing technical training and a variety of skills-related solutions to customers across North America.

Mike can be reached directly via [email protected] or by phone at (208) 715-1590.