Calibration ~ Learning Instrumentation And Control Engineering Learning Instrumentation And Control Engineering

How to Calibrate a Thermocouple Transmitter

Custom Search

To calibrate a thermocouple transmitter will require a thermocouple simulator with an accuracy of at least four times greater than the thermocouple sensor we desire to calibrate. 

Equipment and Materials Required
The following equipment/materials are required to successfully calibrate a thermocouple transmitter:
  1. Thermocouple Simulator (of at least four times the accuracy of sensor)
  2. Two Digital Voltmeters (Five-digit readout) with accuracy of at least ±0.01% with resolution 1mV
  3. 24 VDC Power Supply of at least 35 – 40mA current output
  4. Thermocouple wire of the same type of wire the thermocouple transmitter is constructed of.

Equipment Setup
Below is the equipment set up for the calibration


Calibration Procedure
  1. Remove the thermocouple transmitter terminal housing cover
  2. If the transmitter is already connected, remove all the thermocouple lead connections.
  3. Determine the base and full scale temperatures. Read: How to convert thermocouple millivolt to temperature.
  4. Turn power supply on.
  5. Consult the thermocouple simulator manual for instructions on setting the thermocouple type and engineering units.
  6. Set the simulator to the base (zero) temperature and adjust the zero pot until the output is 4mA or 40mV at the test terminals
  7. Set the simulator to the full scale temperature and adjust the span pot until the output is 20mA
  8. Repeat steps (1-7) above until both the 4 and 20mA readings are obtained without re-adjusting the span and zero pots.






Calibration Practice For Pressure and Flow Transmitters.

Custom Search

Smart or intelligent transmitters are now the industrial standard for almost all pressure and flow measurement. These transmitters are required to be calibrated regularly to guarantee that they deliver the set measurement objective. What are the recommended calibration practice for these type of transmitters?

Calibration Cycle of a Typical Process Transmitter
Right from when the smart transmitter leaves the factory floor to the plant environment, below is illustrated the typical calibration cycle when applying the transmitters for pressure and flow measurement.


Factory Calibration
Factory calibration is usually done at the factory as part of standard industrial  
practice and to conform to relevant standards and codes. A transmitter has an inherent factory Sensor characterization curve similar to the one below:




Common Configuration and Calibration Tasks for Process Transmitters

Custom Search

Process transmitters especially pressure transmitters are about the most popular piece of electronic device you will ever find in a process plant. Exactly what does it take to get your pressure transmitter up and running?

Listed below are common configuration and calibration tasks recommended to be done on a process transmitters before they become fully operational in the field. These tasks are divided into two basic tasks:

(a) Bench calibration tasks: This calibration task can easily be done on a bench or other suitable location before actual installation in the field or plant

(b)Field calibration tasks: This type of calibration task is done after the transmitter has been installed in the field or plant

Bench Calibration Tasks
The common bench calibration tasks recommended for a pressure transmitter before installation are:
(1) Setup of output configuration tasks:
     (a) Set the range points
     (b) Set the output points
     (c) Set the output type
     (d) Set damping value
(2) Perform a sensor trim if required - see how to perform a sensor trim

Basics of Smart Pressure Transmitter Calibration

Field Calibration Tasks
(1) Reconfigure parameters if necessary
(2) Zero trim the transmitter to compensate for mounting effects or static pressure 
     effects.
(3) Perform an analog output trim if required – see how to perform an analog
     output trim.
    See Smart Pressure Transmitter Calibration - Sensor Trim Basics




How to Calibrate and Adjust a Differential Pressure Switch

Custom Search

Just like pressure switches, a differential pressure switch can be calibrated to a known set point. You can do a quick calibration of a differential pressure switch the same way you calibrate a pressure switch. However for more accurate calibrate, the procedure is slightly modified.
Before going through the calibration steps please read: How a Differential Pressure Switch Works

Quick Calibration Procedure for a Differential Pressure Switch
Equipment required includes:
(a) A variable pressure source
(b) A digital multimeter or continuity test lamp
(c) A test gauge




How to Calibrate an RTD Transmitter

Custom Search

The RTD transmitter is usually factory calibrated to the temperature range shown on the device name plate. When the performance deteriorate and the transmitter needs recalibration, the transmitter is normally calibrated by using a resistance decade box. 

Materials Required for Calibration
To calibrate the RTD transmitter, the following equipment will be required:
1.  Voltmeters (digital) of suitable accuracy and very high resolution – 1mv
2.  A 24VDC power source
3.  A 5 dial Resistance Decade Box with high precision providing 100Ω steps 


Calibration Steps:
Connect the above equipment as in the setup below:




Smart Pressure Transmitter Calibration - Sensor Trim Basics

Custom Search

In pressure transmitter calibration, sensor trim can be performed using either sensor or zero trim functions. Both trim functions alter the transmitter’s interpretation of the input signal. Also analog output trim is required to calibrate the output section of the transmitter.

Zero Trim
Zero trim is a single-point adjustment. It is useful for compensating for mounting position effects and is most effective when performed with the transmitter installed in its final mounting position. Zero trim should not be used in place of a sensor trim over the full sensor range. When performing a zero trim,




Basics of Smart Pressure Transmitter Calibration:

Custom Search

Calibration is the process of optimizing transmitter accuracy over a specific range by adjusting the factory sensor characterization curve located in the microprocessor. Calibrating a smart transmitter is different from calibrating an analog transmitter. The one-step calibration process of an analog transmitter is done in several steps with a smart transmitter. These calibration steps involved are:
(a) Re-ranging - Re-ranging involves setting the lower and upper range points (4 and 20 mA) points at required pressures. Re-ranging does not change the factory sensor characterization curve.
(b) Analog Output Trim - This process adjusts the transmitter’s analog  characterization curve to match the plant standard of the control loop.
(c) Sensor Trim - This process adjusts the position of the factory characterization curve to optimize the transmitter performance over a specified pressure range or to adjust for mounting effects. Trimming has two steps, zero and sensor trims.

Factory Characterization Curve of Pressure Transmitter.
The characterization of a smart transmitter allows for permanent storage of reference information. In the factory setup, known pressures are applied and the transmitter stores information about these pressures and how the pressure sensor reacts to these pressure changes. This creates a transfer function of applied pressures versus output shown below:




Basics of A Five Point Calibration

Custom Search

Owing to the physical limitations of measuring devices and the system under study, every practical measurement will always have some errors.Several types of errors occur in a measurement system. These include;

Static Errors:
They are caused by limitations of the measuring device or the physical laws governing its behaviour.

Dynamic Errors:
They are caused by the instrument not responding fast enough to follow the changes in measured variable. A practical example can be seen in a situation where the room thermometer does not show the correct temperature until several minutes after the temperature has reached a steady value.

Random Errors:
These may be due to causes which can not be readily established; could also be caused by random variations in the system under study.

Basic Steps in Instrument Calibration
Calibration is a process where by we ascertain the output of an instrument after being used over a definite period, by measuring and comparing against a standard reference and to carry out the necessary adjustments required to confirm whether its present accuracy conforms to that specified by its manufacturer.
There are three basic steps involved in the calibration of an instrument. These include:




How to Calibrate a Pressure Gauge With a Dead Weight Tester

Custom Search

Basic Operating Principle

Deadweight Testers (DWT) are the primary standard for pressure measurement. There are three main components of this device: a fluid (oil) that transmits the pressure, a weight and piston used to apply the pressure, and a connection port for the gauge to be calibrated.

The dead weight tester also contains an oil reservoir and an adjusting piston or screw pump. The reservoir accumulates oil displaced by the vertical piston during calibration tests when a large range of accurately calibrated weights are used for a given gauge. The adjusting piston is used to make sure that the vertical piston is freely floating on the oil. Please see How a Dead Weight Tester Works for a detailed description of the working principle of the device.

Calibration Basics

To carry out tests or calibrate a pressure gauge with the dead weigh tester(DWT), accurately calibrated weight masses (Force) are loaded on the piston (Area), which rises freely within its cylinder. These weights balance the upward force created by the pressure within the system:
PRESSURE = FORCE/AREA = W/A
So for each weight added, the pressure transmitted within the oil in the dead weight tester is calculated with the above formula because the area of the piston of the tester is accurately known.

Note:
if weights are in pounds (lbs) units and the area in inch square, then the calculated pressure unit is in Pounds per square inch(PSI).

If the weights are in kilogram (kg) units and the area of piston in meters square, then the calculated pressure [P = (W*G)/A, G = gravity in m/s2] unit is in N/m2 or pascal.

During calibration, the system is primed with liquid from the reservoir, and the system pressure is increased by means of the adjusting piston. As liquids are considered incompressible, the displaced liquid causes the piston to rise within the cylinder to balance the downward force of the weights.

Calibrating a Pressure Gauge with the Dead Weight Tester:

To calibrate a pressure gauge with a dead weight tester, set up the device on a level, stable workbench or similar surface as shown in the diagram below:
Proceed with the calibration according to the following steps:
Step 1:
Connect the pressure gauge to the test port on the dead weight tester as shown in the diagram above. Ensure that the test gauge is reading zero, if not correct the zero error and ensure that the gauge is reading zero before proceeding with the calibration exercise.

Step 2:
Select a weight and place it on the vertical piston

Step 3:
Turn the handle of the adjusting piston or screw pump to ensure that the weight and piston are supported freely by oil.

Step 4:
Spin the vertical piston and ensure that it is floating freely.

Step 5:
Allow a few moments for the system to stabilize before taking any readings. After system has stabilized, record the gauge reading and the weight.

Step 6:
Repeat steps 2 through 5 for increasing weights until the full range or maximum pressure is applied to the gauge and then decreasing weights until the gauge reads zero pressure. Calculate the error at each gauge reading and ensure that it is within the acceptable accuracy limits.

If you are doing a five point calibration, then increasing weights should be added corresponding to 0%, 25%, 50%, 75%, and 100% of the full range pressure of the pressure gauge. And for decreasing pressure you proceed in the order 100%, 75%, 50%, 25%, 0%.

For pressure gauges with less accuracy specifications, calibration at the points: 0%, 50% and 100% will suffice.

After calibration, your data can be recorded in a table in this manner:

Upscale Readings:

% Input Weights,W DWT Pressure (W/A)* Test Gauge Pressure Error
0



25



50



75



100




*DWT Pressure = W/A, if W is in lbs, and A in square inch  then DWT Pressure is in PSI(pounds per square inch). However, if W is in kg and A in square meters, then :
DWT Pressure = (W*G)/A, Where G = gravity in meters per seconds square(m/s2) and the DWT Pressure is in N/m2 or Pascal

Downscale Readings:

% Input Weights,W DWT Pressure (W/A)* Test Gauge Pressure Error
100



75



50



25



0




At each pressure reading, the absolute error is calculated thus:
Absolute Error = DWT Pressure – Test Gauge Pressure
The absolute error at each point should be within the acceptable accuracy limits of the gauge.

If the gauge error is in % span proceed as follows to calculate the error:
Span = Maximum pressure – minimum pressure
%Error = [(DWT Pressure – Test Gauge Pressure)/Span]*100  for each pressure gauge reading.
The error in % span should be within the acceptable accuracy limits otherwise the calibration will have to be repeated to correct the errors.

If the pressure gauge error is in % FSD(Full Scale Deflection), proceed as follows to calculate the error:
% Error = [(DWT Pressure - Test Gauge Pressure)/FSD]*100

Correction Factors:

The deadweight tester has been calibrated to the Gravity, Temperature, and Air Density stated on the calibration certificate right from the laboratory.
Equations and factors are given on the certificate to adjust for any variations in these environmental conditions.
Always refer to the documentation for the Dead Weight Tester to ensure that for maximum accuracy, the necessary calibration correction factors are applied to any reading from the device.

Gravity Correction

Gravity varies greatly with geographic location, and so will the deadweight tester
reading. Due to the significant change in gravity throughout the world (about 0.5%), ensure that the tester in your possession has been manufactured with the specification of your local gravity, otherwise you  may have to apply the correction for the calibrated gravity.

To correct for gravity use:
True Pressure = [(Gravity (CS))/(Gravity(LS))]*P(Indicated) 
Where:
P(Indicated)  = Pressure indicated by gauge being calibrated
Gravity(CS)  = Gravity at Calibration Site
Gravity (LS) = Gravity at Laboratory Site

Temperature Correction

Temperature and Air Density variations are less significant than gravity. Variations should be corrected for when maximum accuracy is required.
To correct for Temperature variation use:

True Pressure = P(Indicated) [1+ {T(DWTCT) – T(OT)}*{ΔP/100}] 
Where:
P(Indicated)    = Pressure indicated by gauge being calibrated
T(DWTCT)     = Dead Weight Tester calibrated temperature in the laboratory
T(OT)              = Operating temperature at calibration site
ΔP                    = Percentage pressure change per unit temperature change




How to Calibrate a Current to Pressure Transducer

Custom Search

Basic Operation
The I/P transducer receives a 4‐20 mA DC input signal from a control device and transmits a proportional field‐configurable pneumatic output pressure to a final control element usually a control valve. The I/P converter is typically used in electronic control loops where the final control element is a control valve assembly that is pneumatically operated. In most applications the I/P transducer is mounted on a control valve or very close to it in a mounting bracket.
See How a Current to Pressure Transducer Works




How to Calibrate Smart Transmitters

Custom Search


In our last discussion: Introduction to Smart Transmitters, we have seen that a smart transmitter is remarkably different from that of a conventional analog transmitter. Consequently the calibration methods for both devices are also very different. Remember that calibration refers to the adjustment of an instrument so its output accurately corresponds to its input throughout a specified range. Therefore a true calibration requires a reference standard, usually in the form of one or more pieces of calibration equipment to provide an input and measure the resulting output. If you got here looking for information on analog pressure transmitter calibration, you may consult: How to Calibrate Your DP Transmitter


The procedure for calibrating a smart digital transmitter is known as Digital trimming. A digital trim is a calibration exercise that allows the user to correct the transmitter’s digital signal to match plant standard or compensate for installation effects. Digital trim in a smart transmitter can be done in two ways:




How to Calibrate a Pressure Gauge

Custom Search


In the plant, pressure gauge calibration is often taken for granted probably because they seem to be everywhere in the plant that one just assumes that some how the gauges are accurate even when they are out of calibration. A pressure gauge can be calibrated with a standard pneumatic calibrator, a dead weight tester or any other suitable calibrator.
There is no standard way to calibrate a pressure gauge.  The way a gauge is calibrated depends on the way the gauge is used. Here an outline is given on how a pressure gauge could be calibrated with any type of calibrator.




Basic Principles of Instrument Calibration

Custom Search


Every instrument has at least one input and one output. For a pressure sensor, the input would be some fluid pressure and the output would (most likely) be an electronic signal. For a variable-speed motor drive, the input would be an electronic signal and the output would be electric power to the motor. To calibrate an instrument means to check and adjust (if necessary) its response so the output accurately corresponds to its input throughout a specified range

Calibration is one exercise that is often taken for granted within an industrial plant. Even the most important industrial equipment will become useless if




How to Calibrate DP Pressure Transmitters: 8 Effective Tips that Works

Custom Search


Calibration of a DP pressure transmitter involves a process by which the output of the transmitter is adjusted to properly represent a known pressure input. Calibration is one of the most frequently performed maintenance operations on pressure transmitters. If well performed, the transmitter’s performance improves otherwise its performance could deteriorate with grave consequences. A pressure input is used to provide zero and span adjustments to the transmitter in the calibration process. Consult my previous post: How to Calibrate Your DP Transmitter for a detailed guide on how to calibrate a DP pressure transmitter.

Owing to the fact that a plant could go berserk, if one or two critical pressure transmitters are wrongly calibrated, it is important the calibration process and procedure be done properly. The following tips are general guides that you should have at the back of your mind when calibrating a DP pressure transmitter:




How to Calibrate a Pressure Switch with a Fluke Pressure Calibrator

Custom Search


Owing to the fact that the pressure switch is a ubiquitous device, found regularly in every plant, the need to always calibrate the switch is always there. We have discussed how to calibrate and adjust a pressure switch in a previous post, but I do so again, this time using a Fluke pressure calibrator which is a very handy and popular device used by instrument technicians and experts.




How to Use a Fluke Pressure Calibrator for Calibration

Custom Search

With the evolution of modern process plants, the analog pressure transmitter and pressure gauges have become very useful piece of instruments for transmitting pressure or for local indication by the pressure gauge. Often times it is required to calibrate these devices. What do you actually need to do this?




How to Calibrate and Adjust a Pressure Switch

Custom Search

Before we get down to the nitty-gritty of how to calibrate and adjust a pressure switch, let us get to understand some basic concepts with pressure switch calibration:

Setpoint:
This is the pressure at which the pressure switch is required to operate. A pressure switch may be set to operate on either a rising pressure (high level alarm) or




How to Calibrate Your DP Transmitter

Custom Search

To calibrate an instrument involves checking that the output of the given instrument corresponds to given inputs at several points throughout the calibration range of the instrument. For the analog DP transmitter, its output must be calibrated to obtain a zero percent (4mA) to 100 percent (20 mA) output proportional to the DP transmitter’s zero percent to 100 percent range of input pressures.
In other words calibration of the transmitter is required to make the transmitter’s percent input equal to the transmitter’s percent output. This is accomplished by adjusting screws located and clearly marked as ZERO and SPAN on the analog transmitter’s outer casing. The ZERO and SPAN screws may also be referred to as the ZERO and RANGE adjustment screws for some manufacturers of DP transmitters.

If you got here looking for information   on smart transmitter calibration please see : How to Calibrate Smart Transmitters
Whatever the model/manufacturer of your DP transmitter, it can be easily calibrated according to the manufacturers specific instruction on how to calibrate it. For every calibration you need to do, consult your manufacturer’s specific instruction for calibrating the specific DP transmitter.

However there are general guidelines you need to follow before you calibrate any transmitter:




Common terms Used in DP Transmitter Calibration

Custom Search

Lower Range Limit (LRL)
This is the lowest value of the measured variable that a transmitter can be configured to measure. This is different from Lower Range Value (LRV)

Lower Range Value (LRV)
Lowest value of the measured variable that the analog output of a transmitter is currently configured to measure.

Transmitter Re-ranging
Configuration function that changes a transmitter 4mA and 20mA settings

Upper Range Limit (URL)
This is the highest value of the measured variable that a transmitter can be configured to measure. This is different from Upper Range Value (URV)

Upper Range Value (URV)
Highest value of the measured variable that the analog output of a transmitter is currently configured to measure

Span
Span is defined as the algebraic difference between the upper (URV)and lower range(LRV) values of the DP transmitter.

Span = URV – LRV
For example, if the DP transmitter is being used to measure a pressures in the range 0 – 300psig, then URV = 300, and LRV = 0
Therefore span = URV – LRV = 300 – 0 = 300
To have a better understanding of LRV and URV as used in instrumentation systems, please go through control signals

Calibration Range
The calibration range of a DP transmitter is defined as “the region between the limits within which a quantity is measured, received or transmitted, expressed by stating the lower and upper range values.” The limits are defined by




You May Also Like: