INTRODUCTION
What is Calibration?
Ever thought of how an instrument can provide more accurate results when samples of unknown values are tested during the normal usage of the product?
Here, Calibration comes into the picture. Calibration is the process of setting up an instrument such that it can produce a result for a sample that is within a reasonable range. Instrumentation design revolves towards removing or eliminating variables that lead to faulty readings.
Although the specific approach varies per product, the calibration process typically includes testing samples of one or more known values called "calibrators" with the instrument. The results are utilised to build a link between the instrument's measuring approach and the known values. In essence, the procedure "teaches" the instrument to generate more accurate findings than would otherwise be the case.
To establish the correlation at precise places within the instrument's operational range, calibrations are done using only a few calibrators. While using many calibrators to develop the calibration relationship, or "curve," the time and work involved in creating and testing many calibrators may exceed the level of performance achieved. In practice, a tradeoff must be made between the desired degree of product performance and the time and effort required to achieve the calibration. When the intermediate points stated in the manufacturer's performance criteria are utilised for calibration, the instrument will perform optimally; the defined technique effectively removes, or "zeroes out," the inherent instrument error at these positions.
A well-executed calibration can increase product performance, as seen in the graph below. In an ideal world, a product would deliver test results that perfectly match the sample value, with no errors within the calibrated range. "Ideal Results" is the name of this line. However, if a product is not calibrated, it may yield test results that differ from the sample value, resulting in a potentially substantial mistake.
The issue can be considerably improved by calibrating the product. During calibration, the product is "trained" on what outcome it should deliver using the known values of Calibrators 1 and 2. The method corrects the inaccuracies at these two locations, bringing the "Before Calibration" curve closer to the "After Calibration" curve's Ideal Results line. At the calibration points, the error at any point has been reduced to zero, and the residual error at any point within the operating range is within the manufacturer's specified linearity or accuracy standard.
PRINCIPLES OF CALIBRATION
Why is calibration necessary?
Calibration is required for a new instrument. It is good practice to ensure that the instrument is providing an accurate indication or output signal when it is installed. The question arises why is there a recurring need to calibrate an instrument even when it is functional and continues to provide an expected indication? The simple answer is that Instrument error can occur due to various factors: drift, environment, electrical supply, the addition of components to the output loop, process changes, etc. Since calibration is performed by comparing or applying a known signal to the instrument under test, errors can be detected by performing calibration.
An error is an algebraic difference between the indication and the actual value of the measured variable.
Typical errors that occur include:
Span Error
Zero error
Combined Zero and span error
Linearization error
Characteristics of calibration
Calibration Tolerance: Every calibration should be performed to a specified tolerance. The terms tolerance and accuracy are often used incorrectly.
Accuracy: The ratio of the error to the full-scale output or the ratio of the error to the output, expressed in per cent span or per cent reading.
Tolerance: Permissible deviation from a specified value; may be expressed in measurement units, per cent of span, or per cent of reading.
It is evident from the definitions, that there are subtle differences between the terms. It is recommended that the tolerance, specified in measurement units, is used for the calibration requirements performed at the facility. By specifying an actual value, mistakes caused by calculating percentages of span or reading are eliminated.
Traceability: all calibrations performed should be traceable to a nationally or internationally accepted standard for instance, in the United States, the National Institute of Standards and Technology (NIST), handles the nationally recognized standards. Traceability is accomplished by ensuring the test standards we use are routinely calibrated by "higher level" reference standards.
Classification of instruments
Critical: An instrument that, if not conforming to the specification, could potentially compromise product or process quality.
Non-critical: An instrument whose function is not critical to product or process quality, but whose function is more of an operational significance. Example An instrument that is not classified as critical, but the reading obtained from the instrument is recorded in operating logs.
Reference Only: An instrument whose function is not critical to product quality, not significant to equipment operation, and not used for making quality decisions. Routine calibration may be less frequent, and verification of proper operation will be performed if suspect of error.
These classifications help in assigning calibration frequencies for instance a critical instrument could be calibrated every 6 months a non-critical instrument every 12 and so on.
TEMPERATURE INSTRUMENTS
All meters, including sensors, show incorrectly – calibration will prove by how much.
The most common and most frequently measurable variable in the industry is temperature. Temperature greatly influences many physical features of matter, and its influence on e.g., quality, energy consumption and environmental emission is significant. Temperature, being a state of equilibrium, makes it different from other quantities. A temperature measurement consists of several time constants, and it is crucial to wait until thermal equilibrium is reached before measuring. Metrology contains mathematical formulas for calculating uncertainty. The polynomials are specified in the ITS 90 table (International Temperature Scale of 1990). For each measurement, a model that includes all influencing factors must be created. Every temperature measurement is different, which makes the temperature calibration process slow and expensive. While standards determine the accuracy to which manufacturers must comply, they nevertheless do not determine the permanency of accuracy.
Temperature sensors
The most used sensors in the industry used for measuring temperature are temperature sensors. They either convert temperature into resistance (Resistance Temperature Detectors, RTD) or convert temperature into low voltage (Thermocouples, T/C). RTD’s are since the resistance changes with temperature. Pt100 is a common RTD type made of platinum and its resistance at 0 ˚C (32 ˚F) is 100Ω. A thermocouple consists of two different metal wires connected together. If the connections (hot junction and cold junction) are at different temperatures, a small temperature-dependent voltage difference/current can be detected. This means that the thermocouple is not measuring the temperature, but the temperature difference. The most common T/C type is the K-type (NiCr/NiAl). Despite their lower sensitivity (low Seebeck coefficient), the noble thermo-elements S-, R- or B-type (PtRh/Pt, PtRh/Pt/Rh) are used especially in high temperatures for better accuracy and stability.
Temperature transmitters
The signal from the temperature sensor cannot be transmitted a longer distance than the plant. Therefore, temperature transmitters were developed to convert the sensor signal into a format that can be transmitted easier. Most commonly, the transmitter converts the signal from the temperature sensor into a standard ranging between 4 and 20 mA. Nowadays, transmitters with a digital output signal, such as Fieldbus transmitters, are also being adopted, while the transmitter converts the sensor signal, it also has an impact on the total accuracy, and therefore the transmitter must be calibrated regularly. A temperature transmitter can be calibrated using a temperature calibrator.
Calibrating temperature instruments
To calibrate a temperature sensor, it must be inserted into a known temperature. Sensors are calibrated either by using temperature dry blocks for industrial fields or liquid baths (laboratory). To make comparisons, we compare the sensor to be calibrated and the reference sensor. The uncertainty of calibration is not the same as the accuracy of the device. Many factors influence the total uncertainty, and performing calibration is not the least influencing factor. All heat sources show measurement errors due to their mechanical design and thermodynamic properties. These effects can be quantified to determine the heat source’s contribution to the measurement uncertainty. The major sources of measurement uncertainty are axial homogeneity, radial homogeneity, loading effect, stability, and immersion depth.
Types of Temperature calibrators
Axial homogeneity: Axial homogeneity is the temperature distribution in the measurement zone along with the boring (axial temperature distribution).
Radial homogeneity: Radial homogeneity can be explained as the difference in temperature occurring between the borings.
Loading effect: When several sensors are placed in the borings of the heat source, they will affect accuracy. This phenomenon is called the loading effect.
Stability: Stability means variation of the temperature in the measurement zone over time when the system has reached equilibrium. Thirty minutes is commonly used.
Immersion depth: To achieve a more stable calibration, the immersion depth for a probe should be sufficient for the sensor being calibrated. Stem conduction, heat flux along the length of
PRESSURE INSTRUMENTS
Pressure calibration is the comparison of the output of a device used to measure pressure with that of another pressure measurement device, or pressure measurement standard. This usually involves plumbing the device under test (DUT) to the standard device and generating a common pressure in the measurement circuit. The outputs of the devices are compared at one or more pressures, typically from the lowest to highest readings of the DUT’s full-scale range, or the range for which it is normally used. This comparison process is performed in a chain from the highest level of fundamental pressure realization, down to every day pressure measurement devices, to ensure pressure measurements are accurate and comply with accepted or mandated standards.
Calibrating Pressure Gauges
Pressure gauges are often used as local indicators of process pressure. An analogue pressure gauge, because of its links, levers, and elastic pressure sensing element requires periodic calibration checks. Pressure changes applied to the gauge cause the elastic element to expand and contract. The movement of the element is translated into the movement of the pointer through links, levers, and gears. The measurement values of the gauge are read directly on the gauge scale from the position of the pointer.
Figure- Bourbon pressure gauge
Calibrating a pressure gauge includes adjustment of these components until the gauge accurately represents the input. The power supply & output standard above are not used for calibrating a pressure gauge
CALIBRATING PRESSURE TRANSMITTERS
Calibrating a pressure transmitter is like calibrating a pressure gauge, except we must measure the output signal using an appropriate measuring standard. For example, we would use a milliammeter to measure a 4-to-20 mA transmitter output signal. If necessary, we may also have to provide the transmitter output power source, such as 24 volts direct current (VDC). Most modern calibration standards provide the ability to supply the transmitter power, so a separate power source is not required.
FLOW INSTRUMENTS
There are numerous sensor technologies for flow instrumentation. The sensor type used in a particular application depends on many factors, including the process fluid measured, pressure, temperature, allowable pressure drop, density, conductivity, viscosity, pipe size and orientation, flow rate and/or flow total required, accuracy requirements control system interface, accessibility, maintenance requirement, etc.
Some of the few major sensor types are:
1) Differential Pressure Flowmeter
Restriction-type flow instruments are based on the principle that flow rate is proportional to the square root of the differential pressure across the restriction.
2) Magnetic Flowmeter
The magnetic flowmeter is based on Faraday's Law of electromagnetic induction.
3) Vortex-Shedding Flowmeter
These flowmeters are based on the principle that shedding causes a local increase in pressure and decrease in velocity on one side of the object and vice versa on the other side of the object. After shedding from one side, the process is reversed. The frequency of shedding reversal is proportional to the velocity of the fluid passing the obstruction.
4) Turbine Flowmeter
The turbine flowmeter is a mechanical flowmeter that measures flow using a spinning turbine or rotor using an arrangement of moving parts.
5) Coriolis Mass Flowmeter
A mass flowmeter measures flow rate in weight per unit time rather than volume. This measurement compensates for temperature and pressure changes.
Calibration of Flowmeters
All meters with moving parts require periodic testing because wear over time will reduce the flowmeter performance. Calibration can be performed either in the lab or in situ (in its original place) using a prover, also called a master meter, or by weighing the flow output. There are several methodologies for flowmeter calibration. Any one of them, and others, may be accepted depending on the process system configuration, compatibility, availability of test standards, and accuracy requirements. It is sometimes difficult or impossible to remove a flowmeter from the service for calibration. Therefore, field-mounted and inline provers have been developed. Depending on the application and system configuration, other methods can also be developed to check the accuracy of flowmeters.
Weighing the flowmeter output collected over a specified time is a common alternative. The calibration of the signal-processing portion for most flowmeter instruments can be checked by simulating the signal from the flowmeter. These methods do not check the sensor itself. No one generic method works for all flowmeters. Tests must be performed by the specific manufacturer's instructions. However, a few of these methods are discussed below.
General Methodology | When to Use |
Calibrate only the electronics (or signal processing) using a test instrument to simulate the sensor. | If it is impossible to perform an in-situ check of the flowmeter sensor using a prover or other methods discussed below and the sensor cannot be removed from the system. |
Check the calibration of the flowmeter, sensor and signal processing together, using a prover or some other standard. | If the flowmeter sensor can be checked in situ but test standards are not available for simulating test signal input. |
Calibrate the electronics first, and then check the calibration of the flowmeter, including the sensor. | If the required test instruments are available to simulate the flowmeter input signal and the flowmeter sensor can be checked in situ. |
Remove the flowmeter and send it to a flow calibration lab (internal, manufacturer, or a 3rd party flow calibration lab). | If the system is not compatible with in-situ flowmeter calibration. It may be necessary to install a calibrated spare to keep the process downtime to a minimum. |
CALIBRATION STATUS LABELS
This section refers to one of the most common documentation when it comes to calibration are calibration status labels - these are used to provide a visual indication of the calibration status of an instrument. Many different label styles are in use throughout the industry.
The main information that must be displayed showing the calibration status is:
● Instrument Identification (such as Tag Number, Instrument ID number, or serial number)
● date of calibration
● next calibration due date
● technician who performed the calibration (initials, employee ID, etc.).
Example of a calibration status label
HOW IS CALIBRATION PERFORMED?
Who performs Calibration? - THE CONTROL SYSTEM TECHNICIAN
A control system technician (CST) is a skilled craftsperson who knows pneumatic, mechanical, and electrical instrumentation. He or she understands process control loops and process control systems, including those that are computers--based. Typically, he or she has received training
in such specialized subjects as a theory of control, analogue and/or digital electronics, microprocessors and/or computers, and the operation and maintenance of particular lines of field instrumentation.
A CST performs calibration, documentation, loop checks, troubleshooting, and repair or replacement of instrumentation. These tasks relate to systems that measure and control level, temperature, pressure, flow, force, power, position, motion, physical properties, chemical composition and other process variables.
WHEN DO INSTRUMENTS NEED TO BE CALIBRATED?
There is no one-size-fits-all calibration schedule. Depending on how frequently you use your equipment and the accuracy required, you may need to calibrate as frequently as every month to as infrequently as every year or longer. Generally, the more critical measurements being performed, the more frequently you will calibrate. If you accidentally dropped or otherwise damaged an instrument, you will likely want to calibrate it as soon as possible.
COST OF CALIBRATION
A successful calibration process requires hardware and software, special equipment, and manpower, hence the costs are variable depending on the intensity of use of these variables. The cost of calibration depends on what is calibrated and who is calibrating it.
In simple cases where a one-off instrument is involved, the cost can be lower than one hundred dollars, but in complex cases can cost thousands of dollars.
Calibration cost depends on whether the calibration is carried out on the premises of calibrating laboratories or on the factory floor being outsourced to third parties.
Certification by ISO 10012-1, ISO 9001, MIL-STD 45662A, and MIL-HDBK-52B requires calibration for measuring equipment. In many situations, such as weighing systems calibration, it is a statutory requirement.
One of the major factors for cost is the frequency of calibration of an instrument. Most calibration systems issue a validity period during which the instrument can be used without concern for major errors and uncertainties. Some organizations use finely worked-out methods for determining calibration intervals, while others use conservative calibration intervals barely able to meet the legal demands. The perception exists that calibration costs can be reduced if the interval can be stretched legitimately. The use of uncalibrated instruments in an organization can be costly as it may affect the product quality and quality of downstream operations.
Standards such as MIL-STD 45662A suggest good calibration intervals. As a rule of thumb, 85 to 95% of all instruments returned for calibration meet the calibration limits. The calibration limits are determined by probability charts of the age of instruments and their failure data. Usually, an instrument must be calibrated if the failure rate increases or functionality deteriorates when compared to other standard instruments.
Planned calibration costs may say, $200 as opposed to an unexpected failure costing thousands of dollars. In this respect, several different mathematical techniques, such as the Weibull statistics and renewal equations, can be employed to analyse the costs. Different software tools (e.g., visual SMITH, Calibration Manager, etc.) are available for cost analysis and determining calibration intervals.
CALIBRATION ISSUES
Some of the common factors that would normally influence the accuracy of a calibrator measurement are hysteresis, repeatability, linearity, temperature, and gravity. A change in any of these can cause a deviation in the accuracy of the equipment used for calibration.
The following three problems occur most often during calibration:
● Zero calibration error
● Electrode slope too low
● Slow response, for example, longer than 3 minutes
There are a variety of causes for the problems named above. The most frequent are:
● The buffer solutions used are either contaminated or out-of-date. It could also be that one of the buffer solutions used is no longer the value labelled on the bottle – for this reason, never store buffer solutions in unmarked or dirty containers.
● The reference electrolyte and/or the diaphragm are contaminated.
● An old or defective electrode is used.
● An electrode is used that has not been hydrated long enough (after dry storage or after cleaning with a strong acid solution).
CONCLUSION
Right from complex manufacturing processes to day-to-day measuring instruments, calibration is fundamentally very important. It helps ensure accurate measurements, and accurate measurements are foundational to the quality, safety and innovation of most products and services we use and rely on every day.
Calibration improves the assurance of precise measurements required in research, development, and innovation, as well as the production of millions of products and services worldwide. Pause and look around your room right now; most of what you see was produced within tight measurement specifications assured by calibration.
All mechanical parts wear, and all electronic components drift over time, so a measuring instrument will not measure accurately to its specifications forever, it must be calibrated routinely to make sure that it operates properly and makes measurements as per its product specifications, and those results can be duplicated by others around the world because the calibration system is traceable to a common global reference.
“In short, if measurement results matter, calibration matters.”
REFERENCES
i) Calibration A Technician's Guide - Mike Cable
ii) https://www.aicompanies.com/education-training/calibration//
Comments
Post a Comment