1. Accuracy

Accuracy refers to the degree of closeness of measurements of a quantity to that quantity’s true value. It is often expressed as a percentage of the full-scale value or in the units of the measurement.

The equation to calculate accuracy is:

Accuracy= (True Value-Measured Value) / True Value x 100%

In instrumentation engineering accuracy is measured in two ways,
One is full scale value and other is measured value

A full scale value takes the full range of the device for example from 0-100m³
Now the accuracy is +-1% of FS
Implies: for 30m³/hr it will be
29m³/hr or 31m³/hr

Now the other is measured value taken at a certain point
i.e.
for 30m³/hr +-1% of MV is
30.3m³/hr Or 29.7m³/hr

 

2. Precision

Precision relates to the consistency of repeated measurements. A measurement system is considered precise if it produces similar results under consistent conditions, irrespective of whether those results are accurate. Precision does not imply accuracy.

 

3. Repeatability

Repeatability is the ability of an instrument to return to the same measurement when used repeatedly under the same conditions, without changing the setup or operators. It is a measure of the instrument’s consistency over short periods and under stable conditions.

 

4. Reproducibility

Reproducibility is the degree to which measurements are consistent when repeated under different conditions, such as with different instruments, operators, and locations. It emphasizes the instrument’s reliability across varied conditions.

 

5. Error

Error in a measurement is the difference between the true value and the value that is measured. It can be broken down into:

  • Systematic Error: Constant and predictable errors that consistently cause measurements to be incorrect by a fixed amount.
  • Random Error: Errors that occur without a predictable pattern, varying in magnitude and direction.

Error= [Measured Value−True Value Error]

 

6. Range

Range is the span between the minimum and maximum limits within which an instrument can accurately measure. For example, a thermometer might have a range of -10°C to 50°C.

 

7. Rangeability

Rangeability or turndown ratio is the ratio of the maximum to minimum measurable values that an instrument can accurately measure within its specified performance limits. It is a critical factor for instruments that need to operate effectively at various levels within their range.
Rangeability= Maximum Flow Rate/Minimum Flow Rate

 

8. Calibration

Calibration involves setting or correcting an instrument by comparing its output to a standard or known measurement. It is essential for ensuring that an instrument provides accurate readings over time. Calibration typically involves a series of adjustments to align the instrument with the standard.

 

9. Hysteresis

Hysteresis refers to the phenomenon where the output of an instrument depends not only on its current input but also on its history of past inputs. This is commonly observed in materials and sensors where the response to increasing input differs from the response to decreasing input.

Hysteresis Curve: A graph typically showing the input-output relationship during both increasing and decreasing inputs.

 

10. Turndown

Turndown ratio, also referred to as Rangeability, is the ratio of the maximum flow rate to the minimum flow rate an instrument can measure with acceptable accuracy. It indicates the operational flexibility of devices like flow meters.

11. Sensitivity

It describes the responsiveness of an instrument to changes in the measured quantity. It indicates how much the output of an instrument changes when the input quantity changes. In other words, sensitivity is the ratio of the change in the output signal to the change in the input signal.