CALIBRATION

It
can be defined as the process of making the corrections in readings of the
instrument by taking the readings from that instrument under the specific
derived conditions and comparing them with the known values of the instrument.
The definition could be simplified as that the calibration is the comparison of
the values of the given instrument to that of the standard one. The definition
also implies that the value to be attained with zero tolerance. Sometimes, many
number of iterations are to be performed on the instrument to obtain the value
of error and their average is taken when the instrument cannot be calibrated.

We Will Write a Custom Essay Specifically
For You For Only \$13.90/page!

order now

WHY  IS CALIBRATION NEEDED?

·
Before
measuring any data, the instrument has to be calibrated.

·
After
measuring the data, the instrument has to calibrated so as to know whether the
data collected is reliable or not.

·
The
instrument has to be calibrated if it had any incident by any chance such as
fall down, hit by another instrument, etc.

·
The
instrument has to be calibrated if the data obtained is questionable or if the
instrument user suspects any chance of mistake in its values.

·
Some
of the experiments require calibrated certificates for the instruments before
it is used.

·
For
some of the instruments, the manufacturer indicates that the instrument has to
be calibrated after the given number of uses.

TYPES OF ERRORS

·
Span Shift Calibration Error – It effects the slope of the
given function. The effect of error varies from one point to another point.

·
Zero Shift Calibration Error – It shifts the graph vertically
which affects the zero adjustment error and it effects all the points
similarly. The percentage of error is equal across the entire range of values.

The linear
equation to calculate the output by calibrating span and zero errors

y = mx + b

Where,

y = Output

x = Input

·
Linearity Calibration Error – It affects the response
function of the instrument in such a way that the error will no longer be in a
straight line. The way of appearance of linearity error is unique for each and
every instrument. If this error cannot be calibrated, the average of maximum
and minimum errors has to be taken so that the range of error goes to minimal
value.

·
Hysteresis Calibration Error – This error occurs when the
instrument tends to react differently to the increasing value of input in
comparison to the decreasing value of input. These errors are almost caused due
to the mechanical friction between the parts of the instrument such as gears,
levers, pulleys, etc.

IMPORTANCE OF
CALIBRATION

Let’s
assume that you have performed an experiment and that you have done it without
calibrating your instruments and submitted the values to the supervisor. After
that the supervisor repeats the experiment and finds out that the values were
wrong and this makes you to lose your credibility.

CHARACTERISTICS
OF CALIBRATION

Accuracy – It is nothing but
the exactness in result which is expressed in percentage.
Tolerance
– It is the maximum percentage of the error accepted.

FACTORS AFFECTING
THE CALIBRATION

Temperature
Pressure
Friction

Humidity

Example
:-  In medical field, to
perform a culture test, the instrument has to be calibrated by removing the
pressure by creating vacuum and also the low temperature is obtained before performing
the test. If the calibration is not done properly, the accurate values cannot
be obtained which results in the failure of test.

METHODS
OF CALIBRATION

Interpolative
– There are
four different types of methods in this type

CIM – Conventional Interpolative
Method

IIM
– Indirect Interpolative Method

IISM
– Interpolative Internal Standard Method

IDM
– Interpolative Dilution Method

Extrapolative
– There are
four different types of methods in this type

CEM – Conventional Extrapolative
Method

IEM
– Indirect Extrapolative Method

EISM
– Extrapolative Internal Standard Method

EDM
– Extrapolative Dilution Method

Indicative
– It is
calculated by considering the titration of the chemical reaction between
the standard solution and the analyte.

STEPS
OF CALIBRATION

Select the reference
standards for the type of device to be calibrated.
Take
the readings from the instrument that has to be calibrated.
Derive
a relation between standard values to that of the readings taken and
obtain a curve based on the readings.
Obtain
the correction of the measurements by considering the inverse values of
the curve plotted.