Instruments and Measurement Systems - 1
Instruments and Measurement Systems - 1
True Value:
It is the exact value or the perfectly correct value in any measuring scheme.
It is defined as the average of infinite values taken when the average deviation due to various
contributing factors approach zero.
True value is the one which we cannot reach my experiments. In actual practice true value is
usually taken from a laboratory standard or obtained with all possible error-cancelling
provisions.
Accuracy:
We cannot say any measurement will be exactly correct.
The term Accuracy is used to express how much is near the measured value to the true value.
When we say the readings obtained are very accurate, it means the readings are true for all
practical purposes.
It is defined as the degree of the closeness with which instrument reading approaches the true
value of the quantity being measured. Accuracy can be expressed in three ways
1. Point accuracy
2. Accuracy as the percentage of scale of range
3. Accuracy as percentage of true value.
Precision / Reproducibility
It tells the method which gives the best possible accuracy. When we say the instrument is precise
instrument, it means that the instrument will give uniformly equal readings repeatedly, for a
given quantity measurement.
In short, precision refers to the ability of an instrument to give consistent readings.
It is defined as the degree of the closeness with which a given quantity may be repeatedly
measured. High value of reproducibility means low value of drift. No drift means that with a
given input the measured values do not vary with time.
An undesired gradual departure of an instrument output over a period of time that is unrelated to
change in input, operating conditions or lead
or
Is the gradual shift in the indication or record of the instrument over an extended period of time,
during which the true value of the variable does not change. Drift is an undesirable quality
in instruments, so instruments are properly guarded against drift.
Drift are of three types
1. Zero drift
2. Span drift
3. Zonal drift
Error:
Using an electrical instrument what we measure is termed as actual value. The perfect reading is
the true value. So error is defined as the difference between the measured value and true value.
Error = Measured Value - True value.
Uncertainty:
Uncertainty denotes the range of error, i.e. the region in which one guesses the error to be.
Sensitivity:
It means the ability to feel or realize readily and accurately the slight change in the input quality.
When an instrument is reacting to even a slight difference in the input quantity we can say that
instrument is very sensitive.
It is defined as the ratio of the magnitude response of the output signal to the magnitude response
of the input signal.
Scale readability:
Indicates the closeness with which the scale can read.
Repeatability:
It is defined as the variation of scale reading; it is a measure of closeness with which a given
input can be measured over and over again.
Random errors
These are the errors that remain even after the systematic errors are more or less eliminated.
These errors are due to a multiple of small errors which together give rise to variation in the
readings of the instruments. They are also referred to as residual errors because they remain after
taking care of all known sources of error.
The random errors are accidental, small and independent.
They vary in an unpredictable manner
The magnitude and direction of these errors cannot be predicted.
The most common sources of these errors are:
- friction in instrument movement
- backlash in the instrument
- parallax errors between pointer and scale
- finite dimensions of the pointer and scale divisions
- hysteresis in elastic members
- mechanical vibrations
NOTE: sources of error are noise, response time, design limitation, energy exchanged by
interacting, transmission, determination of measuring system, ambient influences on measuring
systems, errors of observation and interpretation.
1.1 Introduction
Dairy processing unit operations mainly involve heating, cooling, separating, drying or
freezing of the products. These unit operations are carried out under varying conditions
of temperatures, pressures, flows and physical compositions. The measurement and
control of these variable factors at the various stages of processing call for the accurate
and efficient instruments, in addition to the dependence upon human skills. With the
advent of large scale milk handling plants the automatic operation and control through
efficient instrumentation and automation has become even more necessary. Utilities
such as steam, water, electricity air, fuel etc. have to be measured and controlled at
appropriate points in the plant. Automatic control instruments are employed to measure
and control the temperature, pressure, flow and level of these utilities. The overall aim
of the instrumentation/ automation is to improve the product quality and enhance the
plant efficiency for better economic returns.
1.2 Variable
1.3 Measurement
When we decide to study a variable we need to devise some way to measure it. Some
variables are easy to measure and others are very difficult. The values of variables are
made meaningful by quantifying them into specific units. For example, instead of saying
that a particular fluid is hot, we can specify a measurement and specify that the fluid is
having a temperature of 80°C.
The definition, agreement, and practical use of units of measurement have played a
crucial role in human endeavor from early ages up to this day. Different systems of
units used to be very common. Now there is a global standard, the International
System of Units (SI), the modern form of the metric system.
There are two types of SI units, base units and derived units. Base units are the simple
measurements for time, length, mass, temperature, and amount of substance, electric
current and light intensity. Derived units are constructed from the base units, for
example, the watt, i.e. the unit for power, is defined from the base units as m 2/kg/s−3.
Other physical properties may be measured in compound units, such as material
density, measured in kg/m3.
DEFINITIONS OF STANDARDS UNITS
Science is based on objective observation of the changes in variables. The greater our
precision of measurement the greater can be our confidence in our observations. Also,
measurements are always less than perfect, i.e., there are errors in them. The more we
know about the sources of errors in our measurements the less likely we will be to draw
erroneous conclusions. With the progress in science and technology, new phenomena
and relationships are constantly being discovered and these advancements require
newer developments in measurement systems. Any invention is not of any practical
utility unless it is backed by actual measurements. The measurements thus confirm the
validity of a given hypothesis and also add to its understanding. This is a continuous
chain that leads to new discoveries with new and more sophisticated measurement
techniques. While elementary measurements require only ordinary methods of
measurement, the advanced measurements are associated with sophisticated methods
of measurement. The advancement of Science and Technology is therefore dependent
upon a parallel progress in measurement techniques. It can be safely be said that, the
progress in Science and Technology of any country could be assessed by the way in
which the data is acquired by measurements and is processed.
In R&D applications the design of equipments and processes require the basic
engineering design data on the properties of the input raw materials and processed
products. The operation and maintenance of equipments for optimal processing
variables to achieve best quality product and energy efficient equipment utilization
require the monitoring and control of several process variables. Both these functions
require measurements. The economical design, operation and maintenance require a
feedback of information. This information is supplied by appropriate measurement
systems.
The measurement systems and the instruments may be classified based upon the
functions they perform. There are four main functions performed by them: indicating,
signal processing, recording and control.
i. Indicating Function: This function includes supplying information
concerning the variable quantity under measurement. Several types of
methods could be employed in the instruments and systems for this purpose.
Most of the time, this information is obtained as the deflection of a pointer of
a measuring instrument.
ii. Recording Function: In many cases the instrument makes a written record,
usually on paper, of the value of the quantity under measurement against
time or against some other variable. This is a recording function performed
by the instrument. For example, a temperature indicator / recorder in the
HTST pasteurizer gives the instantaneous temperatures on a strip chart
recorder.
iii. Signal Processing: This function is performed to process and modify the
measured signal to facilitate recording / control.
iv. Controlling Function: This is one of the most important functions,
especially in the food processing industries where the processing operations
are required to be precisely controlled. In this case, the information is used
by the instrument or the systems to control the original measured variable or
quantity.
Thus, based on the above functions, there are three main groups of instruments. The
largest group has the indicating function. Next in line is the group of instruments which
have both indicating and or recording functions. The last group falls into a special
category and perform all the three functions, i.e., indicating, recording and controlling.
In this lesson only those instruments would be discussed whose functions are mainly
indicating and recording, especially those instruments which are used for engineering
analysis purposes.
The following are the basic requirements of a good quality measurement system /
instrument:
a) Ruggedness
b) Linearity
c) No hysteresis
d) Repeatability
e) High output signal quality
f) High reliability and stability
g) Good dynamic response
2. CALIBRATION
2.1 DEFINITIONS
Typically, the accuracy of the standard should be ten times the accuracy of the
measuring device being tested. However, an accuracy ratio of 3:1 is acceptable by most
standards organizations. Calibration of your measuring instruments has two objectives;
A person typically performs a calibration to determine the error or verify the accuracy of
the device under test’s (DUT) unknown value.
There are two methodologies for obtaining the comparison between test instrument &
standard instrument. These methodologies are;
i. Direct comparisons
ii. Indirect comparisons
Direct comparisons:
In a direct comparison, a source or generator applies a known input to the meter under
test. The ratio of what meter is indicating & the known generator values gives the
meter’s error. In such case the meter is the test instrument while the generator is the
standard instrument. The deviation of meter from the standard value is compared with
the allowable
performance limit. With the help of direct comparison a generator or source also can be
calibrated.
Indirect comparisons:
In the indirect comparison, the test instrument is compared with the response standard
instrument of same type i .e., if test instrument is meter, standard instrument is also
meter, if test instrument is generator; the standard instrument is also generator & so
on. If the test instrument is a meter then the same input is applied to the test meter as
well a standard meter.
The accuracy of all measuring devices degrades over time. This is typically caused by
normal wear and tear. However, changes in accuracy can also be caused by electric or
mechanical shock or a hazardous manufacturing environment. Depending on the type
of instrument and the environment in which it is being used, it may degrade very
quickly or over a long period of time. The bottom line is that calibration improves the
accuracy of the measuring device. Accurate measuring devices improve product quality.
The hidden costs and risks associated with un-calibrated measuring instruments could
be much higher than the cost of calibration. Therefore, it is recommended that the
measuring instruments are calibrated regularly by a reputable company to ensure that
errors associated with the measurements are in the acceptable range. People who
perform calibration in laboratories include:
i. Metrologists
ii. Lab managers
iii. Calibration engineers
iv. Calibration technicians
Calibrators
Example of a Calibrator
Calibration Disciplines
There are many calibration disciplines, each having different types of calibrators and
calibration references. Common calibration disciplines include but are not limited to:
i. Electrical instrumentation
ii. Radio frequency (RF)
iii. Temperature
iv. Humidity
v. Pressure
vi. Flow
2.5 ELECTRICAL INSTRUMENTS CALIBRATION PROCEDURE
There are several ways to calibrate an instrument depending on the type of instrument
and the chosen calibration scheme. There are two general calibration schemes:
From this basic set of calibration schemes, the calibration options expand with each
measurement discipline.
Calibration Steps
A calibration process starts with the basic step of comparing a known with an unknown
to determine the error or value of the unknown quantity. However, in practice, a
calibration process may consist of "as found" verification, adjustment, and "as left"
verification. Many measurement devices are adjusted physically (turning an adjustment
screw on a pressure gauge), electrically (turning a potentiometer in a voltmeter), or
through internal firmware settings in a digital instrument.
For example, for some devices, the data attained in calibration is maintained on the
device as correction factors, where the user may choose to compensate for the known
correction for the device. An example of this is RF attenuators, where their attenuation
values are measured across a frequency range. The data is kept with the instrument in
the form of correction factors, which the end-user applies to improve the quality of their
measurements. It is generally assumed that the device in question will not drift
significantly, so the corrections will remain within the measurement uncertainty
provided during the calibration for the calibration interval. It is a common mistake for
people to assume that all calibration data can be used as correction factors, because
the short and long term variation of the device may be greater than the measurement
uncertainty during the calibration interval.
The “as left” verification step is required any time an instrument is adjusted to ensure
the adjustment works correctly. Artifact instruments are measured “as-is” since they
can’t be adjusted, so “as found” and “as left” steps don’t apply.
All the instruments are calibrated at the time of manufacturer against measurement
standards. A standard of measurement is a physical representation of a unit of
measurement. A standard means known accurate measure of physical quantity.
i. International standards
ii. Primary standards
iii. Secondary standards
iv. Working standards
International standards
Primary standards
These are highly accurate absolute standards, which can be used as ultimate reference
standards. These primary standards are maintained at national standard laboratories in
different countries. These standards representing fundamental units as well as some
electrical and mechanical derived units are calibrated independently by absolute
measurements at each of the national laboratories. These are not available for use,
outside the national laboratories. The main function of the primary standards is the
calibration and verification of secondary standards.
Secondary standards
As mentioned above, the primary standards are not available for use outside the
national laboratories. The various industries need some reference standards. So, to
protect highly accurate primary standards the secondary standards are maintained,
which are designed and constructed from the absolute standards. These are used by
the measurement and calibration laboratories in industries and are maintained by the
particular industry to which they belong. Each industry has its own standards.
Working standards
These are the basic tools of a measurement laboratory and are used to check and
calibrate the instruments used in laboratory for accuracy and the performance.
Secondary standards
As mentioned above, the primary standards are not available for use outside the national
laboratories. The various industries need some reference standards. So, to protect highly
accurate primary standards the secondary standards are maintained, which are designed
and constructed from the absolute standards. These are used by the measurement and
calibration laboratories in industries and are maintained by the particular industry to which
they belong. Each industry has its own standards.
Working standards
These are the basic tools of a measurement laboratory and are used to check and calibrate the
instruments used in laboratory for accuracy and the performance.
3. SPECIFICATIONS
3.1 Definition of the term Specifications
This is a written or printed description of work describing qualities of materials and
Method of writing specifications
1. Method System
The specifier describes in detail the materials, workmanship, installation,
and erection procedures to be used by the installer in the conduct of his
work operations in order to achieve the results expected.
The method system can best be described as a descriptive specification.
The specifications code sets forth specific materials and methods that are
permitted under the law.
2. Result system
When the specifier instead elects to specify results, he places on the
installer the responsibility for securing the desired results by whatever
methods the installer chooses to use.
The result system is best described as performance specification
Under the performance code, materials and methods are left to the
installer and engineer, provided that performance criteria for fire
protection, structural adequacy, and sanitation are met.
As a matter of fact, both the descriptive specification and the performance
specification can be used together in the same project specification, each
in its proper place, in order to achieve the prime objective.
Advantages
i. Close control of product selection close selection.
ii. Preparation of more detailed and complete drawings based on precise
information obtained from manufacturers data.
iii. Decreases the size of the specification and reduces production time
production time.
iv. Simplification of bidding by narrowing competition and removing product
pricing as a major variable.
Disadvantages
i. Elimination or narrowing of competition elimination competition.
ii. Requiring products with which the contractor has perhaps little or bad
experience.
iii. Favouring of certain products and manufacturers over others.
INSTRUMENT SPECIFICATIONS
There are four common specifications typically prepared for instrumentation and control
1. Instrument specification sheets
2. Control system specifications
3. Control panel (or control cabinet) specifications
4. Installation specifications.
Among the four, preparation of the instrument specification sheets is the most difficult
and time consuming. Although some operations still prepare these specifications
manually, many now use computer-based systems that can select the best instrument
to fit the process conditions, and then generate a specification sheet.
Such software packages also contain master specifications for control systems, control
panels, installation activities, and other specifications. These tools save time and help
produce consistent quality design.
Instrument specification sheets
The purpose of the instrument specification sheet is to list pertinent details for use by
engineers and vendors. The information is also used by installation and maintenance
personnel.
This specification sheet describes the instrument and provides a record of its function.
The information should be uniform in content, presentation, and terminology. And, of
course, the selection must consider all plant and process requirements and comply with
any code requirements in effect at the site.
The most common specification sheets used in instrumentation and control are for:
i. Flow measurement
ii. Level measurement
iii. Pressure measurement
iv. Temperature measurement
v. Analysers (including pH and conductivity)
vi. Control valves and regulators
vii. Pressure relief devices.
Typically, preparation of the instrument specification sheet involves several steps. If
software is used, some of the procedure can be automated. First, the process data are
completed, generally by a process or a mechanical engineer. Then, the best instrument
for the job is chosen.
The specification sheet is completed to cover such points as type of enclosure, type of
signal required, material in contact with the process, connection size, and the like.
Vendors are selected, prices solicited, and finally an order is placed.
Control system specifications
The control system document outlines the parameters for the computer-based control
system. It typically contains the requirements for code compliance, overview of the
system, and detailed requirements.
The information generally begins with a master specification in a word processor
document that can be tailored to the needs of each application. This document remains
in use and is typically needed long after the system is up and running.
The content of a typical control system specification covers
i. Field conditions (including temperature, humidity, and environmental)
ii. Hardware requirements (such as cabinets, communications devices, inputs
and outputs, controllers, and operator consoles)
iii. Software requirements (including system configuration capabilities, graphics,
alarms, trends, and reports)
iv. Service and support.
Control panel/cabinet specifications
The control panel document provides the guidelines for the design, construction,
assembly, testing, and shipping of control panels and cabinets. As with the control
system specification, the control panel specification generally originates with a master
word processor specification to allow the requirements of each application to be easily
customized.
A typical control panel specification is divided into sections covering design,
construction, testing, and shipping. The document also should address certain details,
such as nameplates, electrical and pneumatic requirements, and purging requirements,
if necessary.
All electrically operated instruments, or electrical components incorporated in a panel or
cabinet, must comply with the requirements of the current edition of the electrical code
in effect at the site. All such equipment should be approved (by UL or CSA) and bear
the approval label. ISA’s “Standards and Recommended Practices” also provide a
valuable source of information and guidelines for instrumentation.
Panel drawings may be generated with CAD tools, but the need for control panel
specifications and drawings has diminished with the proliferation of computer-based
control systems and the use of off-the-shelf cabinets. CAD drawings are still used,
however, to show wiring and component locations in the cabinets.
Installation specifications
The installation specification provides the requirements for installing instruments,
control systems, and their accessories. The contractor uses this document to estimate
the cost of the installation. Once again, the information in the specification should be
developed from a master specification document prepared in word-processor format to
allow for convenient customization.
The installation specification marks the transition point between engineering and
maintenance, who typically installs the equipment. The installation specification has
many parts, each covering a section of the installation. Typically, these sections consist
of an overview of the scope of the work.
It is followed by a description of how the instruments are to be mounted and installed,
including the connections between the process and the instruments. The specification
should also cover wiring and tubing requirements. Finally, checkout procedures should
be defined to ensure that the control system as a whole is ready for operation.
All installation work should be based on the installation specification and reference
documentation provided by the engineering phase. This reference documentation,
which forms part of the contract, clearly identifies the scope of work, thereby
minimizing misunderstandings, completion delays, and additional costs.
Drawings
The most commonly prepared drawings for instrumentation and controls are logic
diagrams, instrumentation index, loop diagrams, and interlock diagrams (or electrical
schematics). Although many companies still design drawings manually before
implementing them on a CAD system, some have moved to computer-based systems
that produce a large portion of the design automatically. Such software packages save
time and help produce a consistent design.
Logic diagrams
Logic diagrams are needed to define discrete (on/off) controls. These controls cover all
time-based and state-based logic used in process control, including PLC sequences and
hard-wired trip systems.
If the logic is simple, a written description in the control system definition or a
description on the P&ID is generally adequate. However, whenever intricate logic is
used, logic diagrams, typically drawn to conform with ANSI/ISA Standard S5.2, are
required.
Instrument index
An instrument index lists all items of instrumentation for a specific project or for a
particular plant. Its purpose is to act as a cross-reference document for each item of
instrumentation and for all documents and drawings related to the particular item. An
instrument index is typically generated and maintained on a PC using a database
program. A computer-based approach facilitates updating and retrieving data.
The instrument index is normally presented in tabular form, is generated at the start of
a project, and stays active throughout the life of the facility. The following items are
typically shown on an instrument index:
i. Tag number
ii. Description
iii. P&ID number
iv. Line/equipment number
v. Instrument specification sheet number
vi. Manufacturer’s drawing numbers
vii. Loop drawing number
viii. Interlock diagram number
ix. Location diagram number
x. Miscellaneous notes.
Some users add other information they consider important, such as the equipment
supplier and model number, installation details, purchase order number, and the like.
Loop diagrams
A loop diagram should be prepared for each instrument loop in the project that contains
more than one instrument. The only instruments not requiring loop diagrams are
interlock systems (these instruments are shown on the interlock diagrams) and local
devices such as relief valves (an instrument index entry should suffice for these
devices).
Loop diagrams are generated to show the detailed arrangement for instrumentation
components in all loops. All pneumatic and electronic devices with the same loop
number are generally shown on the same loop diagram. The content and format of loop
diagrams should conform to ANSI/ISA Standard S5.4.
Interlock diagrams
Interlock diagrams (electrical schematics) show the detailed wiring arrangement for
discrete (on/off) control. However, with the introduction and extensive use of
programmable electronic systems to perform logic functions, the use of interlock
diagrams has diminished over the years.
4. Instrumentation and Control Design
The purpose of Instrumentation and Control (I&C) Design document is to cover
the project-specific technical requirements which are to be followed throw-out the Feed
or Detailed Engineering Phase while in preparation of engineering deliverables. The
Design Basis is considered as a mother document for all the engineering activities or
deliverables to be carried out in a particular project.
Client’s Specification
Electrical Hazardous Area Layout
Basic Engineering Design Data
Structure of Design Basis of Instrumentation & Control
The following items listed below should be covered in the Instrumentation & Control
Design Basis.
2. Units of Measurements
3. Control System
9. Spares
Instrumentation documentation
Instrumentation documentation consists of drawings, diagrams and schedules. The
documentation is used by various people for different purposes. Of all the disciplines in
a project, instrumentation is the most interlinked and therefore the most difficult to
control. The best way to understand the purpose and function of each document is to
look at the complete project flow from design through to commissioning.
Instrument list
This is a list of all the instruments on the plant, in the ‘List’ format. All the instruments
of the same type (tag) are listed together; for example, all the pressure transmitters
‘PT’ are grouped together.
Used by: Maintenance staff during the operation of the plant and by commissioning
staff at start up.
2. Signal processor
This element takes the output from the sensor and converts it into a form which
is suitable for display or onward transmission in some control system. In the case
of the thermocouple this may be an amplifier to make the e.m.f. big enough to
register on a meter (Figure 1.8B). There often may be more than an item,
perhaps an element which puts the output from the sensor into a suitable
condition for further processing and then an element which processes the signal
so that it can be displayed. The term signal conditioner is used for an element
which converts the output of a sensor into a suitable form for further processing.
Thus in the case of the resistance thermometer there might be a signal
conditioner, such as a Wheatstone bridge, which transforms the resistance
change into a voltage change, then an amplifier to make the voltage big enough
for display (Figure 1.8B) or for use in a system used to control the temperature.
3. Data presentation
This presents the measured value in a form which enables an observer to
recognize it. This may be via a display, e.g. a pointer moving across the scale of
a meter or perhaps information on a visual display unit (VDU). Alternatively, or
additionally, the signal may be recorded, e.g. in a computer memory, or
transmitted to some other system such as a control system.
The figure below shows how these basic functional elements form a measurement
system.
Example
With a resistance thermometer, element A takes the temperature signal and transforms
it into resistance signal, element B transforms the resistance signal into a current signal,
element C transforms the current signal into a display of a movement of a pointer
across a scale.
Which of these elements is (a) the sensor, (b) the signal processor, (c) the data
presentation?
The sensor is element A,
The signal processor element B,
the data presentation element is C.
The system can be represented by Figure below