Metrology & Measurement
Overview of Metrology and Linear
Measurement
Definitions of Metrology
 Metro means Science of Measurement
 Logy means Science
 Metrology is the science of Measurement
“Metrology is concerned with the establishment,
reproduction, conservation and transfer of units of
measurement and their standards”
Objectives of Metrology
 Establish units of Measurement .
 To achieve Standardization.
 Define the Method of Measurement.
 Find and correct errors in measurement.
 Study measuring instrument and device.
 Determine accuracy and precision of instrument.
 Study industrial application of metrology.
 Study Design, Manufacturing and testing of all kind of
gauges
Types of Metrology
Scientific Metrology:
It is also called as Fundamental
Metrology.
It deals with the organization &
development of measurement standards
& with their maintenance.
Types of Metrology
Industrial Metrology:
It deals with the ensuring of the adequate
functioning of measuring instruments
used in industry as well as in production
& testing Processes.
It is necessary to work with quality in
industrial activities.
Types of Metrology
Legal Metrology:
Its function is to regulate, advise,
supervise & control the manufacturing &
calibration of measuring instruments.
Inspection
“It is the act of checking of all materials,
products or component at various stages
during the manufacturing”
Need for Inspection
i. To ensure components & parts conform to the given
dimensions.
ii. To meet the interchangeability of manufacturer.
iii. To produce the parts having acceptable quality level.
iv. To judge the possibility of rework of defective parts and
re-engineer the process.
v. To purchase good quality of raw materials, tools &
equipment.
Methods of Measurements
1. Direct Method:
 This is the simplest method of measurement in which the
value of the quantity to be measured is obtained directly
without any calculations, e.g. measurements by scale,
calipers & micrometers.
 It involves contact or non contact type of inspections.
2. Indirect Method:
 The value of the quantity to be measured is obtained by
measuring other quantities, which are related to required
value.
 E.g. density calculation by measuring mass & volume.
Characteristics of
Instruments
Least Count
“Least count is the smallest
dimension that can be measured
using the measuring instrument”
Range and Span
“Range is the minimum and maximum values
that an instrument can measure.”
“Span is the difference between the largest
and smallest reading of an instrument.”
For example, a thermometer with a scale from 0°C to 100°C
has a range of 0°C to 100°C, has a span of 100°C.
Accuracy
“Accuracy is the degree of closeness between
measured value and true value or standard
value.”
Precision
“It is the degree of repeatability in the
measuring process.”
Reliability
“Reliability in measurement is the consistency of a
method in producing same results.”
A measurement is considered reliable if the same
method produces the same results when used under
the same conditions.
If a thermometer displays the same temperature for a
liquid sample multiple times under identical
conditions, the results are reliable.
Calibration
“Calibration is a process that compares the output of a
measuring device to a known standard to ensure that
the device is providing accurate measurements”
If the instrument is used continuously, it should be
calibrated periodically.
Calibration is important because it helps to ensure that
the device is operating within its specified accuracy
range.
Hysteresis
“ it is the phenomena, which gives different outputs
when loading and unloading”
It is commonly observed in Electrical equipment.
Dead Zone
“the range of values of measured variable to which
instrument does not respond”
Drift
“Drift in measurement is a gradual change in the
precision of a measuring instrument over time.”
Drift can be caused by a number of factors,
including: Wear and tear, Temperature fluctuations,
Environmental variables, and Aging components.
To deal with drift, you can recalibrate the instrument
periodically.
Sensitivity
“The smallest change in an input signal that causes a
measuring device to respond.”
“How quickly the instrument responds to the change
in input.”
Threshold
“The minimum value of input required to respond the
instrument is called threshold ”
Repeatability
“Repeatability is a measure of how likely it is to get the
same result when repeating an experiment with the
same setup”
Reproducibility
“Reproducibility in measurement is the consistency of
results obtained when a measurement is repeated
under the same conditions. ”
Linearity
“Ability of instrument to give output linearly
proportional input”
Amplification
“Amplification is the process of increasing the
magnitude of a signal or measurement, making it more
detectable or easier to measure.”
Magnification
“Magnification is the enlargement of an object's image
or signal in comparison to its actual size or original
magnitude. ”
Speed of response
“The speed of response is the time taken by a
measuring instrument or system to react to a change in
the quantity being measured. ”
Fidelity
“fidelity refers to the degree to which a measurement
system accurately reproduces the changes in the input
signal or parameter without distortion.”
Overshoot.
“An overshoot is a short term spike or transient in a
waveform which rises above the Top or drops below
the Base levels of a waveform.”
Standards
Standards
Standard is defined as “something that is set-up and
established by authority as a rule for the measure of
quantity, weight, extent, value or quality.”
Types of Standards
1. Line Standard
2. End Standard
3. Wavelength Standard
Line Standards
When length is measured as the distance
between centers of two engraved lines
End Standards
When length is measured as the distance
between to flat parallel faces
Wavelength Standards
When length is measured as the distance
between wave length of light.
Linear Measuring
Instrument's
Vernier caliper
 Vernier principle : When two scales (main and auxiliary
scales) or division slightly different in size are used, the
difference between them can be utilized to enhance the
accuracy of measurement.
Construction :
Vernier height guage
Similar to a vernier calliper
except that the fixed jaw in
this case is replaced by a
fixed base which rests on a
surface plate or table when
taking measurements.
Vernier depth gauge
Micrometers
• Useful device for magnifying small measurement
• Accurate screw and nut are used for measurement
• Micrometers works on the principle of screw and nut.
The screw is attached to a concentric cylinder or
thimble the circumference of which is divided into a
number of equal parts. A screw is turned through
nut by one revolution, its axial movement is
equal to pitch of the thread of screw.
Micrometers
Micrometers
Errors in Measurement
Error in Measurement=Measured value-True Value
1. Gross Errors
2. Systematic Errors
3. Random Errors
Gross Errors
This class of errors mainly covers human mistakes in
reading measuring instruments and recording and
calculating measurement results.
Systematic Errors
These types of errors are divided into three
categories
(i)Instrumental Errors
(ii)Environmental Errors
(iii)Observational Errors
Instrumental Errors
These errors arise due to three main reasons :
(a) Due to inherent shortcomings in the instrument
(b) Due to misuse of the instruments
(c) Due to loading effects of instruments
Environmental Errors
These errors in measuring instruments are due
to conditions external to the measuring device
including conditions in the area surrounding the
instrument.
These may be effects of temperature pressure,
humidity, dust, vibrations or of external magnetic or
electrostatic fields.
Observational errors
Random Errors
It has been consistently found that experimental
results show variation from one reading to another,
even after all systematic errors have been accounted
for.
These types of errors in measuring
instruments are due to a multitude of small factors
which change or fluctuate from one measurement to
another and are due surely to chance
Selection of instrument
Accuracy and Precision: Choose an instrument that meets
the required level of accuracy and precision for your task.
Range: Ensure the instrument can measure within the range
of values you expect.
Sensitivity: Select an instrument that is sensitive enough to
detect small changes in the quantity being measured.
Speed of Response: Pick an instrument that reacts quickly
enough for your application, especially in dynamic systems.
Selection of instrument
Environment: Consider factors like temperature, humidity, and
vibration in the operating environment, as some instruments may
not perform well under certain conditions.
Ease of Use: The instrument should be simple to operate and
read, minimizing chances of errors.
Cost: Balance the cost of the instrument with the level of accuracy
and functionality you need.
Durability and Maintenance: Choose instruments that are
reliable, require minimal maintenance, and have a long service life.
Precautions while using an instrument
 Ensure proper calibration and regular recalibration.
 Use the instrument in a stable environment (avoid
vibrations, extreme temperatures, and humidity).
 Handle the instrument carefully to prevent damage
or misalignment.
 Avoid parallax error by aligning eyes with the scale
or using digital displays.
 Allow sufficient warm-up time for electronic
instruments.
Precautions while using an instrument
 Check and adjust the zero setting before measurement.
 Follow consistent measurement procedures and take
averages for accuracy.
 Securely place the instrument on a stable surface or
mount it.
 Keep the instrument and objects clean and free of debris.
 Minimize human errors by following the user manual
and procedures.
 Perform regular maintenance and store the instrument
properly.

MEMEM CHAPTE R.00.@.pptxM CHAPTE R@.pptx

  • 1.
    Metrology & Measurement Overviewof Metrology and Linear Measurement
  • 2.
    Definitions of Metrology Metro means Science of Measurement  Logy means Science  Metrology is the science of Measurement “Metrology is concerned with the establishment, reproduction, conservation and transfer of units of measurement and their standards”
  • 3.
    Objectives of Metrology Establish units of Measurement .  To achieve Standardization.  Define the Method of Measurement.  Find and correct errors in measurement.  Study measuring instrument and device.  Determine accuracy and precision of instrument.  Study industrial application of metrology.  Study Design, Manufacturing and testing of all kind of gauges
  • 4.
    Types of Metrology ScientificMetrology: It is also called as Fundamental Metrology. It deals with the organization & development of measurement standards & with their maintenance.
  • 5.
    Types of Metrology IndustrialMetrology: It deals with the ensuring of the adequate functioning of measuring instruments used in industry as well as in production & testing Processes. It is necessary to work with quality in industrial activities.
  • 6.
    Types of Metrology LegalMetrology: Its function is to regulate, advise, supervise & control the manufacturing & calibration of measuring instruments.
  • 7.
    Inspection “It is theact of checking of all materials, products or component at various stages during the manufacturing”
  • 8.
    Need for Inspection i.To ensure components & parts conform to the given dimensions. ii. To meet the interchangeability of manufacturer. iii. To produce the parts having acceptable quality level. iv. To judge the possibility of rework of defective parts and re-engineer the process. v. To purchase good quality of raw materials, tools & equipment.
  • 9.
    Methods of Measurements 1.Direct Method:  This is the simplest method of measurement in which the value of the quantity to be measured is obtained directly without any calculations, e.g. measurements by scale, calipers & micrometers.  It involves contact or non contact type of inspections. 2. Indirect Method:  The value of the quantity to be measured is obtained by measuring other quantities, which are related to required value.  E.g. density calculation by measuring mass & volume.
  • 10.
  • 11.
    Least Count “Least countis the smallest dimension that can be measured using the measuring instrument”
  • 12.
    Range and Span “Rangeis the minimum and maximum values that an instrument can measure.” “Span is the difference between the largest and smallest reading of an instrument.” For example, a thermometer with a scale from 0°C to 100°C has a range of 0°C to 100°C, has a span of 100°C.
  • 13.
    Accuracy “Accuracy is thedegree of closeness between measured value and true value or standard value.”
  • 14.
    Precision “It is thedegree of repeatability in the measuring process.”
  • 17.
    Reliability “Reliability in measurementis the consistency of a method in producing same results.” A measurement is considered reliable if the same method produces the same results when used under the same conditions. If a thermometer displays the same temperature for a liquid sample multiple times under identical conditions, the results are reliable.
  • 18.
    Calibration “Calibration is aprocess that compares the output of a measuring device to a known standard to ensure that the device is providing accurate measurements” If the instrument is used continuously, it should be calibrated periodically. Calibration is important because it helps to ensure that the device is operating within its specified accuracy range.
  • 19.
    Hysteresis “ it isthe phenomena, which gives different outputs when loading and unloading” It is commonly observed in Electrical equipment.
  • 20.
    Dead Zone “the rangeof values of measured variable to which instrument does not respond”
  • 21.
    Drift “Drift in measurementis a gradual change in the precision of a measuring instrument over time.” Drift can be caused by a number of factors, including: Wear and tear, Temperature fluctuations, Environmental variables, and Aging components. To deal with drift, you can recalibrate the instrument periodically.
  • 22.
    Sensitivity “The smallest changein an input signal that causes a measuring device to respond.” “How quickly the instrument responds to the change in input.”
  • 23.
    Threshold “The minimum valueof input required to respond the instrument is called threshold ”
  • 24.
    Repeatability “Repeatability is ameasure of how likely it is to get the same result when repeating an experiment with the same setup”
  • 25.
    Reproducibility “Reproducibility in measurementis the consistency of results obtained when a measurement is repeated under the same conditions. ”
  • 26.
    Linearity “Ability of instrumentto give output linearly proportional input”
  • 27.
    Amplification “Amplification is theprocess of increasing the magnitude of a signal or measurement, making it more detectable or easier to measure.”
  • 28.
    Magnification “Magnification is theenlargement of an object's image or signal in comparison to its actual size or original magnitude. ”
  • 29.
    Speed of response “Thespeed of response is the time taken by a measuring instrument or system to react to a change in the quantity being measured. ”
  • 30.
    Fidelity “fidelity refers tothe degree to which a measurement system accurately reproduces the changes in the input signal or parameter without distortion.”
  • 31.
    Overshoot. “An overshoot isa short term spike or transient in a waveform which rises above the Top or drops below the Base levels of a waveform.”
  • 32.
  • 33.
    Standards Standard is definedas “something that is set-up and established by authority as a rule for the measure of quantity, weight, extent, value or quality.”
  • 34.
    Types of Standards 1.Line Standard 2. End Standard 3. Wavelength Standard
  • 35.
    Line Standards When lengthis measured as the distance between centers of two engraved lines
  • 36.
    End Standards When lengthis measured as the distance between to flat parallel faces
  • 37.
    Wavelength Standards When lengthis measured as the distance between wave length of light.
  • 39.
  • 40.
    Vernier caliper  Vernierprinciple : When two scales (main and auxiliary scales) or division slightly different in size are used, the difference between them can be utilized to enhance the accuracy of measurement. Construction :
  • 43.
    Vernier height guage Similarto a vernier calliper except that the fixed jaw in this case is replaced by a fixed base which rests on a surface plate or table when taking measurements.
  • 44.
  • 45.
    Micrometers • Useful devicefor magnifying small measurement • Accurate screw and nut are used for measurement • Micrometers works on the principle of screw and nut. The screw is attached to a concentric cylinder or thimble the circumference of which is divided into a number of equal parts. A screw is turned through nut by one revolution, its axial movement is equal to pitch of the thread of screw.
  • 46.
  • 47.
  • 50.
    Errors in Measurement Errorin Measurement=Measured value-True Value 1. Gross Errors 2. Systematic Errors 3. Random Errors
  • 51.
    Gross Errors This classof errors mainly covers human mistakes in reading measuring instruments and recording and calculating measurement results.
  • 52.
    Systematic Errors These typesof errors are divided into three categories (i)Instrumental Errors (ii)Environmental Errors (iii)Observational Errors
  • 53.
    Instrumental Errors These errorsarise due to three main reasons : (a) Due to inherent shortcomings in the instrument (b) Due to misuse of the instruments (c) Due to loading effects of instruments
  • 54.
    Environmental Errors These errorsin measuring instruments are due to conditions external to the measuring device including conditions in the area surrounding the instrument. These may be effects of temperature pressure, humidity, dust, vibrations or of external magnetic or electrostatic fields.
  • 55.
  • 56.
    Random Errors It hasbeen consistently found that experimental results show variation from one reading to another, even after all systematic errors have been accounted for. These types of errors in measuring instruments are due to a multitude of small factors which change or fluctuate from one measurement to another and are due surely to chance
  • 57.
    Selection of instrument Accuracyand Precision: Choose an instrument that meets the required level of accuracy and precision for your task. Range: Ensure the instrument can measure within the range of values you expect. Sensitivity: Select an instrument that is sensitive enough to detect small changes in the quantity being measured. Speed of Response: Pick an instrument that reacts quickly enough for your application, especially in dynamic systems.
  • 58.
    Selection of instrument Environment:Consider factors like temperature, humidity, and vibration in the operating environment, as some instruments may not perform well under certain conditions. Ease of Use: The instrument should be simple to operate and read, minimizing chances of errors. Cost: Balance the cost of the instrument with the level of accuracy and functionality you need. Durability and Maintenance: Choose instruments that are reliable, require minimal maintenance, and have a long service life.
  • 59.
    Precautions while usingan instrument  Ensure proper calibration and regular recalibration.  Use the instrument in a stable environment (avoid vibrations, extreme temperatures, and humidity).  Handle the instrument carefully to prevent damage or misalignment.  Avoid parallax error by aligning eyes with the scale or using digital displays.  Allow sufficient warm-up time for electronic instruments.
  • 60.
    Precautions while usingan instrument  Check and adjust the zero setting before measurement.  Follow consistent measurement procedures and take averages for accuracy.  Securely place the instrument on a stable surface or mount it.  Keep the instrument and objects clean and free of debris.  Minimize human errors by following the user manual and procedures.  Perform regular maintenance and store the instrument properly.