5. Historical Developments
▶ The early yard was defined as
the distance from the tip of the
nose to the end of the thimble
when the arm was
overstretched.
▶ An inch of 1325 was the length
over three barely corns.
6. Material Length Standards
▶ Imperial Standard yard is a bronze (82% copper, 13% tin, 5% zinc) bar of 1
inch × 1 inch square section and 38 inches long. A round recess, one inch
away from the two ends is cut at both ends up to the central plan of the
bar.
▶ A gold plug 1/10 inch
dia. Is inserted into
these holes so that lines
are in neutral plane.
▶ Y
ard is then defined as
the distance between
two central transverse
lines of the plug at 62°
F.
7. International Yard
▶ Increasing demand for accurate, internationally interchangeable
engineering devices and the problems arising from the varying ratio
between the imperial standard yard, and American standard yard and the
prototype meter were resolved in 1960
1 International yard = 0.9144 meter
1 inch = 25.4 mm
8. Material Length Standards
▶ American standard yard was defined at a temperature at 68° F and very
slightly longer than the imperial standard yard.
▶ Bronze alloy was not stable and continuously shrunk at the rate of one-
millionth of an inch for the past 50 years
▶ 1922 – relationship between yard and meter by specifying it as legal size. 1
yard = 0.91439841 meter
10. International Prototype Metre
▶ Measurement of the arc between Dunkirk and Barcelona were completed
in 1798 and the following year platinum length bar corresponding to one
meter was deposited.
▶ This served as metric length standard until 1875 when it was replaced by
platinum-iridium, international prototype metre.
▶ The length of the metre is defined as the straight line distance, at 0°
▶ between the centre portions of pure platinum-iridium alloy (10% iridium)
of 102 cm total length and having a cross section.
11. Light Wave (Optical) Length standard
▶ In order to define a standard length in this way it was necessary to find a
suitable light source from which a given radiation could be readily
selected, the wavelength of the selected radiation then being measured
and used as the basic unit of length.
▶ Eleventh General Conference of Weights and Measures held in Paris in
1960 defined the metre as equal to 1650763.73 wave lengths of the red
orange radiation of krypton isotope 86 gas.
12. Light Wave (Optical) Length standard
▶ It made primary unit accessible to any laboratory, and reduced its
uncertainty by a factor of 50.
Let x = the number of wavelengths of Kr-86 in the meter, then
1 mm =
𝑥
1000
wavelengths
1 inch =
𝑥
1000
× 25.4 wavelengths
1 yard =
𝑥
1000
× 25.4 × 36 wavelengths
13. Advantages of Light Standard
rooms and physical
▶ It does not change length.
▶ If destroyed it can easily be replaced.
▶ Identical copies can be kept in all standard
laboratories.
▶ It can be used for making comparative measurements of a much higher
accuracy than was possible with the older material standards.
▶ It will give a unit of length which could be produced consistently at any
time in all countries.
14. Metre as of today
▶ Since seventies, it was foreseen that a
definition based on the speed of light
would soon be technically feasible and
practically advantageous.
▶ On the 20th October 1983 the 17th
General Conference of Weights and
Measures agreed to a fundamental
change in the definition of the metre.
▶ Metre is now defined as the length of
the path travelled by light in
vacuum in 1/299792458 second.
Iodine Stabilized Helium-neon Laser
16. Definition
• The word metrology is derived from two Greek words
Metro=measurement
Logy=science
Thus metrology is the science of measurement
Metrology is the field of knowledge concerned with
measurement and includes both theoretical and practical
problems with reference to measurement.
Metrology is the science of weights and measures
Metrology is the process of making precise measurements
of the relative positions and orientations of different
optical and mechanical components
Metrology is the science concerned with the
establishment, reproduction, conversion and transfer of
units of measurement and their standards
17.
18.
19.
20. Elements of Metrology
• Standard
• The most basic element of measurement is standard without which no
measurement is possible.
• Standard is a physical representation of unit of measurement.
• Different standards have been developed for various units
including fundamental units as well as derived units.
• Workpiece
• Workpiece is the object to be measured/measured part
• Variations in geometry and surface finish of the measured part
directly affect measuring system’s repeatability
• Compressible materials like plastic or nylons pose a different type of problem
that any gauge pressure will distort the material. This can be avoided by
fixing of gauging pressure as suggested by the industry so that everyone will
get uniform results
S W I P E
21. • Instruments
• Instrument is a device with the help of which the measurement
can be done
• The instrument must be selected based on the tolerance of the
parts to be measured, the type of environment and the skill
level of operators
• It should be remembered that what type of instruments the
customer prefer to measure.
• Person
• There must be some person or mechanism to carryout the
measurement
• Modern gauges are increasingly easy to use but the failure to
adequately train the operating personnel will lead a poor
performance
22. • The measurement should be performed under standard
environment
• Temperature, dirt, humidity and vibration are the four
environmental factors that influence measurement.
• Vernier scale division of vernier caliper always changes
when the measurement process is carried for ‘n’ number of
times for the same dimension. The environment is
indirectly related to temperature, humidity, conditioning
etc.,
• Environment
23. Elements of a Measuring System
S W I P E
1.Standards :- Physical quantity or property relative
to which quantitative comparison is made.
2.Workpiece:- Features of the workpiece to be
measured.
3.Instrument:- Instruments needed for measurement.
4.Person:- The skill of the human operator.
5.Environment:- The conditions in
which measurement is done.
24. Subdivisions of Standards
▶ The international yard and international prototype metre, described
previously, are considered to be perfect or master standards and obviously
can not be used for general purposes. For practical purposes there is a
hierarchy of working standards.
▶ For example, in most firms inspection gauge slip gauges would be used to
check and measure work produced using workshop grade slip gauges,
micrometers, verniers, rules etc.
▶ The standards are subdivided into following four grades.
1. Primary standards
2. Secondary standards
3. Tertiary standards
4. Working standards
25. Primary Standards
▶ For precise definition of the unit it is essential
that there shall be one, and only one material
standard. This is called primary standard and is
preserved under the most careful conditions,
used only at rate intervals and then solely for
comparison with secondary standards.
26. Secondary Standards
▶ They are made as nearly as possible to the primary standard with which
they are compared at intervals, and records of their deviation from it.
These standards are distributed to a number of places for a safe custody
and used in their turn for occasional comparison for tertiary standards.
▶ Material for secondary standards
1. Invar an alloy of nickel and steel
2. Fused silica
3. Elinvar-an alloy of nickel and chromium.
27. Tertiary Standards
▶ These are the first standards to be used for reference purpose in
laboratories and workshops. The primary and secondary types exist only
as ultimate controls, for reference at rate intervals. Tertiary standards
should also be maintained as a reference for comparison at intervals with
working standards.
28. Working Standards
▶ “Working” or secondary line standards are necessary for use in scientific
laboratories and similar institutions. They are derived from the
fundamental standards. Line standard bars are usually made from an H-
cross-sectional form.
• Line Standards
• End Standards
29. Classification of Standards & Traceability
▶ Application of precise measurement has increased to such an extent that it
is not practicable for a single national laboratory to perform directly all the
calibrations and standardizations required by a large country.
▶ Therefore, process of traceability technique needs to be followed in stages.
International
Standard
National
Standards of
Country
Standardization
Laboratories
30. Classification of Standards & Traceability
▶ In order to maintain accuracy and interchangeability in the items
manufactured by various industries in the country, it is essential that the
standards of units and measurements followed by them must be traceable
to a single source i.e., National Standards of the Country.
▶ Clearly, there is degradation of accuracy in passing from the defining
standards to the standard in use.
▶ All the industrial standardization and industrial product certifications are
governed by the Bureau of Indian Standards, the national standards
organization of India
▶ while standards for other areas (like agricultural products) are developed
and managed by other governmental agencies.
31. Certification Marks in India
▶ Agmark for all agricultural products
▶ BIS hallmark (BIS hallmark) certifies the purity of gold jewelry.
▶ FPO mark. A mandatory mark for all processed fruit products in India.
▶ Geographical Indications marks, defined under the WTO Agreement on
Trade-Related Aspects of Intellectual Property Rights (TRIPS)
▶ India Organic certification mark for organically farmed food products
National Standards for Organic Products.
▶ SI mark. For industrial product.
▶ Non Polluting Vehicle mark on motor vehicles certifying conformity to the
Bharat Stage emission standards.
▶ Toxicity label is mandatory on the containers of pesticides sold in India.
32. Need of Inspection
• To ensure the material, parts and components conform
to the established standards
• To meet the interchangeability of manufacture
• To provide the means of finding the problem area for
meeting the established standards
• To produce the parts having acceptable quality levels
with reduced scrap and wastage
• To purchase good quality of raw materials, tools and
equipment that govern the quality of finished products
• To take necessary efforts to measure and reduce the
rejection percentage
• To judge the possibility of rework of defective parts
33. Effects of elements of metrology on Precision and
Accuracy
• Factors affecting the standard of measurement
• Coefficient of thermal expansion
• Elastic properties of a material
• Stability with time
• Calibration interval
• Geometric compatibility
• Factors affecting the workpiece to be measured
• Coefficient of thermal expansion of material
• Elastic properties of a material
• Cleanliness, surface finish, surface defects such as scratches,
waviness etc.,
• Adequate datum on the workpiece
• Thermal equalization
34. Errors in Measurement
• An error may be defined as the difference between the
measured value and the actual value
• True value may be defined as the average value of an infinite
number of measured values
• Measured value can be defined as the estimated value of true
value that can be found by taking several values during an
experiment.
• Error in measurement=Measured value-True value
The Errors in measurement can be expressed either as an
absolute or relative error
35. • Absolute Error: Absolute error is the algebraic difference
between the measured value and the true value of the
quantity measured. It is further sub divided into
• True absolute error
• The algebraic difference between the measured average value and
the conventional true value of the quantity measured is called true
absolute error
• Apparent Absolute error
• While taking the series of measurement, the algebraic difference
between one of the measured values of the series of measurement
and the arithmetic mean of all measured values in the same series
is called apparent absolute error.
• Relative Error: It results as the results of the absolute error
and the value of comparison used for the calculation of
absolute error.
• Absolute error=True value-Measured Value=300-280=20 units
• Relative error=Absolute error/Measured
value=20/300=0.06=6%
37. • Gross Errors
• Gross errors are caused by mistakes in using instruments,
calculating measurements and recording data results.
• Eg: The operator or person reads the pressure gauge
reading as 1.10 N/m2 instead of 1.01 N/m2
• This may be the reason for gross errors in the reported
data and such error may end up in calculation of the final
results, thus producing deviated results.
• Blunders
• Blunders are caused by faulty recording or due to a wrong
value while recording a measurement, or misreading a
scale or forgetting a digit while reading a scale.
38. • Theoretical
• Measurement Error
• The measurement error is the result of the variation of a
measurement of the true value.
• The best example of the measurement error is, if electronic
scales are loaded with 1kg standard weight and the reading is
1002grams, then
• The measurement error is = (1002grams-1000grams) =2grams
• Measurement Errors are classified into two types: systematic
error and random errors
Systematic error
• The errors that occur due to fault in the measuring device are
known as systematic errors.
• These errors can be detached by correcting the measurement
device.
• These errors may be classified into different categories.
• Instrumental Errors
• Environmental Errors
• Observational Errors
39. Instrumental Errors
• Instrumental errors occur due to wrong construction of
the measuring instruments.
• These errors may occur due to hysteresis or friction.
• In order to reduce these errors in measurement, different
correction factors must be applied and in the extreme
condition instrument must be recalibrated carefully.
Environmental Errors
• The environmental errors occur due to some external
conditions of the instrument.
• External conditions mainly include pressure, temperature,
humidity or due to magnetic fields.
• To reduce the environmental errors
• Try to maintain the humidity and temperature constant in the
laboratory by making some arrangements.
• Ensure that there shall not be any external electrostatic or
magnetic field around the instrument.
40. Observational Errors
• As the name suggests, these types of errors occurs due to wrong
observations or reading in the instruments particularly in case
of energy meter reading.
• The wrong observations may be due to PARALLAX.
• In order to reduce the PARALLAX error highly accurate meters are
needed: meters provided with mirror scales.
Theoretical Errors
• Theoretical errors are caused by simplification of the model system.
For example, a theory states that the temperature of the system
surrounding will not change the readings taken when it actually
does, then this factor will begin a source of error in measurement.
Random Errors
• These are errors due to unknown causes and they occur even when
all systematic errors have been accounted.
• These are caused by any factors that randomly affect the
measurement of the variable across the sample.
41. Methods of Measurement
• Direct method
• In this method, the quantity to be measured is directly
compared with the primary or secondary standard.
• This method is widely employed in production field.
• In this method, a very slight difference exists between the
actual and the measured values because of the limitation
of the human being performing the measurement.
• Indirect method
• In this method, the value of quantity is obtained by
measuring other quantities that are functionally related to
the required value.
• Measurement of the quantity is carried out directly and
then the value is determined by using a mathematical
relationship.
• Eg: angle measurement using sine bar
42. • Fundamental or absolute method
• In this method, the measurement is based on the
measurements of base quantities used to define the
quantity.
• The quantity under consideration is directly measured and is
then linked with the definition of that quantity.
• Comparative method
• The quantity to be measured is compared with the known
value of the same quantity or any other quantity practically
related to it.
• The quantity is compared with the master gauge and only
the deviations from the master gauge are recorded after
comparison.
• Eg. Dial indicators
43. • Transposition method
• This method involves making the measurement by direct
comparison wherein the quantity to be measured(V) is
initially balanced by a known value (X) of the same quantity.
Then, ‘X’ is replaced by the quantity to be measured and
balanced again by another known value (Y). If the quantity to
be measured is equal to both X and Y
, then
• V= 𝑋𝑌
• Eg. Determination of mass by balancing methods
• Coincidence methods
• In this method, a very minute difference between the
quantity to be measured and the reference is determined by
careful observation of certain lines and signals
• Eg: Vernier caliper
44. • Deflection method
• This method involves the indication of the value of the
quantity to be measured by deflection of a pointer on a
calibrated scale.
• Eg. Pressure measurement
• Null measurement method
• In this method, the difference between the value of the
quantity to be measured and the known value of the same
quantity with which comparison is to be made is brought to
be zero.
• Substitution method
• This method involves the replacement of the value of the
quantity to be measured with a known value of the same
quantity, so selected that the effects produced in the
indicating device by these two values are the same.
45. • Contact method
• In this method, the surface to be measured is touched by the
sensor or measuring tip of the instrument.
• Eg. Micrometer, Vernier calliper and dial indicator
• Contactless method
• As the name indicates, there is no direct contact with the
surface to be measured
• Eg. Tool makers microscope, profile projector
• Composite method
• The actual contour of a component to be checked is
compared with its maximum and minimum tolerance limits.
• Cumulative errors of the interconnected elements of the
component which are controlled through a combined
tolerance can be checked by this method.
• This method is very reliable to ensure interchangeability and
is effected through the use of composite GO gauges.
46. General characteristics of a Measuring Instruments
in
Engineering Metrology
***************************
***************************
47. General characteristics in Metrology
Sensitivity: It is the ratio of the magnitude of output signal to the
magnitude of input signal. It denotes the smallest change in the
measured variable to which the instrument responds.
Sensitivity=(Infinitesimal change of output signal)/(Infinitesimal
change of input signal)
If the input-output relation is linear, the sensitivity will be constant
for all values of input.
If the instrument is having non-linear static characteristics, the
sensitivity of the instrument depends on the value of the input
quantity.
48. Hysteresis: All the energy put into the stressed component
when loaded is not recovered upon unloading. Hence, the output of
a measurement system will partly depend on its previous input
signals and this is called as hysteresis.
Range: It is the minimum and maximum values of a
quantity for which an instrument is designed to measure/ The
region between which the instrument is to operate is called
range.
Range = Lower Calibration Value – Higher Calibration
Value = Lc to Hc
Span: It is the algebraic difference between higher
calibration value and lower calibration value.
Span = Hc - Lc
Ex: If the range of an instrument is 100̊ C to 150̊ C, its
span is 150̊ C – 100̊ C = 50̊ C
49. Response Time: It is the time which elapses after a sudden change in
the measured quantity, until the instrument gives an indication
differing from the true value by an amount less than a given
permissible error.
Speed of response of a measuring instrument is defined as the
quickness with which an instrument responds to a change in input
signal.
Repeatability: It is the ability of the measuring instrument to give the
same value every time the measurement of a given quantity is
repeated.
It is the closeness between successive measurements of the
same quantity with the same instrument by the same operator over a
short span of time, with same value of input under same operating
conditions.
50.
51. Stability: The ability of a measuring instrument to retain its
calibration over a long period of time is called stability. It determines
an instruments consistency over time.
Backlash: Maximum distance through which one part of an
instrument may be moved without disturbing the other part.
Accuracy: The degree of closeness of a measurement compared to
the expected value is known as accuracy.
Precision: A measure of consistency or repeatability of
measurement. i.e. successive reading does not differ. The ability of
an instrument to reproduce its readings again and again in the same
manner for a constant input signal.
Magnification: Human limitations or incapability to read
instruments places limit on sensitiveness of instruments.
Magnification of the signal from measuring instrument can make it
better readable.
52. Resolution: Minimum value of input signal required to cause an
appreciable change or an increment in the output is called
resolution/ Minimum value that can be measured when the
instrument is gradually increased from non-zero value.
Error: The deviation of the true value from the desired value is
called error.
Drift: The variation of change in output for a given input over a
period of time is known as drift.
Threshold: Minimum value of input below which no output can be
appeared is known as threshold.
Reliability: Reliability may be explicitly defined as the probability
that a system will perform satisfactory for at least a given period of
time when used under stated conditions. The reliability function is
thus same probability expressed as a function of the time period.
53.
54. Calibration is a comparison of two instruments against each other, one being the
standard (the calibrator). This process is essential to document the error of the
instrument being calibrated and to increase its accuracy.
Calibration
55. Accuracy:
Accuracy may be defined as the ability of an instrument to respond to a true
value of measured variable under the reference conditions. It refers how
closely the measured value agrees with the true value. The difference
between the measured value and the true value is know as Error of
measurement.
Precision:
Precision may be defined as the degree of exactness for which an instrument
is designed or intended to perform. It refers the repeatability or consistency
of measurement when the measurements are carried out under identical
conditions
57. Accuracy is how close a measured value is to the actual (true) value.
It is closeness with the true value of the quantity being measured.
Precision is how close the measured values are to each other.
It is a measure of the reproducibility of the measurement.
Accuracy v/s Precision
58. Accuracy v/s Precision
# Accuracy Precision
1
It is closeness with the true value of the
quantity being measured.
It is a measure of the reproducibility of the
measurement.
2
The accuracy of measurement means
conformity to truth.
The term precise means clearly or
sharply defined.
3 Accuracy can be improved. Precision cannot be improved.
4
Accuracy depends upon simple
techniques of analysis.
Precision depends upon many factors
and required many sophisticated
techniques of analysis.
5
Accuracy is necessary but not sufficient
condition for precision.
Precision is necessary but not a
sufficient condition for accuracy.
59. Standards
Types of standards
• Line standard
• Standard yard
• Standard metre
• End standard
• End bar
• Slip gauges
• Wavelength standard
60. 1. Line Standard: When length is
measured as the distance
between centers of two
engraved lines
2. End Standard: When length is
measured as the distance
between to flat parallel faces
3. Wavelength Standard: When
length is measured as the
distance between wave length of
light.
Types of Measurement Standards
61. Line standard
The measurement of distance may be made between
two parallel lines or two surfaces.
When a length is measured between as the distance
between centres of two engraved lines, it is called line
standard
Standard yard
62. • Yard is made of a one inch square cross section
bronze bar and is 38 inches long
• The bar has a round recess (gold plug) of 0.5 inches
diameter and 0.5 inches deep. The gold plug is 1
inch away from both the ends
• The highly polished top surfaces of these plugs
contain three transversely and two longitudinally
engraved lines lying on the neutral axis
• The yard is the distance between two central
transverse lines on the plugs when the temperature
of the bar is constant at 62OF
• To protect the gold plug from accidental damage, it
is kept at the neutral axis as the neutral axis
remains unaffected even if the bar bends
63. Standard metre
• The metre is the distance between the centre portions
of two lines engraved on the polished surface of bar
made up of platinum (90%) and iridium (10%) having a
unique cross section
• The web section gives maximum rigidity and economy
in the use of costly material
• The upper surface of the
web is inoxidizable and
needs a good finish for
quality measurement
• the bar is kept at 0OC and
under normal
atmospheric pressure.
64. Characteristics of Line Standards
1. Scale can be accurately engraved but it is difficult to take full advantage
of this accuracy.
2. Quick and easy to use over a wide range since only one is required.
3. The scale markings are not subjected to wear although significant wear
on leading end leads to “undersizing”
4. A scale does not possess a built in datum which would allow easy scale
alignment with the axis of measurement, this again leads to undersizing.
5. Scales are subjected to the parallax effect, a source of both positive and
negative reading errors.
6. Scales are not convenient for close tolerance length measurement except
in conjunction of microscopes.
66. End standard
The need of an end standard arises as the use of line standards and
their copies was difficult at various places in workshops
These are usually in the form of end bars and slip gauges
• End bar
• End bars made of steel having cylindrical section of 22.2 mm diameter
with the faces lapped and hardened at the ends are available in
various lengths.
• Flat and parallel faced end bars are firmly used as the most practical
end standard used for measurement.
• These are used for measurement of larger sizes
• Slip gauges
• Slip gauges are rectangular blocks of hardened and stabilized high
grade cast steel
• The length of a slip gauge is strictly the dimension which it measures
• The blocks after being manufactured are hardened to resist wear and
are allowed to stabilize to release internal stresses
• A combination of slip gauges enables the measurements to be made in
the range of 0.0025mm to 100mm but in combinations with end bars,
the measurement range upto 1200mm is possible.
67. Characteristics of End Standards
1. End standards are highly accurate and are well-suited to measurements of
close tolerance.
2. They are time consuming in use and prove only one dimension at a time.
3. Dimensional tolerance as small as 0.0005 mm can be obtained.
4. They are subjected to wear on their measuring faces.
5. Groups of blocks are wrung together to provide a given size, faulty
wringing leads to damage.
6. End standards have a built in datum because their measuring faces are
flat and parallel.
7. They are not subjected to the parallax effect as their use depends on
“feel”
69. # Characteristics Line Standards End Standards
1 Principle Length is measured
between two Lines.
Length is expressed between
Two Parallel Flat Surfaces.
2 Measurement Simple and Quick. Time for measurement is
more.
3 Error Possibility of Parallax Error. Error due to improper Wringing or
Change in Temperature.
4 Accuracy Upto ± 0.2 mm unless provided
with magnifying glass.
± 0.001 mm.
5 Manufactur
e and Cost
Comparatively simple
to
manufacture and Cheaper.
Complex method or
manufacturing and Higher Cost.
6 Effect of Use End of the scale is subjected to
wear.
Due to continuous use measuring
faces get worn out special end pieces
may be needed.
Line Standards v/s End Standards
70. Wavelength standards
• Line and end standards are physical standards and
are made up of materials that can change their size
with temperature and other environmental
conditions
• In search for such suitable unit of length, wave
length source is established
• Laser is used as primary level wavelength standard
• According to this standard, a metre is defined as
equal to 1650763.73 wavelength of the red orange
radiation of krypton isotope gas
1 metre
1 yard
=1650763.73 wavelengths
=0.9144m
=0.9144x1650763.73
=1509458.3 wavelength.
71. Metal Standards v/s Wavelength Standards
# Characteristics Metal Standards Wavelength Standards
1 Effect of
Environment
Influenced by variation of
environment conditions like
temperature, pressure,
humidity etc.
Not affected by environmental
conditions.
2 Replacement
after Damage
If damaged exact
copies cannot be
made.
Can be easily reproduced.
3 Wear and Tear Affected by wear and tear. Not subjected to wear
and tear.
4 Availability of
Replicas
Exact replicas are not available
elsewhere.
Identical copies can be kept in
all standard laboratories.
5 Security
Requirement
Required to be preserved or
stored under strict surveillance
to prevent damage or
destruction.
No such security requirements.
72. Concept of Interchangeability
Interchangeability can be defined as it is a system of
producing the mating parts. A single operator was confined
with a number of units to assemble it which will take a long
period of time and it won't be economical. To reduce the
cost and time, mass production of the system was
developed.