3. Quality Assurance
• Sum of the organized arrangements
with the objective of ensuring that
products will be of a quality required
for their intended use
4. Good Manufacturing Practices
• Is that part of Quality Assurance that
aimed at ensuring that products are
consistently manufactured to a
quality appropriate to their intended
use
5. Quality Control
• Is that part of GMP concerned with
sampling, specifications & testing,
documentation & release procedures
which ensure that the necessary &
relevant tests are performed & the
product is released for use only after
ascertaining it’s quality
6. QA and QC
• All those planned • Operational
or systematic laboratory
actions necessary techniques and
to provide activities used to
adequate fulfill the
confidence that a requirement of
product will satisfy Quality
the requirements
of quality
7. QA and QC
• QA is company • QC is lab based
based
8.
9. Introduction and Background
The concepts of Statistical Process Control (SPC)
were initially developed by Dr. Walter Shewhart of
Bell Laboratories in the 1920's, and were expanded
by Dr. W. Edwards Deming, who introduced SPC to
Japanese industry after WWII.
After early successful adoption by Japanese firms,
Statistical Process Control has now been
incorporated by organizations around the world as a
primary tool to improve product quality by reducing
process variation
10. Statistical Process Control
• Monitoring quality by application of
statistical methods in all stages of
production
• Such methods are
– Based on theory of probability
and
– Relate qualitative and quantitative
characteristics of a product to meet
established standards
11. SQC IS Used
• Estimating parameters
• Tests of significance
• Determining relationship between
factors
• Making decisions on the basis of
experimental evidence
12. Selection of statistical
method
• Selection of appropriate method of
statistical analysis depends on
– Types of data or measurements
– Sampling techniques
– Design of experiments
– Types of sample distribution
13. SQC has been used to serve
• As a base for improved evaluation of
materials through more representative
sampling technique
• As a mean of achieving sharper control in
manufacturing processes
• To provide logical approach to variations
• Evaluation of magnitude of chance
variation in quality of a product
• Detection of assignable variations of
product quality
14. Procedure
• The procedure consists of
– Proper sampling
– Determine variations in the samples
– To draw conclusion to the entire batch from
the observed data
– The data pattern once obtained may be
utilized to predict the limits within which
future data can be expected to fall as a
matter of chance, and to determine when
significant variations in the process have
taken place
15. Data analysis
• Data can be analyzed using suitable
method of analysis e.g.
– T-TEST
– Analysis of variance
– Inference is based on P value (0.05)
16. Chance variations
• These variations are inevitable
because any programme of
production and inspection have its
unique chance of causes of
variations, which cannot be
controlled or eliminated and often
cant be identified
17. Assignable variations
• These variations can usually be
detected and corrected by statistical
techniques
• Such variations are usually caused
by machine in a specific batch or a
container
18. Process Variability
• In order to work with any distribution, it is
important to have a measure of the data
dispersion or spread. This can be expressed
by the range (highest less lowest), but is
better captured by the standard deviation
(sigma).
19. Why Is Dispersion So Important?
• Often we focus on average values, but
understanding dispersion is critical to the
management of industrial processes.
Consider two examples:
• If a person puts one foot in a bucket of
water (33oF) and one foot in a bucket of
water (127oF), on average he'll feel fine
(80oF), but he won't actually be very
comfortable
20. • If a person is asked to walk through
a river and told that the average
water depth is 3 feet he might want
more information. If he is then told
that the range is from zero to 15
feet, he might want to re-evaluate
the trip.
21. Control Limits
• Statistical tables have been developed for various
types of distributions that quantify the area under
the curve for a given number of standard deviations
from the mean, which can be used as probability
tables to calculate the odds that a given value is
part of the same group of data used to construct the
histogram
• Shewhart found that control limits placed at three
standard deviations from the mean in either
direction provide an economical trade-off between
the risk of reacting to a false signal and the risk of
not reacting to a true signal - regardless the shape
of the underlying process distribution
22. • If the process has a normal distribution,
99.7% of the population is captured by the
curve at three standard deviations from the
mean.
• Stated another way, there is only a 0.3%
chance of finding a value beyond 3 standard
deviations. Therefore, a measurement value
beyond 3 standard deviations indicates that
the process has either shifted or become
unstable (more variability).
23. • The illustration below shows a normal curve
for a distribution with a mean of 69, a mean
less 3 standard deviations value of 63.4, and
a mean plus 3 standard deviations value of
74.6. Values, or measurements, less than
63.4 or greater than 74.6 are extremely
unlikely. These laws of probability are the
foundation of the control chart.