Presentation on the examination of microbiological data for assessment and trending.
Includes: normalizing data, graphs, and assessment of alert and action levels.
Control on Cleanroom Environmental Monitoring (Pharmaceutical)Srinath Sasidharan
Â
A general consideration of Environmental Monitoring in Pharmaceutical manufacturing area. Cleanroom Monitoring Tools and Utilities: Author Sreenath Sasidharan (Geltec Healthcare FZE)
In this presentation from the Institute of Validation Technology's Life Sciences Aseptic Processing, Kim Van Antwerpen discusses collecting environmental data, methods for trending, and interpreting and sharing environmental monitoring data.
The two most commonly used within microbiology are
HACCP (which originated in the food industry) and FMEA
(developed for engineering). This article explores these two
approaches, first with a description of HACCP, followed by a
description and case study of FMEA in sterility testing.
Merupakan penggalan USP 36 chapter 1116 mengenai Microbiological Control And Monitoring Of Aseptic Processing Environments
Untuk mendapat softcopy atau informasi lebih lanjut silahkan hubungi delli.intralab@gmail.com
Control on Cleanroom Environmental Monitoring (Pharmaceutical)Srinath Sasidharan
Â
A general consideration of Environmental Monitoring in Pharmaceutical manufacturing area. Cleanroom Monitoring Tools and Utilities: Author Sreenath Sasidharan (Geltec Healthcare FZE)
In this presentation from the Institute of Validation Technology's Life Sciences Aseptic Processing, Kim Van Antwerpen discusses collecting environmental data, methods for trending, and interpreting and sharing environmental monitoring data.
The two most commonly used within microbiology are
HACCP (which originated in the food industry) and FMEA
(developed for engineering). This article explores these two
approaches, first with a description of HACCP, followed by a
description and case study of FMEA in sterility testing.
Merupakan penggalan USP 36 chapter 1116 mengenai Microbiological Control And Monitoring Of Aseptic Processing Environments
Untuk mendapat softcopy atau informasi lebih lanjut silahkan hubungi delli.intralab@gmail.com
It is process of âEstablishing documentary evidence that provide a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributesâ.
In the pharmaceutical industry, it is very important that in addition to final testing and compliance of products, it is also assured that the process will consistently produce the expected results.
Validation is action of proving in accordance with the principles of good manufacturing practices, that any procedure, process, equipment, material, activity or system actually leads to expected results.
Cleaning validation is documented evidence with a high degree assurance that one can consistently clean a system or a piece of equipment to predetermined and acceptable limits.
The primary regulatory concern driving the need for cleaning validation is cross contamination of the desired drug substance either by other API from previous batch runs or by residues from the cleaning agents used.
The prime purpose of validating a cleaning process is to ensure compliance with federal and other standard regulations
1. Cross contamination with active ingredients
Contamination of one batch of product with significant levels of residual active ingredients from previous batch cannot be tolerated.
In addition to the obvious problems posed by subjecting consumers or patients to unintended contaminants, potential clinically significant synergistic interactions between pharmacologically active chemicals are a real concern.
2. Contamination with unintended materials or compounds
While inert ingredients used in drug products are generally recognized as safe for human consumption, the routine use, maintenance and cleaning of equipment's provide the potential contamination with such items as equipment parts, lubricants and chemical cleaning agents3. Microbiological contamination
Maintenance , cleaning and storage conditions may provide adventitious microorganisms with the opportunity to proliferate within the processing equipment.
Corrective Actions and Preventive Actions.Corrective Action Preventive Action (CAPA) is a process which investigates and solves problems, identifies causes, takes corrective action and prevents recurrence of the root causes. The ultimate purpose of CAPA is to assure the problem can never be experienced again. Corrective vs. Preventive Action. Quality professionals frequently express confusion as to the difference between corrective and preventive action. A corrective action deals with a nonconformity that has occurred, and a preventive action addresses the potential for a nonconformity to occur.
Cleaning is much important step in pharmaceutical production and delivering high quality products to the system .
Cleaning is done by multiple step approach and all that have necessary role in production of high quality accepted product.
To minimise the times , money and inventory we discovered the concept of CIP - i.e. Clean In Place
System.
By this system we clean and sterilize all the pharmaceutical equipment on working site , without any loss and dissemble that's the advantages of CIP .
And also other method of cleaning are explained here .
Risk assessment for computer system validationBangaluru
Â
A risk assessment is a process to identify potential hazards and analyze what could happen if a hazard occurs.
Computer system validation (sometimes called computer validation or CSV) is the process of documenting that a computer system meets a set of defined system requirements.
It is process of âEstablishing documentary evidence that provide a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributesâ.
In the pharmaceutical industry, it is very important that in addition to final testing and compliance of products, it is also assured that the process will consistently produce the expected results.
Validation is action of proving in accordance with the principles of good manufacturing practices, that any procedure, process, equipment, material, activity or system actually leads to expected results.
Cleaning validation is documented evidence with a high degree assurance that one can consistently clean a system or a piece of equipment to predetermined and acceptable limits.
The primary regulatory concern driving the need for cleaning validation is cross contamination of the desired drug substance either by other API from previous batch runs or by residues from the cleaning agents used.
The prime purpose of validating a cleaning process is to ensure compliance with federal and other standard regulations
1. Cross contamination with active ingredients
Contamination of one batch of product with significant levels of residual active ingredients from previous batch cannot be tolerated.
In addition to the obvious problems posed by subjecting consumers or patients to unintended contaminants, potential clinically significant synergistic interactions between pharmacologically active chemicals are a real concern.
2. Contamination with unintended materials or compounds
While inert ingredients used in drug products are generally recognized as safe for human consumption, the routine use, maintenance and cleaning of equipment's provide the potential contamination with such items as equipment parts, lubricants and chemical cleaning agents3. Microbiological contamination
Maintenance , cleaning and storage conditions may provide adventitious microorganisms with the opportunity to proliferate within the processing equipment.
Corrective Actions and Preventive Actions.Corrective Action Preventive Action (CAPA) is a process which investigates and solves problems, identifies causes, takes corrective action and prevents recurrence of the root causes. The ultimate purpose of CAPA is to assure the problem can never be experienced again. Corrective vs. Preventive Action. Quality professionals frequently express confusion as to the difference between corrective and preventive action. A corrective action deals with a nonconformity that has occurred, and a preventive action addresses the potential for a nonconformity to occur.
Cleaning is much important step in pharmaceutical production and delivering high quality products to the system .
Cleaning is done by multiple step approach and all that have necessary role in production of high quality accepted product.
To minimise the times , money and inventory we discovered the concept of CIP - i.e. Clean In Place
System.
By this system we clean and sterilize all the pharmaceutical equipment on working site , without any loss and dissemble that's the advantages of CIP .
And also other method of cleaning are explained here .
Risk assessment for computer system validationBangaluru
Â
A risk assessment is a process to identify potential hazards and analyze what could happen if a hazard occurs.
Computer system validation (sometimes called computer validation or CSV) is the process of documenting that a computer system meets a set of defined system requirements.
Key question:
Could the plague ever re-emerge on a similar level in the twenty-first century?
Due to the potential seriousness of the disease this is a subject worthy of epidemiological consideration and research.
Considering: Environmental monitoring guidance, Background to USP <1116>, Main changes and debates Method limitations, Incident rates, Frequencies of monitoring, Locations of monitoring, Other changes, Regulatory issues and Rapid methods
An introduction to the international cleanroom standard ISO 14644 and the 2015 revisions to Parts 1 and 2. The focus is on particulate and contamination control.
What is likely to go into the revised Annex 1, including:
Terminal sterilisation vs aseptic processing
WFI produced by reverse osmosis
Guidance for media simulation trials
This remains speculative
ME 313 Mechanical Measurements and Instrumentation is a followup course on ME-312 Machine Design. Design and implementation of measurement systems, signal conditioning and formatting. Dr. Bilal Siddiqui teaches this course every spring at DHA Suffa University.
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...Gabor Szabo, CQE
Â
This presentation walks you through the components of variation and the various metrics used in Variable Gage R&R Study. It also talks about the different root causes associated with a failing study, and how to perform root cause analysis using statistical tools.
Microbiologists carry out a lot of environmental montoring, but is this sufficiently focused? Are too many samples taken? Are samples taken in the wrong locations or at the wrong frequency? Some ideas are presented.
Overview of the key requirements ofelectronic data management systems in relation to pharmaceuticals and healthcare facilities. This includes the importance of computerised systems controls and defenitions of data. The presentation includes the importance of validation and quality assurance aspects.
Risk management tools and techniques for environmental monitoring:
Application of HACCP for selecting environmental monitoring locations; Use of risk filtering to determine frequencies of monitoring ; Applying FMEA to assess risks from process equipment â a sterility testing isolator.
Overview of the apporach to non-compliances and related matters. Appropriate training for analysts on how to perform the tests and steps to take when obtaining OOS results should be implemented . The use of root cause analysis tools when finding an OOS should also be available for review.
Introduction â the âgreatâ myths
Colony Forming Units â what are they?
Microbiology laboratory cabinets â always work?
Media growth promotion â can it be skipped?
Microbial distribution in cleanrooms â free floating?
Environmental monitoring parameters â can they be pre-set?
Bunsen burners needed to create aseptic spaceâ or not?
Identification resultsâ always believable?
Application of FMEA to a Sterility Testing Isolator: A Case StudyTim Sandle, Ph.D.
Â
Presentation on Failure Modes and Effects Analysis, in the pharmaceutical context. Covering:
Introduction to risk assessment
What are risks?
Advantages and disadvantages of FMEA
Applying FMEA to review a sterility testing isolator â case study
Pharmaceutical Microbiology: Current and Future Challenges Tim Sandle, Ph.D.
Â
The changing environment for pharmaceutical microbiology
Limitations of methods
Need for new (rapid) methods
Separating people form processes
Single-use technologies
Environmental monitoring programme
Best practices
Rapid methods
Contamination control strategy
Objectionable organisms
Burkholderia cepacia complex
Why use reference materials?
The importance of reference materials
Different categories of reference materials.
Different classes of reference materials.
Standards for reference materials.
How reference materials are prepared and assessed.
How reference materials are used.
GxP is a general abbreviation for the "good practice" quality guidelines and regulations. These slides provide an overview of current regulations, with a focus on pharmaceuticals and healthcare.
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
Â
As consumer awareness of health and wellness rises, the nutraceutical marketâwhich includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutritionâis growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...Scintica Instrumentation
Â
Intravital microscopy (IVM) is a powerful tool utilized to study cellular behavior over time and space in vivo. Much of our understanding of cell biology has been accomplished using various in vitro and ex vivo methods; however, these studies do not necessarily reflect the natural dynamics of biological processes. Unlike traditional cell culture or fixed tissue imaging, IVM allows for the ultra-fast high-resolution imaging of cellular processes over time and space and were studied in its natural environment. Real-time visualization of biological processes in the context of an intact organism helps maintain physiological relevance and provide insights into the progression of disease, response to treatments or developmental processes.
In this webinar we give an overview of advanced applications of the IVM system in preclinical research. IVIM technology is a provider of all-in-one intravital microscopy systems and solutions optimized for in vivo imaging of live animal models at sub-micron resolution. The systemâs unique features and user-friendly software enables researchers to probe fast dynamic biological processes such as immune cell tracking, cell-cell interaction as well as vascularization and tumor metastasis with exceptional detail. This webinar will also give an overview of IVM being utilized in drug development, offering a view into the intricate interaction between drugs/nanoparticles and tissues in vivo and allows for the evaluation of therapeutic intervention in a variety of tissues and organs. This interdisciplinary collaboration continues to drive the advancements of novel therapeutic strategies.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.SĂŠrgio Sacani
Â
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Observation of Ioâs Resurfacing via Plume Deposition Using Ground-based Adapt...SĂŠrgio Sacani
Â
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Ioâs surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Ioâs trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Ioâs surface using adaptive
optics at visible wavelengths.
The increased availability of biomedical data, particularly in the public domain, offers the opportunity to better understand human health and to develop effective therapeutics for a wide range of unmet medical needs. However, data scientists remain stymied by the fact that data remain hard to find and to productively reuse because data and their metadata i) are wholly inaccessible, ii) are in non-standard or incompatible representations, iii) do not conform to community standards, and iv) have unclear or highly restricted terms and conditions that preclude legitimate reuse. These limitations require a rethink on data can be made machine and AI-ready - the key motivation behind the FAIR Guiding Principles. Concurrently, while recent efforts have explored the use of deep learning to fuse disparate data into predictive models for a wide range of biomedical applications, these models often fail even when the correct answer is already known, and fail to explain individual predictions in terms that data scientists can appreciate. These limitations suggest that new methods to produce practical artificial intelligence are still needed.
In this talk, I will discuss our work in (1) building an integrative knowledge infrastructure to prepare FAIR and "AI-ready" data and services along with (2) neurosymbolic AI methods to improve the quality of predictions and to generate plausible explanations. Attention is given to standards, platforms, and methods to wrangle knowledge into simple, but effective semantic and latent representations, and to make these available into standards-compliant and discoverable interfaces that can be used in model building, validation, and explanation. Our work, and those of others in the field, creates a baseline for building trustworthy and easy to deploy AI models in biomedicine.
Bio
Dr. Michel Dumontier is the Distinguished Professor of Data Science at Maastricht University, founder and executive director of the Institute of Data Science, and co-founder of the FAIR (Findable, Accessible, Interoperable and Reusable) data principles. His research explores socio-technological approaches for responsible discovery science, which includes collaborative multi-modal knowledge graphs, privacy-preserving distributed data mining, and AI methods for drug discovery and personalized medicine. His work is supported through the Dutch National Research Agenda, the Netherlands Organisation for Scientific Research, Horizon Europe, the European Open Science Cloud, the US National Institutes of Health, and a Marie-Curie Innovative Training Network. He is the editor-in-chief for the journal Data Science and is internationally recognized for his contributions in bioinformatics, biomedical informatics, and semantic technologies including ontologies and linked data.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Â
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other  chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released. Â
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules -Â a chemical called pyruvate. A small amount of ATP is formed during this process.Â
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to âburnâ the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP.  Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.Â
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.Â
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 â 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : Â cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
Richard's entangled aventures in wonderlandRichard Gill
Â
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
3. Introduction
ďŽ Examples from environmental
monitoring and water testing
ďŽ Broad and illustrative overview
ďŽ Written paper with more detail
4. Distribution of microbiological data
ďŽ Why study distribution?
⢠Impact on sampling
⢠Impact on trending
⢠Impact upon calculation of warning and
action levels
5. Distribution
ďŽ Most statistical methods are based
on normal distribution, and yetâŚ.
ďŽ Most microbiological data does NOT
follow normal distribution
8. Distribution
ďŽ And microbial counts tend to be
skewed (or positive or negative
exponential distribution)
ďŽ For example, a Water-for-Injection
systemâŚ
11. Distribution
ďŽ Well:
a) Use complex calculations and
Poisson distribution tables, or
b) Attempt to transform then data
ďŽ Weâll go for the second option
12. Distribution
ďŽ A general rule is:
⢠For low count data e.g. Grade A
monitoring and WFI systems, take the
square root
⢠For higher count data, e.g. Grade C and
D environmental monitoring or a
purified water system, convert the data
into logarithms
17. Distribution
ďŽ Logarithms work in a similar way for
higher counts
ďŽ Remember to add â+1â to zero counts
(and therefore, +1 to all counts)
18. Trend Analysis
ďŽ There is no right or wrong approach
ďŽ There are competing systems
ďŽ This presentation focuses on two
approaches, both described as
âcontrol chartsâ:
⢠The cumulative sum chart
⢠The Shewhart chart
19. Trend Analysis
ďŽ Control charts form part of the
quality system
ďŽ They can be used to show:
⢠Excessive variations in the data
⢠How variations change with time
⢠Variations that are ânormallyâ expected
⢠Variations that are unexpected, i.e.
something unusual has happened
20. Trend Analysis
ďŽ Control charts need:
⢠A target value, e.g. last yearâs average
⢠Monitoring limits:
ďŽ Upper limit
ďŽ Lower limit
ďŽ Control line / mean
ďŽ So the data can be monitored over time and
in relation to these limits
21. Trend Analysis
ďŽ Of these,
⢠The warning limit is calculated to represent a
2.5% chance
⢠The action level is calculated to represent a
0.1% chance
⢠So, if set properly, most data should remain
below these limits
⢠These assumptions are based on NORMAL
DISTRIBUTION
⢠Various formula can be used to set these or
validated software
22. Trend Analysis
ďŽ Cumulative sum chart (cusum)
⢠Suitable for large quantities of low count
data. It is very sensitive to small shifts
⢠Shows shifts in the process mean
ďŽ Shewhart chart
⢠Suitable for higher count data. It shows
large changes more quickly.
23. Trend Analysis
ďŽ Cusums
⢠Harder to interpret
⢠Displays the cumulative sum of a rolling
average of three values and plots these
in comparison with the target value
⢠The direction and steepness of the slope
are important
⢠Significant changes are called âstepsâ
⢠V-masks can be used as a prediction to
the future direction
24. Trend Analysis
ďŽ For example, a Grade B cleanroom
ďŽ Contact (RODAC) plates are
examined
ďŽ A target of 0.2 cfu has been used,
based on data from the previous
year
26. Trend Analysis
ďŽ Shewhart charts
⢠Powerful for distinguishing between
special causes and common causes
⢠Common causes are inherent to the
process and are long-term
⢠Special causes are where something has
changed and maybe of a long or short
term
27. Trend Analysis
ďŽ Examples of special causes:
⢠a) A certain process
⢠b) A certain outlet
⢠c) A certain method of sanitisation, etc.
⢠d) Sampling technique
⢠e) Equipment malfunction e.g. pumps, UV
lamps
⢠f) Cross contamination in laboratory
⢠g) Engineering work
⢠h) Sanitisation frequencies
28. Trend Analysis
ďŽ For example, a Grade C cleanroom
⢠Active air-samples are examined
⢠A target of 1.5, based on historical data
30. Trend Analysis
ďŽ The previous charts were prepared
using a statistical software package
ďŽ However, MS Excel can also be used
ďŽ The next example is of a WFI system
ďŽ Notice the data has been converted
by taking the square root of each
value
31. Trend Analysis
Trend of WFI System over 62 weeks with trend line
-1
-0.5
0
0.5
1
1.5
2
2.5
3
3.5
1
5
9
13
17
21
25
29
33
37
41
45
49
53
57
61
Number of weeks
Sqrootofmeancount/
week
32. Trend Analysis
ďŽ Alternatives:
⢠Individual Value / Moving Range charts
⢠Exponentially Weighted Moving Average
charts (EWMA)
⢠These are useful where counts are NOT
expected, e.g. Grade A environments
⢠They look at the frequency of intervals
between counts
33. Trend Analysis
ďŽ Summary
Chart Type Advantage Disadvantage
Cumulative sum Cusum charts are more
sensitive to small process
shifts.
Large,
abrupt shifts are not
detected as fast as in a
Shewhart chart.
Shewhart chart Systematic shifts are
easily detected.
The probability of
detecting small shifts fast
is rather small
34. Limits
ďŽ Alert and action levels
ďŽ Based on PDA Tech. Report 13 (2001):
⢠Alert level: a level, when exceeded, indicates
that the process may have drifted from its
normal operating condition. This does not
necessarily warrant corrective action but
should be noted by the user.
⢠Action level: a level, when exceeded, indicates
that the process has drifted from its normal
operating range. This requires a documented
investigation and corrective action.
35. Limits
ďŽ Why use them?
⢠Assess any risk (which can be
defined as low, medium or high)
⢠To propose any corrective action
⢠To propose any preventative action
36. Limits
ďŽ âLevelâ is preferable to âLimitâ
ďŽ Limits apply to specifications e.g.
sterility test
ďŽ Levels are used for environmental
monitoring
37. Limits
ďŽ Regulators set âguidanceâ values e.g.
EU GMP; USP <1116>; FDA (2004)
ďŽ These apply to new facilities
ďŽ User is expected to set their own
based on historical data
⢠Not to exceed the published values
⢠Many references stating this
⢠Views of MHRA and FDA
38. Limits
ďŽ Things to consider:
⢠The length of time that the facility has been in
use for
⢠How often the user intends to use the limits for
(i.e. when the user intends to re-assess or re-
calculate the limits. Is this yearly? Two yearly?
And so on).
⢠Custom and practice in the userâs organisation
(e.g. is there a preferred statistical technique?)
⢠They be calculated from an historical analysis
of data.
⢠Uses a statistical technique.
39. Limits
ďŽ Historical data
⢠Aim for a minimum of 100 results
⢠Ideally one year, to account for
seasonal variations
40. Limits
ďŽ Statistical methods:
⢠Percentile cut-off
⢠Normal distribution
⢠Exponential distribution
⢠Non-parametric tolerance limits
⢠Weibull distribution
Recommended by PDA Technical
Report, No. 13
41. Limits
ďŽ Assumptions:
a) The previous period was ânormalâ
and that future excursions above the
limits are deviations from the norm
b) Outliers have been accounted for
42. Limits
ďŽ Percentile cut-off
⢠Good for low count data
⢠May need to use frequency tables
⢠May need to round up or down to
nearest whole zero or five
⢠Warning level = 90th or 95th
⢠Action level = 95th or 99th
43. Limits
ďŽ Percentile cut-off
⢠Data is collected, sorted and ranked
ďŽ 90th percentile means that any future result
that exceeds this is 90% higher than all of
the results obtained over the previous year.
⢠Refer to PharMIG News Number 3
(2000) for excellent examples.
44. Limits
ďŽ Normal distribution
⢠Can only be used on data that is
normally distributed!
⢠Could transform data but inaccuracies
can creep in
⢠Most data will be one-tailed, therefore
need to adjust 2nd and 3rd standard
deviation
ďŽ Warning level = 1.645 + the mean
ďŽ Action level = 2.326 + the mean
45. Limits
ďŽ Negative exponential distribution
⢠Suitable for higher count data
⢠Warning level: 3.0 x mean
⢠Action level: 4.6 x mean
46. Limits
ďŽ For all, do a âsore thumbâ activity by
comparing to a histogram of the data
ďŽ Does it feel right?
47. Conclusion
ďŽ We have looked at:
⢠Distribution of microbiological data
⢠Trending
ďŽ Cusum charts
ďŽ Shewhart charts
⢠Setting warning and action levels
ďŽ Percentile cut-off
ďŽ Normal distribution approach
ďŽ Negative exponential approach
48. Conclusion
ďŽ Key points:
⢠Most micro-organisms and microbial
counts do not follow normal distribution
⢠Data can be transformed
⢠Inspectors expect some trending and
user defined monitoring levels
⢠Donât forget to be professional
microbiologists â it isnât all numbers!
49. Just a thoughtâŚ
ďŽ This has been a broad over-view
ďŽ If there is merit in a more âhands onâ
training course, please indicate on
your post-conference questionnaires.