Data integrity is the degree to which data are complete, consistent, accurate, trustworthy, reliable, and that these characteristics of the data are maintained throughout the data life cycle.
Data integrity is critical throughout the CGMP data life cycle, including in the creation, modification, processing, maintenance, archival, retrieval, transmission, and disposition of data after the record’s retention period ends. It would be helpful for data management.
Data Integrity in pharmaceutical laboratories is a must, the attached ppt shall help the QC members to understand and develop an integral analytical culture
Data integrity is critical throughout the CGMP data life cycle, including in the creation, modification, processing, maintenance, archival, retrieval, transmission, and disposition of data after the record’s retention period ends. It would be helpful for data management.
Data Integrity in pharmaceutical laboratories is a must, the attached ppt shall help the QC members to understand and develop an integral analytical culture
Data integrity, Pharmaceutical industry, Good Manufacturing Practice, GMP, Guidelines, Data management, DI and GMP Compliance, paper and electronic data, Archive and back up
ENSURING DATA INTEGRTY THROUGH "ALCOA" : BASIC DATA INTEGRITY PRINCIPLES APPL...Abhijeet Waghare
Data Integrity refers to the completeness, consistency and accuracy of the data. Complete, consistent and accurate data should be attributable, legible, contemporaneously recorded, original or true copy and accurate across. The acronym ALCOA has been around since the 1990’s, is used by regulated industries as a framework for ensuring data integrity, and is a key to Good Documentation Practice (GDP).
What is 21 CFR Part 11?:
21 CFR Part 11:
Allow the industry to use electronic records and signatures alternatively to paper records and hand-written signatures
21 CFR Part 11 applies:
To all FDA regulated environments
When using computers in the creation, modification, archiving, retrieval or transmission of data or records
To records required by predicate rules – GLP, GCP, GMP – that impact patient safety
To new and old systems
Purpose of Part 11
Ensure data is not corrupted or lost
Data is secure
Approvals cannot be repudiated
Changes to data can be traced
Attempts to falsify records are made difficult and can be detected
Types of Systems
Two types of systems that come under 21 CFR Part 11 – closed and open systems
Closed and Open Systems:
What is a Closed system?
A system to which access is controlled by person responsible for electronic records stored on it
What is an Open system?
A system to which access is not controlled by those responsible for the electronic records stored on it
21 CFR Part 11 Requirements:
21 CFR Part 11 lists the following controls for closed systems:
Validation
Device checks
Operational system checks
Accurate and complete copies
Accurate and steady retrieval
Limited access to systems and data
Authority checks
Electronic audit trail
Training/qualification of personnel
Accountability of signatures
Control over system documentation
Digital Signatures :
Use of digital signatures for open systems
Electronic Signatures
Requirements for signed electronic records
Linking records to signatures
Good documentation practice (commonly abbreviated GDP, recommended to abbreviate as GDocP to distinguish from "good distribution practice" also abbreviated GDP) is a term in the pharmaceutical industry to describe standards by which documents are created and maintained. While some GDocP standards are codified by various competent authorities, others are not but are considered cGMP (with emphasis on the "c", or "current"). Some competent authorities release or adopt guidelines, and they may include non-codified GDocP expectations. While authorities will inspect against these guidelines and cGMP expectations in addition to the legal requirements and make comments or observations if departures are seen. In the past years, the application of GDocP is also expanding to cosmetic industry, excipient and ingredient manufacturers.
Data Integrity Issues in Pharmaceutical CompaniesPiyush Tripathi
Data integrity refers to maintaining and assuring the accuracy and consistency of data over its entire life-cycle, and is a critical aspect to the design, implementation and usage of any system which stores, processes, or retrieves data.
Trends changed from Non compliance to RR --> Gap to RR --> Data Integrity --> DIB --> Smart Audit & Smart Data.
RR = Regulatory Requirements
DIB = Data Integrity Breach
Take a serious Note for Data Integrity whether you are small or big organization. Your Data is the Heart of your business. Regulatory bodies are highly conscious about such issues. For beginners in this path, my small note can help you a lot.
Data Integrity in a GxP-regulated Environment - Pauwels Consulting AcademyPauwels Consulting
On Tuesday, December 6, 2016, our colleague Angelo Rossi, Senior Regulatory Compliance Consultant, gave an interesting presentation about “Data Integrity in a GxP-regulated Environment” at the Brussels Office of Pauwels Consulting in Diegem.
In his presentation, Angelo covered definitions and concepts of data integrity, the change in regulatory focus, lessons learned from recent FDA warning letters, importants highlights of regulations and guidelines. Angelo also presented a practical example of data integrity for a computerized system.
Please contact us at contact@pauwelsconsulting.com or +32 9 324 70 80 if you have any further questions regarding our consulting services in this area.
www.3-14.com
Source Data expectations for the life sciences industry. Data integrity refers to the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate.
This presentation is contain information about Documentation System of Pharmaceuticals. This presentation is prepared for training on documentation in Drug International Limited (Herbal Division) Depending on WHO and ICH guideline.
21CFR regulations & its applicability in the industry and FDA perspective on the same and FDA check points on 21CFR regulations during their inspection.
Data integrity, Pharmaceutical industry, Good Manufacturing Practice, GMP, Guidelines, Data management, DI and GMP Compliance, paper and electronic data, Archive and back up
ENSURING DATA INTEGRTY THROUGH "ALCOA" : BASIC DATA INTEGRITY PRINCIPLES APPL...Abhijeet Waghare
Data Integrity refers to the completeness, consistency and accuracy of the data. Complete, consistent and accurate data should be attributable, legible, contemporaneously recorded, original or true copy and accurate across. The acronym ALCOA has been around since the 1990’s, is used by regulated industries as a framework for ensuring data integrity, and is a key to Good Documentation Practice (GDP).
What is 21 CFR Part 11?:
21 CFR Part 11:
Allow the industry to use electronic records and signatures alternatively to paper records and hand-written signatures
21 CFR Part 11 applies:
To all FDA regulated environments
When using computers in the creation, modification, archiving, retrieval or transmission of data or records
To records required by predicate rules – GLP, GCP, GMP – that impact patient safety
To new and old systems
Purpose of Part 11
Ensure data is not corrupted or lost
Data is secure
Approvals cannot be repudiated
Changes to data can be traced
Attempts to falsify records are made difficult and can be detected
Types of Systems
Two types of systems that come under 21 CFR Part 11 – closed and open systems
Closed and Open Systems:
What is a Closed system?
A system to which access is controlled by person responsible for electronic records stored on it
What is an Open system?
A system to which access is not controlled by those responsible for the electronic records stored on it
21 CFR Part 11 Requirements:
21 CFR Part 11 lists the following controls for closed systems:
Validation
Device checks
Operational system checks
Accurate and complete copies
Accurate and steady retrieval
Limited access to systems and data
Authority checks
Electronic audit trail
Training/qualification of personnel
Accountability of signatures
Control over system documentation
Digital Signatures :
Use of digital signatures for open systems
Electronic Signatures
Requirements for signed electronic records
Linking records to signatures
Good documentation practice (commonly abbreviated GDP, recommended to abbreviate as GDocP to distinguish from "good distribution practice" also abbreviated GDP) is a term in the pharmaceutical industry to describe standards by which documents are created and maintained. While some GDocP standards are codified by various competent authorities, others are not but are considered cGMP (with emphasis on the "c", or "current"). Some competent authorities release or adopt guidelines, and they may include non-codified GDocP expectations. While authorities will inspect against these guidelines and cGMP expectations in addition to the legal requirements and make comments or observations if departures are seen. In the past years, the application of GDocP is also expanding to cosmetic industry, excipient and ingredient manufacturers.
Data Integrity Issues in Pharmaceutical CompaniesPiyush Tripathi
Data integrity refers to maintaining and assuring the accuracy and consistency of data over its entire life-cycle, and is a critical aspect to the design, implementation and usage of any system which stores, processes, or retrieves data.
Trends changed from Non compliance to RR --> Gap to RR --> Data Integrity --> DIB --> Smart Audit & Smart Data.
RR = Regulatory Requirements
DIB = Data Integrity Breach
Take a serious Note for Data Integrity whether you are small or big organization. Your Data is the Heart of your business. Regulatory bodies are highly conscious about such issues. For beginners in this path, my small note can help you a lot.
Data Integrity in a GxP-regulated Environment - Pauwels Consulting AcademyPauwels Consulting
On Tuesday, December 6, 2016, our colleague Angelo Rossi, Senior Regulatory Compliance Consultant, gave an interesting presentation about “Data Integrity in a GxP-regulated Environment” at the Brussels Office of Pauwels Consulting in Diegem.
In his presentation, Angelo covered definitions and concepts of data integrity, the change in regulatory focus, lessons learned from recent FDA warning letters, importants highlights of regulations and guidelines. Angelo also presented a practical example of data integrity for a computerized system.
Please contact us at contact@pauwelsconsulting.com or +32 9 324 70 80 if you have any further questions regarding our consulting services in this area.
www.3-14.com
Source Data expectations for the life sciences industry. Data integrity refers to the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate.
This presentation is contain information about Documentation System of Pharmaceuticals. This presentation is prepared for training on documentation in Drug International Limited (Herbal Division) Depending on WHO and ICH guideline.
21CFR regulations & its applicability in the industry and FDA perspective on the same and FDA check points on 21CFR regulations during their inspection.
1)Data integrity refers to the accuracy and consistency (validity) of data over its lifecycle. Compromised data, after all, is of little use to enterprises, not to mention the dangers presented by sensitive data loss. For this reason, maintaining data integrity is a core focus of many enterprise security solutions.
2) The term data integrity refers to the accuracy and consistency of data. When creating databases, attention needs to be given to data integrity and how to maintain it. A good database will enforce data integrity whenever possible. For example, a user could accidentally try to enter a phone number into a date field.
3) The Technopedia.com definition of Data Integrity linked here focuses on three key attributes: completeness, accuracy and consistency.
4) 8 Ways to Ensure Data Integrity
Perform Risk-Based Validation.
Select Appropriate System and Service Providers.
Audit your Audit Trails.
Change Control.
Qualify IT & Validate Systems.
Plan for Business Continuity.
Be Accurate.
Archive Regularly.
5) Maintaining data integrity requires an understanding of the two types of data integrity: physical integrity and logical integrity. Both are collections of processes and methods that enforce data integrity in both hierarchical and relational databases.
6) Data Integrity (DI) in the pharmaceutical manufacturing industry is the state where data are Attributable, Legible, Contemporaneous, Original, Accurate, Complete, Consistent, Enduring, and Available (ALCOA+)
7) Data integrity helps in building trust between regulatory agencies and the industry as a whole. It eliminates the need for inspecting each and every process involved in the production and supply of drugs and other pharmaceutical products.
8) 21 CFR Rules are a set of rules which govern or regulate the management and usage of electronic records in pharmaceuticals and medical devices.
9) Data Integrity is. defined as “the extent to which all data are complete, consistent and accurate, throughout the. data lifecycle” and is fundamental in a pharmaceutical quality system which ensures that. medicines are of the required quality .
10) For example, a user could accidentally try to enter a phone number into a date field. If the system enforces data integrity, it will prevent the user from making these mistakes. Maintaining data integrity means making sure the data remains intact and unchanged throughout its entire life cycle.
According to the FDA, data should meet certain fundamental elements of quality. Whether they're recorded on paper or electronically, source data should follow ALCOA: an acronym used in clinical research standing for attributable, legible, contemporaneous, original and accurate.
CCK Discussion Forum held at ICCBS, University of Karachi, attended by over hundred of registered experienced pharmaceutical professionals participants belonging from dozen of pharmaceutical manufacturing facilities
Risk and compliance is a Business Strength Canon Belgium
PLUGGING THE GAPS IN DATA LOSS.
When it comes to controlling data loss, THERE ARE STILL SOME SURPRISING GAPS …
And when it comes to enforcing information security policies and managing compliance more effectively, THERE ARE SOME SURPRISINGLY EASY WINS …
When it comes to controlling data loss, there are still some suprising gaps.
And when it comes to enforcing information security policies and managing compliance more effectively, there are some suprisingly easy wins.
A Pharma/CRO Partnership in the Design and Execution of Paperless Clinical Tr...Target Health, Inc.
DIA 2019 presentation by Dr. Jules Mitchel with Michelle Eli (Lilly) and Tom Haag (ex-Novartis) based on their experience with Lilly collaborating on Target Health's paperless clinical trial system.
Dale W. Usner, Ph.D., President of SDC, co-authored the article "The Clinical Data Management Process," which was published in the November/December 2014 issue of Retina Today.
The article reviews the clinical data management (CDM) process in its entirety - from protocol review and CRF design through database lock. Describing the roles of various CDM team members and tips for efficient data management practices, "The Clinical Data Management Process" provides a comprehensive yet concise summary of this essential function in clinical trial research, specifically with respect to retina trials.
Study start up activities in clinical data managementsoumyapottola
Study start-up (SSU) is so much more than a one-time document management exercise. It’s a global, strategic operation that can get new drugs approved faster – and it’s ripe for innovation – from Site Selection to Site Activation and Site Training.
Many SSU tech solutions deployed by sponsors don’t deliver the results promised because they add burden without benefits to clinical research sites. The result? Site staff simply avoid using them.
When that happens, document exchange and tracking falls back to paper, email and Excel formats – with CRAs holding the processes together. The tools that were supposed to solve a problem become part of the problem – and consume preThe implementation and conduct of a study can be a complex process that involves a
team from various disciplines and multiple steps that are dependent on one another. This
document offers guidance for navigating the study start-up processcious clinical trial budget.
A successful clinical study start-up is a crucial first step and an important factor for the overall success of the trial. For this reason, SCRO has experienced study start-up teams, offering customized services depending on your needs, whether it be fuWhile the definition varies across companies, study startup typically includes the process of identifying and qualifying sites, collecting essential documents at the study and site level, and submitting these documents for ethics approval. Successful study startup requires coordination between sites, sponsors, and contract research organizations (CROs) to achieve critical milestones in a compliant manner.ll-service or single activities.
How to achieve better time management in EDC start up
Clinical data management requires strict time management processes, especially in study start up within an electronic data capture (EDC) system. Three steps that clinical data management teams can take to outline the planning and executing of each task that needs to be considered are as follows:
Make a List: Create a daily or weekly task list and schedule when each task will be completed. This strategy will assist you in maintaining focus and staying organized.
Set realist goals: Be realistic about what you can finish in the amount of time you have. When setting unrealistic goals, failure is almost certain to follow.
Explore time-saving techniques: Examples of techniques that could help save time include grouping similar tasks together or using a timer to stay focused.
To help get started, here is a list of EDC considerations for Study Start-Up deadlines:
Protocol finalization and study enrollment
Split go-live considerations
eCRF Specification meetings (this will ensure proper collaboration and minimize any back-and-forth communication)
EDC add-on modules (which will be required and need validation?)
ePRO/eCOA used with licensed questionnaires.
IRB requirements for add-on modules (eConsent/ePRO)
brief about diabetes mellitus, its types, how to manage the disease and best nutrition behavior
شرح عن مرض السكري وانواعه وطريقة التعامل معه بالاضافة لتغذية المرضى
On Sep 13th 2019, the FDA released a statement about the common heartburn medicine ranitidine it might contain low levels of cancer-causing substance NDMA. They said it is not recalled yet and we are still doing tests.
this is the story of how it happened and where is it now
the product samadine - palestine
Medication Adherence , setting up directions .. Ahmed Nouri
presenting the terminology of adherence, statistics of non-adherence and its impact, why do patients have difficulty with treatment, how to measure and how to improve the adherence, in addition to the role of the pharmacist in improving adherence.
introduction to research and healthcare study designs, a focus on Qualitative research and the qualitative data analysis.
Presented by Clinical Pharmacists Ahmed Nouri, PharmD
Herbal supplements: what you should know about it?Ahmed Nouri
introduction to herbal products, how to use it, what you should know about it for easy and safe use by public
a lecture prepared by a clinical pharmacist
shortage of medications is a problem confronts each hospital among the world, this a presentation gives a brief information about the problem from a humble research. Ahmed Nouri, PharmD
Lab Results Interpretation for Pharmacist A.NouriAhmed Nouri
PHARMACISTS dealing with LAB RESULTS reading, each pharmacist needs to have the basic knowledge regarding lab results and how to deal with it . Ahmed Nouri, PharmD
Pharmacovigilance (PV) is defined as the science and activities relating to the detection, assessment, understanding and prevention of adverse effects or any other drug-related problem.
a presentation in CME activities by Saad Specialist Hospital, KSA
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Learn SQL from basic queries to Advance queriesmanishkhaire30
Dive into the world of data analysis with our comprehensive guide on mastering SQL! This presentation offers a practical approach to learning SQL, focusing on real-world applications and hands-on practice. Whether you're a beginner or looking to sharpen your skills, this guide provides the tools you need to extract, analyze, and interpret data effectively.
Key Highlights:
Foundations of SQL: Understand the basics of SQL, including data retrieval, filtering, and aggregation.
Advanced Queries: Learn to craft complex queries to uncover deep insights from your data.
Data Trends and Patterns: Discover how to identify and interpret trends and patterns in your datasets.
Practical Examples: Follow step-by-step examples to apply SQL techniques in real-world scenarios.
Actionable Insights: Gain the skills to derive actionable insights that drive informed decision-making.
Join us on this journey to enhance your data analysis capabilities and unlock the full potential of SQL. Perfect for data enthusiasts, analysts, and anyone eager to harness the power of data!
#DataAnalysis #SQL #LearningSQL #DataInsights #DataScience #Analytics
Adjusting OpenMP PageRank : SHORT REPORT / NOTESSubhajit Sahu
For massive graphs that fit in RAM, but not in GPU memory, it is possible to take
advantage of a shared memory system with multiple CPUs, each with multiple cores, to
accelerate pagerank computation. If the NUMA architecture of the system is properly taken
into account with good vertex partitioning, the speedup can be significant. To take steps in
this direction, experiments are conducted to implement pagerank in OpenMP using two
different approaches, uniform and hybrid. The uniform approach runs all primitives required
for pagerank in OpenMP mode (with multiple threads). On the other hand, the hybrid
approach runs certain primitives in sequential mode (i.e., sumAt, multiply).
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Round table discussion of vector databases, unstructured data, ai, big data, real-time, robots and Milvus.
A lively discussion with NJ Gen AI Meetup Lead, Prasad and Procure.FYI's Co-Found
2. OUTLINES
What is Data Integrity?
• Defining Data and Meta data.
• The Concept and Basics of Data
Integrity.
• ALCOA Principles
• Mistakes Versus Falsification or
Fraud.
How to implement Data Integrity?
•Practices; Top Failures and Ways to
avoid them.
4. DEFINING DATA AND METADATA
Data
Information derived or obtained from raw data, facts, figures and
statistics collected together for reference or analysis. (MHRA, 2018)
Raw data is defined as the original record (data) which can be
described as the first-capture of information, whether recorded on
paper or electronically
5. META DATA
•Meta Data are data used to describe other data.
•It can be used to describe information such as file type,
format, author, user rights, etc. and is usually attached to
files, but invisible to the user.
•For example, author, date created, date modified, and file
size are examples of very basic document metadata.
•Having the ability to filter through that metadata makes it
much easier for someone to locate a specific document.
6. AUDIT TRAIL
Secure, computer-generated,
time-stamped electronic record
that allows for reconstruction of
events relating to the creation,
modification, or deletion of an
electronic record.
Example: audit trail for an HPLC
run could include username,
date/time of run, integration
parameters used, details of a
7. INTEGRITY
BEING HONEST, EVEN
WHEN NOBODY IS
WATCHING.
Quality
Quality means doing it
right when nobody is
looking.
Integrity
The quality of being
honest and having
strong moral
principles.
Internal consistency or
lack of corruption
8. DATA INTEGRITY.
Data integrity is the degree to which data
are complete, consistent, accurate,
trustworthy, reliable and that these
characteristics of the data are maintained
throughout the data life cycle.
DATA LIFE CYCLE:
From initial data generation and recording
through processing (including
transformation or migration), use,
retention, archiving, retrieval and
destruction.
9. REGULATORS VIEW
Data Integrity breach break the trust between
Industry and Regulatory Agencies .
between the inspections, batch release, … etc ::
we trust you to do the right thing when the
regulatory agency are not watching.
If they find compliance gaps, regaining trust can
be costly, and time-consuming Task.
Karen Takahashi Senior Policy Adviser to USFDA
10. DATA INTEGRITY - PURPOSE
Assures the quality, safety and efficacy of the drugs
documented record available to represent the quality of
the product after sold
Reliability of the data is important
Questioning data integrity = loss of trust
Submitting false data to the FDA is a criminal violation
FDA has a “zero tolerance” policy for data integrity
11. WHAT ARE DATA INTEGRITY
BREACH ?
1. Falsification / fabrication
2. Dishonest / malicious
3. Hiding
4. Bad practice: Shortcuts, etc
12. KNOW THE DIFFERENCE BETWEEN
POOR/BAD PRACTICES AND
FALSIFICATION
•Human errors data entered by mistake
•Ignorance (not aware of regulatory
requirements or poor training)
•Errors during transmission from one
computer to another
•Changes due to software bugs or
malware of which the user is unaware
•Use of non-validated software
applications/Spreadsheets
•Discarding source documents after
accurate transcription;
•Hardware malfunctions
•Willfully falsification of data or
fraudulent data (with the intent
to deceive)
•Selection of good or passing
results (exclusion of poor or
failing results)
•Unauthorized changes of post
acquisition data
• overwriting, change the name /
data
13. FDA FINDINGS RELATED DATA
INTEGRITY
Backdating/Postdating/missing /mismatching Signatures
Data manipulation/ data falsification,
Copying existing data as new data
Not saving the actual electronic or deleting electronic data after Printing-
Chromatograms
Disposing the original hard copies
Not reporting of failures and deviations
Releasing the failing product
Hiding/obscuring /withholding critical information etc
Mismatch between reported data and actual data
14. DATA INTEGRITY – REGULATORY
REQUIREMENT
FDA September 1991: Application Integrity Policy – Fraud, Untrue
Statements of Material Facts, Bribery, and Illegal Gratuities
FDA Guidance for Industry April 2016: Data Integrity and Compliance
With CGMP
MHRA Guidance March 2018: GXP Data Integrity Guidance and
Definitions
WHO Guidance September 2015: Good Data and Record Management
Practices
PIC/S Guidance Good Practices For Data Management And Integrity In
Regulated GMP/GDP Environments - November 2018
EMA Questions & Answers August 2016
15. MHRA -DATA INTEGRITY DEFINITIONS
AND
GUIDANCE
Handwritten entries should be made in
a clear, legible, indelible way.
Records should be made or completed
at the time each action is taken and in
such a way that all significant activities
concerning
the manufacture of medicinal products
are traceable.
Any alteration made to the entry on a
document should be signed and dated;
the alteration should permit the
reading of the original information.
Where appropriate, the reason for the
alteration should be recorded.
16. DATA INTEGRITY AS PER USFDA
Data integrity is critical to regulatory compliance, and the
fundamental reason for 21 CFR Part 11.
Many regulatory bodies as the FDA, Health Canada and the EMEA
recommend the use of:
ALCOA
to ensure good documentation practices in pharmaceuticals
18. ATTRIBUTABLE
It includes who performed an action and when. This can be recorded
manually by initialing and dating a paper record or by audit trail in an
electronic system.
It is important to ensure a signature log is maintained to identify the
signatures, initials and/or aliases of people completing paper records.
For example:
During a validation, test results should be initialed and dated by the
person executing the test.
Adjustment of a setpoint on a process or monitoring system should
be made by an authorized user and the details of the change logged
in an audit trail.
A correction on a lab record should be initialed and dated to show
when and who made the adjustment
19. ATTRIBUTABLE MISTAKES
•Common User ID and password or sharing
•Disable of audit trail : Not able to identify the person who did the
activities or changed.
•Admin user ID is as “Admin” and who is access? Not able to indentify.
•Analyst doesn’t log out of PC in HPLC. Subsequent analysis is
performed by second analyst under same login.
•Design of forms/ record: BPR does not have space for recording
observation or additional information / signature.
•Two persons are performing the activity and one person signing
20. LEGIBLE
Readability
All data recorded must be legible (readable) and
permanent.
Ensuring records are readable and permanent assists with
its accessibility throughout the data lifecycle. This
includes the storage of human-readable metadata that
may be recorded to support an electronic record.
For example:
GDP will always promote the use of indelible ink when
completing records.
When making corrections to a record, ensure a single line
is used to strike out the old record. This ensures the
record is still legible.
Controlling your paper records/forms and formatting
them such that there is ample room for the information to
be recorded.
21. CONTEMPORANEOUS
Contemporaneous means to record the result,
measurement or data at the time the work is
performed. Date and time stamps should flow in
order of execution for the data to be credible.
Data should never be back dated.
For example:
If executing a validation protocol, tests should be
performed, and their results recorded as they happen
on the approved protocol.
Data that is logged, or testing that is performed
22. CONTEMPORANEOUS MISTAKES
Second person /witness ( eg weight) enter the data by observer at the actual
time; but second person only signing at end of the shift.
Electronic version of the excel output saved on personal drive and printed in
a later time.
Time clock is not available/ accessible where the activity is performed. Eg.
maintenance activity at near by /away
Unavailability of form, raw data sheet and log books right place.
Recording data in white paper /scrap papers / post it and entered the data
in actual record later
Non compliance with Good documentation practices (back date /forward
date).
23. ORIGINAL
Original data, sometimes referred to as source data or primary data, is the
medium in which the data point is recorded for the first time. This could be a
database, an approved protocol or form, or a dedicated notebook. It is
important to understand where your original data will be generated so that its
content and meaning are preserved.
For example:
Ensure validation test results are recorded on the approved protocol.
Recording results in a notebook for transcription later can introduce errors.
If your original data is hand written and needs to be stored electronically,
ensure a “true copy” is generated, the copy is verified for completeness and
then migrated into the electronic system.
24. ACCURATE
For data and records to be accurate, they should be free from errors,
complete, truthful and reflective of the observation. Editing should
not be performed without documenting and annotating the
amendments.
For example:
Use a witness check for critical record collection to confirm accuracy
of data.
Consider how to capture data electronically and verify its accuracy.
Build accuracy checks into the design of the electronic system.
Place controls/verification on manual data entry, for example,
temperature results can only be entered within a predefined range of
0-100°C.
25. ACCURATE
X Operator records a passing value for IPC result, even though they
never performed the test, as they know this attribute never fails.
X Actual result is failing , so data is discarded; the system adjusted to
get passing results to avoid an OOS.
X Flow meter readings are recorded with the “typical” value, rather
than the ( start and end) actual value.
X Data is recorded on paper, however during transcription the
numbers are accidentally reversed.
X Data from passing run is re-named, and used for a different
sample to ensure a result within specification.
26. + COMPLETE
X Deleting selective data (deviation/OOS) and retaining
desired data.
X Worksheets/ notebooks not reconciled or controlled.
X Data printout without instrument ID, analyst name, method name,
or date, or time …. analysis.
X Three technicians work on a complex calibration, but only one
person’s name is on the record.
X Data printout is retained as raw data, original meaningful metadata
is discarded.
27. + CONSISTENT
X Batch record steps are filled inconstantly- based on the operators
time.
X Recorded info may found ambiguity in the process or data, which
may be due to inadequate design of worksheet /
format. Eg parallel activity / sequential activity…
X System flashes the results and the results disappears before
operator can record the data. Eg rpm of reactor/cfg
X System allows you to preview data prior to naming or saving the
record.
28. + ENDURING
X Thermal paper is used for equipment printouts, but copies are not
made available.
X New software upgraded for the system, but existing data could not
be retrieved due to old version of software
X Poor quality of printed report/ BPRs
X Record the data in temporary manner and forget . Eg QC chemists
writes in butter papers, post-it notes, etc.,
X Not storing the data from the system / not taking backup
29. + AVAILABLE
OOS results are hideout in separate folder and frequently deleted.
Files are not backed up, and data is deleted from the system
periodically .
Records are not archived until its complete retention period.
Validated spreadsheet is not backed-up.
30. TIPS FOR DATA INTEGRITY -
IMPLEMENTATION
Establish a “Data Integrity policy” .
Describe the DI and consequences of DI breach /falsification of data
Training on the DI policy or procedure .
Establish a GDP so that even the most innocent recording issues cannot
be perceived as fraudulent
Design systems to prevent DI
Keep the BPRs / Log books / at work place to assess and record
Control over templates/ formats/ blank papers
Setting proper access to users/ audit trail
Connect recorder / printouts / Access to Clock for recording time
31. CONCLUSION
In the pharmaceutical industry, data integrity play an important role to
maintain the quality of a final product because the poor practice can
allow the substandard product to reach patients, so it’s necessary for
an existing system to ensure the data integrity, data traceability, and
reliability. On quality bases, data integrity is a critical component of a
Quality System. Quality data provides the base for the confidence of the
company to utilize correct data to operate in accordance with
regulatory requirements.
Data integrity is critically important to regulators for various reasons,
including patient safety, process, and product quality. The integrity and
trustworthiness of the data provide a baseline for the regulators'
opinion about the company.
It’s also the responsibility of the manufacturer to prevent and detect
poor data integrity practices which occur due to the lack of quality
system effectiveness. Quality Risk Management (QRM) approach can
prevent, detect and control potential risks where data is generated and
used to make manufacturing and quality decisions, ensure it is
trustworthy and reliable.