Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Waters: Reviewing Audit Trail Information in Empower Chromatography Data Software

5,564 views

Published on

This presentation provides an overview of how to review audit trail information within Waters Empower Chromatography Data Software. (From Inform 2016, our annual software users meeting)

Published in: Software

Waters: Reviewing Audit Trail Information in Empower Chromatography Data Software

  1. 1. ©2016 Waters Corporation 1 Reviewing Information Stored in Empower Audit Trails Heather Longden / Fiona O’Leary
  2. 2. ©2016 Waters Corporation 2 NIST: Review of Audit Trails  Audit trails need to be available and convertible to a generally intelligible form and regularly reviewed. (A11§9) – Part 11 “ agency review”  From a NIST publication* – Audit trails are a technical mechanism that help managers maintain individual accountability. ..Users are less likely to attempt to circumvent security policy if they know that their actions will be recorded in an audit log. – “Determine how much review of audit trail records is necessary”  Increased appearance of Warning Letter observations * Introduction to Computer Security: The NIST Handbook
  3. 3. ©2016 Waters Corporation 3 Data Integrity Guidances US FDA Level 2 Guidance on fda.gov 2012 and updated in 2015 Draft Data Integrity Guidance April 2016 MHRA Data Integrity Definitions and Guidance MAR 2015 WHO Good Data and Record Management Guidance Draft Sept 2015
  4. 4. ©2016 Waters Corporation 4 Question 7: How often should audit trails be reviewed?  reviewed with each record and before final approval of the record.  BUT: does not apply to all audit trails?? – include, but are not limited to, the following: o the change history of finished product test results, o changes to sample run sequences, o changes to sample identification, o changes to critical process parameters. ( not “processing” parameters)  routine scheduled audit trail review based on the complexity of the system and its intended use.  Question 8: By WHOM? – Personnel responsible for Record Review
  5. 5. ©2016 Waters Corporation 5 MHRA Guidance March 2015  The effort and resource assigned to data governance should be commensurate with the risk to product quality, and should also be balanced with other quality assurance resource demands.  As such, manufacturers and analytical laboratories are not expected to implement a forensic approach to data checking on a routine basis, but instead design and operate a system which provides an acceptable state of control based on the data integrity risk, and which is fully documented with supporting rationale.
  6. 6. ©2016 Waters Corporation 6 MHRA Guidance March 2015  Where computerised systems are used to capture, process, report or store raw data electronically, system design should always provide for the retention of full audit trails to show all changes to the data while retaining previous and original data. – It should be possible to associate all changes to data with the persons making those changes, and changes should be time stamped and a reason given. – Users should not have the ability to amend or switch off the audit trail.  If no audit trailed system exists a paper based audit trail to demonstrate changes to data will be permitted until a fully audit trailed (integrated system or independent audit software using a validated interface) system becomes available. – These hybrid systems are currently permitted, where they achieve equivalence to integrated audit trail described in Annex 11 of the GMP Guide. – If such equivalence cannot be demonstrated, it is expected that facilities should upgrade to an audit trailed system by the end of 2017.
  7. 7. ©2016 Waters Corporation 7 MHRA Guidance March 2015  The relevance of data retained in audit trails should be considered by the company to permit robust data review / verification. The items included in audit trail should be those of relevance to permit reconstruction of the process or activity.  It is not necessary for audit trail review to include every system activity (e.g. user log on/off, keystrokes etc.), and may be achieved by review of designed and validated system reports.  Audit trail review should be part of the routine data review / approval process, usually performed by the operational area which has generated the data (e.g. laboratory). There should be evidence available to confirm that review of the relevant audit trails have taken place.
  8. 8. ©2016 Waters Corporation 8 MHRA Guidance March 2015  When designing a system for review of audit trails, this may be limited to those with GMP relevance (e.g. relating to data creation, processing, modification and deletion etc). Audit trails may be reviewed as a list of relevant data, or by a validated ‘exception reporting’ process.  QA should also review a sample of relevant audit trails, raw data and metadata as part of self inspection to ensure ongoing compliance with the data governance policy / procedures.
  9. 9. ©2016 Waters Corporation 9 WHO Draft Guidance Sept 2015  An audit trail is a process that captures details such as additions, deletions, or alterations of information in a record, either paper or electronic, without obscuring or over-writing the original record.  An audit trail facilitates the reconstruction of the history of such events relating to the record regardless of its media, including the “who, what, when and why” of the action. – For example, in a paper record, an audit trail of a change would be documented via a single-line cross-out that allows the original entry to be legible and documents the initials of the person making the change, the date of the change and the reason for the change, as required to substantiate and justify the change. – Whereas, in electronic records, secure, computer-generated, time-stamped audit trails at both the system and record level should allow for reconstruction of the course of events relating to the creation, modification and deletion of electronic data. Computer-generated audit trails shall retain the original entry and document the user ID, time/date stamp of the action, as well as a reason for the action, as required to substantiate and justify the action.  Computer-generated audit trails may include discrete event logs, history files, database queries or reports or other mechanisms that display events related to the computerized system, specific electronic records or specific data contained within the record.
  10. 10. ©2016 Waters Corporation 10 WHO Draft Guidance Sept 2015  regular review of audit trails may reveal incorrect processing of data and help prevent incorrect results from being reported and identify the need for additional training of personnel;  All GxP records held by the GxP organization are subject to inspection by health authorities. This includes original electronic data and metadata, such as audit trails maintained in computerized systems.  In addition, key personnel, including managers, supervisors and quality unit personnel, should be trained in measures to prevent and detect data issues. – This may require specific training in evaluating the configuration settings and reviewing electronic data and metadata, such as audit trails, for individual computerized systems used in the generation, processing and reporting of data.  supervisors responsible for reviewing electronic data should learn which audit trails in the system track significant data changes and how these might be most efficiently accessed as part of their review.
  11. 11. ©2016 Waters Corporation 11 WHO Draft Guidance Sept 2015  it should be possible to associate all changes to data with the persons making those changes and those changes should be time stamped and a reason for the change recorded. This traceability of user actions should be documented via computer-generated audit trails or in other metadata fields or system features that meet these requirements. – Users should not have the ability to amend or switch off the audit trails or alternate means of providing traceability of user actions. – Where a computerized system lacks computer-generated audit trails, persons may use alternate means such as procedurally-controlled use of logbooks, change control, record version control or other combinations of paper and electronic records to meet GxP regulatory expectations for traceability to document the what, who, when and why of an action. – Procedural controls should include written procedures, training programmes, review of records and audits and self-inspections of the governing process(es).
  12. 12. ©2016 Waters Corporation 12 WHO Draft Guidance Sept 2015  Data review procedures should describe review of original electronic data and relevant metadata.  Written procedures for review should require that persons evaluate changes made to original information in electronic records (such as changes documented in audit trails or history fields or found in other meaningful metadata) to ensure these changes are appropriately documented and justified with substantiating evidence and investigated when required;
  13. 13. ©2016 Waters Corporation 13 WHO Draft Guidance Sept 2015  When determining a risk-based approach to reviewing audit trails in GxP computerized systems, it is important to note that some software developers may design mechanisms for tracking user actions related to the most critical GxP data using metadata features and not named these audit trails but may have used the naming convention “audit trail” to track other computer system and file maintenance activities. – For example, changes to scientific data may sometimes be most readily viewed by running various database queries or by viewing metadata fields labelled “history files” or by review of designed and validated system reports, – the files designated by the software developer as audit trails alone may be of limited value for an effective review.  The risk-based review of electronic data and metadata, such as audit trails requires an understanding of the system and the scientific process governing the data life cycle so that the meaningful metadata is subject to review, regardless of naming conventions used by the software developer.
  14. 14. ©2016 Waters Corporation 14 WHO Draft Guidance Sept 2015  Systems typically include many metadata fields and audit trails. It is expected that during validation of the system the organization will establish – based upon a documented and justified risk assessment – the frequency, roles and responsibilities, and approach to review of the various types of meaningful metadata, such as audit trails. – For example, under some circumstances, an organization may justify periodic review of audit trails that track system maintenance activities, – whereas audit trails that track changes to critical GxP data with direct impact on patient safety or product quality would be expected to be reviewed each and every time the associated data set is being reviewed and approved – and prior to decision-making.  Systems may be designed to facilitate audit trail review via varied means, for example, the system design may permit audit trails to be reviewed as a list of relevant data or by a validated exception reporting process.  Written procedures on data review should define the frequency, roles and responsibilities, and approach to review of meaningful metadata, such as audit trails. – These procedures should also describe how aberrant data is handled if found during the review.  Persons who conduct such reviews should have adequate and appropriate training in the review process as well as in the software systems containing the data subject to review.
  15. 15. ©2016 Waters Corporation 15 WHO Draft Guidance Sept 2015  Quality assurance should also review a sample of relevant audit trails, raw data and metadata as part of self-inspection to ensure ongoing compliance with the data governance policy/procedures. – Any significant variation from expected outcomes should be fully recorded and investigated.  In the hybrid approach, which is not the preferred approach, paper printouts of original electronic records from computerized systems may be useful as summary reports if the requirements for original electronic records are also met. – To rely upon these printed summaries of results for future decision-making, a second person would review the original electronic data and any relevant metadata such as audit trails, to verify that the printed summary is representative of all results. – This verification would then be documented and the printout could be used for subsequent decision-making.  The GxP organization may choose a fully-electronic approach to allow more efficient, streamlined record review and record retention. – This would require that authenticated and secure electronic signatures be implemented for signing records where required. – This would require preservation of the original electronic records, or verified true copy, as well as the necessary software and hardware or other suitable reader equipment to view the records during the records retention period.
  16. 16. ©2016 Waters Corporation 16 WHO Draft Guidance Sept 2015  data review should be documented.  For electronic records, this is typically signified by electronically signing the electronic data set that has been reviewed and approved.  Written procedures for data review should – clarify the meaning of the review and approval signatures to – ensure persons understand their responsibility as reviewers and approvers to – assure the integrity, accuracy, consistency and compliance with established standards of the electronic data and metadata subject to review and approval;
  17. 17. ©2016 Waters Corporation 17 Audit Trails in Empower
  18. 18. ©2016 Waters Corporation 18 Empower Audit Trails  Sample Audit Trail – Tracks changes to entered data about each sample  Result Audit Trail – Links results to instruments, samplesets, methods, calibration curves and standards used in calibration. – Also traces any manual manipulation of data  Method Audit Trail – Keeps all versions of method for recreation of results – Audit Trail monitors each change, before and after values, who when and why – Different versions can be compared to identify the differences
  19. 19. ©2016 Waters Corporation 19 Empower Audit Trails  Project Audit Trail – Gives overview of all changes in a project – Includes details of method / data deletion  System Audit Trail – shows changes to system objects and system policies – details archive activity – notes all changes to security (users, user types etc) – documents all successful and unsuccessful logins o you have a history of who was logged into the application at any time o you have information about system break in attempts o includes the client the login/login attempt occurred at
  20. 20. ©2016 Waters Corporation 20 Review of Audit Trails  Review audit trails as part of data review process – Find anomalies before batch release – Focus of user behaviour that affect results – Peer Review / Manager review / QA review?  Periodic Review of overall/system level audit trails – system level activity without correct documentation, change control, testing or approval o eg. changing system policies, user access or deletion of data  Inspectors WILL look at the audit trails in electronic data systems Biggest Issue: Audit trails are often more a log of all activity (to comply) and not designed for easy review
  21. 21. ©2016 Waters Corporation 21 Review Audit Trails Electronically Print Audit Trails  Use the tools ( if any) built into the CDS  Review as PART of the data/integration /method review  Write a clear SOP defining which audit trails to review and when – Only flagged or suspicious results?  Signing results includes declaration of electronic review Review of Audit Trails  Include data relevant audit trails in regular reports  Periodically print out System level audit trails to “review”  Sign reports as “evidence” of review
  22. 22. ©2016 Waters Corporation 22 Adding audit trails to reports
  23. 23. ©2016 Waters Corporation 23 Empower Review Tool E-Cord Information Original Instrument Method LC/GC System Used Product Code/ Stage Reagent LIMS ID Who Collected, Processed Reviewed, Approved? When, What, Why? Unchanged Raw Data File Standards Used for Calibration Sample Sets Calibration Curves Unique Result Original Processing Method Now includes access to Sample Set History and Audit Trails
  24. 24. ©2016 Waters Corporation 24 Result Audit Viewer Tool One Stop Solution: • Project Audit Trails • Method History and Differences • Sample History • Sample Set History • Acquisition Log • Injection Log New in Feature Release 2
  25. 25. ©2016 Waters Corporation 25 How to document Data Review including Audit Trails  Review chromatograms, methods and relevant Audit Trails in Empower application  Document that process by SIGNATURE – Sign a report to document that you have followed the review SOP SOP should document what to review and how it should be done by your role  Similar to other laboratory tasks where there is no proof of the activity (such as making mobile phases or sample preparation) other than a user attesting to their completion of the task I sign this data to attest that I performed/ reviewed / approved this data according to SOP 12345
  26. 26. ©2016 Waters Corporation 26 Annex 11 Periodic Evaluation  Periodic reviews are used throughout the operational life of systems to verify that they remain compliant with regulatory requirements, fit for intended use, and meet company policies and procedures. (GAMP 5 definition) 11. Computerised systems should be periodically evaluated to confirm that they remain in a valid state and are compliant with GMP. Such evaluations should include, where appropriate, the current range of functionality, deviation records, incidents, problems, upgrade history, performance, reliability, security and validation status reports.
  27. 27. ©2016 Waters Corporation 27 Periodic Review  It’s like an internal audit on the compliance of the system – Find concerns BEFORE the audit – Find ways to improve the efficiency of systems and processes – documented evidence of actively searching for data integrity issues – Eg Review System Audit Trail for correct use of Admin functionalities  Review major and minor changes to determine if any retesting or additional testing of new functionality is required – Has it significantly expanded or changed use – Is the system still in control and in a validated state?  How often? – Frequency may depend of maturity and criticality (3-18monthly)  A formal report must be written about the review – Its a regulatory requirement
  28. 28. ©2016 Waters Corporation 28 Reviewing Audit Trails Task:- As a team discuss and write on the flip chat suggestions/mechanisms for:- 1. Reviewing SOP's for data review 2. How often the reviews should be performed

×