Our purpose
We enable people with life-altering conditions to lead better lives
1
2nd
exl Quality Oversight of
Clinical Ve...
To be as brave as the people we help
Agenda
• Traditional Qualification Audit
• Assumptions and Limitations of a Qualifica...
To be as brave as the people we help
Disclaimer
As a component of this presentation is a “Case Study”, the
names, facts, a...
To be as brave as the people we help
Traditional Vendor Qualification Audit
• Audit Plan Drafted from SOW
• Two-day Onsite...
To be as brave as the people we help
Assumptions and Limitations of a Qualification Audit
• Transactional Vendor
• Little ...
To be as brave as the people we help
Assumptions and Limitations of a Qualification Audit
• Inability to Gauge Compliance ...
To be as brave as the people we help
We continue to move forward at a faster pace. It’s a timing thing!
To be as brave as the people we help
Paradigm Shift and Related Framework
To be as brave as the people we help
Sourcing St...
To be as brave as the people we help
Anatomy of The Functional Assessment
• “Don’t Call Me Francis!”
• The Assessment Team...
To be as brave as the people we help
The Functional Assessment
• Onsite Activities
• Confirm Firm’s Capabilities (project ...
To be as brave as the people we help
Benjamin Franklin
“Diligence is the
mother of good luck”
To be as brave as the people we help
In-Process Audit Strategy
Master Audit PlanMaster Audit Plan
Frequency Assignment:
 ...
To be as brave as the people we help
In-Process Audit Strategy
Master Audit PlanMaster Audit Plan
Business Risk Factors:
...
To be as brave as the people we help
In-Process Audit Strategy
Benchmarks, Benchmarks, Benchmarks!
• Service Agreement / T...
To be as brave as the people we help
In-Process Audit Strategy
Outputs and Deliverables  Follow the Process
• Approval of...
To be as brave as the people we help
In-Process Audit Strategy
Outputs and Deliverables  Follow the Process
• Clinical Da...
To be as brave as the people we help
In-Process Audit Strategy
Risk Focus
• Indicator Patterns and Anomalies
• Determinati...
To be as brave as the people we help
Audit Fusion – Vendor Audit to Process Audit – Case Study
Background
• Biometrics
• V...
To be as brave as the people we help
Process Audit Vendor Audit 1 (FSP) Vendor Audit 2
Management: Oversight
(Functional &...
To be as brave as the people we help
“Fused Audit” Approach
• Audit Planning More Strategic
• Fewer Audits  Cost Saving
•...
To be as brave as the people we help
Framework
Integrated view of sourcing picture
to identify economies
of scale opportun...
Upcoming SlideShare
Loading in …5
×

2nd exl Quality Oversight Conf Szpindor In Process Vendor Audit

623 views

Published on

Conducting audits of GCP service providers during the conduct of clinical studies once there are outputs, data, and deliverables to review.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
623
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
16
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Good Afternoon. Thank you for the warm introduction. I’m happy to be attending this conference again this year and find it to be one of the only forums with a true focus on the so very important oversight aspects. Having had the distinct pleasure of hosting a recent MHRA inspection, not only is there a strong emphasis on vendor oversight, but on the ability for your systems and documentation around oversight to demonstrate controls and measures.
  • Today, I’d like to share with you some experiences on how you might effectively use “in-process” audits as a mainstay of your QA program. To do that, we’ll cover the traditional qualification audit (to serve as a sort of baseline for the discussion), followed by some general assumptions and limitations with them. Then we’ll discuss a focus shift and the related organizational framework beyond QA to truly make an in-process audit approach effective. Next, the strategic considerations around these audits and an example of a coordinated in-process audit conducted recently that spanned several providers and part of the internal organization – which by all intents and purposes was a fusion of a few types of audits to give us the ability to see the complete end-to-end view. And finally, how all of this fits together to promote and support quality well beyond the audit.
  • Of course the lawyers have to have their say and didn’t appreciate when I told them that this disclaimer sounded like the opening of a Law and Order episode. Since I can assure you that in-process audits are essentially the only type of vendor or partner audit we conduct outside of e-service providers, I am however not able to discuss specific outcomes in the slides. But, we will touch upon examples throughout. OK?
  • So, in working through a traditional qualification audit, the process may look something like this. One of your study teams gets the OK to kick off their study. They embark on the RFP process and are able to narrow down their choice for a suitable partner to two parties and QA is requested to audit the firm – hopefully of course with a little bit of notice and time. Typically, an audit plan is developed from the Statement of Work and the Transfer of Obligations. Depending on the exact scope, the audit may require 2 to 3 days onsite. The scope may include: the quality system and all associated components, a review of inspection history (hopefully with no severe agency actions), a review of select SOPs relevant to the TORO, interview of personnel from the various functions (usually a higher ranking person and not necessarily someone that will be assigned to your project), review of the training program and personnel qualifications, and perhaps sometime to assess some of the infrastructure - hopefully. Through all this, the outcomes will include a discussion with your study team pertaining to the observations and risks, an audit report detailing the same, eventually a response from the vendor and several CAPAs, and stakeholders wanting to know the bottom line – can we place the study and initiate sites? Does this model look somewhat familiar?
  • Now, I am making some assumptions here that this is perhaps a transactional vendor that we are qualifying – that is we don’t have an ongoing relationship and have no previous experience together. That there are no pre-conceived intentions or plans going in to work together again in the future. And, like it or not, therapeutic experience being relatively equal in this case, the selection drivers may have been the cost and the speed at which they could deliver FPFV or promise the final CSR. In these scenarios (and most of us have seen some variation of this), the focus of the audit then tends to be narrowed. The audit team is there with a focus on one and only one study or program. Due to timing demands, infrastructure related items may not get fully reviewed or tend to get deprioritized. Though I can see the overall requirements and curriculum for training, I can’t really tell how effective it is. I maybe able to identify some gaps, but that’s about it. In reviewing SOPs and then asking personnel to describe the process, I should be able to get a pretty good read on how well folks know their procedures. I may be able to achieve a comfort level knowing that there aren’t any process gaps, but again, it won’t give me too many indications of how effective the procedures might be.
  • The bottom line is that my evaluation may give some predictors of future vendor performance, but you really don’t know and can’t predict the future. So what we have is a review of the framework and the generalities of how the processes are intended to work. Because we haven’t yet placed our study, I don’t have data or deliverables of any type to sample or gauge. How do I really know that the processes I just reviewed are robust enough and Can consistently reproduce the same outcomes? Without anything to sample, I am a bit limited and can’t truly determine the effectiveness of the processes. Having a presentation by the vendor covering an overview of their processes will be almost as effective in some cases because at the end of the day, do I still know if the vendor can deliver or not? If I’m lucky enough, they may even point out things they have recently implemented or fixed.
  • Because of the pace, which just seems to be getting faster and even more demanding, we are challenged to work more collaboratively and have broader solutions.
  • If we are going to change our audit strategy, let’s first explore some broader considerations around the framework. It would be helpful if there was some semblance of a strategy in our firm for how work is sourced, an objective process for selection, a shift from transactional to strategic relationships, as well as seizing upon economies of scale. If my vendor is already doing 65% of all my data management, does it make sense to explore a full 100% functional service provider arrangement and have them help us with all of it. Maybe. From an evaluation standpoint, can I leverage the SME within the functions and on the teams to assess “fit for use” through the conduct of a functional assessment (which we’ll get into shortly) to determine that the vendor has the expertise, the capacity, and that their procedures meet or exceed our own standards How is the vendor being effectively managed? There is a management saying that notes “what gets measured, gets managed, and what gets managed, gets done. But with all of the talk these days about KPI and KQI, what is the right set of measurements for the firm we’re working with? Once this is defined, how do we ensure that the dashboard of measurements are useful to decision makers? How can we then use these indicators to focus our audits? What are they telling us? And finally from a governance perspective, how are the messages that the indicators & audits are feeding us being tabled for discussion between the parties. I discussed the value of quality agreements last year at this conference – how can we use such a tool to further establish & clarify expectations? And then, effectively transition this feedback into continuous improvement through a mechanism such as After-action-reviews.
  • So let’s examine the functional assessment in some detail. Don’t call me Francis! Anyone a fan of that old classic movie Stripes? Well don’t call me an audit, confuse me with an audit, because I’m not an audit! A FA is functional based and focused and typically performed by several members of a given function or study team. They have the expertise to kick the tires and make that fit for use determination. The team conducting s/b aligned with the services that will be performed by the vendor. These folks have the technical expertise to assess the details in the processes; To know that, for example, the approach to developing programming code is consistent with our expectations in this area. Over time, you may even establish some away teams to perform these assessments that are not only SMEs, but have evolved the skills to really hone in on what matters most in evaluating a good partner.
  • So, what exactly is the FA attempting to accomplish? While onsite, we want to know about: The firm’s capabilities not only for the upcoming project, but what can they do for us in the future? How are the set-up? Is their structure compatible with our organization? What are the competencies? Do they need to sub-contract in any areas? Is the sphere of delivery isolated to a specific region or are they a true global player? Are the personnel qualified? How deep is the bench? Is turnover an issue? What are the provisions for transition? Do the procedures meet or exceed those of the sponsor? How will the hand-offs be managed? We talked a bit about management & governance on a previous slide, but specifically how is the communication going to work? Issues escalation? And Workload management? Will there be transparency to know that my study manager is also working on another sponsor’s study. How do we ensure service delivery alignment? What systems/tools are needed? These focuses are quite different from what a qualification audit will typically yield. The FA is reported using a menu driven approach where certain components must be included for all vendors i.e.: things around how they are organized, SOPs, infrastructure – then standard sections for each function i.e.: monitoring, PV, DM, etc. are chosen depending on the services being assessed. Also included are any open items or items agreed to for follow-up.
  • As one of our wise founders noted As one of our wise founders noted, “Diligence is the mother of good luck”! The relationship will work best once there is an understanding of each of the parties, a transparent framework in which to communicate and operate, and a genuine trust and desire to work together as partners.
  • So how can an in-process audit strategy now be applied? It starts with a master plan that allows you to manage some key components of your program. For each provider, you are going to be assigning some weightings: First a frequency assignment. You may conduct an in-process audit each year if the vendor is a strategic partner or they are determined to be high risk (based both on what it is they are doing for you as well as the compliance risks of their own operation that you identify). Perhaps an audit every two years for those vendors deemed to be at a standard service level – a CRO we use on an ongoing basis, but not a strategic partner. Perhaps an audit every 3 years for those vendors where the work they are performing is deemed as lower risk, lower volume, or some combination of both. We have an example for one of our secondary central lab partners – standard services, low volume, low risk. They help us with over capacity. GCP Risk Factors: What is the inspection & audit history over the last 2 years? Are there high risks? Open commitments? What are our global exposures to the various regulatory agencies in the way of differing requirements, proposed regulations, inspection probability? The types of studies being conducted for us – are they key? For example those traditionally classified as pivotal, but to also include BE studies or any study that is part of a post approval commitment. Quality & Performance indicators – have we aggregated those metrics? What are they telling us? May factor into our risk model. If we listen closely enough, they usually have a story to tell. If we were fortunate enough to have a quality agreement in place, what has been the compliance with the agreement? If we have instances where the agreements were not followed or complied with, we’d consider these risk areas to be factored into the model.
  • As we operate in a multi faceted environment, we need to align our audit strategy to these factors as well. There are also business risk factors that can have major impacts so they can’t be ignored. -Again, the type of relationship plays a key role and may help us to deal with the “knowns and unknowns” How do the services factor in? FSP for an entire function or a non strategic niche provider conducting a type of DNA testing that no one else can provide to us – the risks may add up to be similar. Should the volume and overall spend factor in? You may want to factor in the spend. We have a fiduciary responsibility around how money is spent. We also mentioned global exposure under GCP risk. But from a business standpoint, can the vendor deliver consistently around the globe? This is a potential risk in both areas. How about safety – partner audit in Beirut. Had one on the books for 3 years now, I’m not going. M&A can’t be predicted. However is it inherent in the vendor’s business model? Are they acquiring companies in order to grow? If a M or A were recently announced, how does the integration affect our project continuity? With contract adherence, I mentioned MHRA inspections. These tended to be the key benchmark in some of those inspections. Do we have trouble with the parties adhering to the terms and carrying out their obligations? Has to be factored in.
  • For the in-process audits to be effective, they must utilize all of the available benchmarks. Quite simply: What are the expectations that have been defined AND is there evidence to demonstrate compliance with these expectations? This is a fundamental pillar to the strategy. Another is can you keep your promise? There will be a series of commitments stemming from previous: Inspections – responses and CAPAs Audits – CAPAs that note a commitment to modify a process, conduct some training, etc. Governance – open items from FAs, issues trackers, relationship committee outcomes. It goes back to the trust and transparency of the relationship. You said you were going to do something, can you demonstrate it was fulfilled? This also applies to the sponsor side of the relationship and can’t be only focused on the provider.
  • On this slide and the one following are 4 deliverable areas of focus for in depth process review and some of examples of items that could possibly arise. - Approval of Personnel. We want to ensure that based on the agreement, our study leads are receiving qualified CVs, if required conduct phone interviews – there is evidence that the sessions are happening, that there is documented rationale as to why someone is selected for the team (or not), and approval by our study lead as required by process. - There is an approved specification for the experience and skills required for each study. We want to ensure that the spec was applied or if not, that the controls worked to note exceptions and the circumstances surrounding that particular case. There are also specific requirements around training – GCP, SOPs, protocol, and therapeutic training. This is generally low hanging fruit in terms of being able to definitively determine if these tasks had been carried out as agreed. Shortcomings, if any in this area usually have to do with key personnel from the Sponsor not being available to deliver which causes a delay. Monitoring reports. From a logistics standpoint, are the reports being written, reviewed, approved, and distributed per the approved monitoring plan. Evaluating content across a sample of sites for visit frequency and whether the PI is making themselves available may provide some insights into our overall level of oversight. By reviewing an aggregate of protocol violations, one may ask if the protocol might need to be amended OR if there is additional training required by site personnel, monitors, or both. Last year, when looking at the SDV approach across a sample of 50 sites, we noticed that monitors at about 10 sites were experiencing delays in getting complete files to monitor. It was not understood what was causing the delays as the pattern was irregular – most were issues at study start but many resurfaced again later. As we dove deeper in an effort to determine root cause, turns out that these were sites using electronic source and the monitors were not able to verify until each subject’s file was printed, bound, and certified. This was also being done every 3 months for select components of the record. However, since the monitors eventually received what they needed in paper, the components of the trip report for use of e-Source were NA since the monitors weren’t permitted to verify from the system – only paper was used. At that time, we were at a point in the study where co-monitoring had not yet begun and site audits were just being scheduled, we hadn’t been informed that approximately 20% of our sites were using e-Source and we probably wouldn’t have known for several more weeks. This little item along with several other unrelated similar experiences allowed us to re-examine our position on working with sites using eSource and to clearly define our expectations to our partners.
  • As we all can probably agree, many aspects of clinical research tend to be a bit grayer than the other GxPs. However, within the DM, PRG, and Stats areas, things are typically more well defined and work more like a production line. Many of the activities must take place in a pre-defined sequence in order to progress through the next check point so to speak. All of the deliverables are noted in specifications and plans and each requires approval by both the vendor and Sponsor personnel. These are areas, for example that clearly note the frequency and timeframes for data transfers. If we see a pattern relative to the benchmark that shows the CRO received 8 transfers from the central lab and has provided 2 so far to the Sponsor, when we know that the “Data Transfer Agreement” indicates that the CRO should have received 5 transfers from the lab and subsequently should have provided 5 transfers of blinded data to the Sponsor, we probably have some questions to explore. It could mean nothing more than a particular study falling a bit behind on the timing of certain deliverables. It could suggest that we have problems. What if it turns out that the lab had some challenges with its transfer to the CRO and the item was getting bounced back. They had to make some adjustments to the firewall on both ends. This delay also forced the CRO and lab to move some personnel around (some to solve the problem at hand and others because they were promised for a new project starting on time) and this study was having delays. What if the data now coming into the sponsor is unblinded or doesn’t contain certain parameters that we need, but were not included because the newer personnel weren’t all that familiar with the specifications – in fact some of them never saw them? Could this happen, you bet! The point is that the simple review of the number of transfers against benchmarks due to an anomaly would have revealed a much bigger issue.
  • -In addition to using all of the benchmarks available and following the process to gauge compliance and verify commitments, the other area to consider in structuring these audits are the risk factors. We discussed some of these when determining the timing and frequency of the audits. But what about aggregating your metrics over time for a particular vendor or group of vendors? What kind of story would the analytics tell us? Similar to what we do with audit trending, this data has a story to tell. If patterns and anomalies emerge, they can be integrated into the in-process audit and attempt to determine root-cause. There are a number of reasons why SOPs need to be superceded or replaced, M&A activity and overhauls of SOP/QMS systems to name a few. Consider using these events to look for SOP cohesiveness. Is there clear documentation that definitively articulates which procedures are in effect at any given point in time? Some of the infrastructure items can present real risks. However, in most cases, there is too much to cover in one audit. Depending on the services being performed, attempt to prioritize and cover a portion at your first in-process audit, followed by additional components and follow-up of commitments at subsequent audits.
  • Let’s talk audit fusion. Sometimes, serendipity is a beautiful thing! I’d like to stand up here and tell you about this insightful strategy that we had devised. But the reality is it simply became obvious as we were planning the yearly MAP that we needed to synch these audits together. -It was clear that it was going to be a Biometrics (PRG/Stats) focused year. From a scheduling standpoint, we thought it made some sense to use the same auditors to look at both vendors. -Then the light bulb went off to not only use the same audit team for both vendors and the internal process audit of the function, but to integrate all three audits into one more strategic effort. -As you can see, we have a FSP, with a legacy provider performing a fairly high volume of work, and the internal component. The fusing of these audits gave us the end to end view of the entire process! A high level plan was developed with 3 modules serving to detail the approach and focus of each partner audit and the function process audit. More so than usual, we placed a heavy emphasis on oversight. How & when were expectations set? Who’s processes were used & over what time periods? Were the hand-offs within the same provider or between provider & Sponsor smoothly executed or did anything drop? We had the opportunity to assess how our people reviewed & accepted TLFs, performed QC, and raised issues to be addressed. But now we also had the ability to examine how the vendor dealt with any issues coming from Sponsor QC, requests for re-work (if any), and requests for ad-hoc analysis.
  • -Now I recognize that this is a busy slide. However, it is illustrating a slice of this audit in 4 areas that applied across each component – mgt, SAS Application, PRG, & Stat Analysis. -You should also know from a sequencing perspective that we started with the FSP first. This was a newer relationship and we were confident that expectations were well defined. -The internal process audit was 2 nd , followed by the legacy vendor. This approach allowed us to focus more on the expectation setting & planning component with the FSP and also touch upon those items internally. -Along the same lines, deliverables & any issues arising in the process audit could be assessed internally and then continued once we got to vendor 2. -What kind of things did this approach yield? -Revisions to the SAP template were not implemented consistently across both vendor platforms – in one instance due to a team at V2 using the old versions, in the other due to Sponsor introducing former template to FSP. -Next, Prior to all parties having access to the same hosted suite of SAS, we were able to see variances and in some cases short comings across the IQs for PC SAS. Expectations for the qualifications were detailed and agreed to, but not adhered to in the same level of consistency across all 3 areas. -Next, A Sponsor SOP on macro development was to be used across all 3 areas, it had been revised, and we were able to see examples of it’s use in all 3 areas. However, because it was being revised as the FSP came online, the old version was not provided to them in order to minimize confusion. There were macros that had to be developed before the new version was posted to their portal. They had used their own procedure to fill the gap, due to the timelines given, which was not totally consistent with the standardized approach being sought. -Though no critical observations or “show-stoppers”, you can see how this integrated approach may help reveal gaps and inconsistencies throughout the overall process.
  • -The fused audit forces you to be a bit more strategic and to look more broadly across the entire spectrum. -In some cases it could mean less audits. For example, we used to audit each study Database, now these are part of in-process vendor audits. -Make no mistake, these audits are more intense, require better planning / coordination, and may take more time to con duct. -Depending on risk profile, the processes of certain services can be reviewed more frequently than others while working in some level of infrastructure review each time. -The key is that it allows for a clearer view of the total picture. Verify those things that are working well and identify those that are not in compliance so that enhancements to the process or expectations can be implemented.
  • -The in-process audits need to function in synch as part of a comprehensive quality strategy. -The inner circle highlights the strategic areas that should be considered, while the outer circle items pertain more to the tools and foundations at your disposal. -Clearly, all of these items should be moving in the same direction with one another. Questions?
  • 2nd exl Quality Oversight Conf Szpindor In Process Vendor Audit

    1. 1. Our purpose We enable people with life-altering conditions to lead better lives 1 2nd exl Quality Oversight of Clinical Vendors Conference A Case Study – In Process Audit Strategy Stan Szpindor, M.S. Director, R&D Quality Assurance
    2. 2. To be as brave as the people we help Agenda • Traditional Qualification Audit • Assumptions and Limitations of a Qualification Audit • Paradigm Shift and Related Framework • In-Process Audit Strategy • Audit Fusion – Vendor Audit to Process Audit – A Case Study • Supporting Quality Beyond the Audit To be as brave as the people we help 2
    3. 3. To be as brave as the people we help Disclaimer As a component of this presentation is a “Case Study”, the names, facts, accounts, and outcomes have been altered to protect the confidentiality of the firms and personnel involved. To be as brave as the people we help 3
    4. 4. To be as brave as the people we help Traditional Vendor Qualification Audit • Audit Plan Drafted from SOW • Two-day Onsite Audit • Scope Includes: • Quality Management System • Inspection History • Review of Select SOPs • Interview of Personnel • Training Program • Personnel Qualifications • Infrastructure (physical, IT) Debrief with Sponsor Team QA Qualification Audit Note: data covers timeframe of 1/1/2003 through 12/31/2010 Transactional Vendor Selected CAPAs Comprehensive Audit Report “Green Light” to Proceed (or Not)
    5. 5. To be as brave as the people we help Assumptions and Limitations of a Qualification Audit • Transactional Vendor • Little to No Previous Experience with Vendor • No Pre-conceived Intentions of Future Work • Cost and Timing Selection Drivers • Narrow Audit Focus • Based on Service Area Needs of One Study or Program • Infrastructure Related Items Barely Touched Upon at Best • Training Curriculum and Requirements (not effectiveness) • Reading of SOPs and Discussion of Processes
    6. 6. To be as brave as the people we help Assumptions and Limitations of a Qualification Audit • Inability to Gauge Compliance and Efficiency • Review of SOPs Can’t Predict Real Life Execution • Discussion of Quality Management System with No Real Ability to Know it is Effective • No Data or Deliverables to Sample or Assess • “Dog & Pony Show” Please! At the End of the Day, Can I truly Know with Confidence the Vendor WILL Deliver as Promised?
    7. 7. To be as brave as the people we help We continue to move forward at a faster pace. It’s a timing thing!
    8. 8. To be as brave as the people we help Paradigm Shift and Related Framework To be as brave as the people we help Sourcing Strategy Alignment • Seek an integrated view of the sourcing strategy across the study components and R&D functions • Ensure framework for objective vendor selection and periodic reviews • Consolidate base of providers from many Transactional to few Strategic • Identify economies of scale and functional service provider (FSP) opportunities Vendor Management • Select metrics for each vendor from comprehensive menu based on applicability of services being performed • Monitor metrics / KPIs / KQIs for vendors (in conjunction with functions and teams) • Report executive level dashboard • Conduct In-process audits which gauge deliverables, outputs, and commitments. Vendor Evaluation • Shift to Functional Assessments from Pre- Qualification Audits • Functions and Teams are accountable to determine “fit for use” based on evaluation of vendor SOPs, personnel, and capacity against project specifications • Provide standard approach and adaptable outcome capture tools for consistency across R&D functions and teams Vendor Governance • Establish Relationship Management Committee with cross-functional representation including QA • Utilize Quality Agreements to ensure alignment between outsourcing strategy and quality expectations • Regularly provide / discuss audit and inspection trends from studies • Hold After-Action-Reviews to foster continuous improvement
    9. 9. To be as brave as the people we help Anatomy of The Functional Assessment • “Don’t Call Me Francis!” • The Assessment Team • Consider availability and Subject Matter Expertise (SME) of your Team/Functions • Consider availability of members of both your Team AND members of other Teams that are using the same Vendor • Representatives should be aligned with requested outsourced services and should be functional area expert or SME • Establish pool of “Core Away Team” personnel that are Subject Matter Experts in Key Areas AND Trained / Experienced in conducting Functional Assessments
    10. 10. To be as brave as the people we help The Functional Assessment • Onsite Activities • Confirm Firm’s Capabilities (project and beyond) • Organization – how structured, recent changes • Competencies – partners/sub-contracting, delivery sphere • Personnel – qualifications, turnover, transition provisions • Procedures - Do the Vendor’s SOPs meet or exceed Sponsor standards for your project? • Management – communication channels, senior management involvement, issue escalation, workload management • Functional Level Service Delivery – compatibility, alignment, systems/tools, QC robustness • Reporting • Customizable report containing “must items” as well as those relevant to Vendor and service type • Notes a list of follow-up items To be as brave as the people we help 10
    11. 11. To be as brave as the people we help Benjamin Franklin “Diligence is the mother of good luck”
    12. 12. To be as brave as the people we help In-Process Audit Strategy Master Audit PlanMaster Audit Plan Frequency Assignment:  Annual Audit = Strategic Partners, High Risk Partners  2-Yr. Audit = Standard Level  3-Yr. Audit = Low Volume Partners, Low Risk Partners Frequency Assignment:  Annual Audit = Strategic Partners, High Risk Partners  2-Yr. Audit = Standard Level  3-Yr. Audit = Low Volume Partners, Low Risk Partners GCP Risk Factors:  Inspection History  Audit History  Global Exposure  Key vs. Non-key Study  Key Quality & Compliance Indicator Aggregates  Quality Agreement Adherence GCP Risk Factors:  Inspection History  Audit History  Global Exposure  Key vs. Non-key Study  Key Quality & Compliance Indicator Aggregates  Quality Agreement Adherence
    13. 13. To be as brave as the people we help In-Process Audit Strategy Master Audit PlanMaster Audit Plan Business Risk Factors:  Relationship Type – Strategic vs. Transactional  Service Type – Full Service, Niche Service, FSP  Volume of Work and Overall Spend  Global Exposure  Merger & Acquisition Activity  Key Performance Indicator Aggregates  Contract Adherence Business Risk Factors:  Relationship Type – Strategic vs. Transactional  Service Type – Full Service, Niche Service, FSP  Volume of Work and Overall Spend  Global Exposure  Merger & Acquisition Activity  Key Performance Indicator Aggregates  Contract Adherence
    14. 14. To be as brave as the people we help In-Process Audit Strategy Benchmarks, Benchmarks, Benchmarks! • Service Agreement / TORO / Statement of Work Compliance • Quality Agreement Compliance • SOP / Work Instruction Compliance • Functional Plan Compliance Commitment Fulfillment • Audits • Responses & CAPAs  Process & Structural Improvements noted in Sponsor Audits and Select CRO Audits • Vendor Scorecard • Inspections • Responses & CAPAs to Agencies • Governance • Outcomes of Relationship Committees, Study-level Issues Trackers “Trust, but Verify”
    15. 15. To be as brave as the people we help In-Process Audit Strategy Outputs and Deliverables  Follow the Process • Approval of Study Team Personnel • Evidence of Qualification Review, Selection, and Approval • Evidence of ongoing GCP, SOP, Protocol, and Therapeutic Training • Monitoring Reports • Logistics: timing, format, review, approval, distribution, storage • Content: visit frequency, SDV approach, PI interaction, issue identification/escalation/resolution
    16. 16. To be as brave as the people we help In-Process Audit Strategy Outputs and Deliverables  Follow the Process • Clinical Database Lock Package • Set up, Edit Specification Development, EDC Screen Design, Testing/Validation, Data Transfers, Data Cleaning, Safety Reconciliation, QC, Freeze/Lock • Programming and Statistics • Blind Break, SAS Datasets, Program Development, Macro Library and Controls, Mock Displays, Programming Outputs, Data Reviews, TLF Outputs, QC
    17. 17. To be as brave as the people we help In-Process Audit Strategy Risk Focus • Indicator Patterns and Anomalies • Determination of systemic items, Outliers, Root-Cause • SOP/Work Instruction Cohesiveness • M&A Impact at either Sponsor or Vendor • Clear Definition of which Processes in Effect at Any Point in Time • SOP / QMS Improvement Initiatives • Infrastructure (Baseline and Incremental Views) • Network Topography, Validation Program, Back-up/Recovery • Facility Security, Business Continuity
    18. 18. To be as brave as the people we help Audit Fusion – Vendor Audit to Process Audit – Case Study Background • Biometrics • Vendor 1: FSP Audit Annually • Vendor 2: Biometrics Legacy Vendor Audited Biennially • Process Audit: Biometrics Function on Biennial Cycle • A “Fused Audit” • Identification of a True End to End View Combining the Vendor and Process Audits • Comprehensive Audit Strategy Plan with Modules for the Process Audit and Two Vendor Audits • Focus on Expectation Setting, Oversight, Hand-offs, Process, Actual Deliverables, Internal Review/Acceptance, Issue Resolution, and Feedback – Sponsor Oversight
    19. 19. To be as brave as the people we help Process Audit Vendor Audit 1 (FSP) Vendor Audit 2 Management: Oversight (Functional & Communication Plans), Expectation Setting (Quality Agreement), Personnel Management: Project & Relationship Management, Expectation Setting (Quality Agreement Adherence), Personnel Management: Project Management (Functional Plan & SOW Adherence), Expectation Setting (Quality Agreement), Personnel SAS Application: Access Control, Deployment Types, Validation Package, Add-on Tools, SAS Macro Library SAS Application: Access Control, Deployment Types, Validation Package, Add-on Tools, SAS Macro Library (Sponsor & Study Specific) SAS Application: Access Control, Deployment Types, Validation Package, Add-on Tools, SAS Macro Library (Study Specific), Plans for Upgrades, Migration, and Project Impact Programming: Procedures, Data set Specifications, Data Conversions, Data Transfers, Blinded Data Review Programming: Procedures, Data set Specifications, Data Conversions, Derived Variable Checks, Macro Development/ Testing/ Revision, Output QC & Validation, Data Transfers, Blinded Data-set Production Programming: Procedures, Data set Specifications, Data Conversions, Derived Variable Checks, Macro Development/ Testing/ Revision, Output QC & Validation, Data Transfers, Blinded Data-set Production Statistical Analysis:Procedures, Random. Code Mgt., Analysis Population Define, Data Pooling, TFL Creation & QC, Interim & Ad-hoc Analysis, TFL Version Control Statistical Analysis:Procedures, Random. Code Mgt., Data Pooling, TFL Creation & QC, Interim & Ad-hoc Analysis, TFL Version Control Statistical Analysis:Procedures, Data Pooling, TFL Creation & QC, Interim & Ad-hoc Analysis, TFL Version Control
    20. 20. To be as brave as the people we help “Fused Audit” Approach • Audit Planning More Strategic • Fewer Audits  Cost Saving • Audit Intensity and Duration may increase • Promotes Consistency Across like- audits • Rotating Scope for Vendor.: • Services A & B annually • Service C every 3 years • Infrastructure (physical, IT) biennially • Ability to clearly see Total Picture Debrief and Audit Report Audit Yields/Benefits Coordinated Multiple In-Process Vendor Audits and Internal Process Audit Verification and Effectiveness of Previous Commitments CAPAs Enhancement of Process, Expectations, and Relationship
    21. 21. To be as brave as the people we help Framework Integrated view of sourcing picture to identify economies of scale opportunities Expectations Collaborate to set joint Expectations through Quality Agreement, TORO, and Plans Metrics Standard Menu of Meaningful Metrics that can be Aggregated Relationship Management Audits In-process, Full View, CAPA & Issue Resolution Regular, Collaborative, Forward Looking, Strategic Functional Assessment Operation teams accountable to determine “fit for use” Sourcing Strategy Alignment Vendor Evaluation Vendor Management Quality Vendor Governance Commitment Fulfilment Supporting Quality Beyond the Audit

    ×