Clinical Development Kp Is Ii 08 Dec2011

1,111 views

Published on

My presentation from 7th Annual Clinical
Performance Metrics &
Benchmarking Summit

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,111
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • If you can’t justify a ROI on what is being measured then you have to ask the team is it relevant to measure and report? Labor hours cost money.
  • Notes: some things can be measured but should they be? You can’t measure everything. Heisenberg – the act of measuring something actually changes the effect. It is sometimes best to measure certain activities in a blinded fashion to get the most unbiased reporting. What is the definition and scope of the problem or issue, or what's the question? Where is the problem and how big or serious is it? How should the program or technology be delivered to address the problem?
  • These are not just project management techniques; they’re methods by which to measure performance, KPIs, and assess merit of your program/project parameters.
  • The first three deal more with social research and social programs, however, a thorough understanding of the theories is essential to designing an evaluation program. Project management is essentially nothing more than a commercialization of program evaluation theory. It should be used more than just charting from Point A to Point B.
  • I will add vendor and CRO selection to this as well.
  • Mention the scope of work and task order agreement at this point. Different lecture but critical.
  • Briefly hit contents of TOA.
  • Discuss anecdotes from BMS regarding poor milestone accomplishment driven by SOW being defined as a pay for play FTE based model.
  • A Great IRB can speed the time to market, a poor IRB selection can ruin your timelines. Don’t be afraid to pull a project if they fail to meet standards or promised delivery times.
  • Clinical Development Kp Is Ii 08 Dec2011

    1. 1. Clinical Development KPIs Measuring for Success Manley Finch, PhD, MPH Executive Director HIV Nutrition Network Sr. Medical Science Liaison GTC BioTherapeutics “If we knew what it was we were doing, it would not be called research, would it?” Albert Einstein1 M.R. Finch, PhD, MPH
    2. 2. Program/Project Evaluation Evaluation is to help projects become even better than they planned to be.… First and foremost, evaluation should support the project.… W.K. Kellogg Foundation Evaluation Approach, 19972 M.R. Finch, PhD, MPH
    3. 3. Overview Evaluation measurement starts with the program development plan. The importance of clinical trial program evaluations; critical evaluation creates ROI. Identifying and defining the correct KPIs to track and measure. KPI measurement translates to clinical trial agility and efficiency; driving successful study completion depends on trial performance monitoring and corrective action plans. Monitoring outsourced activities is critical for excellence in execution.3 M.R. Finch, PhD, MPH
    4. 4. Program/Project Evaluation What is program/project evaluation? Evaluation is the systematic acquisition and assessment of information to provide useful feedback about a dynamic process, the interlocking steps of the process, and the intended and unintended outcome(s).  Prospective; before and during process operations – real time.  Retrospective; after the fact, historical – data for the4 future. M.R. Finch, PhD, MPH
    5. 5. Program/Project Evaluation Goal of Evaluation The preemptive goal of evaluation should be to influence decision-making, real-time or in future planning, through the unbiased assimilation and extrapolation of empirically- driven information resultant from strategically designed informatics portals in order to effect a more positive outcome. Evaluate To Create an ROI !!!5 M.R. Finch, PhD, MPH
    6. 6. Program/Project Evaluation Remember -> Program/project evaluation has a cost associated with it‟s planning, inception, monitoring, analyses, and reporting…. Measure only what is real, important, and will create value!!6 M.R. Finch, PhD, MPH
    7. 7. Program/Project Evaluation Why measure?  Quantitative data provides measurable metrics to gauge ongoing and future success and drive ongoing and future improvements in system or programs.7 M.R. Finch, PhD, MPH
    8. 8. Program/Project Evaluation  Provides rationale for current and future decision making  Evaluation programs are essential in any industry  Reporting ROI to Senior Management Senior Management Buy-in = Funding8 M.R. Finch, PhD, MPH
    9. 9. Program/Project Evaluation Program Evaluation:  Formative – evaluation of a program/project during the development stage to ensure reiterative improvement process. Assess the merit, worthiness, and applicability.  Hx data, interviews, questionnaires, focus groups, surveys.  Proactive planning for successful real time assessments.  Summative – evaluation of an ongoing or completed program/project to evaluate the successes and challenges in order to improve ongoing and future projects.  Data driven metrics, quantitative, analyses driven.  ROI reporting to stakeholders.9 M.R. Finch, PhD, MPH
    10. 10. Program/Project EvaluationFormative Evaluation  Prospective; prior to or in parallel with program/project design and planning.  Define parameters (KPIs) to be monitored, assessed, and evaluated.  Defines feasibility of evaluability; don‟t attempt to measure everything. Quantitative versus Qualitative.  Define informatics reporting process and infrastructure.  Define risks and risk mitigation strategies.  Define implementation and training strategies.  Define process evaluation strategies.  Define responsible parties at all levels and assign accountability.10 M.R. Finch, PhD, MPH
    11. 11. Program/Project EvaluationSummative Evaluation:  Retrospective; after data has been collected, from historical data collected, or from several different programs/projects.  Outcome evaluation; did you meet your goals?  Impact evaluation; what was the effect of real time changes?  Cost effectiveness/benefit evaluation; ROI?  Secondary evaluation; examine data to answer additional issues  Meta-analyses; from several programs or projects, historical.11 M.R. Finch, PhD, MPH
    12. 12. Program/Project Evaluation There are numerous models Management Oriented System Models PERT: Program Evaluation and Review Technique CPM: Critical Path Method GANTT: CPM Charting model Only examples and must be tailored to fit your needs, a combination of all is best.12 M.R. Finch, PhD, MPH
    13. 13. Program/Project EvaluationSources for Program Evaluation Methodologies W.K. Kellogg World Health Organization Web Center of Social Research Project Management Institute (PMI). Drug Information Association eXL, Barnett and others PERT, CPM, and GANTT methods13 M.R. Finch, PhD, MPH
    14. 14. Program/Project Evaluation  Evaluation Program  Design from the start of program in parallel with early program or protocol plan development discussions.  Determine relevant measures of program, protocol, and site performance early.  Assign team to craft, implement, and monitor early in process.  Don‟t reinvent the wheel – rely on standards already developed unless protocol demands it.14 M.R. Finch, PhD, MPH
    15. 15. Program/Project Evaluation Program Evaluation Steps Assign Program Team evaluation program responsibilities and set expectations early SMART GOALS Goals Support Clinical Program Timelines15 M.R. Finch, PhD, MPH
    16. 16. Program/Project Evaluation  Define challenges and determine action plan  Determine evaluation metrics and how to assess  Design assessment tools specific to metrics  Determine frequency of assessments  Develop reporting format specifics  Meet often to assess program and steer appropriately16 M.R. Finch, PhD, MPH
    17. 17. Key Performance Indicators KPIs Why do we measure and why do we care? Costs of trials are increasing alarmingly;  400-800 million per drug in 2006; over 900-1.2 billion in 2011  26K per Phase III patient in 2006..  47.5K per Phase III patient in 2011 Trials are delayed more frequently with study start up, site activation, recruitment and retention being blamed most. Failure to be first in class or first to market drives market share loss and substantially impoverished revenues.17 M.R. Finch, PhD, MPH
    18. 18. Evaluation ROI Key Points $$- TIME IS MONEY -$$ Every day the trial is operating is 100 to 200k USD operational cost alone.18 M.R. Finch, PhD, MPH
    19. 19. Evaluation ROI Key Points Marketing Considerations and Opportunities  Blockbuster drug can generate 2-5 million USD per day in sales revenue ( 750 to 1,500 mil/year)  Market share decreases dramatically based on tier approval; First in Class, First to Market, 2nd to Market etc.  Windows for marketing a drug are dynamic  First to market wins market share  Viagra® versus Cialis® as an example19 M.R. Finch, PhD, MPH
    20. 20. Evaluation ROI Key Points $$- TIME IS MONEY -$$  Delays in Time to Market  2 to 5 million per day marketing.  700 to 1,500 million per year revenue  Low approval tier decreases market share from 75-80% to 35-25% or less.20 M.R. Finch, PhD, MPH
    21. 21. Program/Project Evaluation Evaluation and Assessment Create Real ROI Plan early and plan in parallel Assess early and assess in parallel Real time data = real time effective changes21 M.R. Finch, PhD, MPH
    22. 22. KPIs It is critical to define the appropriate, measurable, meaningful, and value granting KPIs early  Study Start Up Process  Vendor Selection  Medical Writing; Protocol, Consent, CRF, Assessments, IVRS  Regulatory Approval  Study Site Selection and Activation  Recruitment and Retention  Data Collection and Management  Data Cleaning and Data Locking  Drug Approval Process22 M.R. Finch, PhD, MPH
    23. 23. KPIs and SPIs  Key Performance Indicators - KPIs  Similar across all trials  Tracking for most is standard in the industry  Determine as a corporate entity prior to program planning  Share with vendors and sites  Study Performance Indicators – SPIs  May be similar within disease indication  Vary across differing disease indications and trial phases  Determine at the beginning of project or trial  Share with vendors and sites23 M.R. Finch, PhD, MPH
    24. 24. KPIs Standard Program and Trial KPIs How your program/project is scored MAP/Development Program Plan Completion DMF/IND Submitted to IND and/or First Protocol Approved Initial IRB Approval; Phase I, II, IIIa - IIIb First Site Selected (FSS); First Site Approved (FSA), and FSS First Patient In (FPI); FPE, FPR First Patient Completed (FPC), LPI, LPO Data Cleaned, Locked, Analyzed, Data Report Completed. Site Close Outs, Final Study Report, etc NDA Submission, NDA Approval, Drug on Market24 M.R. Finch, PhD, MPH
    25. 25. KPIs and SPIs Trial Completion Key Performance Influencers  Study Start Up  Site assessment and selection  Site training  Site activation  Study Conduct  Patient screening and enrollment rates  Patient retention  Data monitoring and cleaning  Study Closure  Final data cleaning  Data lock and analyses  Study site closeouts25  SAR, FSR and metrics reporting M.R. Finch, PhD, MPH
    26. 26. KPIsVendor Assessment & Selection Time Steering Committees / Lead PIs CROs Central IRBs Central Lab Central Reader/Scorer Rater Reliability Recruitment/Trial Awareness/PR (should be the first!). SMOs/PI Networks26 M.R. Finch, PhD, MPH
    27. 27. KPIs Demand Metrics from the Vendors!!! CarFax = CROFax Vendors are service providers and therefore live and die on metrics. Demand formative strategy (case histories) and summative data from each vendor to ensure the best fit. Not all are created equal and a “one-stop” mentality can be fatal to your program or project. What is their Hx out of scope like? How often have they enrolled on time? How often have they completed on time? How often have they met or exceeded expectations? FDA and Sponsor Audits; CAPAs, 483s, Warning Letters, CIAs, etc?27 M.R. Finch, PhD, MPH
    28. 28. KPIs CROs and Vendors Scope of Work & Task Order Agreements Timelines, Milestones From these documents ALL KPIs are measurable from the outset. This can‟t be stressed enough. A Solid SOW and TOA = A Good Chance for Success!28 M.R. Finch, PhD, MPH
    29. 29. SOW & TOA KPIs Measurable Outcomes Start Here “X” Does Mark the Spot Roles and Responsibilities; who is doing what, when, where. Measureable Timelines and Deliverables; quantitative. Project Milestones are milestones, not guidelines. Define KPIs within the documents, set payments based on milestones, deliverables, and KPIs versus time burnt/FTE.  Early communication and clarity amongst parties ensures a29 better chance for success. Finch, PhD, MPH M.R.
    30. 30. KPIs Central IRBs KPIs  Initial Protocol and Consent Approval Time  CRF and Assessment Approval Time  Patient Recruitment Material Approval Time  Individual Site Materials Approval Time  Revision Approval Times  Meet with their team in person and demand the metrics for assessment.  Web access portals, multiple boards with multiple meetings, great FDA standing30 M.R. Finch, PhD, MPH
    31. 31. KPIs Study Start Up and Activation How fast can you come online! Define all Critical Paths, Floats, and Assess Resources  Site Selection Process ensures success or failure in not only start times but also recruitment and retention. Site Selection KPIs - Site Assessments and Onboarding  Feasibility Questionnaires, Historical Data must be a prerequisite for site KPI assessment.  PSVs; Rapid assessment and onboarding. Verify ALL data at PSV  Time to Contract Negotiation  Time to IRB Approval  Time to FPS, FPE, FPR M.R. Finch, PhD, MPH31
    32. 32. Recruitment KPIs Patient Recruitment & Enrollment Is this the Holy Grail?32 M.R. Finch, PhD, MPH
    33. 33. Recruitment KPIs Will it end like This or This ?33 M.R. Finch, PhD, MPH
    34. 34. Recruitment KPIs 80% of Trials Fail to Enroll on Time 60-70 % are delayed greater than 3-6 months. 50-40% are delayed greater than 6 months.. 30% are delayed up to 1 year plus. At 100K (conservative) per day for Phase III this would equal a cost of over 36.5 million dollars for one year not counting lost revenues from delayed time to market. Avg. Cost of Recruitment Plan is ~ 3 - 5% of Trial Cost34 M.R. Finch, PhD, MPH
    35. 35. Recruitment KPIsKey Metrics Rate and Acceleration (Quantitative)  Time to site selection & activation..  Time to FSA, LSA.  Time to FPS, FPE, FPR.  LTFU, DO, Retention Rates  Time to FPC, LPI, LPO. Qualitative (Can measure effect on above at inflection points)  Source of Subject Tracking.  PR and Trial Awareness Plan Execution/Implementation.  Retention Efforts.35 M.R. Finch, PhD, MPH
    36. 36. Recruitment KPIs “Selecting optimal study sites is the single most important study start up activity related to rapid trial enrollment.” & No matter how many great sites you select, they can not overcome a poorly crafted protocol….36 M.R. Finch, PhD, MPH
    37. 37. MEASURING INVESTIGATOR PERFORMANCE Platitudes to Ponder Slow site activation = slow or no patient recruitment Historical enrollment predicts future recruitment Acceptance = Participation Frustration = Abandonment37 M.R. Finch, PhD, MPH
    38. 38. MEASURING INVESTIGATOR PERFORMANCE Platitudes to Ponder Proactive = Performance Rescue programs (band-aids) cost more and do less Failure to plan is to plan to fail38 M.R. Finch, PhD, MPH
    39. 39. MEASURING INVESTIGATOR PERFORMANCE What are the key historical or current site performance metrics (KPIs) to monitor?  Hx enrollment performance; EMRs or Paper?  Number of patients in DB or access to patients  Breadth and depth of referral network  Willingness to attempt recruitment program activities  Requests recruitment enhancement funding proactively  Has on site trial relations or marketing manager  Hx contract/budget negotiation time  IRB approval time  Central or local IRB39 M.R. Finch, PhD, MPH
    40. 40. MEASURING INVESTIGATOR PERFORMANCE Quantitative Analyses of Performance Create Corporate Sponsor/CRO PI Database  Depth of PI patient DB – e.g. # of patients  Contract/Budget negotiation time  IRB approval time  Site activation time  Time to first patient screened & first patient randomized  Number of patients screened/enrolled  Number of patients ET/LTF/completed  Enrollment time vs. allotted enrollment period  Query, DCF, and DEE rates  Various site related trial costs40  Overall cost per patient enrolled -PhD, MPH M.R. Finch, CPP
    41. 41. Points To Ponder Are you providing feedback to the sites in real time? Are you assisting sites to measure their own performance? Are you creating centers of excellence using KPI metrics? Are you getting feedback from your sites on your performance as a sponsor or CRO? One dollar invested proactively in the sites = Ten dollars in return performance!!41 M.R. Finch, PhD, MPH
    42. 42. MEASURING INVESTIGATOR PERFORMANCE Create DB across all internal IR/D components Compare questionnaires to hx DB allows for quantitative analyses Select only the cream of the crop by stratifying the results Allows CTM/CPM to stratify the PI list and concentrate on the most rapidly activating sites to ensure FPI milestone capture Eliminate using the same non-performing sites over and over across company42 M.R. Finch, PhD, MPH
    43. 43. MEASURING INVESTIGATOR PERFORMANCE Points to Ponder Not all sites are created equal nor are all investigators. KOLs historically are poor enrolling centers – an unfortunate but real fact Not all sites accurately report Hx performance – over estimate. 20% Rule Applies The level of involvement of the PI often is a valid predictor of enrollment when all other influencers are equal. Geographical location is important – incidence and prevalence of disease are impacted by population. These tools apply within disease indication – sites may enroll slower or faster in another indication.43 M.R. Finch, PhD, MPH
    44. 44. MEASURING PROGRAM PERFORMANCE Recruitment Program  Include evaluation program at outset  Design plan in conjunction with:  Steering Committee – KOLs  Site Input – PI, CRC, Research Dir., Marketing  Internal Marketing and Medical Affairs  Synergy across internal and external sources  Implement early and evaluate early.  Assess often and redesign as needed  Craft and maintain evaluation ROI report – Sr. Execs will want report on cost to benefit ratio – your position may depend on effect.44 M.R. Finch, PhD, MPH
    45. 45. MEASURING PROGRAM PERFORMANCE Recruitment Program Evaluation  Steering Committee, KOLs, and PIs  Assess level of product knowledge versus literature, publications, presentations, posters  Assess level of buy-in to protocol in general  Know your message points and TPP  Design presentations and assessment  Assess level of knowledge  Present data  Reassess Understanding = Acceptance = Performance45 M.R. Finch, PhD, MPH
    46. 46. MEASURING PROGRAM PERFORMANCE Design Recruitment Program and Test  Provide program to SC, KOLs, PI, and CRCs  Use site specific paradigm – one size does not fit all  Get site specific feedback and fine tune  What media works best in their area?  Develop site specific referral network list with contact information.  Review with Marketing, Med Affairs, Legal  Design evaluation KPIs for program  Design SMART metrics and tracking system  Design ROI report  Circle back once more prior to implementation46 Then off toPhD, MPH races! M.R. Finch, the
    47. 47. MEASURING PROGRAM PERFORMANCE Ok, this presentation is too short for a full recruitment program design seminar and KPIs – so Assume program is designed – What should we measure? $ Follow the spend $47 M.R. Finch, PhD, MPH
    48. 48. MEASURING PROGRAM PERFORMANCE Case Study  New global trial, 150 US trial subjects required, very tight study completion timelines  Poor perception of ability to capture goal  Limited Senior Management Buy-in for recruitment program and associated costs  Variety of band-aids in similar trials, no success  Perception enrollment based entirely on site DB48 M.R. Finch, PhD, MPH
    49. 49. Case Study  Action Plan  Began internal BU expertise and resource search  Implemented evaluation program at start  Implemented potential PI internal evaluation  Interview CTM/CPM, CRAs, MSLs, etc.  Reviewed available internal  Enrollment  Site activation times  Created PI/Site DB from scratch49 M.R. Finch, PhD, MPH
    50. 50. Case Study Action Plan  Implemented site selection program using quantitative metrics  Engaged all internal and external stakeholders for recruitment program design  Designed and implemented recruitment program early  Stratified site selection based on quantitative data  Began site activation based on cohort strata50 M.R. Finch, PhD, MPH
    51. 51. MEASURING PROGRAM PERFORMANCE Case Study Facts Program ROI Evaluation Demonstrated  Source of subjects  DB represented ~ < 60% of patients  Required CRC time resource funding for EMR/Chart  Emails and letters very productive  PI/CRC to patient discussions were driver  Community Organization ~ 20 % of patients  Media and PR relations ~ 20% of patients51 M.R. Finch, PhD, MPH
    52. 52. Case Study Case Study Facts  Results  Enrollment goal 93% met in allotted time  FPI goal met, LPI goal met  Site activation by strata  Time to grants/contracts and IRB reduced ~ 10%  Time to overall activation reduced 15%  Site stratification paradigm predicted enrollment  PI/CRC acceptance predicted enrollment  Screening rate elastic to program components = enrollment  Increased site and investigator relations confirmed via summative52 evaluation questionnaire at study end M.R. Finch, PhD, MPH
    53. 53. 53 11 /2 9/ 20 0 20 40 60 80 100 120 140 160 12 07 /2 9/ 20 07 1/ 29 /2 00 2/ 8 29 /2 00 3/ 8 29 /2 00 4/ 8 29 /2 00 5/ 8 29 /2 00 6/ 8 29 /2 00 7/ 8 29 Screening /2 00 8M.R. Finch, PhD, MPH 8/ 29 /2 00 9/ 8 29 /2 Case Study 00 10 8 /2 9/ 20 11 08 /2 9/ 20 08 Screening
    54. 54. Case StudyScreening rates: 3 rates with 2 primary inflection points  December 07 - April 21, „08:  0.23 pts/day  April 22, – July 23, ‟08:  0.40 pts/day  July 24 – October 17, ‟08:  0.73 pts/day• Additional inflection point  September 23 – October 17, ‟08:  0.94 pts/day M.R. Finch, PhD, MPH54
    55. 55. Conclusion Defining KPIs and SPIs Early is Critical Defining Evaluation Plan in Parallel with Program/Project Planning is Critical to Success Program Evaluation Creates ROI Measuring KPIs and SPIs Creates Documentable ROI and Increases Future Funding55 M.R. Finch, PhD, MPH
    56. 56. References Bain & Company. Has the Pharmaceutical Model Gone Bust? (www.bain.com; December 8th, 2003.) Body of Knowledge 5th edition, Association for Project Management, 2006. Colier, R. Rapidly rising clinical trial costs worry researchers. CMAJ. January 3, 2009. 180(3). Cutting Edge Information. “Clinical Operations: Accelerating Trials, Allocating Resources and Measuring Performance” [Accessed 2006, www.ClinicalTrialBenchmarking.com] Cutting Edge Information. [Accessed 2011, www.ClinicalTrialBenchmarking.com] DiMasi, JA, Hansen, RW, Grabowski, HG. The price of innovation: New estimates of drug development costs. J Health Eco 22(2003)151-185. Johnston, SC, Hauser, SL. Clinical Trials: Rising Costs Limit Innovation. Ann Neurol. Dec;62(6):A6-7. Milosevic, Dragan Z. . Project Management ToolBox: Tools and Techniques for the Practicing Project Manager. 2003, Wiley. Pharmaceutical Research and Manufacturers of America, Pharmaceutical Industry Profile, 2009. (Washington, DC: PhRMAA, April 2009. William M.K. Trochim. Research Methods Knowledge Base: Introduction to Evaluation. Web Center for Social Research Methods. ,2006. [Accessed 28NOV2011, www.socialresearchmethods.net] W.K. Kellogg Foundation. W.K. Kellogg Foundation Program Evaluation Handbook. 1997 [ Accessed 28NOV2011, www.wkkf.org/knowledge-center/resources].56 M.R. Finch, PhD, MPH

    ×