SlideShare a Scribd company logo
1 of 38
11/23/2015 1
‘Lean’ Thinking for IRB
Process Improvement
Elizabeth (Liz) Tioupine, CIP
Sr. System and Process Specialist, HRPP
University of California, San Francisco
11/23/2015 2
Disclosure: Elizabeth Tioupine
I have no relevant
personal/professional/financial
relationships with respect to this
educational activity
11/23/2015 3
UCSF – About Us
High Volume Academic Medical Center
• 4 Committees - 8 meetings per month
• 6500 Active Studies
• ~ 1600 Full Committee
• ~ 4900 Minimal Risk
• ~ 1300 Exempt
• 4600 Continuing Reviews/Yr
• 1000 Submissions/Mo (new and post-approval
submissions)
11/23/2015 4
UCSF IRB’s
Process Improvement History
• Divided Coordinators into Assessment and
Review Teams
• Challenges included knowledge transfer,
excessive handoffs, re-work
2005-2006
‘QIP’ Project
• Single analyst manages the submission
start to finish
• Fix ‘everything’ during screening phase so
studies could be approved at the meeting
2007-2012
Cradle to Grave
• Adaptive change model
• Focus on the entire operation, not just
initial submissions
• Continuous process improvement
2013-present
Lean IRB
11/23/2015 5
Lean Objectives for UCSF IRB
• Eliminate non-value added work,
unnecessary handoffs, waiting,
extra inventory
Reduce Waste
• Get it right the first timeBuilt-in Quality
• Reduce variation between staff,
between review board committees
Consistent
Standards and
Process
11/23/2015 6
3 Major Lean IRB Initiatives
“Creating a Researcher-Friendly Chart
Review Application,” Gelfand, A. et al (2012)
Continuing Review Project (2015)
Full Committee Time to Approval Process
Improvement Project (2013-present)
11/23/2015 7
New Chart Review Research
Application Project – AIMS
Create an efficient and simple chart-review application
form that a novice or junior researcher could easily
complete in < 15 minutes.
Achieve a ≥ 50% reduction in number of reviews (rounds)
required to receive IRB approval compared to current
baseline for Category 5 expedited review studies.
Achieve a ≥ 50% reduction in time to approval (measured
in days) compared to current baseline for Category 5
expedited review.
11/23/2015 8
Form Redesign
• Simplified and shortened form
• Reworded questions using
research terminology
• Replaced open text fields with
multiple choice check boxes
wherever possible
Types of
Changes
11/23/2015 9
Results
11/23/2015 10
Conclusions
Identifying problem questions and fixing them had the
greatest impact on reducing iterations
Converting text boxes into multiple choice check boxes
virtually eliminated requests for revisions to those
questions
Campus LOVED the new short and streamlined form
Big win for the IRB!!!
11/23/2015 11
Continuing Review Lean Project
(Form Improvement Phase) – AIMS
Reduce the number of submissions
requiring correction
Reduce number of submission rounds
needed for approval
11/23/2015 12
Continuing Review Submission Data
January 2014 to September 2015, N=7952)
• 31% of Continuing Review submissions with no
changes and no reportable events are returned at
least one time for corrections
• ~10% of Continuing Review submissions required
three or more rounds to be approved
11/23/2015 13
Project Details
Types of Errors
• Accrual number errors –
36%
• Wrong status – 22%
• Incomplete form – 11%
• Uploading revised
documents as new ones –
11%
• Failure to incorporate
changes in the Study
Application – 9%
• Submitting last year’s info
– 5%
Form Improvements
• Redesigned status section
and accrual sections
• Implement Show/Hide for
questions but all questions
required
• Require attachment of
revised documents and forms
when needed
• Provide pop-up instructions
at the point of need
• Abbreviated the form for
minimal risk research and
data analysis phase studies
11/23/2015 14
Full Committee T2A Project - AIMS
Reduce the overall time to approval from 87 days to 43
days for full committee applications (50% reduction)
Reduce the amount of work (review rounds, number of
stipulations) for the IRB analysts and research community
Improve the quality of submissions that the IRB receives
by identifying common errors and fixing the application
form
11/23/2015 15
New Submission Data
(Feb 2010-July 2013)
85% of new applications returned at least once
for pre-review corrections
Most studies undergo several submission rounds
prior to approval
~ 50% of the total time to approval (T2A) is time
spent waiting for the study team to respond to
requests for corrections or changes
11/23/2015 16
Time to Approval by Meeting Outcome
(2010-2013 Data)
Median Time to Approval (T2A) (n = 1010)
• 45 days for studies approved at the meeting
• 68 days if Revisions Requested (+ 3.5 weeks)
• 118 days if Returned by Committee (+ 10 weeks)
11/23/2015 17
Lean Interventions
"Fast Track" review (February 2014)
21-day deadline for pre-review corrections (May 2014)
Submission Standards (May 2014)
Pre-Review Correction Consistency Project (2014-
present)
Focus on the Regulatory Criteria for Approval (2014-
present)
Improved Study Application form (Fall 2015)
11/23/2015 18
“Fast Track” Project
Project
• PI’s of well-prepared submissions were
given the option to go straight to review if
they could be available by phone during the
meeting
Methods • Phone call option offered to 22 PIs with new
studies on a CHR agenda; 50% accepted
Results
• In total only 9% (2/22) of applications
resulted in an improved outcome based on
phone calls to investigators during
committee meetings
Outcome
• CHR analyst felt that additional work before
and during committee of arranging call was
significant – process not adopted
11/23/2015 19
21-Day Response Window
for Corrections
Pre-intervention - 45 days to respond to corrections (each
time) and 45 days to respond to changes requested by the
IRB (each time)
If you’re trying to achieve a 43 day T2A, you can’t allow the
study team 45 days to respond to corrections
11/23/2015 20
Pre-review: First Correction Round
0
20
40
Jan'13
Mar'13
Jun'13
Sept'…
Dec'13
Mar'14
Jun'14
Sep'14
Dec'14
Mar'15
Jun'15
Sep'15
CalendarDays
Median Time to Changes Requested by Screener
0
5
10
15
20
25
Jan…
Apr…
Jul'13
Oct'…
Jan…
Apr…
Jul'14
Oct'…
Jan…
Apr…
Jul'15
Oct'…
CalendarDays
Median Time For Investigator Response to
Changes Requested by Screener
This alone is not the answer!
11/23/2015 21
21-Day Response Window
for Corrections
Pre-intervention - 45 days to respond to corrections (each
time) and 45 days to respond to changes requested by the
IRB (each time)
If you’re trying to achieve a 43 day T2A, you can’t allow the
study team 45 days to respond to corrections
Results: Impact on overall T2A – Minimal
Still in effect
11/23/2015 22
Submission Quality
Poor quality of some submissions causes
delays for all submissions
• Submissions are often incomplete
• Inadequate responses to stipulations
require unnecessary re-work
• Delays correspondence for other studies
11/23/2015 23
Submission Standards
• Submissions sent back if missing:
• Scientific or feasibility approval, as required
• Study protocol for greater than minimal risk
research
• Investigator’s Brochure (drug/device study)
• Human Subjects Section of grant for federally
funded studies
Incomplete
• Submissions sent back if:
• Incomplete, incorrect, or unreadable
application form
• Major inconsistencies within or between
documents
• Important attachments are missing that are
needed for the review
Submission
Standards
Not Met
11/23/2015 24
Immediate Impact
Data collected through November 2014
11/23/2015 25
Pre-Review Stipulation Analysis
Consistency Project
Categorization and Standards for Pre-
Review Stipulations (IRB Coordinators)
• Send prior to the meeting
• Send post-meeting
• Not critical to the conduct of the study –
‘Let it go’
11/23/2015 26
Pre-Intervention Flow for Studies
With Any Corrections
Submission
Received
Send All
Corrections
Review
Send IRB-
Requested
Changes
Approve
This was the flow for 85% of the full board studies
11/23/2015 27
Post – Intervention Flow
Submission
Received
Send Only
Critical
Corrections
Review
Send IRB
Changes
and Post-
Meeting
Corrections
Approve
11/23/2015 28
Difference in Time to First Review
Corrected vs. Non-Corrected
0
10
20
30
40
50
60
70
80
90
100
Jan'13
Apr'13
Jul'13
Oct'13
Jan'14
Apr'14
Jul'14
Oct'14
Jan'15
Apr'15
Jul'15
Oct'15
CalendarDays
Median Time to First Review
No Corrections
Corrections Needed
Linear (No
Corrections)
Linear (Corrections
Needed)
11/23/2015 29
Volume Sent for Pre-Review
Correction Rounds
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Jan'13
Apr'13
Jul'13
Oct'13
Jan'14
Apr'14
Jul'14
Oct'14
Jan'15
Apr'15
Jul'15
Oct'15
%ofSubmissions
11/23/2015 30
Median Time to Approval (T2A)
0
20
40
60
80
100
Jan'13
Apr'13
Jul'13
Oct'13
Jan'14
Apr'14
Jul'14
Oct'14
Jan'15
Apr'15
Jul'15
Oct'15
Days
11/23/2015 31
Time to Approval by Meeting
Outcome – 2013 Data
Median Time to Approval (T2A) (n = 1010)
• 45 days for Approved at the meeting
• 68 days for Revisions Requested (+ 3.5 weeks)
• 118 days for Returned by Committee (+ 10
weeks)
11/23/2015 32
Current State – October 2015
Median T2A is 55 days
65% of studies skip correction round
Working with committees to change the culture of review
(Criteria for Approval (45CFR46 .111/ 21CFR56.111)
Working with faculty on changes to Study Application to
improve the quality of submissions received
11/23/2015 33
Full Committee Volume
Received / Approved / Withdrawn
Year
Number
Received
Number
Approved
Number
Withdrawn
2012 399 341 60
2013 337 257 52
2014 446 345 69
2015* 406 321 39
15% of full board studies are withdrawn before they
open. This is a HUGE drain on the IRB’s resources!
11/23/2015 34
Next Project:
Improved Study Application
Re-write all questions in “researcher-friendly” language
Consolidate and reorganize form sections for more intuitive flow
Use pre-review stipulation data to provide more instructions for
commonly mis-answered questions
Show/Hide questions – display only questions that the user has to
answer
Provide instructions at the point of need
11/23/2015 35
Leverage Your eIRB System
• Calculate dates
• Notifications to other units
• Reviewers Checklist
Automation
• Prevent errors – validations
• Data value triggers for
instructions
• Programmed logic – content
specific controls on data fields
Built-in
quality
• Internal QA/QI
• Process improvement activities
Mine your
data
11/23/2015 36
Metrics – What We Do Collect
Volume of submissions
Volume approved
‘Time to’ time-points: (AVG, Median, MAX, MIN)
• Changes requested by IRB (pre- & post-review)
• Time for investigator to respond (pre- & post-review)
• Time to first review (‘clean,’ ‘corrected,’ ‘all’)
• Time to approval
Review volume statistics (Chairs, VCs, Staff)
Statistics by committee (volume, T2A)
11/23/2015 37
Metrics – What We Want to Collect
Business Analytics – Dashboard Style Views
• ‘Submissions Out of T2C’
• Processing statistics by Committee, by person
• Reviewer statistics
Modified T2A Report
• ‘Clean,’ vs ‘corrected’ studies
• Total time with IRB vs study team
All statistics and metrics by Funding Type, by Department, by PI
11/23/2015 38
Final Thoughts About Lean
Change is hard and sometimes scary for people
Collect as much data as you can – you can’t change what
you don’t measure
Not everything you try will work but the experience will
always be instructive
Think of it as a work in progress
Build in feedback loops and keep adjusting based on
experience

More Related Content

Similar to E21 UCSF Lean 2015_Final_LT 112115 public

FCA Internship Presentation
FCA Internship PresentationFCA Internship Presentation
FCA Internship Presentation
Sarah Fox
 
QI training curriculum- tracked jg-SSK-31-07-12.docx
QI training curriculum- tracked jg-SSK-31-07-12.docxQI training curriculum- tracked jg-SSK-31-07-12.docx
QI training curriculum- tracked jg-SSK-31-07-12.docx
Shelemo Shawula (MD,MPH)
 
Nick's Project Storyboards
Nick's Project StoryboardsNick's Project Storyboards
Nick's Project Storyboards
Nicholas Cline
 
BABT RBT Presentation
BABT RBT PresentationBABT RBT Presentation
BABT RBT Presentation
Corina Lugo
 

Similar to E21 UCSF Lean 2015_Final_LT 112115 public (20)

lead presentation class 2016
lead presentation class 2016lead presentation class 2016
lead presentation class 2016
 
Pp slide set 11 whole system ri es
Pp slide set 11   whole system ri esPp slide set 11   whole system ri es
Pp slide set 11 whole system ri es
 
Achieve Webinar Slides: Student Assessment Inventory Tool for School Districts
Achieve Webinar Slides: Student Assessment Inventory Tool for School DistrictsAchieve Webinar Slides: Student Assessment Inventory Tool for School Districts
Achieve Webinar Slides: Student Assessment Inventory Tool for School Districts
 
Quality_Improvement_Tools___collaborative_Learning_Session_May2021.pptx
Quality_Improvement_Tools___collaborative_Learning_Session_May2021.pptxQuality_Improvement_Tools___collaborative_Learning_Session_May2021.pptx
Quality_Improvement_Tools___collaborative_Learning_Session_May2021.pptx
 
Audit Technique
Audit TechniqueAudit Technique
Audit Technique
 
FCA Internship Presentation
FCA Internship PresentationFCA Internship Presentation
FCA Internship Presentation
 
Lean presentation amc
Lean presentation amcLean presentation amc
Lean presentation amc
 
Department heads office managers staff revised jan 2014 final
Department heads office managers staff revised jan 2014 finalDepartment heads office managers staff revised jan 2014 final
Department heads office managers staff revised jan 2014 final
 
Applying Lean in Pathology
Applying Lean in PathologyApplying Lean in Pathology
Applying Lean in Pathology
 
Final Class Presentation on Project Audit and Closure.ppt
Final Class Presentation on Project Audit and Closure.pptFinal Class Presentation on Project Audit and Closure.ppt
Final Class Presentation on Project Audit and Closure.ppt
 
Parallel Session: Collaborating to Give Every Child the Best Start
Parallel Session: Collaborating to Give Every Child the Best StartParallel Session: Collaborating to Give Every Child the Best Start
Parallel Session: Collaborating to Give Every Child the Best Start
 
Presentation for faculty awareness about Quality Management System and Accred...
Presentation for faculty awareness about Quality Management System and Accred...Presentation for faculty awareness about Quality Management System and Accred...
Presentation for faculty awareness about Quality Management System and Accred...
 
20150114_presentation_RS_LEAN
20150114_presentation_RS_LEAN20150114_presentation_RS_LEAN
20150114_presentation_RS_LEAN
 
QI training curriculum- tracked jg-SSK-31-07-12.docx
QI training curriculum- tracked jg-SSK-31-07-12.docxQI training curriculum- tracked jg-SSK-31-07-12.docx
QI training curriculum- tracked jg-SSK-31-07-12.docx
 
SBAC K-12 Leads Meeting Test Administration Updates
SBAC K-12 Leads Meeting Test Administration UpdatesSBAC K-12 Leads Meeting Test Administration Updates
SBAC K-12 Leads Meeting Test Administration Updates
 
Nick's Project Storyboards
Nick's Project StoryboardsNick's Project Storyboards
Nick's Project Storyboards
 
Third-Year Review Mid-Term 03/03/15
Third-Year Review Mid-Term 03/03/15Third-Year Review Mid-Term 03/03/15
Third-Year Review Mid-Term 03/03/15
 
BABT RBT Presentation
BABT RBT PresentationBABT RBT Presentation
BABT RBT Presentation
 
Rapid Cycle Improvement 2009
Rapid Cycle Improvement 2009Rapid Cycle Improvement 2009
Rapid Cycle Improvement 2009
 
Elective care conference: the Endoscopy Improvement Programme
Elective care conference: the Endoscopy Improvement ProgrammeElective care conference: the Endoscopy Improvement Programme
Elective care conference: the Endoscopy Improvement Programme
 

E21 UCSF Lean 2015_Final_LT 112115 public

  • 1. 11/23/2015 1 ‘Lean’ Thinking for IRB Process Improvement Elizabeth (Liz) Tioupine, CIP Sr. System and Process Specialist, HRPP University of California, San Francisco
  • 2. 11/23/2015 2 Disclosure: Elizabeth Tioupine I have no relevant personal/professional/financial relationships with respect to this educational activity
  • 3. 11/23/2015 3 UCSF – About Us High Volume Academic Medical Center • 4 Committees - 8 meetings per month • 6500 Active Studies • ~ 1600 Full Committee • ~ 4900 Minimal Risk • ~ 1300 Exempt • 4600 Continuing Reviews/Yr • 1000 Submissions/Mo (new and post-approval submissions)
  • 4. 11/23/2015 4 UCSF IRB’s Process Improvement History • Divided Coordinators into Assessment and Review Teams • Challenges included knowledge transfer, excessive handoffs, re-work 2005-2006 ‘QIP’ Project • Single analyst manages the submission start to finish • Fix ‘everything’ during screening phase so studies could be approved at the meeting 2007-2012 Cradle to Grave • Adaptive change model • Focus on the entire operation, not just initial submissions • Continuous process improvement 2013-present Lean IRB
  • 5. 11/23/2015 5 Lean Objectives for UCSF IRB • Eliminate non-value added work, unnecessary handoffs, waiting, extra inventory Reduce Waste • Get it right the first timeBuilt-in Quality • Reduce variation between staff, between review board committees Consistent Standards and Process
  • 6. 11/23/2015 6 3 Major Lean IRB Initiatives “Creating a Researcher-Friendly Chart Review Application,” Gelfand, A. et al (2012) Continuing Review Project (2015) Full Committee Time to Approval Process Improvement Project (2013-present)
  • 7. 11/23/2015 7 New Chart Review Research Application Project – AIMS Create an efficient and simple chart-review application form that a novice or junior researcher could easily complete in < 15 minutes. Achieve a ≥ 50% reduction in number of reviews (rounds) required to receive IRB approval compared to current baseline for Category 5 expedited review studies. Achieve a ≥ 50% reduction in time to approval (measured in days) compared to current baseline for Category 5 expedited review.
  • 8. 11/23/2015 8 Form Redesign • Simplified and shortened form • Reworded questions using research terminology • Replaced open text fields with multiple choice check boxes wherever possible Types of Changes
  • 10. 11/23/2015 10 Conclusions Identifying problem questions and fixing them had the greatest impact on reducing iterations Converting text boxes into multiple choice check boxes virtually eliminated requests for revisions to those questions Campus LOVED the new short and streamlined form Big win for the IRB!!!
  • 11. 11/23/2015 11 Continuing Review Lean Project (Form Improvement Phase) – AIMS Reduce the number of submissions requiring correction Reduce number of submission rounds needed for approval
  • 12. 11/23/2015 12 Continuing Review Submission Data January 2014 to September 2015, N=7952) • 31% of Continuing Review submissions with no changes and no reportable events are returned at least one time for corrections • ~10% of Continuing Review submissions required three or more rounds to be approved
  • 13. 11/23/2015 13 Project Details Types of Errors • Accrual number errors – 36% • Wrong status – 22% • Incomplete form – 11% • Uploading revised documents as new ones – 11% • Failure to incorporate changes in the Study Application – 9% • Submitting last year’s info – 5% Form Improvements • Redesigned status section and accrual sections • Implement Show/Hide for questions but all questions required • Require attachment of revised documents and forms when needed • Provide pop-up instructions at the point of need • Abbreviated the form for minimal risk research and data analysis phase studies
  • 14. 11/23/2015 14 Full Committee T2A Project - AIMS Reduce the overall time to approval from 87 days to 43 days for full committee applications (50% reduction) Reduce the amount of work (review rounds, number of stipulations) for the IRB analysts and research community Improve the quality of submissions that the IRB receives by identifying common errors and fixing the application form
  • 15. 11/23/2015 15 New Submission Data (Feb 2010-July 2013) 85% of new applications returned at least once for pre-review corrections Most studies undergo several submission rounds prior to approval ~ 50% of the total time to approval (T2A) is time spent waiting for the study team to respond to requests for corrections or changes
  • 16. 11/23/2015 16 Time to Approval by Meeting Outcome (2010-2013 Data) Median Time to Approval (T2A) (n = 1010) • 45 days for studies approved at the meeting • 68 days if Revisions Requested (+ 3.5 weeks) • 118 days if Returned by Committee (+ 10 weeks)
  • 17. 11/23/2015 17 Lean Interventions "Fast Track" review (February 2014) 21-day deadline for pre-review corrections (May 2014) Submission Standards (May 2014) Pre-Review Correction Consistency Project (2014- present) Focus on the Regulatory Criteria for Approval (2014- present) Improved Study Application form (Fall 2015)
  • 18. 11/23/2015 18 “Fast Track” Project Project • PI’s of well-prepared submissions were given the option to go straight to review if they could be available by phone during the meeting Methods • Phone call option offered to 22 PIs with new studies on a CHR agenda; 50% accepted Results • In total only 9% (2/22) of applications resulted in an improved outcome based on phone calls to investigators during committee meetings Outcome • CHR analyst felt that additional work before and during committee of arranging call was significant – process not adopted
  • 19. 11/23/2015 19 21-Day Response Window for Corrections Pre-intervention - 45 days to respond to corrections (each time) and 45 days to respond to changes requested by the IRB (each time) If you’re trying to achieve a 43 day T2A, you can’t allow the study team 45 days to respond to corrections
  • 20. 11/23/2015 20 Pre-review: First Correction Round 0 20 40 Jan'13 Mar'13 Jun'13 Sept'… Dec'13 Mar'14 Jun'14 Sep'14 Dec'14 Mar'15 Jun'15 Sep'15 CalendarDays Median Time to Changes Requested by Screener 0 5 10 15 20 25 Jan… Apr… Jul'13 Oct'… Jan… Apr… Jul'14 Oct'… Jan… Apr… Jul'15 Oct'… CalendarDays Median Time For Investigator Response to Changes Requested by Screener This alone is not the answer!
  • 21. 11/23/2015 21 21-Day Response Window for Corrections Pre-intervention - 45 days to respond to corrections (each time) and 45 days to respond to changes requested by the IRB (each time) If you’re trying to achieve a 43 day T2A, you can’t allow the study team 45 days to respond to corrections Results: Impact on overall T2A – Minimal Still in effect
  • 22. 11/23/2015 22 Submission Quality Poor quality of some submissions causes delays for all submissions • Submissions are often incomplete • Inadequate responses to stipulations require unnecessary re-work • Delays correspondence for other studies
  • 23. 11/23/2015 23 Submission Standards • Submissions sent back if missing: • Scientific or feasibility approval, as required • Study protocol for greater than minimal risk research • Investigator’s Brochure (drug/device study) • Human Subjects Section of grant for federally funded studies Incomplete • Submissions sent back if: • Incomplete, incorrect, or unreadable application form • Major inconsistencies within or between documents • Important attachments are missing that are needed for the review Submission Standards Not Met
  • 24. 11/23/2015 24 Immediate Impact Data collected through November 2014
  • 25. 11/23/2015 25 Pre-Review Stipulation Analysis Consistency Project Categorization and Standards for Pre- Review Stipulations (IRB Coordinators) • Send prior to the meeting • Send post-meeting • Not critical to the conduct of the study – ‘Let it go’
  • 26. 11/23/2015 26 Pre-Intervention Flow for Studies With Any Corrections Submission Received Send All Corrections Review Send IRB- Requested Changes Approve This was the flow for 85% of the full board studies
  • 27. 11/23/2015 27 Post – Intervention Flow Submission Received Send Only Critical Corrections Review Send IRB Changes and Post- Meeting Corrections Approve
  • 28. 11/23/2015 28 Difference in Time to First Review Corrected vs. Non-Corrected 0 10 20 30 40 50 60 70 80 90 100 Jan'13 Apr'13 Jul'13 Oct'13 Jan'14 Apr'14 Jul'14 Oct'14 Jan'15 Apr'15 Jul'15 Oct'15 CalendarDays Median Time to First Review No Corrections Corrections Needed Linear (No Corrections) Linear (Corrections Needed)
  • 29. 11/23/2015 29 Volume Sent for Pre-Review Correction Rounds 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Jan'13 Apr'13 Jul'13 Oct'13 Jan'14 Apr'14 Jul'14 Oct'14 Jan'15 Apr'15 Jul'15 Oct'15 %ofSubmissions
  • 30. 11/23/2015 30 Median Time to Approval (T2A) 0 20 40 60 80 100 Jan'13 Apr'13 Jul'13 Oct'13 Jan'14 Apr'14 Jul'14 Oct'14 Jan'15 Apr'15 Jul'15 Oct'15 Days
  • 31. 11/23/2015 31 Time to Approval by Meeting Outcome – 2013 Data Median Time to Approval (T2A) (n = 1010) • 45 days for Approved at the meeting • 68 days for Revisions Requested (+ 3.5 weeks) • 118 days for Returned by Committee (+ 10 weeks)
  • 32. 11/23/2015 32 Current State – October 2015 Median T2A is 55 days 65% of studies skip correction round Working with committees to change the culture of review (Criteria for Approval (45CFR46 .111/ 21CFR56.111) Working with faculty on changes to Study Application to improve the quality of submissions received
  • 33. 11/23/2015 33 Full Committee Volume Received / Approved / Withdrawn Year Number Received Number Approved Number Withdrawn 2012 399 341 60 2013 337 257 52 2014 446 345 69 2015* 406 321 39 15% of full board studies are withdrawn before they open. This is a HUGE drain on the IRB’s resources!
  • 34. 11/23/2015 34 Next Project: Improved Study Application Re-write all questions in “researcher-friendly” language Consolidate and reorganize form sections for more intuitive flow Use pre-review stipulation data to provide more instructions for commonly mis-answered questions Show/Hide questions – display only questions that the user has to answer Provide instructions at the point of need
  • 35. 11/23/2015 35 Leverage Your eIRB System • Calculate dates • Notifications to other units • Reviewers Checklist Automation • Prevent errors – validations • Data value triggers for instructions • Programmed logic – content specific controls on data fields Built-in quality • Internal QA/QI • Process improvement activities Mine your data
  • 36. 11/23/2015 36 Metrics – What We Do Collect Volume of submissions Volume approved ‘Time to’ time-points: (AVG, Median, MAX, MIN) • Changes requested by IRB (pre- & post-review) • Time for investigator to respond (pre- & post-review) • Time to first review (‘clean,’ ‘corrected,’ ‘all’) • Time to approval Review volume statistics (Chairs, VCs, Staff) Statistics by committee (volume, T2A)
  • 37. 11/23/2015 37 Metrics – What We Want to Collect Business Analytics – Dashboard Style Views • ‘Submissions Out of T2C’ • Processing statistics by Committee, by person • Reviewer statistics Modified T2A Report • ‘Clean,’ vs ‘corrected’ studies • Total time with IRB vs study team All statistics and metrics by Funding Type, by Department, by PI
  • 38. 11/23/2015 38 Final Thoughts About Lean Change is hard and sometimes scary for people Collect as much data as you can – you can’t change what you don’t measure Not everything you try will work but the experience will always be instructive Think of it as a work in progress Build in feedback loops and keep adjusting based on experience

Editor's Notes

  1. Give an intro of myself and who I am - yrs experience, lean six sigma green belt in 2013
  2. Failed to involve IRB Chairs and Members Screning - Systemwide – all areas involved in protocol approval and initiation. – not just the IRB
  3. Being held to different standards
  4. Worked with faculty to translate the IRB application into “researcher-friendly” terminology Made changes to form data types and organization of sections and questions Pilot tested the application in a support environment Testers completed a survey about ease of use, time required to complete the form Data were collected about time to submission (T2S), time to approval (T2A), and number of submission rounds DEFINE TIME TO APPROVAL
  5. (10 sections and 32 questions) rather than IRB regulatory terminology
  6. T2A and number of rounds for chart review research improved dramatically by making IRB form improvements Not a “15-minute application” but an easy-to-use form that significantly reduced the time to submission Identifying questions most often answered inadequately and working with researchers to reword questions so they could be better understood had the greatest impact on reducing iterations Identifying confusing free-text questions and converting them into multiple choice check boxes virtually eliminated requests for revisions to those questions
  7. Worked with faculty to translate the IRB application into “researcher-friendly” terminology Made changes to form data types and organization of sections and questions Pilot tested the application in a support environment Testers completed a survey about ease of use, time required to complete the form Data were collected about time to submission (T2S), time to approval (T2A), and number of submission rounds DEFINE TIME TO APPROVAL
  8. We pulled the data to look at 2 things – correction rounds and types of errors Extracted correction requests (‘stipulations’) from eIRB system for detailed analysis of types and incidence of errors Analyzed submission processing and review data for Continuing Reviews from January 2014-September 2015 (n=7952) 31% of Continuing Review Forms for research with no changes and no reportable events are returned for corrections Made changes to the Continuing Review Form to address most common types of errors (vs. 32% of renewals with minor changes and 34% of renewals with major modifications) A lot of unnecessary work What this said to us is that we have a problem with our form!!!!
  9. Most common errors: Subject accrual number errors – 36% Selection of the wrong study status – 22% Incomplete form – 11% Uploading revisions of attachments as new documents – 11% Failure to update the application and consent form with proposed changes – 9% Submitting last year’s status information from a copied form – 5% Major Improvements also include: Show/Hide Pop-up ‘advisements’ to guide users Shorter form for minimal risk research and studies not involving subject contact Fewer sections Smarter form includes only relevant attachment sections but user MUST attach a document or revised application form – studies in data analysis with no problems to report Still collecting and analyzing data
  10. Our Time to Approval is defined as how long it takes from the moment they click submit to the moment we click Approve and it includes time waiting for corrections or changes
  11. Poorly-prepared applications take excessive time and resources to screen and review
  12. For studies approved at the meeting, ~ 35 days were eaten up by the pre-review correction process
  13. We’ve done some analysis and the time to changes requested by screener are influenced by several factors, including: Volume Rec’d Work in progress Staffing (mainly vacation and holiday schedules) Quality of the submissions being processed – a few really messy submissions sets everything back Adding average of about 3.5 weeks on to the front end Some studies have multiple correction rounds This alone is not the answer
  14. Incomplete – quick triage – turnaround in a couple of days Submission Standard Not Met – happens during the more thorough pre-review screening
  15. 20% of the submissions received were either incomplete or unreviewable Effort was abandoned because there wasn’t a resource to send people to for help – Staff ended up helping them anyways – non-value added work – re-work , etc.
  16. IRB Coordinators analysis of pre-review stipulations – categorization and standards Send Prior to the Meeting (information/content needed for review) Send After the Meeting (corrections needed but not critical for review) Let it Go (changes to the application that would not make a substantial difference in the conduct of the study)
  17. Exhaustive list of everything that needed correction, plus other changes to make it ‘cleaner’ – Still had the objective to ‘screen to approval’ All Corrections could be up to 48 (the worst study) but averaged about 20 Some corrections took several rounds We were allowing 45 days for the study team to respond to corrections and 45 days to respond to
  18. One of the main differences here is that instead of receiving an average of 18 items to correct, they may only receive 3 or 4 The rest of the correction items are batched with the committee’s requested changes Faster for the study team to turn it around pre-review feedback from faculty and coordinators is that if they receive a short list of changes they can do it in one sitting. For longer lists of changes, they need
  19. Note that this is a graph of time to first meeting date, not approval times but the Time to Approval trend lines are similar for those as well.
  20. Going back to this slide from earlier – I didn’t include our data on % of studies approved at the first meeting but it’s very, very low. Over 95% of studies are returned with Revisions Requested. We have been working with the Chairs and Vice Chairs to shift the members towards focusing on the criteria for approval – and especially that risks and minimized, that the risk to benefit ratio is acceptable – but it’s very hard to change the culture with long-standing members.
  21. We are still keeping new submissions with the IRB Coordinator that first screened it. I am not convinced this is most efficient. Review cycle with 2 meetings per month, the Coordinator has to screen submissions and prepare the next agenda when they ideally would be working on correspondence from the meeting.
  22. Time frames are different – studies may not be approved or withdrawn in the year they were submitted 2015 has an asterisk because the data are not yet complete. We receive about 40 new full committee submissions a month. These data were collected on 11/3 so I expect our final annual volume to be somewhere between 470 and 490. 2014 saw a 32% increase over 2013 volume, which seems to be unusually low but a 12% increase over 2012. We may be looking at a 10% increase in volume this year over last year. One thing I wanted to point out - Look at the increase in volume since 2013 when we started the Lean T2A project. I wonder how much of an decrease in T2A we could have achieved if the volume had remained fairly constant – or at least not risen so sharply I expect the number of 2015 studies that are ultimately withdrawn to increase through February or even March and to include about 30 more studies, some of which under review right now. Just out of curiosity – how many of you charge for IRB review after it’s undergone review? How many after it’s approved? How many of you collect IRB review fees for withdrawn submissions?
  23. This part of the project is really the main lynchpin of the effort to improve the quality of submissions we receive. Taking the lessons learned from the chart-review application, feedback from faculty about the impact that flow and organization have on the ease of completing the application, providing additional instructions for questions commonly answered inadequately from the stipulation analysis project – these combined with implementation of some new functionality should make for better applications. Provide instructions at the point of need – we did have Help text but most users didn’t click on it. No more ‘If Yes’ Instructions at the point of need Site-specific requirements Other approvals needed Flags for likely errors Special instructions for consent forms
  24. Some of these are canned reports but others are very labor-intensive to compile!!
  25. Time to Approval for ‘clean’ vs. ‘corrected’ Submissions out of T2C- like to be able to email a weekly report to the managers and coordinators whenever an unassigned study has been sitting either on the study side or our side for x-number of days – prevent the irate calls to our Director if we can! (Scroll back to 34 – this wasn’t our fault but it looks like it was)