Evidence-Based Practice: If Doctors Can Do It, Managers Can Do It?

1,095 views

Published on

Evidence-Based Practice: If Doctors Can Do It, Managers Can Do It?
Eric Barends
AUPHA 2013 Annual Meeting
Monterey, CA

Published in: Education, Health & Medicine
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,095
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
33
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Evidence-Based Practice: If Doctors Can Do It, Managers Can Do It?

  1. 1. Evidence-Based PracticeAUPHA Annual Meeting, June 20, 2013, MontereyIf doctors can do it managers can do it?
  2. 2. Evidence?outcome of scientific research,organizational facts & data,benchmarking, best practices,collective experience, personalexperience, intuition
  3. 3. All managers base theirdecisions on ‘evidence’
  4. 4. However ...
  5. 5. Many managers pay littleor no attention to the qualityof the evidence they basetheir decisions on
  6. 6. Trust me, 20 years ofmanagement experience
  7. 7. Teach managers how tocritically evaluate the validity,and generalizability of theevidence and help them find‘the best available’ evidence
  8. 8. Evidence-based decision
  9. 9. What is the added value of evidence-based practice for managers withinthe field of hospital care?Proof of concept
  10. 10. Teaching Hospital- 6 managersUniversity Hospital- 4 managersEvidence based pilot
  11. 11. Phase 1: Training managers in the principles of EBPPhase 2: Examination of the current decision makingprocesses that managers are usingPhase 3: Evaluation of 4 completed projects from anEB perspective (retrospective)Phase 4: Making EB recommendations for 4 newprojects (prospective)Phase 5: EvaluationEvidence based pilot
  12. 12. Some preliminary results
  13. 13. Decision making process
  14. 14. Decision making process Focus on procedures instead of evidence Internal politics and power struggles No critical appraisal of the evidence at hand Relying on anecdotal evidence (workshops, bestpractices, popular management books, consultants) One option (sometimes two) Bias: Outcome, Halo, Confirmation, etc.
  15. 15. Post mortem analysis
  16. 16. Evidence-based perspectiveNOT: Did we made the right decision?BUT: Is there evidence from scientific research tosupport (or call into question) the approachtaken?
  17. 17. Post mortem: leadership training
  18. 18. leadership training: dm process No problem definition No organizational evidence consulted Selection of training companies based onexperience, recommendation or reputation No explicit selection criteria / procedure ‘Best’ presentation has won: one size fits all
  19. 19. leadership training: scientific evidence 15 meta analyses, 5 relevant 37 (‘systematic’) reviews, 2 relevant Lots of relevant primary studies
  20. 20. leadership training: scientific evidence Long history (30 yrs): moderate effect sizes Senior & middle managers tend to benefit more thanmanagers at the supervisory level Effect on ‘poor’ leaders is limited. Leadership trainings that focus on interpersonal / socialskills show higher effect sizes than those based on aspecific leadership ‘model’
  21. 21. ReactionsWhoknew?DenialAngerBargainingAcceptance
  22. 22. Prospective /EB recommendations
  23. 23. Questions / projects 360 degree feedback Financial incentives Lean Six Sigma Hand Hygiene Goal setting Value Based Health Care Downsizing
  24. 24. Evidence-based perspectiveNOT: What works?BUT: What are, given the target group, the problemand the context involved, the main factorsdetermining the success or failure of the projectthat need to be taken into account?
  25. 25. Prospective: Multi Source Feedback
  26. 26. Multi Source Feedback: background IFMS: based on multi source feedback Regulating bodies and insurance companies(KPI’s – prices/ revenue) Based on CANMEDS, no standard method New market: consulting firms
  27. 27. Process Scoping session: inventory of the aspects relevantto the question Session with leading academic Search in relevant databases Critical Appraisal Summary / research synthesis Recommendations / guidelines
  28. 28. Multi Source Feedback: scientific evidence 223 primary studies on MSF, 42 relevant 6 meta analyses or systematic reviews onMSF, 3 relevant 18 meta analyses or systematic reviews on‘feedback’ or ‘performance appraisal’,5 relevant
  29. 29. Content of thefeedback (neg vs pos)Way of the deliveryof the feedbackInterpretation of thefeedbackPersonality of therateeFeedback orientationof the rateeType andnumber of ratersSelection ofratersRater reliablity(patients, nurses,colleagues)Type ofresponsescaleDevelopment vsperfomance appraisalOrganizationalculturePerceived proceduraljusticeMulti Source Feedback: main factors
  30. 30. ReactionsWhoknew?Wow!Great!Good stuff!Relevant!
  31. 31. Lessons learned I New approach Recalibrates the power dynamics (accountability!) The profit is in the process Different (better?) decisions were made Doctors love it!
  32. 32. Lessons learned II Hard for individual managers It starts with the senior management team It’s all about accountability Support system EBP > Planning & Control
  33. 33. One day, maybe …Chief Evidence Officer

×