Competency-based assessment: The good, the bad, and the puzzling

3,888 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
3,888
On SlideShare
0
From Embeds
0
Number of Embeds
1,663
Actions
Shares
0
Downloads
67
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Competency-based assessment: The good, the bad, and the puzzling

  1. 1. Competency-based assessment: The good, the bad, and the puzzling Kevin W. Eva Medical Education Assessment Advisory Committee
  2. 2. The Medical Education Assessment Advisory Committee • Georges Bordage • Craig Campbell • Robert Galbraith • Shiphra Ginsburg • Eric Holmboe • Glenn Regehr
  3. 3. Our charge
  4. 4. 1. Defining the “complete” health care professional
  5. 5. “We demand ever more of the latter even as we bemoan its dominance, and yearn for the former while remaining wary of its inefficiencies and lack of uniformity.” Humanistic expectations vs Knowledge demands (Anderson, 2011)
  6. 6. 0 50 100 150 200 1957 1971 1978 1980 1984 1987 1989 1991 1993 1995 1997 1999 2001 2003 2005 2007 2009 2011 #ofarticles Year of Publication PubMed Search on 'Professionalism, Medical Education' Publication of CanMEDs 2000
  7. 7. 2. Creating clear and explicit milestones
  8. 8. % prior to blueprint publication % after blueprint publication p-value Exam performance 72.5 76.9 ns Agreement that exam tested material taught 63.5 81.3 < 0.01 Agreement that evaluation methods reflected subject matter 60.7 81.5 < 0.01 Agreement that exam was fair 58.0 76.9 < 0.05 (McLaughlin et al., 2005)
  9. 9. GeneralTheme • Broad, competency-based assessment frameworks, can promote performance improvement rather than simply measuring performance • They create professional culture through conversation, understanding, and steering
  10. 10. 1. Sending the wrong message
  11. 11. Implicit Messages: • Expertise is something that can be achieved • The goal is to “become independent” Risk: • Assessments as hurdles • Hesitation to disclose difficulties • Reduced compulsion to offer support (see MEAAC Report)
  12. 12. GoalTheory Performance Orientation Mastery Orientation •Desire to perform well •Desire to become proficient •Satisfaction derived from grades •Deeper engagement •Greater anxiety •Greater perseverance •Task avoidance •Stronger motivation (see Teunissen and Bok, 2013)
  13. 13. 2. Ignoring the continuum
  14. 14. Violation of 3 fundamental laws Variability Big Losers Big Winners Everyone Else Simple Probability Bell Curve Society’s Problem
  15. 15. Violation of 3 fundamental laws Variability Context specificity (Norcini, circa 2006) “The one truth in medical education.”
  16. 16. Violation of 3 fundamental laws Variability Context specificity Decay (Custers and ten Cate, 2011)
  17. 17. GeneralTheme • Be wary of the unintended consequences of adopting a competence-based assessment framework (see MEAAC Report)
  18. 18. To assess competency effectively requires assessment strategies that are … – broadly focused, – longitudinal, – integrated, – continuous, and – authentic
  19. 19. Three (Overlapping)Themes • Overcoming unintended consequences • Turning quality assurance into quality improvement • Ensuring authenticity
  20. 20. 1. Overcoming Unintended Consequences Gist: • Reduce emphasis on exams as point in time hurdles that prove one’s competence • Promote notion that trainees are equally accountable for their demonstration of learning
  21. 21. 1. Overcoming Unintended Consequences Strategy: • Build quality improvement activities into assessment practices • Use data from licensing process to facilitate formulation of learning plans – And further develop system to enforce follow through
  22. 22. 1. Overcoming Unintended Consequences Examples: • OSCE/CDM components that require candidates to follow- up on an error made; ask for help; use clinical decision supports • Feedback intra-candidate relative strengths and weaknesses and require generation of learning plan • Tailor subsequent assessments to identified weaknesses
  23. 23. 2. Turning Quality Assurance into Quality Improvement Gist: • Reduce the tension between high stakes licensing assessment and genuine investment in improvement
  24. 24. 2. Turning Quality Assurance into Quality Improvement Strategy: • Further integrate assessment practices across the continuum of learning with deliberate attention paid to (and reward of) quality improvement
  25. 25. 2. Turning Quality Assurance into Quality Improvement Examples: • Create a formative test tailoring platform for use by schools/individuals • Support a national “Diagnostic OSCE” late in UG that can feed data to subsequent stages of training/practice/assessment • Testing moments that require demonstration of response to data (e.g., OSCE station in which candidates bring personal data)
  26. 26. 3. Ensuring Authenticity Gist: • Assessment that models the realities of actual practice increases credibility, engagement, and ensures that efforts towards gamesmanship are pedagogically valuable
  27. 27. 3. Ensuring Authenticity Strategy: • Portfolio-supported workplace-based assessment • Increasing use of real world supports and real world uncertainties in current practices
  28. 28. 3. Ensuring Authenticity Examples: • Sequential OSCE stations; SPs who are trained to offer contradictory information mid-station • Post-encounter probes that require reflection on why approach was appropriate and alternative actions were ruled out • Internet enabled OSCEs
  29. 29. Completing the puzzle • Broaden the base of assessment • Build coherent and integrated system • Emphasize the primacy of learning • Harness the power of feedback • Share accountability with the individual
  30. 30. Completing the puzzle
  31. 31. kevin.eva@ubc.ca Thanks

×