Your SlideShare is downloading. ×
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
CBME and Assessment
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

CBME and Assessment

177

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
177
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
19
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • So what is the outcome, and what is the framework?
  • Areas in red are to emphasize that the learner can and must have an active role in the process
  • Transcript

    • 1. CBME and Assessment
    • 2. Competency-Based Medical Education is an outcomes-based approach to the design, implementation, assessment and evaluation of a medical education program using an organizing framework of competencies The International CMBE Collaborators 2009
    • 3. Traditional versus CBME: Start with System NeedsFrenk J. Health professionals for a new century: transforming educationto strengthen health systems in an interdependent world. Lancet. 2010 3
    • 4. The Transition to Competency Fixed length, variable outcome Structure/Process •Knowledge acquisition •Single subjective measure •Norm referenced evaluation •Evaluation setting removed •Emphasis on summative Competency Based Education Competency Based •Knowledge application •Multiple objective measures •Criterion referenced •Evaluation setting: DOVariable length, defined outcome •Emphasis on formativeCaraccio et al 2002
    • 5. Miller’s Assessment Pyramid Impact on Patient Faculty observation, audits, surveys DOES SHOWS Standardized Patients HOW KNOWS HOW Extended matching / CRQ KNOWS MCQ EXAM
    • 6. Training and Safe Patient Care Trainee performance* X Appropriate level of supervision** Must = Safe, effective patient-centered care* a function of level of competence in context**a function of attending competence in context
    • 7. Educational ProgramVariable Structure/Process Competency-basedDriving force: Content-knowledge Outcome-knowledge curriculum acquisition applicationDriving force: process Teacher LearnerPath of learning Hierarchical Non-hierarchical (Teacher→student) (Teacher↔student)Responsibility: content Teacher Student and TeacherGoal of educ. Knowledge acquisition Knowledge application encounterTypical assessment tool Single subject measure Multiple objective measuresAssessment tool Proxy Authentic (mimics real tasks of profession)Setting for evaluation Removed (gestalt) Direct observationEvaluation Norm-referenced Criterion-referencedTiming of assessment Emphasis on summative Emphasis on formativeProgram completion Fixed time Variable timeCarracchio, et al. 2002.
    • 8. Assessment “Building Blocks” Choice of right outcomes tied to an effective curriculum – step 1!! Right combination of assessment methods and tools – MiniCEX, DOPS, Chart stimulated recall (CSR), medical record audit Effective application of the methods and tools Effective processes to produce good judgments
    • 9. Measurement Tools: CriteriaCees van der Vleuten’s utility index: Utility = V x R x A x EI x CE/Context* – Where: V = validity R = reliability A = acceptability E = educational impact C = cost effectiveness *Context = ∑ Microsystems
    • 10. Criteria for “Good” Assessment 1 – Validity or Coherence – Reproducibility or Consistency – Equivalence – Feasibility – Educational effect – Catalytic effect • This is the “new” addition – relates to feedback that “drives future learning forward.” – Acceptability1 Ottawa Conference Working Group 2010
    • 11. Measurement ModelDonabedian Model (adapted)• Structure: the way a training program is set up and the conditions under which the program is administered • Organization, people, equipment and technology• Process: the activities that result from the training program• Outcomes: the changes (desired or undesired) in individuals or institutions that can be attributed to the training program
    • 12. Assessment During Training: Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX•Review portfolio •Review portfolio •Medical record audit/QI•Reflect on contents periodically and project•Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • Licensure and certification in Qatar
    • 13. Model For Programmatic Assessment (With permission from CPM van der Vleuten) Training v v v v v v ActivitiesAssessment ActivitiesSupporting Activities Committee = learning task Time = learning artifact = single assessment data-point = single certification data point for mastery tasks = learner reflection and planning = social interaction around reflection (supervision) = learning task being an assessment task also
    • 14. Assessment Subsystem An assessment subsystem is a group of people who work together on a regular basis to perform evaluation and provide feedback to a population of trainees over a defined period of time This system has a structure to carry out evaluation processes that produce an outcome The assessment subsystem must ultimately produce a valid entrustment judgment
    • 15. Assessment Subsystem This group shares: – Educational goals and outcomes – Linked assessment and evaluation processes – Information about trainee performance – A desire to produce a trainee truly competent (at a minimum) to enter practice or fellowship at the end of training
    • 16. Assessment Subsystem The subsystem must: – Involve the trainees in the evaluation structure and processes – Provide both formative and summative evaluation to the trainees – Be embedded within, not outside the overall educational system (assessment not an “add- on” – Provide a summative judgment for the profession and public • Effective Evaluation = Professionalism
    • 17. Subsystem Components Effective Leadership Clear communication of goals – Both trainees and faculty Evaluation of competencies is multi-faceted Data and Transparency – Involvement of trainees – Self-directed assessment and reflection by trainees – Trainees must have access to their “file”
    • 18. Subsystem Components “Competency” committees – Need wisdom and perspectives of the group Continuous quality improvement – The evaluation program must provide data as part of the CQI cycle of the program and institution – Faculty development Supportive Institutional Culture
    • 19. Multi-faceted Evaluation Systems-based prac Interpersonal skills and Communication Medical recordPractice-based audit and MSF: Directedlearning and QI project per protocolimprovement Twice/year Structured Portfolio EBM/ Mini-CEX: Question Log 10/year Patient care Faculty ITE: Evaluations 1/year Medical knowledge Professionalism■ Trainee-directed ■ Direct observation
    • 20. Assessment During Training: Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX•Review portfolio •Review portfolio •Medical record audit/QI•Reflect on contents periodically and project•Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • USLME •American Boards of Medical Specialties
    • 21. Performance Data A training program cannot reach its full potential without robust and ongoing performance data – Aggregation of individual trainee performance – Performance measurement of the quality and safety of the clinical care provided by the training institution and the program
    • 22. Competency Committees
    • 23. Assessment During Training: Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX•Review portfolio •Review portfolio •Medical record audit/QI•Reflect on contents periodically and project•Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • USLME •American Boards of Medical Specialties
    • 24. Model For Programmatic Assessment (With permission from CPM van der Vleuten) Training v v v v v v ActivitiesAssessment ActivitiesSupporting Activities Committee = learning task Time = learning artifact = single assessment data-point = single certification data point for mastery tasks = learner reflection and planning = social interaction around reflection (supervision) = learning task being an assessment task also
    • 25. Committees and Information Evaluation (“competency”) committees can be invaluable • Develop group goals • “Real-time” faculty development • Key for dealing with difficult trainees Key “receptor site” for frameworks/milestones • Synthesis and integration of multiple assessments
    • 26. “Wisdom of the Crowd” Hemmer (2001) – Group conversations more likely to uncover deficiencies in professionalism among students Schwind, Acad. Med. (2004) – • 18% of resident deficiencies requiring active remediation only became apparent through group discussion. • Average discussion 5 minutes/resident (range 1 – 30 minutes)
    • 27. “Wisdom of the Crowd” Williams, Teach. Learn. Med. (2005) • No evidence that individuals in groups dominate discussions. • No evidence of ganging up or piling on Thomas (2011) – Group assessment improved inter-rater reliability and reduced range restriction in multiple domains in an internal medicine residency
    • 28. Narratives and Judgments Pangaro (1999) – matching students to a “synthetic” descriptive framework (RIME) reliable and valid across multiple clerkships Regehr (2007) – Matching students to a standardized set of holistic, realistic vignettes improved discrimination of student performance Regehr (2012) – Faculty created narrative “profiles” (16 in all) found to produce consistent rankings of excellent, competent and problematic performance.
    • 29. The “System” Accreditation: Residents Institution ACGME/RRC and Program Assessments within Program Aggregation Program:•Direct observations•Audit and Judgment and NAS Milestonesperformance data Synthesis:•Multi-source FB Committee ABIM Fastrak•Simulation•ITExam No Aggregation Faculty, PDs Certification: and others ABIM Milestone and EPAs as Guiding Framework and Blueprint
    • 30. Questions

    Ă—