CBME and Assessment
Competency-Based Medical
           Education
 is an outcomes-based approach to the
  design, implementation, assessment and
  evaluation of a medical education program
  using an organizing framework of
  competencies


                 The International CMBE Collaborators 2009
Traditional versus CBME: Start with System Needs




Frenk J. Health professionals for a new century: transforming education
to strengthen health systems in an interdependent world. Lancet. 2010
                                                                          3
The Transition to Competency
 Fixed length, variable outcome    Structure/Process
                                   •Knowledge acquisition
                                   •Single subjective measure
                                   •Norm referenced evaluation
                                   •Evaluation setting removed
                                   •Emphasis on summative
          Competency Based
             Education
                                   Competency Based
                                   •Knowledge application
                                   •Multiple objective measures
                                   •Criterion referenced
                                   •Evaluation setting: DO
Variable length, defined outcome
                                   •Emphasis on formative

Caraccio et al 2002
Miller’s Assessment Pyramid
        Impact on Patient

                      Faculty observation, audits, surveys
              DOES

              SHOWS
                            Standardized Patients
               HOW
           KNOWS HOW
                                  Extended matching / CRQ


              KNOWS                     MCQ EXAM
Training and Safe Patient Care

                Trainee performance* X
            Appropriate level of supervision**
        Must = Safe, effective patient-centered care

* a function of level of competence in context
**a function of attending competence in context
Educational Program
Variable                   Structure/Process          Competency-based
Driving force:             Content-knowledge          Outcome-knowledge
   curriculum                 acquisition                application
Driving force: process     Teacher                    Learner
Path of learning           Hierarchical               Non-hierarchical
                              (Teacher→student)          (Teacher↔student)
Responsibility: content    Teacher                    Student and Teacher
Goal of educ.              Knowledge acquisition      Knowledge application
  encounter
Typical assessment tool Single subject measure        Multiple objective measures
Assessment tool            Proxy                      Authentic (mimics real tasks
                                                         of profession)
Setting for evaluation     Removed (gestalt)          Direct observation

Evaluation                 Norm-referenced            Criterion-referenced
Timing of assessment       Emphasis on summative      Emphasis on formative
Program completion         Fixed time                 Variable time
Carracchio, et al. 2002.
Assessment “Building Blocks”
 Choice of right outcomes tied to an effective
  curriculum – step 1!!
 Right combination of assessment methods
  and tools
  – MiniCEX, DOPS, Chart stimulated recall (CSR),
    medical record audit
 Effective application of the methods and tools
 Effective processes to produce good
  judgments
Measurement Tools: Criteria
Cees van der Vleuten’s utility index:
 Utility = V x R x A x EI x CE/Context*
  – Where:
      V = validity
      R = reliability
      A = acceptability
      E = educational impact
      C = cost effectiveness

    *Context = ∑ Microsystems
Criteria for “Good” Assessment 1
     – Validity or Coherence
     – Reproducibility or Consistency
     – Equivalence
     – Feasibility
     – Educational effect
     – Catalytic effect
         • This is the “new” addition – relates to
           feedback that “drives future learning
           forward.”
     – Acceptability
1
    Ottawa Conference Working Group 2010
Measurement Model
Donabedian Model (adapted)
• Structure: the way a training program is set up
   and the conditions under which the program is
   administered
    •     Organization, people, equipment and technology
•       Process: the activities that result from the training
        program
•       Outcomes: the changes (desired or undesired) in
        individuals or institutions that can be attributed to
        the training program
Assessment During Training: Components
                                       Clinical Competency Committee
                           •Periodic review – professional growth opportunities for all
                           •Early warning systems


          Advisor
                                              Structured Portfolio
                                            •ITE (formative only)
                                            •Monthly Evaluations
        Trainee                                                                           Program Leaders
                                            •MiniCEX
•Review portfolio                                                                     •Review portfolio
                                            •Medical record audit/QI
•Reflect on contents                                                                  periodically and
                                            project
•Contribute to portfolio                                                              systematically
                                            •Clinical question log
                                                                                      •Develop early warning
                                            •Multisource feedback
                                                                                      system
                                            •Trainee contributions
                                                                                      •Encourage reflection
                                            (personal portfolio)
                                                   o Research project                 and self-assessment




                                   Program Summative Assessment Process



                                      Licensing and Certification
                           • Licensure and certification in Qatar
Model For Programmatic Assessment
               (With permission from CPM van der Vleuten)


   Training            v         v                 v        v                v   v
  Activities
Assessment
  Activities
Supporting
  Activities
                                            Committee
               = learning task                                        Time
               = learning artifact
               = single assessment data-point
               = single certification data point for mastery tasks
               = learner reflection and planning
               = social interaction around reflection (supervision)
               = learning task being an assessment task also
Assessment Subsystem
 An assessment subsystem is a group of
  people who work together on a regular basis
  to perform evaluation and provide feedback
  to a population of trainees over a defined
  period of time
 This system has a structure to carry out
  evaluation processes that produce an
  outcome
 The assessment subsystem must ultimately
  produce a valid entrustment judgment
Assessment Subsystem

 This group shares:
  – Educational goals and outcomes
  – Linked assessment and evaluation processes
  – Information about trainee performance
  – A desire to produce a trainee truly competent (at
    a minimum) to enter practice or fellowship at the
    end of training
Assessment Subsystem
 The subsystem must:
  – Involve the trainees in the evaluation structure
    and processes
  – Provide both formative and summative
    evaluation to the trainees
  – Be embedded within, not outside the overall
    educational system (assessment not an “add-
    on”
  – Provide a summative judgment for the
    profession and public
     • Effective Evaluation = Professionalism
Subsystem Components
 Effective Leadership
 Clear communication of goals
  – Both trainees and faculty
 Evaluation of competencies is multi-faceted
 Data and Transparency
  – Involvement of trainees
  – Self-directed assessment and reflection by
    trainees
  – Trainees must have access to their “file”
Subsystem Components

 “Competency” committees
  – Need wisdom and perspectives of the group
 Continuous quality improvement
  – The evaluation program must provide data as
    part of the CQI cycle of the program and
    institution
  – Faculty development
 Supportive Institutional Culture
Multi-faceted Evaluation

                                              Systems-based
                                              prac                                  Interpersonal skills
                                                                                    and Communication


                          Medical record
Practice-based              audit and                                       MSF: Directed
learning and                QI project                                       per protocol
improvement
                                                                             Twice/year

                                          Structured Portfolio
                     EBM/                                                   Mini-CEX:
                  Question Log                                               10/year

   Patient care

                              Faculty                               ITE:
                            Evaluations                            1/year
                                                                                        Medical
                                                                                        knowledge




                                                 Professionalism




■ Trainee-directed                                                          ■ Direct observation
Assessment During Training: Components
                                       Clinical Competency Committee
                           •Periodic review – professional growth opportunities for all
                           •Early warning systems


          Advisor
                                              Structured Portfolio
                                            •ITE (formative only)
                                            •Monthly Evaluations
        Trainee                                                                           Program Leaders
                                            •MiniCEX
•Review portfolio                                                                     •Review portfolio
                                            •Medical record audit/QI
•Reflect on contents                                                                  periodically and
                                            project
•Contribute to portfolio                                                              systematically
                                            •Clinical question log
                                                                                      •Develop early warning
                                            •Multisource feedback
                                                                                      system
                                            •Trainee contributions
                                                                                      •Encourage reflection
                                            (personal portfolio)
                                                   o Research project                 and self-assessment




                                   Program Summative Assessment Process


                                        Licensing and Certification
                           • USLME
                           •American Boards of Medical Specialties
Performance Data
 A training program cannot reach its full
  potential without robust and ongoing
  performance data
  – Aggregation of individual trainee performance
  – Performance measurement of the quality and
    safety of the clinical care provided by the
    training institution and the program
Competency Committees
Assessment During Training: Components
                                       Clinical Competency Committee
                           •Periodic review – professional growth opportunities for all
                           •Early warning systems


          Advisor
                                              Structured Portfolio
                                            •ITE (formative only)
                                            •Monthly Evaluations
        Trainee                                                                           Program Leaders
                                            •MiniCEX
•Review portfolio                                                                     •Review portfolio
                                            •Medical record audit/QI
•Reflect on contents                                                                  periodically and
                                            project
•Contribute to portfolio                                                              systematically
                                            •Clinical question log
                                                                                      •Develop early warning
                                            •Multisource feedback
                                                                                      system
                                            •Trainee contributions
                                                                                      •Encourage reflection
                                            (personal portfolio)
                                                   o Research project                 and self-assessment




                                   Program Summative Assessment Process


                                        Licensing and Certification
                           • USLME
                           •American Boards of Medical Specialties
Model For Programmatic Assessment
               (With permission from CPM van der Vleuten)


   Training            v         v                 v        v                v   v
  Activities
Assessment
  Activities
Supporting
  Activities
                                            Committee
               = learning task                                        Time
               = learning artifact
               = single assessment data-point
               = single certification data point for mastery tasks
               = learner reflection and planning
               = social interaction around reflection (supervision)
               = learning task being an assessment task also
Committees and Information
 Evaluation (“competency”) committees can be
  invaluable
  • Develop group goals
  • “Real-time” faculty development
  • Key for dealing with difficult trainees
 Key “receptor site” for frameworks/milestones
  • Synthesis and integration of multiple assessments
“Wisdom of the Crowd”
 Hemmer (2001) – Group conversations more
  likely to uncover deficiencies in professionalism
  among students
 Schwind, Acad. Med. (2004) –
   • 18% of resident deficiencies requiring
     active remediation only became apparent
     through group discussion.
      • Average discussion 5 minutes/resident
        (range 1 – 30 minutes)
“Wisdom of the Crowd”
 Williams, Teach. Learn. Med. (2005)
   • No evidence that individuals in groups
     dominate discussions.
      • No evidence of ganging up or piling on
 Thomas (2011) – Group assessment
  improved inter-rater reliability and reduced
  range restriction in multiple domains in an
  internal medicine residency
Narratives and Judgments
 Pangaro (1999) – matching students to a
  “synthetic” descriptive framework (RIME) reliable
  and valid across multiple clerkships
 Regehr (2007) – Matching students to a
  standardized set of holistic, realistic vignettes
  improved discrimination of student performance
 Regehr (2012) – Faculty created narrative
  “profiles” (16 in all) found to produce consistent
  rankings of excellent, competent and problematic
  performance.
The “System”
                                               Accreditation:
     Residents          Institution            ACGME/RRC
                        and Program

 Assessments within                         Program Aggregation
       Program:
•Direct observations
•Audit and             Judgment and    NAS Milestones
performance data         Synthesis:
•Multi-source FB        Committee      ABIM Fastrak
•Simulation
•ITExam
                                               No Aggregation

   Faculty, PDs                                 Certification:
    and others                                     ABIM

          Milestone and EPAs
  as Guiding Framework and Blueprint
Questions

CBME and Assessment

  • 1.
  • 2.
    Competency-Based Medical Education  is an outcomes-based approach to the design, implementation, assessment and evaluation of a medical education program using an organizing framework of competencies The International CMBE Collaborators 2009
  • 3.
    Traditional versus CBME:Start with System Needs Frenk J. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010 3
  • 4.
    The Transition toCompetency Fixed length, variable outcome Structure/Process •Knowledge acquisition •Single subjective measure •Norm referenced evaluation •Evaluation setting removed •Emphasis on summative Competency Based Education Competency Based •Knowledge application •Multiple objective measures •Criterion referenced •Evaluation setting: DO Variable length, defined outcome •Emphasis on formative Caraccio et al 2002
  • 5.
    Miller’s Assessment Pyramid Impact on Patient Faculty observation, audits, surveys DOES SHOWS Standardized Patients HOW KNOWS HOW Extended matching / CRQ KNOWS MCQ EXAM
  • 6.
    Training and SafePatient Care Trainee performance* X Appropriate level of supervision** Must = Safe, effective patient-centered care * a function of level of competence in context **a function of attending competence in context
  • 7.
    Educational Program Variable Structure/Process Competency-based Driving force: Content-knowledge Outcome-knowledge curriculum acquisition application Driving force: process Teacher Learner Path of learning Hierarchical Non-hierarchical (Teacher→student) (Teacher↔student) Responsibility: content Teacher Student and Teacher Goal of educ. Knowledge acquisition Knowledge application encounter Typical assessment tool Single subject measure Multiple objective measures Assessment tool Proxy Authentic (mimics real tasks of profession) Setting for evaluation Removed (gestalt) Direct observation Evaluation Norm-referenced Criterion-referenced Timing of assessment Emphasis on summative Emphasis on formative Program completion Fixed time Variable time Carracchio, et al. 2002.
  • 8.
    Assessment “Building Blocks” Choice of right outcomes tied to an effective curriculum – step 1!!  Right combination of assessment methods and tools – MiniCEX, DOPS, Chart stimulated recall (CSR), medical record audit  Effective application of the methods and tools  Effective processes to produce good judgments
  • 9.
    Measurement Tools: Criteria Ceesvan der Vleuten’s utility index:  Utility = V x R x A x EI x CE/Context* – Where: V = validity R = reliability A = acceptability E = educational impact C = cost effectiveness *Context = ∑ Microsystems
  • 10.
    Criteria for “Good”Assessment 1 – Validity or Coherence – Reproducibility or Consistency – Equivalence – Feasibility – Educational effect – Catalytic effect • This is the “new” addition – relates to feedback that “drives future learning forward.” – Acceptability 1 Ottawa Conference Working Group 2010
  • 11.
    Measurement Model Donabedian Model(adapted) • Structure: the way a training program is set up and the conditions under which the program is administered • Organization, people, equipment and technology • Process: the activities that result from the training program • Outcomes: the changes (desired or undesired) in individuals or institutions that can be attributed to the training program
  • 12.
    Assessment During Training:Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX •Review portfolio •Review portfolio •Medical record audit/QI •Reflect on contents periodically and project •Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • Licensure and certification in Qatar
  • 13.
    Model For ProgrammaticAssessment (With permission from CPM van der Vleuten) Training v v v v v v Activities Assessment Activities Supporting Activities Committee = learning task Time = learning artifact = single assessment data-point = single certification data point for mastery tasks = learner reflection and planning = social interaction around reflection (supervision) = learning task being an assessment task also
  • 14.
    Assessment Subsystem  Anassessment subsystem is a group of people who work together on a regular basis to perform evaluation and provide feedback to a population of trainees over a defined period of time  This system has a structure to carry out evaluation processes that produce an outcome  The assessment subsystem must ultimately produce a valid entrustment judgment
  • 15.
    Assessment Subsystem  Thisgroup shares: – Educational goals and outcomes – Linked assessment and evaluation processes – Information about trainee performance – A desire to produce a trainee truly competent (at a minimum) to enter practice or fellowship at the end of training
  • 16.
    Assessment Subsystem  Thesubsystem must: – Involve the trainees in the evaluation structure and processes – Provide both formative and summative evaluation to the trainees – Be embedded within, not outside the overall educational system (assessment not an “add- on” – Provide a summative judgment for the profession and public • Effective Evaluation = Professionalism
  • 17.
    Subsystem Components  EffectiveLeadership  Clear communication of goals – Both trainees and faculty  Evaluation of competencies is multi-faceted  Data and Transparency – Involvement of trainees – Self-directed assessment and reflection by trainees – Trainees must have access to their “file”
  • 18.
    Subsystem Components  “Competency”committees – Need wisdom and perspectives of the group  Continuous quality improvement – The evaluation program must provide data as part of the CQI cycle of the program and institution – Faculty development  Supportive Institutional Culture
  • 19.
    Multi-faceted Evaluation Systems-based prac Interpersonal skills and Communication Medical record Practice-based audit and MSF: Directed learning and QI project per protocol improvement Twice/year Structured Portfolio EBM/ Mini-CEX: Question Log 10/year Patient care Faculty ITE: Evaluations 1/year Medical knowledge Professionalism ■ Trainee-directed ■ Direct observation
  • 20.
    Assessment During Training:Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX •Review portfolio •Review portfolio •Medical record audit/QI •Reflect on contents periodically and project •Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • USLME •American Boards of Medical Specialties
  • 21.
    Performance Data  Atraining program cannot reach its full potential without robust and ongoing performance data – Aggregation of individual trainee performance – Performance measurement of the quality and safety of the clinical care provided by the training institution and the program
  • 22.
  • 23.
    Assessment During Training:Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX •Review portfolio •Review portfolio •Medical record audit/QI •Reflect on contents periodically and project •Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • USLME •American Boards of Medical Specialties
  • 24.
    Model For ProgrammaticAssessment (With permission from CPM van der Vleuten) Training v v v v v v Activities Assessment Activities Supporting Activities Committee = learning task Time = learning artifact = single assessment data-point = single certification data point for mastery tasks = learner reflection and planning = social interaction around reflection (supervision) = learning task being an assessment task also
  • 25.
    Committees and Information Evaluation (“competency”) committees can be invaluable • Develop group goals • “Real-time” faculty development • Key for dealing with difficult trainees  Key “receptor site” for frameworks/milestones • Synthesis and integration of multiple assessments
  • 26.
    “Wisdom of theCrowd”  Hemmer (2001) – Group conversations more likely to uncover deficiencies in professionalism among students  Schwind, Acad. Med. (2004) – • 18% of resident deficiencies requiring active remediation only became apparent through group discussion. • Average discussion 5 minutes/resident (range 1 – 30 minutes)
  • 27.
    “Wisdom of theCrowd”  Williams, Teach. Learn. Med. (2005) • No evidence that individuals in groups dominate discussions. • No evidence of ganging up or piling on  Thomas (2011) – Group assessment improved inter-rater reliability and reduced range restriction in multiple domains in an internal medicine residency
  • 28.
    Narratives and Judgments Pangaro (1999) – matching students to a “synthetic” descriptive framework (RIME) reliable and valid across multiple clerkships  Regehr (2007) – Matching students to a standardized set of holistic, realistic vignettes improved discrimination of student performance  Regehr (2012) – Faculty created narrative “profiles” (16 in all) found to produce consistent rankings of excellent, competent and problematic performance.
  • 29.
    The “System” Accreditation: Residents Institution ACGME/RRC and Program Assessments within Program Aggregation Program: •Direct observations •Audit and Judgment and NAS Milestones performance data Synthesis: •Multi-source FB Committee ABIM Fastrak •Simulation •ITExam No Aggregation Faculty, PDs Certification: and others ABIM Milestone and EPAs as Guiding Framework and Blueprint
  • 30.

Editor's Notes

  • #3 So what is the outcome, and what is the framework?
  • #20 Areas in red are to emphasize that the learner can and must have an active role in the process