Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Peeking behind the test: insights and innovations from the Medical Council of Canada

920 views

Published on

2015 CCME
MCC Business Session
Peeking behind the test: insights and innovations from the Medical Council of Canada. We will showcase new technological innovations such as the automated item generation, automated scoring and the MCC’s new item bank MOC5.

Published in: Health & Medicine
  • Be the first to comment

Peeking behind the test: insights and innovations from the Medical Council of Canada

  1. 1. | Peeking behind the test – insights and innovations Medical Council of Canada Dr. Ian Bowmer, Dr. André De Champlain, Stephen Abraham, Jessica Hertzog-Grenier
  2. 2. | Overview Introduction › Assessment Review Task Force (IAN) › Blueprint Project (ANDRE) › Flexible exam delivery (ANDRE) Innovations › Automated Item Generation (Andre) › Automated Marking (Stephen) › New item bank (MOC5) (Stephen) Other developments › Orientation › Preparatory materials Q&A 2 Insights and innovations from the Medical Council of Canada
  3. 3. | Introduction – Recalibrating for the 21st Century: Report of the Assessment Review Task Force of the Medical Council of Canada (October, 2011) • Outlined 6 recommendations to focus on in the MCC’s reassessment & realignment of exams 3 Assessment Review Task Force
  4. 4. | Introduction – 6 ARTF RECOMMENDATIONS Recommendation 1 • LMCC becomes ultimate credential (legislation issue) Recommendation 2 • Validate & update blueprint for MCC examinations Recommendation 3 • More frequent scheduling of the exams and associated automation Recommendation 4 • IMG assessment enhancement and national standardization (NAC & Practice Ready Assessment) Recommendation 5 • Physician performance enhancement assessments Recommendation 6 • Implementation oversight, Committee priorities and budgets 4 Assessment Review Task Force
  5. 5. | Introduction – Recommendation #2 That the content of MCC examinations be expanded by: • Defining the knowledge and behaviours in all the CanMEDS roles that demonstrate competency of the physician about to enter independent practice • Reviewing the adequacy of content and skill coverage on the blueprints for all MCC examinations • Revising the examination blueprints and reporting systems with the aim of demonstrating that the appropriate assessment of all core competencies is covered and fulfills the purpose of each examination • Determining whether any general core competencies considered essential cannot be tested employing the current MCC examinations, and exploring the development of new tools to assess these specific competencies when current examinations cannot 5 Blueprint Project
  6. 6. | Introduction – 6 Blueprint Project Assessment of innovative item types to better target blueprint gaps (audio/video, technology-based, etc.) • Can MCC exams “evolve” towards meeting blueprint domains more fully with the addition of innovative item types? • Impacts both the MCCQE Parts I and II • Advisory group meeting (March, 2015) • Overview of technology-based items (“hot spots”, “drag & drop”), higher-order cognitive skills (“pharma ads”, “abstract searches”) • SJT-type vignettes might prove most useful to pursue to assess aspects of Professionalism & Communication
  7. 7. | Introduction – Piloting of new OSCE stations to better assess non-cognitive skills (CM, PF) and complex patient presentations (multiple morbidities) • Piloted 4 new OSCE stations in Fall, 2014 exam • Stations discriminated well and were generally more difficult than traditional counterparts • Provides some preliminary evidence to support the evolving nature of the MCCQE Part II exam 7 Blueprint Project
  8. 8. | Introduction – Recommendation #3 The timing for taking the MCCQE Part I and II, and the frequency with which they are offered, be revisited by exploring: • Options allowing more flexibility in scheduling all of the MCC examinations • Options permitting the components of the MCCQE Part I (knowledge and clinical decision-making) to be offered at the appropriate times in the learning/assessment continuum • The development of an integrated national assessment strategy for physicians in training in collaboration with the CFPC and RCPS 8 Flexible exam delivery
  9. 9. | Introduction – How can we offer our exam more frequently and with a greater degree of flexibility? • By adopting evidence-based simplified scoring ◦ CEC (December, 2012) endorsed R&D recommendation to drop checklist/component weighting for MCCQE Part 2 starting with spring 2013 MCCQE Part II exam ◦ Rasch IRT scoring model applied to MCCQE Part I exam as of spring, 2015 ◦ Version 1.0 of Automated Scoring System for MCC exams 9 Flexible exam delivery
  10. 10. | Introduction – How can we offer our exam more frequently and with a greater degree of flexibility? • By automating manual processes ◦ Automated marking of CDM questions using natural language processing (NLP) › Preliminary findings are very encouraging › NLP marking of CDM questions in concordance with human markings over 96% of the time, on average › Would allow the automation of what is currently a very laborious process 10 Flexible exam delivery
  11. 11. | Introduction – How can we offer our exam more frequently and with a greater degree of flexibility? • By increasing item pool size using AIG ◦ Supplementing/re-envisioning current MCC TD processes to create large pools of items in targeted areas ◦ AIG might not only allow us to better meet ARTF #3 but also enable us to address other needs (self-assessments, progress tests, etc.) 11 Flexible exam delivery
  12. 12. | Innovations – What is AIG? • Automated item generation (AIG) is the process of using item models to generate test items with the aid of computer technology • AIG uses a 3-stage process for generating items where the cognitive mechanism required to solve the items is identified and manipulated to create new items (“cognitive map”) 12 Automated item generation
  13. 13. | Innovations – 13 Automated item generation The model includes three key outcomes: 1. Identification of THE PROBLEM (i.e., Post-Operative Fever) 2. Specification of SOURCES of information required to diagnose the problem (i.e., Type of Surgery, Physical Examination, etc.) 3. Description of Elements within each information source (e.g., Guarding and Rebound, Fever, Calf Tenderness, etc.) needed to create different instances of the problem
  14. 14. | Innovations – 14 Automated item generation Item Model • Item models are created using the cognitive model content, where an item model is like a template, a rendering, or a mold of the assessment task (i.e., it’s a target where we want to place the content for the item) • A 54-year-old woman has a <Type of Surgery>. On post- operative day <Timing of Fever>. The patient has a temperature of 38.5c. Physical examination reveal <Physical Examination>. Which one of the following is the best next step? • Type of Surgery: Gastrectomy, Right Hemicolectomy, Left Hemicolectomy, Appendectomy, Laparoscopic Cholecystectomy • Timing of Fever: 1 to 6 days • Physical Examination: Red and Tender Wound, Guarding and Rebound, Abdominal Tenderness, Calf Tenderness
  15. 15. | Innovations – Lessons Learned • The MCC has been working on AIG with the U of A researchers for the past 5+ years • Thousands of items have been generated across 50+ cognitive maps • Predictive identification accuracy ranged from 32% to 52% across 4 experts, with an average accuracy rate of 42% • Experts cannot systematically differentiate AIG from traditional items • Piloted AIG items cover shallow areas of our pool very well ◦ On average, AIG items are more difficult and discriminating (based on classical and IRT statistics) ◦ Directly attributable to the AIG process 15 Automated item generation
  16. 16. | Innovations – Next Steps • Undertake 2 additional AIG content development workshops in May and September, 2015 ◦ ~ 20 new cognitive maps • Pretest ~70 AIG items in the spring, 2015 MCCQE Part I exam cycle ◦ Selected from 2014 cognitive maps and generated items • Create several apps that will further automate the AIG process and allow us to fully transition AIG to the MCC • Finalize a cost-benefit analysis of AIG vs. traditionally written items 16 Automated item generation
  17. 17. | Innovations – The Challenge • MCC QE1 has Clinical Decision Making (CDM) component which includes Write-in answers marked by Physician markers • Spring Administration - 4,200 candidates answer 86 (scored) questions each • There are 4,200 X 86 = 361,200 distinct answers to score • And… 2 rounds to ensure quality • Over 700,000 answers marked! • 50 physicians are required to mark • Difficult to sustain more frequent and flexible test administrations Streamlining CDM marking
  18. 18. | Innovations – Our solution • Re-think our approach • Combine all identical answers • Display aggregated answers along with number of occurrences • Allow for one-click bulk marking Streamlining CDM marking
  19. 19. | Innovations –
  20. 20. | Innovations –
  21. 21. | Innovations –
  22. 22. | Innovations –
  23. 23. | Innovations – • Marked examination in ~half the time • Improved quality • Provides insight into question development (well-formed questions prompt fewer distinct responses) • Happier, less fatigued markers • Reduced marking costs CDM Marking Results
  24. 24. | Innovations – The Challenge • Items from MCC’s 5 exams managed in 5 systems • Varied Item types; MCQ, CDM, OSCE • Varied business processes for item development and managed – constrained by systems • Wanted a single consistent system and processes • No product exists on market (bilingual, committee driven item development) Integrated Item Bank
  25. 25. | Innovations – Our Solution • Writing our own – “MOC5” • Development began late last year • OSCE IB to be ready for Fall • MCQ’s and CDM’s to follow Integrated Item Bank
  26. 26. | Innovations –
  27. 27. | Innovations –
  28. 28. | Innovations –
  29. 29. | Innovations –
  30. 30. | Innovations –
  31. 31. | Innovations –
  32. 32. | Innovations – • Development well underway • 3 Phases: • Fall 2015 – OSCE Items • Spring 2016 – MCQ Items • Fall 2016 – CDM Items Integrated Item Bank
  33. 33. | Other developments – • Practice ready assessment – online orientation • Two themes: culture & communication • Tool repurposing • Working group review • Re-filming, translation and launch of web platform 33 Orientation
  34. 34. | Other developments – 34 Orientation case topics
  35. 35. | Other developments – • Coming to a website near you! – Sept. 2015 • Pilots with PRA programs • Integration with programs’ current orientation activities • Broader adoption and use 35 Orientation
  36. 36. | Other developments – • In support of the international delivery of the MCCQE Part I: • Enhanced self-assessments • Prep guide • Orientation modules • Expected launch – 2018-2019 36 Preparatory materials
  37. 37. | Q & A 37
  38. 38. | THANK YOU! *8 *8 *8 *8 Dr. Ian Bowmer ibowmer@mcc.ca Dr. André De Champlain Stephen Abraham Jessica Hertzog-Grenier adechamplain@mcc.ca sabraham@mcc.ca jhertzog@mcc.ca

×