Computer based assessment of clinical reasoning (Heidelberg 2012)

  • 316 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
316
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
3

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Computer Based Assessment of Clinical Reasoning A cooperative project between 3 Dutch medical schools Mathijs Doets, MSc Erasmus University Medical Center (Rotterdam, The Netherlands)
  • 2. Presentation outline• Project overview and goals• Clinical reasoning in medical education• Project activities and results• Pilots• Future directions
  • 3. Project participants AMC Amsterdam 2.300 medical students 250 doctors graduate yearly UMC Utrecht 2.100 medical students 275 doctors graduate yearlyErasmus MC Rotterdam2.400 medical students280 doctors graduate yearly Organisation for ICT in higher education
  • 4. Project overview• March 2011 – March 2013• Different phases: – Literature study – Determine formats for assessment – Training of teachers – Development of questions – Technical infrastructure – Pilots – Implementation in curriculum
  • 5. What is clinical reasoning?Clinical reasoning is a core competence of a doctor and akey element of medical educationDefinitionThe process through which the physician defines a patientsmost probable diagnosis and determines further policyTo answer a clinical reasoning question, students shouldhave underlying knowledge and should be able to apply thisknowledge to solve a patients problem
  • 6. Assessment of clinical reasoningAssessment of clinical reasoning is very often paper basedor oralDisadvantages•Very labour intensive•Oral exams often subjectiveif not very well structured
  • 7. Aim of the projectReduce workload for teachers (and staff) in high qualityassessments of clinical reasoning,by developing a more objective and more efficient formof assessment of clinical reasoningSolutionJoint development of questions•collaborating per discipline•analysis and validation of test itemsMaking the assessments computer based•creating a shared question database•developing good digital formats
  • 8. Literature studyResearch question: Which questions types are preferredfrom an educational and practical point of view and aresuitable for CBA of clinical reasoning in large groups ofundergraduate medical students?9 suitable question types were identified for assessmentof clinical reasoning•Extended Matching Question (EMQ)•Comprehensive Integrative Puzzle (CIP)•Script Concordance Test (SCT)•Modified Essay Question (MEQ)•Short Answer Question (SAQ)•Long Menu Question (LMQ)•Multiple Choice Question (MCQ)•(Multiple) True/false Question (MTFQ)•Virtual Patients (VP)
  • 9. Summary of findings• All question types have a different focus and assess different aspects of clinical reasoning• Regardless of the chosen question type, patient vignettes should be used as a standard stimulus format to assess clinical reasoning.• MEQ and SAQ can not be rated automatically and therefore do not reduce workload• LMQ are not suitable for more than one word answers• Validity of MTFQ and VP is questionable• Differentiation in scoring generates higher discrimination values ​and increases reliability (SCT and CIP)• The need to have a panel to identify scoring instructions for SCT may be a challenge• Teachers tend to test knowledge only by MCQ
  • 10. Conclusions literature study• Combine CIP en EMQ to assess clinical reasoning, because this combination: – covers most aspects of clinical reasoning – produces valid and reliable test results – are suitable for use in CBA (automatic scoring)• Regardless of the chosen question type, patient vignettes should be used as a standard stimulus format You are a physician and are seeing a 52 year old man• Article to be published in Dutch Journal forand cough productive He has had increasing dyspnea Medical Education of purulent sputum for 2 days (...) Which of the following is the most likely diagnosis?
  • 11. Extended Matching Question (EMQ)• Starting with a theme or clinical problem• A list with 8-15 options• Question• 2 or 3 patient vignettes• Scoring: points for each correct answer; multiple answers may be correct EMQ example: Case and Swanson, 1996
  • 12. Comprehensive Integrative Puzzle (CIP)• Combines data from: medical history, physical examination, diagnostic tests into a logical, coherent patient case, given a diagnosis (1st column). Treatment• Matrix from 4x4 to 7x7 rows/columns• Alternatives are presented for each column• Answer options may be used once, more than once, or not at all (reduces guessing)• Scoring: one point per cell of points for complete row: to be determined in pilot
  • 13. Training of teachers• Online instruction on writing good patient vignettes and constructing CIP and EMQ questions• Workshop for teachers
  • 14. Technical infrastructure: Store and share questions• Search for a system which: – Stores CIP and EMQ – Allows searching by discipline and topic – Supports reviewing questions between medical schools• Cooperation with IMS from april 2012• Implementation of CIP summer 2012• Pilot with users september-december 2012• Export to assesment systems: QMP, Testvision, Blackboard
  • 15. Pilot assessment55 (clinical) students from all 3 medical schools did a(voluntary) assessment in a prototype assessment system,consisting of 4 CIP and 11 EMQ from different topics•Students were enthousiastic•Assessment was moderately reliable (alpha 0,625)•Differences between difficulty and discriminatory power ofitems•Complete analysis availablein the coming months
  • 16. Project future activities• Development of more questions• Entering and classifying questions in IMS• Reviewing questions through IMS• Implementation of assessments in curriculum – Clinical part (internships) – Different choices for each medical school (by discipline or integrated assessments)• Lasting cooperation beyond the project
  • 17. Questions?
  • 18. References• Ber, R. (2003). The CIP (Comprehensive Integrative Puzzle) assessment method. Medical Teacher, 25, 171-176.• Beullens, J., Struyf, E., & Van Damme, B. (2005) Do extended matching multiple-choice questions measure clinical reasoning? Medical Education, 39, 410-417.• Beullens, J., van Damme, B., Jaspaert, H., Janssen, P.J. (2002) Are extended-matching multiple-choice items appropriate for a final test in medical education? Medical Teacher, 24 (4) 390-395.• Bhakta, B., Tennant, A., Horton, M., Lawton, G., Andrich, D. (2005) Using item response theory to explore the psychometric properties of extended matching questions examination in undergraduate medical education. BMC Medical Education, 5: 1-9.• Case, S.M., Swanson, D.B., Ripkey, D.R. (1994) Multiple-Choice Question Strategies. Comparison of Items in Five-option and Extended-matching Formats for Assessment of Diagnostic Skills. Academic Medicine, 69 (10), 1-3.• Coderre, S.P., Harasym, P., Mandin, H., Fick, G. (2004) The impact of two multiple-choice question formats on the problem-solving strategies used by novices and experts. BMC Medical Education, 4 (23), 1-9.• Groothoff, J.W., Frenkel, J., Tytgat, G.A., Vreede, W.B., Bosman, D.K., ten Cate, O.Th. J. (2008) Growth of analytical thinking skills over time as measured with the MATCH test. Medical Education, 42 (10), 1037-1043.• Samuels A (2006) Extended Matching Questions and the Royal Australian and New Zealand College of Psychiatrists written examination: an overview. Australasian Psychiatry, 14, (1), 63-66.• Schuwirth, L.W.T., van der Vleuten, C.P.M. (2003) ABC of learning and teaching in medicine. Written assessment, BMJ (326), 643-645.• Schuwirth, L.W.T., van der Vleuten, C.P.M. (2004) Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education (38), 974-979.