Simulation - CORD Home Page

2,773 views
2,632 views

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,773
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
76
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • http://www.acgme.org/Outcome/assess/ToolTable.pdf
  • BEME article
  • Simulation - CORD Home Page

    1. 1. Simulation For Emergency Medicine CORD Academic Assembly March 4 th , 2006
    2. 2. Steve McLaughlin, MD EM Program Director Medical Director ‘BATCAVE’ Simulation Center University of New Mexico Mary Jo Wagner, MD EM Program Director Synergy Medical Education Alliance Michigan State University
    3. 3. Objectives <ul><li>Describe the current state of education and research in simulation. </li></ul><ul><li>List the various simulators, mannequins and models available for emergency medicine training. </li></ul><ul><li>Discuss the strengths and weaknesses of each simulation modality. </li></ul><ul><li>List some of the best practice examples for using simulation in EM residencies. </li></ul>
    4. 4. Outline <ul><li>Introduction </li></ul><ul><li>Spectrum of Simulation Equipment </li></ul><ul><li>Best Practice Examples </li></ul><ul><li>Hands-on Practice </li></ul>
    5. 5. Introduction <ul><li>“Simulation is the act of mimicking a real object, event or process.” </li></ul><ul><li>“Simulation is a person, device or set of conditions which present evaluation problems authentically. The student responds to the problems as they would under natural circumstances.” </li></ul>
    6. 6. Introduction <ul><li>Characteristics </li></ul><ul><ul><li>Cues and consequences are like reality </li></ul></ul><ul><ul><li>Situations can be complex </li></ul></ul><ul><ul><li>Fidelity (exactness of duplication) is not perfect </li></ul></ul><ul><ul><li>Feedback to users questions, decisions, and actions. </li></ul></ul>
    7. 7. Introduction <ul><li>History </li></ul><ul><ul><li>1928 Edwin Link develops first flight simulator – “Link Trainer” </li></ul></ul><ul><ul><li>1960 Laerdal introduces first “Resusci-Annie” </li></ul></ul><ul><ul><li>1968 “Harvey” cardiology simulator </li></ul></ul><ul><ul><li>1970 First power plant simulators </li></ul></ul><ul><ul><li>1973 First computer aided modeling of physiology </li></ul></ul><ul><ul><li>1975 Standardized patients and OSCE’s introduced </li></ul></ul><ul><ul><li>1988 First full body, computerized mannequin at Stanford </li></ul></ul><ul><ul><li>1989 ACRM – Anesthesia focused on patient safety and education movement at this time </li></ul></ul><ul><ul><li>1990 Term Virtual Reality was introduced, Screen Based Simulators Introduced </li></ul></ul>
    8. 8. Link Trainer
    9. 9. Harvey
    10. 10. Introduction <ul><li>History </li></ul><ul><ul><li>1990’s US IOM “To Err Is Human” report. </li></ul></ul><ul><ul><li>1994 Boston Center for Medical Simulation </li></ul></ul><ul><ul><li>1993 First national/international simulation meetings MMVR </li></ul></ul><ul><ul><li>Late 1990’s Introduction of simulation into specialties like EM </li></ul></ul><ul><ul><li>1997 MIST VR Task Trainer </li></ul></ul><ul><ul><li>1998 AAMC MSOP </li></ul></ul><ul><ul><li>1991-1993 a total of 30 articles on High Fidelity Simulation </li></ul></ul><ul><ul><li>2000-1 Current generation of full body mannequins introduced by METI and Laerdal </li></ul></ul><ul><ul><li>2000-2003 a total of 385 articles on High Fidelity Simulation </li></ul></ul><ul><ul><li>2005 Society for Medical Simulation </li></ul></ul><ul><ul><li>2006 Simulation in Healthcare Journal </li></ul></ul>
    11. 11. Introduction <ul><li>Why is this a valuable tool? Or is it? </li></ul><ul><ul><li>Learners can learn without risk to patient. </li></ul></ul><ul><ul><li>Learning can be focused without regard to patient care needs/safety/etc. </li></ul></ul><ul><ul><li>Opportunity to repeat lesson/skill to mastery. </li></ul></ul><ul><ul><li>Specific learning opportunities  guaranteed. </li></ul></ul><ul><ul><li>Learning can be done at convenient times. </li></ul></ul><ul><ul><li>Performance can be observed/recorded. </li></ul></ul>
    12. 12. Introduction <ul><li>Why is simulation important in medical education? </li></ul><ul><ul><li>Problems with clinical teaching </li></ul></ul><ul><ul><li>New technologies for diagnosis/treatment </li></ul></ul><ul><ul><li>Assessing professional competence </li></ul></ul><ul><ul><li>Medical errors and patient safety </li></ul></ul><ul><ul><li>Deliberate practice </li></ul></ul>
    13. 13. Introduction <ul><li>CORD Consensus Conference </li></ul><ul><ul><li>Simulation is a useful tool for assess competence. Especially patient care, IP skills and SBP. </li></ul></ul><ul><ul><li>There is a lack of evidence to support the use of simulation for high stakes assessment. </li></ul></ul><ul><ul><li>Definitions of competence and tools to evaluate performance must be developed and tested. </li></ul></ul><ul><ul><li>Scenarios and evaluation tools should be standardized. </li></ul></ul>
    14. 14. Introduction <ul><li>ACGME ‘Toolbox of Assessment Methods’ </li></ul><ul><ul><li>Simulation is ‘the best’, ‘second best’ tool for assessing: </li></ul></ul><ul><ul><ul><li>Medical procedures </li></ul></ul></ul><ul><ul><ul><li>Ability to develop and carry out patient management plans </li></ul></ul></ul><ul><ul><ul><li>Investigative/analytical thinking </li></ul></ul></ul><ul><ul><ul><li>Knowledge/application of basic sciences </li></ul></ul></ul><ul><ul><ul><li>Ethically sound practice </li></ul></ul></ul>
    15. 15. Introduction <ul><li>LCME Requirements </li></ul><ul><ul><li>Allows simulated patients to count for student exposure to particular cases. </li></ul></ul><ul><li>RRC Requirements </li></ul><ul><ul><li>Allows simulated procedures to count for program/individual totals. </li></ul></ul><ul><ul><li>Very helpful for rare procedures. </li></ul></ul>
    16. 16. Introduction <ul><li>Simulation is one tool </li></ul><ul><li>(new, expensive and exciting) </li></ul><ul><li>in our educational repertoire. </li></ul><ul><li>(Similar to lecture, case discussion, skill lab, MCQ, SP, etc.) </li></ul>
    17. 17. Outline <ul><li>Introduction </li></ul><ul><li>Spectrum of Simulation Equipment </li></ul><ul><li>Best Practice Examples </li></ul><ul><li>Hands-on Practice </li></ul>
    18. 18. Available Simulation Equipment <ul><li>Standardized Patients </li></ul><ul><li>Improvised Technology </li></ul><ul><li>Screen Based Simulation </li></ul><ul><li>Task Trainers </li></ul><ul><li>Low/Mid/High Fidelity Mannequins </li></ul><ul><li>Virtual Reality </li></ul>
    19. 19. Evaluating Simulators <ul><li>Usability </li></ul><ul><li>Validity </li></ul><ul><ul><li>Face, Content, Construct, Concurrent, Predictive </li></ul></ul><ul><li>Transfer </li></ul><ul><li>Efficiency </li></ul><ul><li>Cost </li></ul><ul><li>Evidence </li></ul>
    20. 20. Standardized Patients <ul><li>Individuals trained to portray specific illness or behavior in a realistic and consistent manner for the purposes of teaching or assessment. </li></ul><ul><li>Used in classroom setting, or without knowledge in clinical setting </li></ul><ul><li>Especially useful to teach and assess communications and professionalism competencies in a standardized method. </li></ul>
    21. 21. Standardized Patients <ul><li>Initially started in the 1980’s </li></ul><ul><li>Now - Association of Standardized Patient Educators </li></ul><ul><ul><li>http://www.aspeducators.org/sp_info.htm </li></ul></ul><ul><li>Required Clinical Skills testing for all students </li></ul><ul><ul><li>USMLE Part II CS exam </li></ul></ul>Univ of South Florida standardized patient
    22. 22. Standardized Patients <ul><li>Strengths </li></ul><ul><ul><li>Can consistently reproduce clinical scenario for standardized testing of learners </li></ul></ul><ul><ul><li>Ability to assess rare conditions not otherwise reliably seen </li></ul></ul><ul><ul><li>Patients trained to provide objective & accurate feedback </li></ul></ul><ul><ul><li>Can use in real settings (arrive at office/ED as ‘real’ patient for realistic environment) </li></ul></ul>
    23. 23. Standardized Patients <ul><li>Weaknesses </li></ul><ul><ul><li>Little research on effectiveness </li></ul></ul><ul><ul><ul><li>Most studies are from preclinical medical school education </li></ul></ul></ul><ul><ul><ul><li>Few studies done with residents or practitioners and nearly all have small numbers (15-50) </li></ul></ul></ul><ul><ul><li>Cost to pay & time to teach standardized patients </li></ul></ul><ul><ul><li>Quality of experience heavily dependent upon training of the patient and scenarios developed </li></ul></ul>
    24. 24. Standardized Patients Audience Comments
    25. 25. Improvised Technology <ul><li>Models made of easily available items </li></ul><ul><ul><li>Closely mimic human tissue </li></ul></ul><ul><ul><li>Allow for near replica of actual procedural steps </li></ul></ul><ul><li>Generally used for instruction of procedures </li></ul><ul><li>Commonly used examples </li></ul><ul><ul><li>Slab of ribs to teach insertion of chest tubes </li></ul></ul><ul><ul><li>Pigs feet or head for suturing practice </li></ul></ul><ul><li>Other examples in the literature </li></ul><ul><ul><li>Jello for vascular model </li></ul></ul><ul><ul><li>Lasagna for split skin graft harvesting </li></ul></ul>
    26. 26. Animal Models
    27. 27. Improvised Technology – Educational Theory <ul><li>Cognitive process for learning a procedure </li></ul><ul><ul><li>Understanding of indications, contraindications & complications </li></ul></ul><ul><ul><li>Knowledge of equipment used for procedure </li></ul></ul><ul><ul><li>Step-by-step knowledge of technical procedure </li></ul></ul><ul><ul><li>Identifying anatomical landmarks and ‘tissue clues’ </li></ul></ul><ul><ul><ul><li>E.g. “pop” when entering dura or peritoneal cavity </li></ul></ul></ul>
    28. 28. Improvised Technology – Educational Theory <ul><li>Improvised Technology </li></ul><ul><ul><li>Useful to teach </li></ul></ul><ul><ul><ul><li>Knowledge of equipment </li></ul></ul></ul><ul><ul><ul><li>Step-by-step knowledge of procedure </li></ul></ul></ul><ul><ul><ul><li>Some ‘tissue clues’ </li></ul></ul></ul><ul><ul><li>Less useful for </li></ul></ul><ul><ul><ul><li>Anatomical landmarks </li></ul></ul></ul><ul><li>“ Greatest predictor of procedural competency … was the ability to sequentially order procedural steps” </li></ul><ul><ul><li>Chapman DM et al Open Thoracotomy Procedural…Ann Emerg Med 1996; 28:641. </li></ul></ul>
    29. 29. Improvised Technology <ul><li>Strengths </li></ul><ul><ul><li>Cheap!!! </li></ul></ul><ul><ul><li>Made easily available at all sites </li></ul></ul><ul><ul><li>Easy to duplicate for repetitive use or numerous users </li></ul></ul><ul><ul><li>Minimal instructor education needed </li></ul></ul><ul><ul><li>Ability to create models otherwise not available </li></ul></ul><ul><ul><ul><li>Resuscitative Thoractomy </li></ul></ul></ul>
    30. 30. Improvised Technology <ul><li>Weaknesses </li></ul><ul><ul><li>Almost no research on effectiveness </li></ul></ul><ul><ul><li>Less ‘real-life’ experience, therefore stress factor removed </li></ul></ul><ul><ul><li>Often does not duplicate most difficult aspect of procedure (E.g. obese patient) </li></ul></ul><ul><ul><li>Static devices , therefore useful for specific procedures only, not actively changing clinical scenarios </li></ul></ul>
    31. 31. Examples – Vascular model A = Sock skin B = Film canister for support C = Foam curler connective tissue D = Straw vessel
    32. 32. Examples – DPL model A = Fine fabric peritoneum B = Foam connective tissue C = Shower curtain skin D = PVC pipe intestines E = Umbilicus marking
    33. 33. Examples – Lumbar puncture model A = Box spinous process B = Film canister lateral masses b = Lid of film canister C = Foam curler connective tissue D = Dural “pop” from packing bubbles Not seen – pillow muscular layer
    34. 34. Examples – Thoracotomy model A = Shower curtain skin B = Foam connective tissue C = Laundry basket rib cage D = Clips E = Packing air bag lungs F = Ice cube tray spine G = Plastic bag pericardium with tape phrenic nerve H = covered football heart with hole I = Tubing esophagus with NG in place J = Tubing aorta
    35. 35. Examples – Thoracotomy model
    36. 36. Improvised Technology Audience Comments
    37. 37. Screen Based Simulation <ul><li>Desktop Computer </li></ul><ul><li>Strengths – low cost, distance learning, variety of cases, improving realism, self guided </li></ul><ul><li>Weaknesses – procedural skills, teamwork skills </li></ul>
    38. 38. Screen Based Simulation <ul><li>Laerdal Microsim </li></ul><ul><li>www. Anesoft .com </li></ul><ul><ul><li>ACLS </li></ul></ul><ul><ul><li>Critical Care </li></ul></ul><ul><ul><li>Anesthesia </li></ul></ul><ul><ul><li>Sedation </li></ul></ul><ul><ul><li>Neonatal </li></ul></ul>
    39. 39. Screen Based Simulation Audience Comments
    40. 40. Task Trainers <ul><li>Devices designed to simulate a specific task or procedure. </li></ul><ul><li>Examples: </li></ul><ul><ul><li>Lap simulator </li></ul></ul><ul><ul><li>Bronch simulator </li></ul></ul><ul><ul><li>“ Traumaman” </li></ul></ul><ul><ul><li>Artificial knee </li></ul></ul>
    41. 41. Task Trainers
    42. 42. Task Trainers <ul><li>Strengths </li></ul><ul><ul><li>High fidelity, good research on efficacy, may have self guided teaching, metrics available </li></ul></ul><ul><li>Weaknesses </li></ul><ul><ul><li>Poor haptics on most machines, expensive, focus on single task, not integrated into complete patient care </li></ul></ul>
    43. 43. Task Trainers Audience Comments
    44. 44. Low Fidelity Mannequins <ul><li>Features: </li></ul><ul><ul><li>Static airways </li></ul></ul><ul><ul><li>+/- rhythm generation </li></ul></ul><ul><ul><li>No/minimal programmed responses. </li></ul></ul><ul><li>Strengths: Low cost, reliable, easy to use, portable </li></ul><ul><li>Weaknesses: Limited features, less interactive, instructor required </li></ul>
    45. 45. Low Fidelity Mannequins <ul><li>Examples </li></ul>
    46. 46. Mid Fidelity Mannequins <ul><li>Relatively new class of mannequins, often used for ACLS training. </li></ul><ul><li>Features: </li></ul><ul><ul><li>Active airways – ETT, LMA, Combitube </li></ul></ul><ul><ul><li>Breathing/pulses, rhythms </li></ul></ul><ul><ul><li>Basic procedures – pacing, defibrillation </li></ul></ul><ul><ul><li>Some automated response and programmed scenarios </li></ul></ul>
    47. 47. Mid Fidelity Mannequins <ul><li>Strengths: </li></ul><ul><ul><li>Active airways, somewhat interactive, moderate cost, moderate portability </li></ul></ul><ul><li>Weaknesses: </li></ul><ul><ul><li>Semiskilled instructor, limited advanced procedures (lines, chest tubes) </li></ul></ul>
    48. 48. High Fidelity Mannequins <ul><li>Mannequin with electrical, pneumatic functions driven by a computer. </li></ul><ul><li>Adult, child and newborn models </li></ul><ul><li>Features: </li></ul><ul><ul><li>Dynamic airways, reactive pupils </li></ul></ul><ul><ul><li>Heart sounds, lung sounds, chest movement </li></ul></ul><ul><ul><li>Pulses, rhythms, vital signs </li></ul></ul><ul><ul><li>Abdominal sounds, voice </li></ul></ul><ul><ul><li>CO2 exhalation, cardiac output, invasive pressures </li></ul></ul><ul><ul><li>Bleeding, salivation, lacrimation </li></ul></ul>
    49. 49. High Fidelity Mannequins <ul><li>Procedures </li></ul><ul><ul><li>O2, BVM, Oral/nasal airway, ETT, LMA, Cric </li></ul></ul><ul><ul><li>Pericardiocentesis, PIV </li></ul></ul><ul><ul><li>Defibrillation, Pacing, CPR </li></ul></ul><ul><ul><li>Needle or open thoracentesis </li></ul></ul><ul><ul><li>TOF, Internal gas analysis </li></ul></ul><ul><ul><li>Foley placement </li></ul></ul><ul><ul><li>Reacts to medications </li></ul></ul>
    50. 50. Features
    51. 51. Laerdal vs. METI <ul><li>Laerdal </li></ul><ul><ul><li>Instructor programmed physiology changes </li></ul></ul><ul><ul><li>Windows </li></ul></ul><ul><ul><li>Terrific Airway </li></ul></ul><ul><ul><li>Reliability </li></ul></ul><ul><ul><li>Ease of Use </li></ul></ul><ul><ul><li>Cost: 35-45K </li></ul></ul><ul><li>METI </li></ul><ul><ul><li>Physiology modeled to respond to interventions </li></ul></ul><ul><ul><li>Macintosh </li></ul></ul><ul><ul><li>Drug Recognition </li></ul></ul><ul><ul><li>Gas Analyzer </li></ul></ul><ul><ul><li>Two Cost Levels </li></ul></ul><ul><ul><ul><li>ECS: 45K </li></ul></ul></ul><ul><ul><ul><li>HPS: >150K </li></ul></ul></ul>
    52. 55. High Fidelity Mannequins <ul><li>Strengths </li></ul><ul><ul><li>Many dynamic responses, preprogrammed scenarios, widest variety of procedures, most immersive. </li></ul></ul><ul><li>Weaknesses </li></ul><ul><ul><li>Cost, procedures are not very realistic, reliability, lack of portability, significant instructor training required. </li></ul></ul>
    53. 56. Mannequins Audience Comments
    54. 57. Virtual Reality <ul><li>Advanced form of human-computer interaction </li></ul><ul><ul><li>Allow humans to work in the computer’s world </li></ul></ul><ul><ul><li>Environment understandable to us </li></ul></ul><ul><li>Four necessary components </li></ul><ul><ul><li>Software </li></ul></ul><ul><ul><li>Hardware </li></ul></ul><ul><ul><li>Input devices </li></ul></ul><ul><ul><li>Output devises </li></ul></ul>
    55. 58. Input and Output devices
    56. 59. Virtual Reality <ul><li>Types of VR applicable to medicine </li></ul><ul><ul><li>Immersive VR </li></ul></ul><ul><ul><li>Desktop VR </li></ul></ul><ul><ul><li>Pseudo-VR </li></ul></ul><ul><ul><li>Augmented reality </li></ul></ul>
    57. 60. Immersive VR
    58. 61. Desktop VR
    59. 62. Pseudo-VR
    60. 63. Augmented Reality
    61. 64. Virtual Reality Audience Comments
    62. 65. Outline <ul><li>Introduction </li></ul><ul><li>Spectrum of Simulation Equipment </li></ul><ul><li>Best Practice Examples </li></ul><ul><li>Hands-on Practice </li></ul>
    63. 66. Research <ul><li>Rapidly expanding body of literature since 2000. </li></ul><ul><li>First issue of ‘Simulation in Healthcare’ Jan 2006. </li></ul><ul><li>Many articles on ‘look at what we did’ level and data that says ‘everyone thought it was nifty.’ </li></ul><ul><li>Focus on best practices in teaching/learning and assessment using simulation. </li></ul>
    64. 67. Best Teaching Practices <ul><li>Screen based teaching with feedback is better than self study. </li></ul><ul><ul><li>Schwid, H. A., G. A. Rooke, et al. (2001). &quot;Screen-based anesthesia simulation with debriefing improves performance in a mannequin-based anesthesia simulator.&quot; Teaching & Learning in Medicine 13 (2): 92-6. </li></ul></ul><ul><ul><ul><li>We measured the effectiveness of screen-based simulator training with debriefing on the response to simulated anesthetic critical incidents. </li></ul></ul></ul><ul><ul><ul><li>The intervention group handled 10 anesthetic emergencies using the screen-based anesthesia simulator program and received written feedback on their management, whereas the traditional (control) group was asked to study a handout covering the same 10 emergencies. </li></ul></ul></ul><ul><ul><ul><li>All residents then were evaluated on their management of 4 standardized scenarios in a mannequin-based simulator using a quantitative scoring system. </li></ul></ul></ul><ul><ul><ul><li>Residents who managed anesthetic problems using a screen-based anesthesia simulator handled the emergencies in a mannequin-based anesthesia simulator better than residents who were asked to study a handout covering the same problems. </li></ul></ul></ul>
    65. 68. Best Teaching Practices <ul><li>Comparing simulation to other teaching modalities demonstrates some slight advantages. </li></ul><ul><ul><li>Lee, S. K., M. Pardo, et al. &quot;Trauma assessment training with a patient simulator: a prospective, randomized study.&quot; Journal of Trauma-Injury Infection & Critical Care. 55(4):651-7, 2003 Oct. </li></ul></ul><ul><ul><ul><li>Interns (n = 60) attended a basic trauma course, and were then randomized to trauma assessment practice sessions with either the patient simulator (n = 30) or a moulage patient (n = 30). After practice sessions, interns were randomized a second time to an individual trauma assessment test on either the simulator or the moulage patient. </li></ul></ul></ul><ul><ul><ul><li>Within randomized groups, mean trauma assessment test scores for all simulator-trained interns were higher when compared with all moulage-trained interns. </li></ul></ul></ul><ul><ul><ul><li>Use of a patient simulator to introduce trauma assessment training is feasible and compares favorably to training in a moulage setting. </li></ul></ul></ul>
    66. 69. Best Teaching Practices <ul><li>Simulation can be an effective replacement for live practice for some skills. </li></ul><ul><ul><li>Hall, R. E., J. R. Plant, et al. (2005). &quot;Human Patient Simulation Is Effective for Teaching Paramedic Students Endotracheal Intubation.&quot; Acad Emerg Med 12 (9): 850-855. </li></ul></ul><ul><ul><ul><li>Paramedic students (n = 36) with no prior ETI training received identical didactic and mannequin teaching. After randomization, students were trained for ten hours on a patient simulator (SIM) or with 15 intubations on human subjects in the OR. All students then underwent a formalized test of 15 intubations in the OR. </li></ul></ul></ul><ul><ul><ul><li>When tested in the OR, paramedic students who were trained in ETI on a simulator are as effective as students who trained on human subjects. </li></ul></ul></ul>
    67. 70. Best Teaching Practices <ul><li>Learner centered teaching with simulation. </li></ul><ul><ul><li>Gordon, J. A. and J. Pawlowski (2002). &quot;Education on-demand: the development of a simulator-based medical education service.&quot; Academic Medicine. 77 (7): 751-2. </li></ul></ul><ul><ul><ul><li>Using the simulator, we wanted to create a medical education service-like any other clinical teaching service, but designed exclusively to help students fill in the gaps in their own education, on demand. We hoped to mitigate the inherent variability of standard clinical teaching, and to augment areas of deficiency. </li></ul></ul></ul><ul><ul><ul><li>Upon arriving at the skills lab for their appointments, students would proceed to interview, evaluate, and treat the mannequin-simulator as if it were a real patient, using the instructor for assistance as needed. All students participated in an educational debriefing after each session. </li></ul></ul></ul><ul><ul><ul><li>Customized, realistic clinical correlates are now readily available for students and teachers, allowing reliable access to &quot;the good teaching case.&quot; </li></ul></ul></ul>
    68. 71. Best Teaching Practices <ul><li>Cheap may be as good as expensive. </li></ul><ul><li>Keyser, E. J., A. M. Derossis, et al. (2000). &quot;A simplified simulator for the training and evaluation of laparoscopic skills.&quot; Surg Endosc 14 (2): 149-53. </li></ul><ul><ul><li>The purpose of this study was to compare a simplified mirrored-box simulator to the video- laparoscopic cart system. </li></ul></ul><ul><ul><li>22 surgical residents performed seven structured tasks in both simulators in random order. Scores reflected precision and speed. </li></ul></ul><ul><ul><li>There were no significant differences in mean raw scores between the simulators for six of the seven tasks. </li></ul></ul><ul><ul><li>A mirrored-box simulator was shown to provide a reasonable reflection of relative performance of laparoscopic skills. </li></ul></ul>
    69. 72. Best Teaching Practices <ul><li>Team behavior can be effected by focused simulation experiences. </li></ul><ul><ul><li>Shapiro, M. J., J. C. Morey, et al. (2004). &quot;Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum?&quot; Quality & Safety in Health Care 13 (6): 417-21. </li></ul></ul><ul><ul><ul><li>ED staff who had recently received didactic training in the Emergency Team Coordination Course (ETCC) also received an 8 hour intensive experience in an ED simulator in which three scenarios of graduated difficulty were encountered. A comparison group, also ETCC trained, was assigned to work together in the ED for one 8 hour shift. </li></ul></ul></ul><ul><ul><ul><li>Experimental and comparison teams were observed in the ED before and after the intervention. </li></ul></ul></ul><ul><ul><ul><li>The experimental team showed a trend towards improvement in the quality of team behavior (p = 0.07); the comparison group showed no change in team behavior during the two observation periods (p = 0.55). </li></ul></ul></ul>
    70. 73. Best Teaching Practices <ul><li>Innovative use of two new technologies helps to engage learners in a large group setting. </li></ul><ul><ul><li>Vozenilek, J., E. Wang, et al. (2005). &quot;Simulation-based Morbidity and Mortality Conference: New Technologies Augmenting Traditional Case-based Presentations.&quot; Acad Emerg Med : j.aem.2005.08.015. </li></ul></ul><ul><ul><ul><li>The use of two separate technologies were enlisted: a METI high-fidelity patient simulator to re-create the case in a more lifelike fashion, and an audience response system to collect clinical impressions throughout the case presentation and survey data at the end of the presentation. </li></ul></ul></ul><ul><ul><ul><li>The re-creation of the patient encounter with all relevant physical findings displayed in high fidelity, with relevant laboratory data, nursing notes, and imaging as it occurred in the actual case, provides a more engaging format for the resident-learner. </li></ul></ul></ul>
    71. 74. Best Teaching Practices <ul><li>Orientation </li></ul><ul><li>Introduction to session </li></ul><ul><ul><li>Expectations </li></ul></ul><ul><ul><li>What is real/what is not </li></ul></ul><ul><li>Self assessment </li></ul><ul><li>Debriefing </li></ul><ul><li>Evaluation </li></ul>
    72. 75. How To Best Use Simulation <ul><li>Provide feedback </li></ul><ul><li>Give opportunities for repetitive practice </li></ul><ul><li>Integrate simulation into overall curriculum </li></ul><ul><li>Provide increasing levels of difficulty </li></ul><ul><li>Provide clinical variation in scenarios </li></ul><ul><li>Control environment </li></ul><ul><li>Provide individual and team learning </li></ul><ul><li>Define outcomes and benchmarks </li></ul>
    73. 76. Best Teaching Practices Audience Comments
    74. 77. Best Assessment Practices <ul><li>Simulation has some data to support its use as an assessment modality. </li></ul><ul><ul><li>Schwid, H. A., G. A. Rooke, et al. (2002). &quot;Evaluation of anesthesia residents using mannequin-based simulation: a multiinstitutional study.&quot; Anesthesiology 97 (6): 1434-44. </li></ul></ul><ul><ul><ul><li>99 anesthesia residents consented to be videotaped during their management of four simulated scenarios on MedSim or METI mannequin-based anesthesia simulators </li></ul></ul></ul><ul><ul><ul><li>Construct-related validity of mannequin-based simulator assessment was supported by an overall improvement in simulator scores from CB and CA-1 to CA-2 and CA-3 levels of training. </li></ul></ul></ul><ul><ul><ul><li>Criterion-related validity was supported by moderate correlation of simulator scores with departmental faculty evaluations (0.37-0.41, P < 0.01), ABA written in-training scores (0.44-0.49, < 0.01), and departmental mock oral board scores (0.44-0.47, P < 0.01). </li></ul></ul></ul><ul><ul><ul><li>Reliability of the simulator assessment was demonstrated by very good internal consistency (alpha = 0.71-0.76) and excellent interrater reliability (correlation = 0.94-0.96; P < 0.01; kappa = 0.81-0.90). </li></ul></ul></ul>
    75. 78. Best Assessment Practices <ul><li>Task trainers appear to be a valid method for assessing procedural competence. </li></ul><ul><ul><li>Adrales, G. L., A. E. Park, et al. (2003). &quot;A valid method of laparoscopic simulation training and competence assessment.&quot; Journal of Surgical Research 114 (2): 156-62. </li></ul></ul><ul><ul><ul><li>Subjects (N = 27) of varying levels of surgical experience performed three laparoscopic simulations, representing appendectomy (LA), cholecystectomy (LC), and inguinal hemiorrhaphy (LH). </li></ul></ul></ul><ul><ul><ul><li>Years of experience directly correlated with the skills ratings (all P < 0.001) and with the competence ratings across the three procedures (P < 0.01). Experience inversely correlated with the time for each procedure (P < 0.01) and the technical error total across the three models (P < 0.05). </li></ul></ul></ul>
    76. 79. Best Assessment Practices <ul><li>Multiple simulated encounters are needed to accurately assess resident abilities. </li></ul><ul><ul><li>Boulet, J. R., D. Murray, et al. (2003). &quot;Reliability and validity of a simulation-based acute care skills assessment for medical students and residents.&quot; Anesthesiology 99 (6): 1270-80. </li></ul></ul><ul><ul><ul><li>The authors developed and tested 10 simulated acute care situations that clinical faculty at a major medical school expects graduating physicians to be able to recognize and treat at the conclusion of training. Forty medical students and residents participated in the evaluation of the exercises. </li></ul></ul></ul><ul><ul><ul><li>The reliability of the simulation scores was moderate and was most strongly influenced by the choice and number of simulated encounters. </li></ul></ul></ul><ul><ul><ul><li>However, multiple simulated encounters, covering a broad domain, are needed to effectively and accurately estimate student/resident abilities in acute care settings. </li></ul></ul></ul>
    77. 80. Best Assessment Practices <ul><li>Checklists scoring of videotaped performance can have a high degree of inter-rater reliability. </li></ul><ul><ul><li>Devitt, J. H., M. M. Kurrek, et al. (1997). &quot;Testing the raters: inter-rater reliability of standardized anaesthesia simulator performance.&quot; Can J Anaesth 44 (9): 924-8. </li></ul></ul><ul><ul><ul><li>We sought to determine if observers witnessing the same event in an anaesthesia simulator would agree on their rating of anaesthetist performance. </li></ul></ul></ul><ul><ul><ul><li>Two one-hour clinical scenarios were developed, each containing five anaesthetic problems. </li></ul></ul></ul><ul><ul><ul><li>Video tape recordings were generated through role-playing with recording of the two scenarios three times each resulting in a total of 30 events to be evaluated. Two clinical anaesthetists, reviewed and scored each of the 30 problems independently. </li></ul></ul></ul><ul><ul><ul><li>The raters were in complete agreement on 29 of the 30 items. There was excellent inter- rater reliability (= 0.96, P 0.001). </li></ul></ul></ul>
    78. 81. Best Assessment Practices <ul><li>Validation that simulator performance correlates with real practice. </li></ul><ul><ul><li>Fried, G. M., A. M. Derossis, et al. (1999). &quot;Comparison of laparoscopic performance in vivo with performance measured in a laparoscopic simulator.&quot; Surg Endosc 13(11): 1077-81; discussion 1082. </li></ul></ul><ul><ul><ul><li>Twelve PGY3 residents were given a baseline evaluation in the simulator and in the animal model. They were then randomized to either five practice sessions in the simulator (group A) or no practice (group B). Each group was retested in the simulator and in the animal (final test). </li></ul></ul></ul><ul><ul><ul><li>Performance in an in vitro laparoscopic simulator correlated significantly with performance in an in vivo animal model. Practice in the simulator resulted in improved performance in vivo. </li></ul></ul></ul>
    79. 82. Best Assessment Practices <ul><li>ABEM type assessment tools measure performance equally well in oral or simulation environments. </li></ul><ul><ul><li>Gordon, J. A., D. N. Tancredi, et al. (2003). &quot;Assessment of a clinical performance evaluation tool for use in a simulator-based testing environment: a pilot study.&quot; Academic Medicine 78 (10 Suppl). </li></ul></ul><ul><ul><ul><li>Twenty-three subjects were evaluated during five standardized encounters using a patient simulator. </li></ul></ul></ul><ul><ul><ul><li>Performance in each 15-minute session was compared with performance on an identical number of oral objective-structured clinical examination (OSCE) sessions used as controls. </li></ul></ul></ul><ul><ul><ul><li>In this pilot, a standardized oral OSCE scoring system performed equally well in a simulator-based testing environment. </li></ul></ul></ul>
    80. 83. Best Assessment Practices <ul><li>There are many aspects of human knowledge/skills/attitudes to assess and the correct tool must be used for each one. </li></ul><ul><ul><li>Kahn, M. J., W. W. Merrill, et al. (2001). &quot;Residency program director evaluations do not correlate with performance on a required 4th-year objective structured clinical examination.&quot; Teaching & Learning in Medicine 13(1): 9-12. </li></ul></ul><ul><ul><ul><li>We surveyed program directors about the performance of 50 graduates from our medical school chosen to represent the highest (OSCEHI) and lowest (OSCELO) 25 performers on our required 4th-year OSCE. </li></ul></ul></ul><ul><ul><ul><li>OSCE scores did not correlate with Likert scores for any survey parameter studied (r < .23, p > .13 for all comparisons). Similarly, program director evaluations did not correlate with class rank or USMLE scores (r < .26, p > .09 for all comparisons). </li></ul></ul></ul><ul><ul><ul><li>We concluded that program director evaluations of resident performance do not appear to correlate with objective tests of either clinical skills or knowledge taken during medical school. </li></ul></ul></ul>
    81. 84. Best Assessment Practices <ul><li>” Softer” competencies like professionalism can be assessed with the aid of simulation technology. </li></ul><ul><ul><li>Gisondi, M. A., R. Smith-Coggins, et al. (2004). &quot;Assessment of Resident Professionalism Using High-fidelity Simulation of Ethical Dilemmas.&quot; Acad Emerg Med 11(9): 931-937. </li></ul></ul><ul><ul><ul><li>Each resident subject participated in a simulated critical patient encounter during an Emergency Medicine Crisis Resource Management course. An ethical dilemma was introduced before the end of each simulated encounter. Resident responses to that dilemma were compared with a </li></ul></ul></ul><ul><ul><ul><li>It was observed that senior residents (second and third year) performed more checklist items than did first-year residents (p < 0.028 for each senior class). </li></ul></ul></ul><ul><ul><ul><li>Residents performed a critical action with 100% uniformity across training years in only one ethical scenario (&quot;Practicing Procedures on the Recently Dead&quot;). Residents performed the fewest critical actions and overall checklist items for the &quot;Patient Confidentiality&quot; case. </li></ul></ul></ul><ul><ul><ul><li>Although limited by small sample size, the application of this performance-assessment tool showed the ability to discriminate between experienced and inexperienced EMRs with respect to a variety of aspects of professional competency. </li></ul></ul></ul>
    82. 85. Best Assessment Practices <ul><li>The scoring/evaluation system chosen to assess simulated performance is critical. </li></ul><ul><ul><li>Regehr, G., R. Freeman, et al. (1999). &quot;Assessing the generalizability of OSCE measures across content domains.&quot; Academic Medicine 74 (12): 1320-2. </li></ul></ul><ul><ul><ul><li>Students' scores from three OSCEs at one institution were correlated to determine the generalizability of the scoring systems across course domains. </li></ul></ul></ul><ul><ul><ul><li>Analysis revealed that while checklist scores showed quite low correlations across examinations from different domains (ranging from 0.14 to 0.25), global process scores showed quite reasonable correlations (ranging from 0.30 to 0.44). </li></ul></ul></ul><ul><ul><ul><li>These data would seem to confirm the intuitions about each of these measures: the checklist scores are highly content-specific, while the global scores are evaluating a more broadly based set of skills. </li></ul></ul></ul>
    83. 86. Best Assessment Practices <ul><li>Learners are smart and will figure out the game. </li></ul><ul><ul><li>Jones, J. S., S. J. Hunt, et al. (1997). &quot;Assessing bedside cardiologic examination skills using &quot;Harvey,&quot; a cardiology patient simulator .&quot; Acad Emerg Med 4 (10): 980-5. </li></ul></ul><ul><ul><ul><li>To assess the cardiovascular physical examination skills of emergency medicine (EM) housestaff and attending physicians. </li></ul></ul></ul><ul><ul><ul><li>Prospective, cohort assessment of EM housestaff and faculty performance on 3 valvular abnormality simulations conducted on the cardiology patient simulator, &quot;Harvey.&quot; </li></ul></ul></ul><ul><ul><ul><li>Forty-six EM housestaff (PGY1-3) and attending physicians were tested over a 2-month study period. Physician responses did not differ significantly among the different levels of postgraduate training. </li></ul></ul></ul><ul><ul><ul><li>Housestaff and faculty had difficulty establishing a correct diagnosis for simulations of 3 common valvular heart diseases. However, accurate recognition of a few critical signs was associated with a correct diagnosis in each simulation. </li></ul></ul></ul>
    84. 87. Best Assessment Practices <ul><li>Determine what you want to assess. </li></ul><ul><li>Design a simulation that provokes this performance. </li></ul><ul><li>Observe/record the performance. </li></ul><ul><li>Analyze the performance using some type of rubric: checklist, GAS, etc. </li></ul><ul><li>Debriefing, feedback and teaching. </li></ul>
    85. 88. Best Assessment Practices Audience Comments
    86. 89. Outline <ul><li>Introduction </li></ul><ul><li>Spectrum of Simulation Equipment </li></ul><ul><li>Best Practice Examples </li></ul><ul><li>Hands-on Practice </li></ul>
    87. 90. Summary <ul><li>Simulation is one tool </li></ul><ul><li>(new, expensive and exciting) </li></ul><ul><li>in our educational repertoire. </li></ul><ul><li>(Similar to lecture, case discussion, skill lab, MCQ, SP, etc.) </li></ul>
    88. 91. Summary <ul><li>Provide feedback </li></ul><ul><li>Give opportunities for repetitive practice </li></ul><ul><li>Integrate simulation into overall curriculum </li></ul><ul><li>Provide increasing levels of difficulty </li></ul><ul><li>Provide clinical variation in scenarios </li></ul><ul><li>Control environment </li></ul><ul><li>Provide individual and team learning </li></ul><ul><li>Define outcomes and benchmarks </li></ul><ul><li>Determine what you want to assess. </li></ul><ul><li>Design a simulation that provokes this performance. </li></ul><ul><li>Observe/record the performance. </li></ul><ul><li>Analyze the performance using some type of rubric: checklist, GAS, etc. </li></ul><ul><li>Debriefing, feedback and teaching. </li></ul>
    89. 92. Outline <ul><li>Introduction </li></ul><ul><li>Spectrum of Simulation Equipment </li></ul><ul><li>Best Practice Examples </li></ul><ul><li>Hands-on Practice </li></ul>
    90. 93. References <ul><li>Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Issenberg, McGaghie, Petrusa, Gordon and Scalese. Medical Teacher, vol 27, 2005, p 10-28. </li></ul><ul><li>Loyd GE, Lake CL, Greenberg RB. Practical Health Care Simulations. Philadelphia, PA. Elsevier-Mosby. 2004. </li></ul><ul><li>Bond WF, Spillane L, for the CORD Core Competencies Simulation Group: The use of simulation for emergency medicine resident assessment. Acad Emerg Med 2002;9:1295-1299. </li></ul><ul><li>ACGME Resources </li></ul><ul><ul><li>www.acgme.org/Outcome/assess/Toolbox.pdf </li></ul></ul><ul><ul><li>www.acgme.org/Outcome/assess/ToolTable.pdf </li></ul></ul><ul><ul><li>Accessed on Feb 2 nd 2006. </li></ul></ul>
    91. 94. Additional References <ul><li>1. Glassman PA, Luck J, O'Gara EM, Peabody JW. Using standardized patients to measure quality: evidence from the literature and a prospective study. Joint Commission Journal on Quality Improvement. 2000; 26:644-653. </li></ul><ul><li>2. Owen H, Plummer JL. Improving learning of a clinical skill: the first year's experience of teaching endotracheal intubation in a clinical simulation facility. Medical Education. 2002; 36:635-642. </li></ul><ul><li>3. Pittini R, Oepkes D, Macrury K, Reznick R, Beyene J, Windrim R. Teaching invasive perinatal procedures: assessment of a high fidelity simulator-based curriculum. Ultrasound in Obstetrics & Gynecology. 2002; 19:478-483. </li></ul><ul><li>4. Kurrek MM, Devitt JH, Cohen M. Cardiac arrest in the OR: how are our ACLS skills? Can J Anaesth. 1998; 45:130-2. </li></ul><ul><li>5. Schwid HA, Rooke GA, Ross BK, Sivarajan M. Use of a computerized advanced cardiac life support simulator improves retention of advanced cardiac life support guidelines better than a textbook review. Critical Care Medicine. 1999; 27:821-824. </li></ul><ul><li>6. Rosenblatt MA, Abrams KJ, New York State Society of Anesthesiologists I, Committee on Continuing Medical E, Remediation, Remediation S-C. The use of a human patient simulator in the evaluation of and development of a remedial prescription for an anesthesiologist with lapsed medical skills. Anesthesia & Analgesia. 2002; 94:149-153, table of contents. </li></ul><ul><li>7. Gisondi MA, Smith-Coggins R, Harter PM, Soltysik RC, Yarnold PR. Assessment of Resident Professionalism Using High-fidelity Simulation of Ethical Dilemmas. Acad Emerg Med. 2004; 11:931-937. </li></ul><ul><li>8. Schwid HA, Rooke GA, Michalowski P, Ross BK. Screen-based anesthesia simulation with debriefing improves performance in a mannequin-based anesthesia simulator. Teaching & Learning in Medicine. 2001; 13:92-96. </li></ul><ul><li>9. Gaba DM, Fish K.J., Howard S.K. Crisis Management in Anesthesiology. New York: Churchill Livingstone; 1994. </li></ul><ul><li>10. Reznek M, Smith-Coggins R, Howard S, Kiran K, Harter P, Sowb Y, Gaba D, et al. Emergency Medicine Crisis Resource Management (EMCRM): Pilot Study of a Simulation-based Crisis Management Course for Emergency Medicine. Acad Emerg Med. 2003; 10:386-389. </li></ul><ul><li>11. Small SD, Wuerz RC, Simon R, Shapiro N, Conn A, Setnik G. Demonstration of high-fidelity simulation team training for emergency medicine. Academic Emergency Medicine. 1999; 6:312-323. </li></ul>
    92. 95. Additional References <ul><li>12. Shapiro MJ, Morey JC, Small SD, Langford V, Kaylor CJ, Jagminas L, Suner S, et al. Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum?[see comment]. Quality & Safety in Health Care. 2004; 13:417-421. </li></ul><ul><li>13. Berkenstadt H, Ziv A, Barsuk D, Levine I, Cohen A, Vardi A. The use of advanced simulation in the training of anesthesiologists to treat chemical warfare casualties. Anesthesia & Analgesia. 2003; 96:1739-1742, table of contents. </li></ul><ul><li>14. Kyle RR, Via DK, Lowy RJ, Madsen JM, Marty AM, Mongan PD. A multidisciplinary approach to teach responses to weapons of mass destruction and terrorism using combined simulation modalities.[see comment]. Journal of Clinical Anesthesia. 2004; 16:152-158. </li></ul><ul><li>15. Kobayashi L, Shapiro MJ, Suner S, Williams KA. Disaster medicine: the potential role of high fidelity medical simulation for mass casualty incident training. Medicine & Health, Rhode Island. 2003; 86:196-200. </li></ul><ul><li>16. Kassirer JPaK, R. I. Learning Clinical Reasoning. First ed. Baltimore, MD: Williams and Wilkins; 1991. </li></ul><ul><li>17. Bond WF DL, Kostenbader M, Worrilow CC. &quot;Using Human Patient Simulation to Instruct Emergency Medicine Residents in Cognitive Forcing Strategies&quot;. Paper presented at: Innovation in Emergency Medical Education Exhibit, SAEM Annual Meeting, . 2003; Boston. </li></ul><ul><li>18. Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Academic Emergency Medicine. 2002; 9:1184-1204. </li></ul><ul><li>19. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them.[comment]. Academic Medicine. 2003; 78:775-780. </li></ul><ul><li>20. Boulet JR, Murray D, Kras J, Woodhouse J, McAllister J, Ziv A. Reliability and validity of a simulation-based acute care skills assessment for medical students and residents. Anesthesiology. 2003; 99:1270-1280. </li></ul><ul><li>21. Bond WF, Spillane L. The use of simulation for emergency medicine resident assessment. Academic Emergency Medicine. 2002; 9:1295-1299. </li></ul><ul><li>22. Byrne AJ, Greaves JD. Assessment instruments used during anaesthetic simulation: review of published studies. British Journal of Anaesthesia. 2001; 86:445-450. </li></ul><ul><li>23. Gordon JA, Tancredi DN, Binder WD, Wilkerson WM, Shaffer DW. Assessment of a clinical performance evaluation tool for use in a simulator-based testing environment: a pilot study. Academic Medicine. 2003; 78. </li></ul>
    93. 96. Additional References <ul><li>24. LaMantia J, Rennie W, Risucci DA, Cydulka R, Spillane L, Graff L, Becher J, et al. Interobserver variability among faculty in evaluations of residents' clinical skills. Academic Emergency Medicine. 1999; 6:38-44. </li></ul><ul><li>25. Morgan PJ, Cleave-Hogg D, McIlroy J, Devitt JH. Simulation technology: a comparison of experiential and visual learning for undergraduate medical students. Anesthesiology. 2002; 96:10-16. </li></ul><ul><li>26. Schwid HA, Rooke GA, Carline J, Steadman RH, Murray WB, Olympio M, Tarver S, et al. Evaluation of anesthesia residents using mannequin-based simulation: a multiinstitutional study. Anesthesiology. 2002; 97:1434-1444. </li></ul><ul><li>27. Beaulieu MD, R. M., Hudon E, Saucier D, Remondin M Favreau R (2003). &quot;Using standardized patients to measure professional performance of physicians.&quot; International journal for quality in health care : journal of the International Society for Quality in Health Care / ISQua 15(3): 251-9 </li></ul><ul><li>28. Chapman DM. Rhee KJ. Marx JA. Honigman B. Panacek EA. Martinez D. Brofeldt BT. Cavanaugh SH. Open thoracotomy procedural competency: validity study of teaching and assessment modalities. Annals of Emergency Medicine. 28(6):641-7, 1996 Dec. </li></ul><ul><li>29. Cubison, T. C. S. and T. Clare (2002). &quot;Lasagne: a simple model to assess the practical skills of split-skin graft harvesting and meshing.&quot; British Journal of Plastic Surgery 55(8): 703-4. </li></ul><ul><li>30. Davidson R, D. M., Rathe R, Pauly R, Watson RT (2001). &quot;Using standardized patients as teachers: a concurrent controlled trial.&quot; Academic medicine : journal of the Association of American Medical Colleges 76(8): 840-3 </li></ul><ul><li>31. Hance, J., R. Aggarwal, et al. (2005). &quot;Objective assessment of technical skills in cardiac surgery.&quot; European Journal of Cardio-Thoracic Surgery 28(1): 157-62. </li></ul><ul><li>32. Maran, N. J. and R. J. Glavin (2003). &quot;Low- to high-fidelity simulation - a continuum of medical education?[see comment].&quot; Medical Education 37 Suppl 1: 22-8. </li></ul><ul><li>33. Nikendei, C., A. Zeuch, et al. (2005). &quot;Role-playing for more realistic technical skills training.&quot; Medical Teacher 27(2): 122-6Clauser, B. E., S. G. Clyman, et al. (1996). &quot;Are fully compensatory models appropriate for setting standards on performance assessments of clinical skills?&quot; Academic Medicine 71(1 Suppl): S90-2. </li></ul><ul><li>34. Williams, R. G. (2004). &quot;Have standardized patient examinations stood the test of time and experience?&quot; Teaching & Learning in Medicine 16(2): 215-22 </li></ul>

    ×