View stunning SlideShares in full-screen with the new iOS app!Introducing SlideShare for AndroidExplore all your favorite topics in the SlideShare appGet the SlideShare app to Save for Later — even offline
View stunning SlideShares in full-screen with the new Android app!View stunning SlideShares in full-screen with the new iOS app!
METHODS FORDEVELOPINGASSESSMENTINSTRUMENTS TOGENERATE USEFULDATA IN THE PRESENCEOF VAGUE COURSEOBJECTIVES Patrick B. Barlow Tiffany L. Smith Eric Heidel, PhD William Metheny, PhD
On the AgendaSETTING THE SCENE• Who? What? Where? When?WHY?• Assessment in Graduate Medical EducationFIVE PRACTICAL TIPS• How we addressed the problemUSE OF ASSESSMENT RESULTS• How were these methods useful?DISCUSSION• Questions?
Office of MedicalEducation, Research, and Development(OMERAD) • Job Description • Consultation and education • What was happening with GME at our institution? • New Office Structure • PhD students in ESM brought in • Office given the reigns of the clinical research skills curricula
Assessment in Graduate Medical EducationWHY DOES THIS NEED TO BEADDRESSED?
What is EBM? Clinical Epidemiology Biostatistics Critical AppraisalEvidence-Based Medicine in GME1,2
What We Know About ResidentKnowledge of Clinical Research Skills? • Error rate in reporting and interpreting statistics in medicine 3 is estimated between 30-90% • Consistent… • Lack of knowledge 4,5 • Lack of confidence An example…
How we addressed the problemFIVE PRACTICAL TIPS
TIP ONEKnow Your Situation • Learning environment factors • Statistics and research methods as a topic • No formal “courses”, nothing is “required” • No previous learning objectives, syllabus, or assessment structure • Work environment factors • Hospital obligations • Attending physician buy-in & priorities
TIP ONEKnow Your Situation • Population-specific factors • Variable background experience • Low average competence and confidence • Realities of being a physician • Availability of resources • Limited time • Limited money
TIP TWOClarify Your Purpose • Ask two questions: • How will the assessment audience benefit from the results? • How will the students benefit from the assessment results? • In our case • Audience (OMERAD, GSM faculty/administration) • Students (Residents, fellows, physicians , & staff)
TIP THREEUse What You Have • Gather the Necessary Background Data • Existing content • Faculty interviews • Direct observation • Literature • Clinical/Work experience • Three benefits • What instructors think the students are learning • What is being taught • Where the gaps are in the curriculum
TIP FOURFit the Instrument to Your Purpose, Not the Other Way Around • Again, consider situational factors • Resources for types of assessment instruments • What worked for us • Background knowledge 6 probe
TIP FIVE Get Consistent and Critical Feedback Assessment must be viewed as a never- Develop/ Modify ending, iterative process • An instrument is developed or modified • The instrument is “Feedback tested Loop” of Assessment • Testing generates Practice feedback • Feedback leads toFeedback Test modifications…
TIP FIVE Get Consistent and Critical Feedback Assessment must be viewed as a never- Develop/ Modify ending, iterative process • An instrument is developed or modified • The instrument is “Feedback tested Loop” of Assessment • Testing generates Practice feedback • Feedback leads toFeedback Test modifications… • These modifications are tested
An Integrated Assessment Model for a Dynamic LearningEnvironmentUSE OF RESULTS
Improvements to the Course: Learning Objective Development • Multiple sources of data • Assessment Concrete • Experiential“List of Topics” Learning • Evaluation Objectives • Data identified • Salient topics • Missing content • Student needs • Need for a BEFORE AFTER responsive curriculum
Improvements to the Assessment: Test blueprint process used to improve the assessment instrumentModule 2:Comparing Research Designs 1. Start with course learning objectives 2. Identify test “topics” Identify major from learning epidemiologic objectives research “after this 3. Expand each topic designs module to as many participants “concepts” as should be Apply each possible able to…” design to their 4. Collapse list of own area of research concepts to remove redundancy 5. Create/modify items
Next Steps• Continue instrument & curriculum revisions• Standardized assessment for residents, fellows, physicians on clinical research skills and statistics.
References1. Green, M. L. (2000). Evidence-based medicine training in graduate medical education: past, present and future. Journal of evaluation in clinical practice, 6(2), 121–38. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/109700062. Stewart, M.G. (2001). ACGME Core Compentencies Accreditation Council for Graduate Medical Education. Retrieved from http://www.acgme.org/acWebsite/RRC_280/280_coreComp.asp3. Novack, L., Jotkowitz, A., Knyazer, B., & Novack, V. (2006). Evidence- based medicine: assessment of knowledge of basic epidemiological and research methods among medical doctors. Postgraduate Medical Journal, 82(974), 817–822. Retrieved from http://pmj.bmj.com/content/82/974/817.abstract4. West, C. P., & Ficalora, R. D. (2007). Clinician Attitudes Toward Biostatistics. Mayo Clinic Proceedings, 82(8), 939–943. Retrieved from http://www.mayoclinicproceedings.com/content/82/8/939.abstract5. Windish, D. M., Huot, S. J., & Green, M. L. (2007). Medicine Residents’ Understanding of the Biostatistics and Results in the Medical Literature. JAMA: The Journal of the American Medical Association, 298(9), 1010– 1022. Retrieved from http://jama.ama- assn.org/content/298/9/1010.abstract6. Angelo, T. & Cross, K.P. (1993). Classroom Assessment Techniques. San Francisco: Jossey-Bass.7. Fink, L.D. (2003). Creating Significant Learning Experiences: An integrated approach to designing college courses. San Francisco, CA: John Wiley & Sons, Inc.