Methods for developing assessment instruments to generate useful data in t…

  • 317 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
317
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
9
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. METHODS FORDEVELOPINGASSESSMENTINSTRUMENTS TOGENERATE USEFULDATA IN THE PRESENCEOF VAGUE COURSEOBJECTIVES Patrick B. Barlow Tiffany L. Smith Eric Heidel, PhD William Metheny, PhD
  • 2. On the AgendaSETTING THE SCENE• Who? What? Where? When?WHY?• Assessment in Graduate Medical EducationFIVE PRACTICAL TIPS• How we addressed the problemUSE OF ASSESSMENT RESULTS• How were these methods useful?DISCUSSION• Questions?
  • 3. Who? What? Where? When?SETTING THE SCENE
  • 4. Office of MedicalEducation, Research, and Development(OMERAD) • Job Description • Consultation and education • What was happening with GME at our institution? • New Office Structure • PhD students in ESM brought in • Office given the reigns of the clinical research skills curricula
  • 5. Assessment in Graduate Medical EducationWHY DOES THIS NEED TO BEADDRESSED?
  • 6. What is EBM? Clinical Epidemiology Biostatistics Critical AppraisalEvidence-Based Medicine in GME1,2
  • 7. What We Know About ResidentKnowledge of Clinical Research Skills? • Error rate in reporting and interpreting statistics in medicine 3 is estimated between 30-90% • Consistent… • Lack of knowledge 4,5 • Lack of confidence An example…
  • 8. How we addressed the problemFIVE PRACTICAL TIPS
  • 9. TIP ONEKnow Your Situation • Learning environment factors • Statistics and research methods as a topic • No formal “courses”, nothing is “required” • No previous learning objectives, syllabus, or assessment structure • Work environment factors • Hospital obligations • Attending physician buy-in & priorities
  • 10. TIP ONEKnow Your Situation • Population-specific factors • Variable background experience • Low average competence and confidence • Realities of being a physician • Availability of resources • Limited time • Limited money
  • 11. TIP TWOClarify Your Purpose • Ask two questions: • How will the assessment audience benefit from the results? • How will the students benefit from the assessment results? • In our case • Audience (OMERAD, GSM faculty/administration) • Students (Residents, fellows, physicians , & staff)
  • 12. TIP THREEUse What You Have • Gather the Necessary Background Data • Existing content • Faculty interviews • Direct observation • Literature • Clinical/Work experience • Three benefits • What instructors think the students are learning • What is being taught • Where the gaps are in the curriculum
  • 13. TIP FOURFit the Instrument to Your Purpose, Not the Other Way Around • Again, consider situational factors • Resources for types of assessment instruments • What worked for us • Background knowledge 6 probe
  • 14. TIP FIVE Get Consistent and Critical Feedback Assessment must be viewed as a never- Develop/ Modify ending, iterative process • An instrument is developed or modified • The instrument is “Feedback tested Loop” of Assessment • Testing generates Practice feedback • Feedback leads toFeedback Test modifications…
  • 15. TIP FIVE Get Consistent and Critical Feedback Assessment must be viewed as a never- Develop/ Modify ending, iterative process • An instrument is developed or modified • The instrument is “Feedback tested Loop” of Assessment • Testing generates Practice feedback • Feedback leads toFeedback Test modifications… • These modifications are tested
  • 16. An Integrated Assessment Model for a Dynamic LearningEnvironmentUSE OF RESULTS
  • 17. Improvements to the Course: Learning Objective Development • Multiple sources of data • Assessment Concrete • Experiential“List of Topics” Learning • Evaluation Objectives • Data identified • Salient topics • Missing content • Student needs • Need for a BEFORE AFTER responsive curriculum
  • 18. Improvements to the Assessment: Test blueprint process used to improve the assessment instrumentModule 2:Comparing Research Designs 1. Start with course learning objectives 2. Identify test “topics” Identify major from learning epidemiologic objectives research “after this 3. Expand each topic designs module to as many participants “concepts” as should be Apply each possible able to…” design to their 4. Collapse list of own area of research concepts to remove redundancy 5. Create/modify items
  • 19. Next Steps• Continue instrument & curriculum revisions• Standardized assessment for residents, fellows, physicians on clinical research skills and statistics.
  • 20. Questions? Comments?DISCUSSION
  • 21. References1. Green, M. L. (2000). Evidence-based medicine training in graduate medical education: past, present and future. Journal of evaluation in clinical practice, 6(2), 121–38. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/109700062. Stewart, M.G. (2001). ACGME Core Compentencies Accreditation Council for Graduate Medical Education. Retrieved from http://www.acgme.org/acWebsite/RRC_280/280_coreComp.asp3. Novack, L., Jotkowitz, A., Knyazer, B., & Novack, V. (2006). Evidence- based medicine: assessment of knowledge of basic epidemiological and research methods among medical doctors. Postgraduate Medical Journal, 82(974), 817–822. Retrieved from http://pmj.bmj.com/content/82/974/817.abstract4. West, C. P., & Ficalora, R. D. (2007). Clinician Attitudes Toward Biostatistics. Mayo Clinic Proceedings, 82(8), 939–943. Retrieved from http://www.mayoclinicproceedings.com/content/82/8/939.abstract5. Windish, D. M., Huot, S. J., & Green, M. L. (2007). Medicine Residents’ Understanding of the Biostatistics and Results in the Medical Literature. JAMA: The Journal of the American Medical Association, 298(9), 1010– 1022. Retrieved from http://jama.ama- assn.org/content/298/9/1010.abstract6. Angelo, T. & Cross, K.P. (1993). Classroom Assessment Techniques. San Francisco: Jossey-Bass.7. Fink, L.D. (2003). Creating Significant Learning Experiences: An integrated approach to designing college courses. San Francisco, CA: John Wiley & Sons, Inc.