Learning Objectives Presentation

340 views

Published on

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
340
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
10
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • What do the students have to do to correctly answer the test questions? That is the objective.
  • Coming up with plausible distractors requires a certain amount of skill
  • the added disadvantage of providing clues that help students with only partialknowledge detect the correct combination of alternatives.
  • If you must use a negative in the stem, use only short (preferably single word options), emphasize the negative.
  • Learning Objectives Presentation

    1. 1. Linking Learning to Assessments BJDSOP Faculty Retreat 5/16/2012
    2. 2. Linking Learning to Assessment1. Writing learning objectives2. Relating objectives to assessment3. Writing assessment questions4. How to analyze knowledge tests for discrimination
    3. 3. Learning objectives1. Define an objective.2. How/ when to author3. How to align objectives with assessments.
    4. 4. Characteristics of objectives• Specific and focused• Targets performance• Realistic to achieve• Can be measured and validated• Time-bound with a deadline
    5. 5. Learning objectives…• … describe the intended result of instruction. LWBAT
    6. 6. Outcomes• recall, identify, choose,• solve,• calculate• apply therapeutic concepts to clinical scenarios
    7. 7. Iterative process Lecture/labObjectives Assessments LWBAT WTAT?
    8. 8. “Begin with the end in mind” Stephen Covey The Seven Habits of Highly Effective People
    9. 9. • We need to know what we want the end resultto be before we plan.
    10. 10. Write your testquestions first thenwrite yourobjectives
    11. 11. Questions already written?• Rewrite objectives to align with what is required of the students on the assessments
    12. 12. Rationale1. Alignment2. Focuses attention on what is most important*3. Promotes continuous improvement
    13. 13. Consider these….
    14. 14. Example 1• Question – Which statement is correct…• Objective: –Student will be able to identify..
    15. 15. Example 2• Refer to the case above.• What is the most appropriate therapy at this time?• Objective: –Student will be able to select/…
    16. 16. Work smart• What (are) the KSAs, being assessed by your test questions?• Restate those as your objectives in LWBAT terms…..
    17. 17. Objectives RECAP1. What is a learning objective?2. When do we right them?3. Why?4. How do we align objectives with assessments?
    18. 18. What is a good question?1. Good form
    19. 19. Bad format DO NOT COPY
    20. 20. Good format – please copy
    21. 21. Factual recall format
    22. 22. Better
    23. 23. Basic Science examples
    24. 24. More BSexamples
    25. 25. Vignettes NBME• “..we believe vignette items are generally more appropriate ..” –test application of knowledge to patient situations –pose appropriate clinical challenges
    26. 26. Non vignette
    27. 27. Short Vignette
    28. 28. Long Vignette
    29. 29. What is a good question?1. Good form2.Best practices
    30. 30. www.nbme.org /
    31. 31. NBME Guidelines
    32. 32. Best Practices• No T F• MC with 4-5 good options• 1 correct answer preferred
    33. 33. TestwisenessBAD• Grammatical cues• Absolute terms• Long correct answer• Word repeats
    34. 34. Irrelevant Difficulty BAD• Options are long, complicated• Numeric data not stated consistently• Vague – “rarely”• Avoid all/none of the above• Hinged responses
    35. 35. K- type questions• Avoid• Re
    36. 36. Good questions• Big stems• ~ Same length distractors (short)• Avoid absolutes / vague terms• Avoid negatively phrased items
    37. 37. What is a good question?1.Good form2.Best practices3.Performs well
    38. 38. Performance
    39. 39. A good question?1. % correct2. Item discrimination3. Distractor performance
    40. 40. Item Analysis
    41. 41. A good question?1. % correct2. Item discrimination3. Distractor performance
    42. 42. Test matrix 2010 2011 % ChangeCK 13 11 (15.3%)APP 3 5 66.6%Correct 86 90.5 4.9%response CKCorrect APP 62 65.8 6.1%
    43. 43. Bloom’s Taxonomy Level Less 1. Knowledgecomplex 2. Comprehension 3. Application 4. Analysis 5. Synthesis 6. Evaluation Morecomplex
    44. 44. VERBSBloom’s Level Verbs1. Knowledge match, recognize, select, compute, define, label, name, describe2. Comprehension restate, elaborate, identify, explain, paraphrase, summarize3. Application Apply knowledge, solve problems4. Analysis outline, draw a diagram, illustrate, discriminate, subdivide5. Synthesis compare, contrast, organize, generate, design, formulate6. Evaluation support, interpret, criticize, judge, critique, appraise
    45. 45. Linking Learning to Assessment1. Writing learning objectives2. Relating objectives to assessment3. Writing assessment questions4. How to analyze knowledge tests for discrimination
    46. 46. References1. Case SM, Swanson DB, Becker DF. Verbosity, window dressing, and red herrings: do they make a better test item? Academic Medicine. 1996;71:528-530.2. NBME Constructing Written Test Questions for the Basic and Clinical Sciences3. Haladyna, T. M., & Downing, S. M. (1989a). A taxonomy of multiple-choice item-writing rules. Applied Measurement in Education, 2(1), 37-50.4. Frisbie, D. A. (1990, April). The evolution of the multiple true- false item format. Paper presented attheAnual Meeting of the National Council on Measurement in Education, Boston.
    47. 47. • Sands (2002), the “basic precept of course- planning [is]: What do [you] want students to be able to do at the end of the semester?” In other words, course goals and objectives should guide the design of your course rather than technology (Aycock, Garnham, &Kaleta, 2002). Sands’ first principle for developing a blended course is to “work backward from the final course goal…to avoid a counterproductive focus on technology.”

    ×