• Like
  • Save
Test item analysis
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Test item analysis

  • 8,334 views
Published

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
8,334
On SlideShare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
3
Comments
3
Likes
7

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Test Item AnalysisAbdulmohsen AlzalabaniMBChB, ABCM, MSc(Epi), MHPEEpidemiology/Community MedicineCollege of MedicineTaibah University
  • 2. Good Exam Practices 1 3 Blueprinting Standard 2 Setting Good Question Writing
  • 3. Good Exam Practices Post hoc Item Analysis is NOT a replacement of Good Exam Practices
  • 4. Post hoc Item Analysis can… Make the • by excluding bad current items exam better Make later • by revising or exam better • removing questions
  • 5. In College of Medicine, TUWHAT REPORTS WILL YOU GET?
  • 6. Item Analysis So, how to interpret these reports?
  • 7. Item Analysis Difficulty Discrimination Distractor Analysis Exam reliability
  • 8. Item Analysis Difficulty • aka: Facility • Difficulty index, p-value
  • 9. Item Difficulty
  • 10. Item Difficulty High Medium Low (Difficult) (Moderate) (Easy) <= 30% > 30% AND < 80% >=80%
  • 11. Item Difficulty: Exercise Number of students who answered each item = 50 Item No. Correct Difficulty Difficulty No. Answers Index Level 1 15 30 High 2 25 50 Medium 3 35 70 Medium 4 45 90 Low
  • 12. Item Difficulty • Very hard question • Question not in the curriculum • Unanswerable questionLow values • Possible wrong answer key • Too easy High • Core knowledge? values
  • 13. Item Analysis Difficulty Discrimination Distractor Analysis Exam reliability
  • 14. Discrimination Index vs.
  • 15. Item Discrimination Index 80% 20%
  • 16. Item Discrimination: Examples Item Number of Correct Answers Item No. in Group Discrimination Upper 1/4 Lower 1/4 Index 1 90 20 0.7 2 80 70 0.1 3 100 0 1 4 100 100 0 5 50 50 0 6 20 60 -0.4 Number of students per group = 100
  • 17. Discrimination Index • Mis-keyed question Negative • Highly ambiguous question values • Confusing question 0 to 0.2 • Not discriminating well 0.4 and • Good discrimination above
  • 18. Item Analysis
  • 19. Item Analysis Difficulty Discrimination Distractor Analysis Exam reliability
  • 20. Distractor Analysis
  • 21. Distractor Analysis
  • 22. Distractor Analysis
  • 23. Interpret…
  • 24. Interpret…
  • 25. Interpret…
  • 26. Interpret…
  • 27. Item Analysis Difficulty Discrimination Distractor Analysis Exam reliability
  • 28. Reliability Consistency of assessment results Rater 1 85% ??% Final Exam Final Exam 10/04/2012 10/05/2012 Rater 2 85% ??%
  • 29. True Score Apparent clues in + several test itemsObserved Systematic Score Error Short time limits + Random Affects validity Error Inadequate exam practices
  • 30. Difficult to control or predict Effect can be estimated True Score + Temporary fluctuationObserved Systematic in memory Score Error Variation in motivation + and concentration Random Carelessness in Error marking answers Luck in guessing
  • 31. Effect can be estimated Reliability Standard Error ofReliability coefficient Measurement (SEM)
  • 32. Reliability Test-Retest 2 Exams Equivalent forms • (alternate forms, parallel forms) Inter-rater • OSCE, Supply-type (constructed-response) e.g. Essay questions Internal Consistency • Split-half, KR-21, KR20 1 Exam
  • 33. Kuder-Richardson 20 A measure of internal consistency reliability Ranges from 0.00 to 1.00 Should be 0.60 or better (0.85 gold-standard) Cronbach’s alpha for continuous scales
  • 34. Internal Consistency Reliability • Kuder-Richardson 20 • Kuder-Richardson 21
  • 35. SEM (Standard Error of Measurement) Amount of error to allow for when interpreting individual score Allows calculation of confidence intervals around scores 1 SEM = 68% confidence interval 2 SEM = 95% confidence interval
  • 36. Recommended Reading…