WHY NOT MAKE YOUR TESTS BETTER? By: Snežana Filipović
Why do we test?
“ The assumption is that is that best teacher is the teacher who devises classroom methods and techniques that derive from...
What makes a good test?
Tests are good only when they are used for a particular purpose with the students for who they are intended
TEST DEVELOPMENT
STAGE 1 The design stage or  THINK
STAGE 2 The operationalisation stage or THINK & WRITE
STAGE 3 The administration stage or TEST & THINK & CHANGE
Some reasons why I like multiple choice tests
<ul><li>They are objective </li></ul><ul><li>They are quick to do and check </li></ul><ul><li>They test exactly what you w...
ONE REASON WHY I DO NOT LIKE THEM They are notoriously difficult to write
Anatomy of a Multiple Choice Item 1. How did Tina go to the airport? - a) by bus - b) by car - c) on foot - d) by taxi ste...
GUIDELINES FOR MAKING MULTIPLE CHOICE TESTS
Each item should assess a single written objective Before writing an item, think about what it is that you want to test
Include one and only one correct or clearly the best answer in each item
The stem should not be burdened with irrelevant material, but shoul contain as much of an item as possible
Layout of the answers should be clear and consistent. The alternatives should be listed vertically
Avoid answering one item in the test by giving the answer  somewhere else in the test
Keep the items mutually exclusive
The problem should be stated clearly in the stem. The students should not infer what the problem is
Avoid changing pages in the middle of an item
The alternatives should be kept homogenous in content.  They should not consist of potpourri of statements related to the ...
You should not provide clues as to which alternative is correct. Keep the grammar of each alternative consistent with the ...
Distractors should be as plausible as possible.  They should sound plausible only to an incompetent student
Put the correct answer in each of the alternative positions approximately the equal number of times, in a random order
Avoid the use of specific determiners.  (never, always, only)
Keep the alternatives similar in length
Try to make the first few items relatively easy. Make sure you have items of the different level of difficulty
Do not try to write the entire test in a day. It takes time, creativity and thinking to write  good multiple choice items....
Analyse the effectiveness of each item. Item analysis is an excelleny tool for this
ITEM ANALYSIS
PURPOSE OF ITEM ANALYSIS - Evaluates the quality of each item - The quality of items determines the quality of the test - ...
CLASSICAL ITEM ANALYSIS Item Facility Analysis Item Discrimination Analysis Distractor Efficiency Analysis
ITEM FACILITY INDEX <ul><li>number of students answering correctly </li></ul><ul><li>IF = --------------------------------...
General Rules for Item Facility IF less than 0.20 - difficult test items IF 0.20 - 0.80 -moderately difficult items IF mor...
ITEM DISCRIMINATION ANALYSIS We test because we want to find out if the students know the material, but  all we learn for ...
ITEM DISCRIMINATION ANALYSIS - Compares the performance of upper group of students (high test scorers) and lower group (lo...
<ul><li>ITEM DISCRIMINATION INDEX </li></ul><ul><li>ID = IF upper -  IF lower </li></ul>
<ul><li>GENERAL RULES FOR ID INDEX </li></ul><ul><li>0.40  and up – Excellent items </li></ul><ul><li>0.30 – 0.39  - Good ...
DISTRACTOR EFFICIENCY ANALYSIS How many students chose each option?
A perfect item would have 2 characteristics: - Everyone who knows the material tested in the item would get it right - Stu...
DISTRACTOR EFFICIENCY ANALYSIS Distractor Efficiency Table
IT IS TIME FOR A COFFEE BREAK! (Thank you all for coming!)
 
 
 
 
 
 
 
 
 
Upcoming SlideShare
Loading in...5
×

Slovenija presentation

614

Published on

This is a PPT about MCQ. It offers guidelines for making good items and gives a few hints about Item Analysis.
p.s. The first few slides looked OK on the big screen!

1 Comment
1 Like
Statistics
Notes
No Downloads
Views
Total Views
614
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
33
Comments
1
Likes
1
Embeds 0
No embeds

No notes for slide

Slovenija presentation

  1. 1. WHY NOT MAKE YOUR TESTS BETTER? By: Snežana Filipović
  2. 2. Why do we test?
  3. 3. “ The assumption is that is that best teacher is the teacher who devises classroom methods and techniques that derive from a comprehensive knowledge of the total process of language learning, of what is happening within the learner and within the teacher and the interaction between the two. All of this knowledge, however, remains somehow abstract in the mind of the teacher unless it can be empirically tested in the real world. Your theory of second language acquisition can be put into practice every day in the classroom, but you will never know how valid your theory is unless you systematically measure the success of your learners – the success of your theory-in-practice” Douglas Brown (1987:218)
  4. 4. What makes a good test?
  5. 5. Tests are good only when they are used for a particular purpose with the students for who they are intended
  6. 6. TEST DEVELOPMENT
  7. 7. STAGE 1 The design stage or THINK
  8. 8. STAGE 2 The operationalisation stage or THINK & WRITE
  9. 9. STAGE 3 The administration stage or TEST & THINK & CHANGE
  10. 10. Some reasons why I like multiple choice tests
  11. 11. <ul><li>They are objective </li></ul><ul><li>They are quick to do and check </li></ul><ul><li>They test exactly what you want them to test </li></ul><ul><li>They can be easily analysed </li></ul>
  12. 12. ONE REASON WHY I DO NOT LIKE THEM They are notoriously difficult to write
  13. 13. Anatomy of a Multiple Choice Item 1. How did Tina go to the airport? - a) by bus - b) by car - c) on foot - d) by taxi stem alternatives distractor answer distractor distractor
  14. 14. GUIDELINES FOR MAKING MULTIPLE CHOICE TESTS
  15. 15. Each item should assess a single written objective Before writing an item, think about what it is that you want to test
  16. 16. Include one and only one correct or clearly the best answer in each item
  17. 17. The stem should not be burdened with irrelevant material, but shoul contain as much of an item as possible
  18. 18. Layout of the answers should be clear and consistent. The alternatives should be listed vertically
  19. 19. Avoid answering one item in the test by giving the answer somewhere else in the test
  20. 20. Keep the items mutually exclusive
  21. 21. The problem should be stated clearly in the stem. The students should not infer what the problem is
  22. 22. Avoid changing pages in the middle of an item
  23. 23. The alternatives should be kept homogenous in content. They should not consist of potpourri of statements related to the stem but unrelated to each other
  24. 24. You should not provide clues as to which alternative is correct. Keep the grammar of each alternative consistent with the stem
  25. 25. Distractors should be as plausible as possible. They should sound plausible only to an incompetent student
  26. 26. Put the correct answer in each of the alternative positions approximately the equal number of times, in a random order
  27. 27. Avoid the use of specific determiners. (never, always, only)
  28. 28. Keep the alternatives similar in length
  29. 29. Try to make the first few items relatively easy. Make sure you have items of the different level of difficulty
  30. 30. Do not try to write the entire test in a day. It takes time, creativity and thinking to write good multiple choice items. Come back to the test a few days later, with a fresh eye
  31. 31. Analyse the effectiveness of each item. Item analysis is an excelleny tool for this
  32. 32. ITEM ANALYSIS
  33. 33. PURPOSE OF ITEM ANALYSIS - Evaluates the quality of each item - The quality of items determines the quality of the test - Suggests ways of improving the test - Suggests ways of improving teaching
  34. 34. CLASSICAL ITEM ANALYSIS Item Facility Analysis Item Discrimination Analysis Distractor Efficiency Analysis
  35. 35. ITEM FACILITY INDEX <ul><li>number of students answering correctly </li></ul><ul><li>IF = --------------------------------------- </li></ul><ul><li>number of students taking the test </li></ul>
  36. 36. General Rules for Item Facility IF less than 0.20 - difficult test items IF 0.20 - 0.80 -moderately difficult items IF more than 0.80 - easy items
  37. 37. ITEM DISCRIMINATION ANALYSIS We test because we want to find out if the students know the material, but all we learn for certain is how they did on the exam we gave them. The item discrimination index tests the test in the hope of keeping the correlation between the knowledge and exam performance as close as it can be in an admittedly imperfect system. Zurawski 1998: 2
  38. 38. ITEM DISCRIMINATION ANALYSIS - Compares the performance of upper group of students (high test scorers) and lower group (low test scorers) on each item in the test
  39. 39. <ul><li>ITEM DISCRIMINATION INDEX </li></ul><ul><li>ID = IF upper - IF lower </li></ul>
  40. 40. <ul><li>GENERAL RULES FOR ID INDEX </li></ul><ul><li>0.40 and up – Excellent items </li></ul><ul><li>0.30 – 0.39 - Good items but possibly subject to improvement </li></ul><ul><li>0.20 – 0.29 – Marginal items, usually needing and being subject to improvement </li></ul><ul><li>Below 0.19 – Poor items, to be rejected or improved by revision </li></ul>
  41. 41. DISTRACTOR EFFICIENCY ANALYSIS How many students chose each option?
  42. 42. A perfect item would have 2 characteristics: - Everyone who knows the material tested in the item would get it right - Students who do not know would have the answers equally distributed amount the options
  43. 43. DISTRACTOR EFFICIENCY ANALYSIS Distractor Efficiency Table
  44. 44. IT IS TIME FOR A COFFEE BREAK! (Thank you all for coming!)
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×