Analyzing and using test item data
Upcoming SlideShare
Loading in...5

Analyzing and using test item data






Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

Analyzing and using test item data Analyzing and using test item data Presentation Transcript

  • Purposes and Elements of Item Analysis
    • To select the best available items for the final form of the test.
    • To identify structural or content defects in the items.
    • To detect learning difficulties of the class as a whole
    • To identify the areas of weaknesses of students in need of remediation .
  • Three Elements in an Item Analysis
    • Examination of the difficulty level of the items,
    • Determination of the discriminating power of each item, and
    • Examination of the effectiveness of distractors in a multiple choice or matching items.
    • The difficulty level of an item is known as index of difficulty .
    • Index of difficulty is the percentage of students answering correctly each item in the test
    • Index of discrimination refers to the percentage of high-scoring individuals responding correctly versus the number of low-scoring individuals responding correctly to an item.
    • This numeric index indicates how effectively an item differentiates between the students who did well and those who did poorly on the test.
  • Preparing Data for Item Analysis
    • Arrange test scores from highest to lowest.
    • Get one-third of the papers from the highest scores and the other third from the lowest scores.
    • Record separately the number of times each alternative was chosen by the students in both groups.
    • Add the number of correct answers to each item made by the combined upper and lower groups.
    • Compute the index of difficulty for each item, following the formula:
    • IDF = ( NRC / TS )100
    • where IDF = index of difficulty
    • NRC = number of students responding correctly to an item
    • TS = total number of students in the upper and lower groups
    • Compute the index of discrimination, based on the formula:
    • IDN = (CU –CL)
    • NSG
    • where IDN = index of discrimination
    • CU = number of correct responses of the upper group
    • CL = number of correct responses of the lower group
    • NSG = number of students per group
  • Using Information about Index of Difficulty
    • The difficulty index of a test item tells a teacher about the comprehension of or performance on material or task contained in an item.
  • Item Group Answers A B C D Total No. of Correct Answers Difficulty Index H – L Discrimination Index 1 H 20 L 20 3 14 2 1 10 7 3 0 21 52.5 7 0.35 2 H 20 L 20 0 0 18 2 0 3 9 8 27 67.5 9 0.45 3 H 20 L 20 3 8 4 4 10 2 4 4 10 25.0 6 0.30 4 H 20 L 20 3 3 4 10 2 4 10 4 14 35.0 6 0.30 5 H 20 L 20
    • 2 2 1
    • 1 10 4 5
    16 40.0 14 0.70
    • For an item to be considered a good item, its difficulty index should be 50%. An item with 50% difficulty index is neither easy nor difficult.
    • If an item has a difficulty index of 67.5%, this means that it is 67.5% easy and 32.5% difficult.
    • Information on the index of difficulty of an item can help a teacher decide whether a test should be revised, retained or modified .
  • Interpretation of the Difficulty Index Range Difficulty Level 20 & below 21 – 40 41 – 60 61 – 80 81 & above Very Difficult Difficult Average Easy Very Easy
  • Using Information about Index of Discrimination
    • The index of discrimination tells a teacher the degree to which a test item differentiates the high achievers from the low achievers in his class. A test item may have positive or negative discriminating power.
    • An item has a positive discriminating power when more students from the upper group got the right answer than those from the lower group
    • When more students from the lower group got the correct answer on an item than those from the upper group, the item has a negative discriminating power.
    • There are instances when an item has zero discriminating power – when equal number of students from upper and lower group got the right answer to a test item.
    • In the given example, item 5 has the highest discriminating power. This means that it can differentiate high and low achievers.
  • Interpretation of the Index of Discrimination Range Verbal Description .40 & above .30 – .39 .20 – .29 .09 – .19 Very Good Item Good Item Fair Item Poor Item
    • When should a test item be rejected? Retained? Modified or revised?
    • A test item can be retained when its level of difficulty is average and discriminating power is positive.
    • It has to rejected when it is either easy/very easy or difficult/very difficult and its discriminating power is negative or zero.
    • An item can be modified when its difficulty level is average and its discrimination index is negative.
  • Examining Distractor Effectiveness
    • An ideal item is one that all students in the upper group answer correctly and all students in the lower group answer wrongly. And the responses of the lower group have to be evenly distributed among the incorrect alternatives.
  • Developing an Item Data File
    • Encourage teachers to undertake an item analysis as often as practical
    • Allowing for accumulated data to be used to make item analysis more reliable
    • Providing for a wider choice of item format and objectives
    • Facilitating the revision of items
    • Facilitating the physical construction and reproduction of the test
    • Accumulating a large pool of items as to allow for some items to be shared with the students for study purposes.
  • Limitations of Item Analysis
    • It cannot be used for essay items.
    • Teachers must be cautious about what damage may be due to the table of specifications when items not meeting the criteria are deleted from the test. These items are to be rewritten or replaced.