3. What is Test Item Analysis…
a process of knowing how good a
particular test item measures, what it
is supposed to measure.
This process is a valuable tool in
improving the quality of test items.
4. GENERAL PURPOSES OF
ITEM ANALYSIS:
select the best valuable items for the final
form of a test.
identify structural or content defects in the
items.
detect learning difficulties of the class as a
whole, identifying general content areas or
skills.
identify for individual student areas of
weaknesses in need of redemption.
5. SEVERAL BENEFITS OF
ITEM ANALYSIS
gives useful information for class
discussion of the test.
gives data for helping the students
to improve their learning method.
gives insights which lead to the
construction of better items for future
work.
6. LIMITATIONS OF ITEM
ANALYSIS
has a maximum application on tests that measure
essentially same level of mental function.
item formats, reading level and other factors
affecting item difficulty .
item analysis techniques less applicable to essay
tests.
cannot be used as diagnostic assessment scheme
for student performance (chance could play an
important role in taking test and ultimately on student
performance).
8. TABLE 1.
STUDENTS RESPONSES
40 student-respondents randomly selected
for this test item analysis activity:
BASED ON THE RESULT OF THE TEST:
Total Score= 10
HS =6
LS = 0
10. TABLE 2-Scores
This sheet is for viewing only. The template
will automatically check the answers, 0 indicates
wrong answer and 1 is correct.
11. COPY SORTING
This template is equipped with auto-sorting
facility. It will sort the scores of the learners from
highest to lowest.
12. RESPONSES HL-LL
This is where you will copy-paste the
responses of the learners sorted from the Upper
27% and Lower 27% found in the COPY
SORTING SHEET.
13.
14. Table 3.
Level of difficulty and
Discrimination Index
Difficulty Level (Dl)
the percentage of students responding
correctly to each item in the test.
The higher this value, the easier is the
item (however, extremely high difficulty
level could indicate a structural defect in
the item)
15. Values of difficulty level may be
interpreted as follows:
91 – 100%very easy
76 – 90 %easy
25 – 75% average
10 – 24% difficult
0 – 9% very difficult
Suggested items to remain: items
with index of difficulty values of
20% - 80%.
16. Table 3.
Level of difficulty and
Discrimination Index
Discrimination Index
the degree in which an item differentiates the high
achievers from the low achievers.
Can be interpreted as the correlation of the item
with the total test score.
Describes how well the item discriminates the high
achievers from the low achievers.
17. Index of discrimination values may be interpreted as
follows:
0.60 – 1.00 excellent
0.40 – 0.59 very good
0.30 – 0.39 good
0.20 – 0.29 satisfactory/marginal
0.01 – 0.19 reject
Suggested items to retain: items with index of
discrimination values greater than or equal to 0.40.
Maximum discrimination occurs at index of discrimination
equal to 0.50.
18. TABLE 2-Scores
This sheet is for viewing only. The template
will automatically check the answers, 0 indicates
wrong answer and 1 is correct.
19. Table 4. Distractor Effectiveness
Distractor Effectiveness
the attractiveness of the alternative responses.
Distractor analysis is done to determine the
attractiveness of options. This is done to determine
whether the item is really a difficult question or it is
difficult only because there is a problem in test
construction.
24. Kuder Richardson Formula 20
Where:
n = the number of items;
SD
2 = the variance of scores on test defined as
∑( X – X)
2
__________
N -1
pi qi = is the proportion of passes fails for item i.
the proportion of individuals passing item i is denoted by the symbol pi, and the
proportion failing , by qi where qi = 1 - pi .
rxx = _n__ SD2 – Σpiqi
n-1 SD2