2. INTRODUCTION
Item analysis of test items consists of several
procedures which determine how effectively a
test item functions within a total
test/examination. This is a procedure to
appraise the effectiveness of the test item and
to build a file of high quality items for future
use.
3. Introduction…
Item analysis describes the statistical analysis,
which allows measurement of effectiveness of
individual test items. An understanding of the
factors, govern effectiveness, can enable
teachers to create more effective test
questions.
4. DEFINITION
• Item analysis is a process that examines
students response to individual test items/
questions in order to assess the quality of
those items and of the test as a whole.
5. BENEFITS OF ITEMS ANALYSIS:
1. Provides a basis for efficient classroom
discussion of the test results:
2. Provide data for remedial work:
3. Provides a basis for the general improvement
of classroom instruction:
4. Provide a basis for increased skill in item
construction:
6. Simplified item analysis procedure
• Conduct test/exams and score them.
• Arrange all answer sheet in order.
• Select 27% papers within the highest total score
and the 27% with the lowest total score.
• Put aside remaining papers. They will not be
used.
• Compute the difficulty index of each item.
• Compute the discriminating index of each item.
• Evaluate the effectiveness of distracter.
7. Procedure involved in an item analysis
It specifically involve three numerical
indicators. They are:
a. Item difficulty index (p)
Item difficulty index
portrays the “easiness” of an item because
the higher the percentage, the easier the
item. Item difficulty index is symbolized by p.
8. Item difficulty p = R / T , H+L / N
R / H+L = number of students who correctly
answered the item.
T / N= number of students included for the
analysis.
9. • Thus if 40 students took the test and 22
students answered the item correctly, item
difficulty would be: (40 students included in
the analysis)
• Item difficulty = 22 / 40
= .55
11. Item discrimination index (D)
• The item discrimination index of a test refers
to the degree, which the item discriminations
between high achieving students and low
achieving students in terms of the scores of
the total test.
12. R u – R 1
• D =
½ T
• Ru = number of students in the upper group
who got the item right.
• R1 = number of students in the lower group
who got the item right.
• ½ T = one half of the total number of students
included in the analysis.
13. Ebel’s (1972) gives the indices of item discrimination in the
terms shown below. For the average classroom test, these
indices are widely accepted.
D Value
• 0.40 and above
• 0.30-0.39
• 0.20-0.29
• Below 0.19
Item Evaluation
• very good item
• Reasonably good, but
subject to
improvement.
• Marginal items usually
needing and being
subject to
improvement.
• Poor items to be rejected or
improved by revision.
15. DISTRACTOR POWER
• The third kind of statistic is distractor power. It
provides information about effectiveness of
the distractors.
16. EXAMPLE OF AN ITEM ANALYSIS DATA
OF 5 MCQs item.
A C A B D (correct answer)
Q1 Q2 Q3 Q4 Q5
1. RAVI A C A B D
2. VIVEK A B B B D
3. SHRUTI A C C D D
4. RANJU A B D D D
5. RAJU A C A D D
6. NEELAMA C B B A
7. SABINA B C B B A
8. JUHI C C C D A
9. PREET D C D D B
18. DISTRACTOR ANALYSIS:
1. If student choose one of the distractors more
than the correct answer (as is shown in Q4),
it indicates potential problem with question.
Students may choose a distractor because:
• Of partial knowledge
• Item may have been poorly constructed
• Distractor may be outside of the content area
19. 2. When students do not know the correct
answer, their answer is purly a guess. In such
situation, their answers would be distributed
among the distractors as well as the correct
answer such as in Q3.
20. 3. If students do not choose one or more
distractors (see Q2, 4,5), this is because they
may be implausible. According to Harper &
Harper (1990) an option chosen by less than
5% of students is ineffective and should be
revised (kehoe, J. 1995).
21. CONCLUSION
• Item analysis is a process which examines student
responses to individual test items (questions) in
order to assess the quality of those items and of
the test as a whole. Item analysis is especially
valuable in improving items which will be used
again in later tests, but it can also be used to
eliminate ambiguous or misleading items in a
single test administration. In addition, item
analysis is valuable for increasing instructors skills
in test construction, and identifying specific areas
of course content which need greater emphasis
or clarity.