This document discusses item analysis, which is a procedure used to evaluate test questions and assess whether they are effectively measuring the intended construct. It defines key terms like item difficulty, facility value, discrimination index, and discusses the purposes and steps of performing an item analysis. The purposes include selecting the best questions, identifying weaknesses, and improving the quality and effectiveness of assessments. The steps involve scoring tests, dividing students into high and low groups, calculating difficulty and discrimination indices for each item, and using the results to revise tests.
3. DEFINITIONS
A type of analysis used to assess
whether items on a scale are
tapping the same construct and
are sufficiently discriminating.
(Pollit & Beck)
4. The procedure used judge the quality
of an item (Neeraja)
Item analysis is the process of
looking at the item responses of a
test
5. PURPOSES
Select best questions
Identify structural content defect in
question
Detect learning difficulties
Identify weakness
Interpretation of student’s
performance
6. Understand behavior of item
To become better competent
teachers.
Control the quality of a test.
Evaluate the students.
7. Point out problems of validity test by
revising and eliminating ineffective
items
Find out performance of group
Reveals the facility value of each of
the item and the discrimination.
8. ITEM DIFFICULTY:
It is measured by calculating the
percentage of test-takers who
answer the item correctly
9. ITEM DIFFICULTY INDEX
P=R/N X 100
R - No of pupil who answered the
item correctly
N - Total No of pupil who tried them
10. DIFFICULTY LEVEL=
Average on the test X 100
Max. Possible score
DIFFICULTY INDEX =H+L X 100
N
H -No. of correct answers to the high group
L - No. of correct answers to the low group
N - Total No. of students in both group
11. FACILITY VALUE=
No. of students answering
question correctly X100
No. of students who have taken the test
12. ITEM DISCRIMINATION: The way an
item differentiates students who know
the content from who do not know
13. DISCRIMINATION INDEX
The degree to which a given item
discriminates among students who
differ sharply in the functions
measured by the test as a whole
14. DISCRIMINATION INDEX
Formula
DI=2 x (H-L)
N
H – Number of correct answer in high group
L - Number of correct answer in low group
N – Total number of students in both
groups
15. DISTRACTOR EVALUATION: In
addition to evaluation of correct
answers to an item distracters or
wrong answers also individually
evaluated
16. INTER-ITEM CORRELATION
This matrix displays the
correlation of each item with
every other item.
This matrix provides important
information about a test’s
internal consistency, and what
could be done to improve it.
17. ITEM-TOTAL CORRELATIONS
Point-biserial or item-total
correlations assess the usefulness of
an item as a measure of individual
differences in knowledge, ability, or
personality characteristic.
Here each test item (incorrect = 0;
correct =1) is correlated with the
person’s total test score.
18. ITEM REVISION
Developing a valid and reliable
test is an ongoing process.
It helps the faculty to recall items
and student responses to them.
Item revision should conduct
after item analysis.
19. STEPS OF ITEM ANALYSIS
After the test is scored, arrange the
test scores in rank order from
highest to lowest.
Divide the score into a high scoring
half and a low scoring half
For each item, tally the number of
students in each group who chose
each alternative. Record these
counts on a copy of test item next to
each response item.
20. Calculate the difficulty index for each
item.
Calculate the discrimination index for
each item.
Check each items for implausible
distracters, ambiguity and mis-
keying.
21. USING ITEM ANALYSIS RESULT
judge the worth or quality of a test.
Aids in subsequent test revision
Increase skills in test construction
Planning future activities.
Basis for discussing test result
Promotion of students to the next higher
grade
improve teaching methods and techniques