• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Item analysis2
 

Item analysis2

on

  • 4,122 views

 

Statistics

Views

Total Views
4,122
Views on SlideShare
4,122
Embed Views
0

Actions

Likes
1
Downloads
151
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Item analysis2 Item analysis2 Presentation Transcript

    • Item Analysis• When analyzing the test item, we have several questionsabout the performance of each item. Some of thesequestions include : •Are the item congruent with the test objectives? •Are the item valid? Do they measure what they supposed to measure? •Are the item reliable? Do they measure consistently? •How long does it take an examinee to complete each item? •What item are most difficult to answer correctly? •What item are easy? •Are they any poor performing items that need to be discarded?
    • Types Of Item Analysis for CTT• Three major types : 1. Assess quality of the distracters 2. Assess difficulty of the items 3. Assess how well an item differentiates between high and low performers
    • Purposes and Elements of item AnalysisTo select the best available items for the final form of the test.To identify structural or content defects in the items.To detect learning difficulties of the class as a wholeTo identify the areas of weakness of students in need of remediation
    • Three elements of item analysis1. Examination of the difficulty level of the items.2. Determination of the discriminating power of each item, and3. Examination of the effectiveness of distractors in a multiple choice or matching items.
    • The difficulty level of an item is known as index ofdifficulty.Index of difficulty is the percentage of students answering correctly each item in the testIndex of discrimination refer to the percentage of high-scoring individuals responding correctly versus the number of low- scoring individuals responding responding correctly to an item.This numeric index indicates how effectively an item differentiates between the students who did well and those who did poorly on the test.
    • Preparing Data for Item Analysis1. Arrange test score from highest to lowest.2. Ger one-third of the papers from the highest scores and the other third from the lowest scores.3. Record separately the number of times each alternative was chosen by the students in both groups.
    • 4. Add the number of correct answers to each item made by the combined upper and lower groups.5. Compute the index of difficulty for each item, following formula : IDF = (NRC/TS)100Where IDF = index of difficulty NRC = number of students responding correctly to an item TS = total number of an students in the upper and lower groups.
    • 6. Compute thee index of discrimination, based on the formula : IDN = (CU – CL) NSGWhere IDN = index of discrimination CU = number of correct responses of the upper group CL = number of correct responses of the lower group NSG = number of student per group
    • Using information about Index of DifficultyThe difficulty index of a test items tells a teacherabout the comprehension of or performance onmaterial or task contained in an item.
    • For an item to be considered a good item, its difficulty index should be 50%. An item with 50% difficulty index is neither easy nor difficult.If an item has a difficulty index of 67.5%, this means that it is 67.5% easy and 32.5% difficult.Information on the index of difficulty of an item can help a teacher decide whether a test should be revised, retained or modified.
    • Interpretation of the Difficulty Index Range Difficulty Level 20 & below Very difficult 21-40 Difficult 41-60 Average 61-80 Easy 81 & above Very easy
    • Using Information About Index Of Discrimination• The Index Of Discrimination tells a teacher the degree to which a test item differentiates the high achievers from the low achievers in is class. A test item may have positive or negative discriminating power.• An item has a positive discriminating power when more student from the upper group got the right answer than those from the lowest group.• When more student from the upper group got the correct answer on an item than those from the upper group, the item has a negative discriminating power.
    • There are instance when an item has zero discriminating power – when equal number of students from upper and lower group got the right answer to a test item.In the given example, item 5 has the highest discriminating power. This means that it can differentiate high and low achievers.
    • Interpretation of the Difficulty Index Range Verbal Description .40 & above Very Good Item .30 - .39 Good Item .20 - .29 Fair Item .09 - .19 Poor Item
    • When should a test item be rejected? Retained?Modified or revisedA test item can be retained when its level of difficulty is average and discriminating power is positive.It has to rejected when it is either easy / very easy or difficult / very difficult and its discriminating power is negative or zero.An item can be modified when its difficulty level is average and its discrimination index is negative.
    • Examining Distracter EffectivenessAn ideal item is one that all student in the upper group answer correctly and all students in the lower group answer wrongly. And the responses of the lower group have to be evenly distributed among the incorrect alternatives.
    • Developing an Item Data File Encourage teachers to undertake an item analysis as often as practical Allowing for accumulated data to be used to make item analysis more reliable Providing for a wider choice of item format and objectives Facilitating the revision of items Accumulating a large pool of items as to allow for some items to be shared with the students for study purposes.
    • Limitations Of Item Analysis• It cannot be used for essay items.• Teacher must be cautious about what damage may be due to the table of specifications when items not meeting the criteria are deleted from the test. These items are to be rewritten or replaced.
    • What is Item Discrimination?• Generally, student who did well on the exam should select the correct answer to any given item on the exam.• The Discrimination Index distinguishes for each item between the performance of students who did poorly.
    • How does it work?• for each item, subtract the number in the lower group who answered correctly from the number of students in the upper group who answered correctly.• Divide the result by the number of students in one group.• The discrimination Index is listed in decimal format and ranges between -1 and 1.
    • What a “good” value?
    • Item Discrimination : Examples Number of correct answers in group ItemItem Discrimination no. Upper 1/4 Lower 1/4 Index 1 90 20 0.7 2 80 70 0.1 3 100 0 1 4 100 100 0 5 50 50 0 6 20 60 -04
    • Quick Reference• Use the following table as a guideline to determine whether an item ( or its corresponding instruction) should be considered for revision. Item Discrimination Item Difficulty (D) High Medium Low D = < 0% review review review 0 % < D < 30 % ok review ok D > = 30 % ok ok ok
    • Distracter analysisFirst question of item analysis : how many people choose each response?If there only one best response, then all other response options are distracters.Example from in class assignment (N=35):Which method has best internal consistensy ?a) Projective test 1b) Peer ratings 1c) Forced choice 21d) Differences n.s. 12
    • Distracter analysis (cont’d)• A perfect test item would have 2 characteristics :1. Everyone who knows the item gets it right2. People who do know the item will have responses equality distributed across the wrong answer.• It is not desirable to have one of the distracters chosen more often then the correct answer.• This result indicates a potential problem with the question. This distracters may be too similar to the correct answer and /or these maybe something in either the stem or the alternatives that is misleading.
    • Distracter analysis (cont’d)• Calculate the # of people expected to choose each of the distracters. If random same expected number for each wrong response (Figure 10-1).# of Persons N answering incorrectly 14Exp. To Choose ___________________ = __ =4.7Distracter number of distracters 3
    • Distracter analysis (cont’d)When the number of person choosing a distracter significantly exceeds the number expected, these are 2 possibilities:1. It is possible that choice reflects partial knowledge2. The item is a poorly worded trick question• Unpopular distracters may lower item and test difficulty because it is easily eliminated• Extremely popular likely to lower the reliability and validity of the test
    • Distracter analysis : Definition• Compare the performance of the highest and lowest scoring 25% of the student on the distracter option (i.e. the incorrect answers presented on the exam)• Fewer of the top performers should choose each of the distracters as their answer compared to the bottom performers.
    • Distracter analysis : Examples Item 1 A B C D E Omit% of student in upper 1/4 20 5 0 0 0 0 % of student in middle 15 10 10 10 5 0% of student in lower 1/4 5 5 5 10 0 0 Item 2 A B C D E Omit% of student in upper ¼ 0 5 5 15 0 0 % of student in middle 0 10 15 5 20 0% of student in lower 1/4 0 5 10 0 10 0
    • Distracter Analysis : Discussion• What is the purpose of a good distracter?• Which distracters should you consider throwing out?
    • Item analysis report
    • Exercise : Interpret Item Analysis• Review the sample report.• Identify any exam items that may require revision.• For each identify item, list your observation and hypothesis of the nature of the problem.
    • Knowledge Or Successful Guessing?Multiple Choice Exam Strategies -improve odds by eliminating 1 or more infeasible or unlikely answer optionsDescription Exam Strategies -brain dumping -part marks -consideration for perfect answers to questions that were not asked
    • Possibility of a “Random Pass” Depends on the numberof answer options per question and the number of questions!
    • Percent Pass ( >50%) by ChanceNumber ofQuestions 2 choice 3 choice 4 choice 5 choice 1 50 33 25 20 2 75 56 44 36 4 69 41 26 18 6 66 32 17 10 10 62 21 8 3 20 59 9.2 1.4 .3 50 56 1 .01 .0004