Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Achieve your 2017 Assessment Resolutions: How to Improve Your Item Writing Skills and Create Better Exams

223 views

Published on

Presented by Ainslie Nibert, Associate Dean/Associate Professor, College of Nursing, Texas Woman's University - Houston Center

Now that your spring semester has started, now is the time to act on your new year’s resolution to improve your item writing skills and build sound assessments. This webinar will help your create critically-thinking test items for this semester’s exams. You’ll obtain valuable student response data from these new questions that can guide future editing, and help you obtain the greatest benefit from your authoring efforts. By performing a systematic item analysis after each exam, you can pinpoint students’ knowledge gaps, which will help you focus your item writing on those course objectives that are globally misunderstood or ignored. In addition to reviewing item writing techniques, we’ll also cover the advantages of using electronic test blueprints to establish test validity and tie your assessments to your overall program objectives. You may not lose 10 pounds by February 14th, but you can develop 10 new (or newly-edited) test items instead of polishing off that box of Valentine’s chocolates – what a great way to make a productive start to 2017, and to your new semester!

Published in: Education
  • D0WNL0AD FULL ▶ ▶ ▶ ▶ http://1lite.top/k4mx4y ◀ ◀ ◀ ◀
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Achieve your 2017 Assessment Resolutions: How to Improve Your Item Writing Skills and Create Better Exams

  1. 1. Achieve Your 2017 Resolution to Improve Your Test Item Writing Skills and Create Better Exams Ainslie T. Nibert, PhD, RN, FAAN Consultant Email – anibert@comcast.net
  2. 2. Objectives Create/Edit test items that assess for application and analysis. Critique exams for alignment with the NCLEX-RN® Test Plan. Evaluate the relevance and effectiveness of current testing policies.
  3. 3. Resources for Developing Critical Thinking Test Items and Alternate Format Items: National Council Website www.nscbn.org ◦NCLEX Test Plans ◦2016 RN ◦Candidate FAQ ◦Alternate item formats FAQ ◦Exam Development FAQ Source: https://www.ncsbn.org/9010.htm
  4. 4. Relationship between Testing & the Curriculum Focus today: INTERNAL Curriculum Evaluation (Teacher-made Tests)
  5. 5.  Writing Critical Thinking Test Items  Item Analysis Software & Blueprinting  Test Item Banking & Exam Delivery Internal Evaluation Evaluation of course objectives (faculty designed)
  6. 6. Five Guidelines to Developing Effective Critical Thinking Exams  Assemble the “basics.”  Write critical thinking test items.  Pay attention to housekeeping duties.  Develop a test blueprint.  Scientifically analyze all exams.
  7. 7. Critical Thinking Test Items  Contain Rationale  Written at the Application Level or Above  Require Multilogical Thinking to Answer  Ask for High Level of Discrimination Source: Morrison, Nibert, & Flick (2006)
  8. 8. Housekeeping Tips
  9. 9. Rules  Get rid of names  Get rid of ‘multiple’ multiples  Use non-sexist writing style  Develop parsimonious writing style Eliminate Delete scenarios  Write items independent of each other “of the following…”X
  10. 10. … and More Rules  Use a question format when possible  Make distracters plausible and homogeneous  Include in stem words repeated in responses
  11. 11. … and More Rules  Eliminate “all of the above” and “none of the above”  Rewrite any “all except” questions  Ensure that alternatives do not overlap  Present choices in a logical order  Vary correct answer
  12. 12. … and the MOST IMPORTANT Rule Develop written testing policy  Writing style  Format
  13. 13. Does the test measure what it claims to measure? Content ValidityContent Validity
  14. 14. Use a Blueprint to Assess a Test’s Validity  Test Blueprint  Reflects Course Objectives  Rational/Logical Tool  Testing Software Program  Storage of item analysis data (Last & Cumulative)  Storage of test item categories
  15. 15. Consistency of Scores
  16. 16. Reliability Tools  Kuder-Richardson Formula 20 (KR20)—EXAM Range from –1 to + 1  Point Biserial Correlation Coefficient (PBCC)—TEST ITEMS Range from – 1 to + 1
  17. 17.  Item difficulty 30% - 90%  Item Discrimination Ratio 25% and Above  PBCC 0.20 and Above  KR20 0.70 and Above Standards of Acceptance
  18. 18. one “absolute” rule about item difficulty TEST ITEMS ANSWERED CORRECTLY BY 30% or LESS of the examinees should always be considered too difficult, and the instructor must take action.
  19. 19. …but what about high difficulty levels? Test items with high difficulty levels (>90%) often yield poor discrimination values. Is there a situation where faculty can legitimately expect that 100% of the class will answer a test item correctly, and be pleased when this happens? RULE OF THUMB ABOUT MASTERY ITEMS: Due to their negative impact on test discrimination and reliability, they should comprise no more than 10% of the test.
  20. 20.  Item difficulty 30% - 90%  Item Discrimination Ratio 25% and Above  PBCC 0.20 and Above  KR20 0.70 and Above Standards of Acceptance
  21. 21. Thinking more about item discrimination statistics on teacher- made tests… IDR can be calculated quickly, but doesn’t effectively consider variance of the entire group. Use it to quickly identify items that have zero/negative discrimination values, since these need to be edited before using again. PBCC is a more powerful measure discrimination. Correlates the correct answer to a single test items with the total test score of the student. Considers the variance of the entire student group, not just the lower and upper 27% groups. For a small ‘n,’ consider referencing the cumulative value.
  22. 22. … what decisions need to be made about Test items? When a test item has poor difficulty and/or discrimination values, action is needed. All of these actions require that the exam be rescored. Credit can be given for more than one choice. Test item can be nullified. Test item can be deleted. REMEMBER: Each of these actions has a consequence, so faculty need to carefully consider these when choosing an action. Faculty judgment is crucial when determining actions affecting test scores.
  23. 23. Standards of Acceptance Nursing Nursing-PBCC 0.15 and Above Nursing-KR20 0.60 - 0.65 and Above
  24. 24. 3-Step Method for Item Analysis 1. Review Difficulty Level 2. Review Discrimination Data  Item Discrimination Ratio (IDR)  Point Biserial Correlation Coefficient (PBCC) 3. Review Effectiveness of Alternatives  Response Frequencies  Non-distracters
  25. 25. ..and a word about using Response Frequencies A review of the response frequency data can focus your editing. For items where 100% of students answer correctly, and no other options were chosen, make sure that this is indeed intentional (MASTERY ITEM), and not just reflective of an item that is too easy (>90% DIFFICULTY.) Target re-writing the “zero” distracters – those options that are ignored by students. Replacing “zeros” with plausible options will immediately improve item DISCRIMINATION.
  26. 26. Critically-thinking Questions Which intervention is most important? Which intervention, plan, assessment data is/are most critical to developing a plan of care? Which intervention should be done first? What action should the nurse take first? Which intervention, plan, nursing action has the highest priority? What response is best?
  27. 27. 33
  28. 28. Fair/Common Universal Language The client is running late for an appointment. The client understands Buddhist practices are peaceful. The client is on five different medications. The client ate a submarine sandwich. The alcoholic client with delirium tremens is agitated. After the client sneezed, the nurse said “bless you.” The nurse is giving a report on the client. The nursing unit is working shorthanded. Source: Bristol, T. 2016) NCLEX® Updates (webinar series) Available: http://nursetim.com/webinars/nclex
  29. 29. Latest NCLEX® Test Item Format Considerations Units of Measure •International Systems of Units (SI) •Metric •Imperial Measurement Generic vs. Trade Names for Medications •Generic names only in most cases •References to general classifications of medications
  30. 30. Item Writing Tools for Success … Knowledge Test Blueprint Testing Software
  31. 31. References Morrison, S., Nibert, A., & Flick, J. (2006). Critical thinking and test item writing (2nd ed.). Houston, TX: Health Education Systems, Inc. National Council of State Boards of Nursing. (2016) 2016 NCLEX-RN test plan. Chicago, IL: National Council of State Boards of Nursing. https://www.ncsbn.org/RN_Test_Plan_2016_Final.pdf Nibert, A. (2010) Benchmarking for student progression throughout a nursing program: Implications for students, faculty, and administrators. In Caputi, L. (Ed.), Teaching nursing: The art and science, 2nd ed. (Vol. 3). (pp.45-64). Chicago: College of DuPage Press.

×