Considerations for OnlineAssessment ProgramDesign   Co-hosted by: ExcelSoft                             Hash tag #CaveonWbnr
Considerations For OnlineAssessment Program Design                      Presented by: Mika Hoffman Executive Director – Ce...
ContentsIntroductionUnderstanding Online Assessment ProgramsUnderstanding Test Development &PsychometriciansQ&A
Considerations for Online        Assessment Program        DesignUnderstanding OnlineAssessment ProgramsMika Hoffman
Overview   Types of academic assessment   The stakes involved   Validity   Test planning   Proctoring and identity ve...
Types of academic assessment Diagnostic assessment   Placement in sections/courses   Identification of strengths and we...
The stakes involved Low stakes   Quizzes with little impact on grade   Self-assessments   Assessments in non-credit co...
Validity In academic testing, we can say that a  test is valid if it gives us reasonable  assurance that a person claimin...
Test Planning Deals with what knowledge is being  tested and whether it’s relevant to the  academic subject and the purpo...
Test Security Not just about proctoring   Need to know that the test has been secure    throughout development Security...
Proctoring and Identity Verification Need to verify that the people taking the  test are who they say they are Need to v...
Example Excelsior College’s exams are high stakes, “all the  marbles” exams: designed to stand alone as the  equivalent o...
Considerations for Online        Assessment Program        DesignUnderstanding TestDevelopment & PsychometriciansLawrence ...
Overview            Building a quality test           Marks of quality           Sources of error
Question for our attendees    Why should we worry about test quality?
Quality Test takers are entitled to assurance that no  examinee enjoys an unfair advantage Testing organizations have an...
Test Development Process                                 Review                                 ReviewIdentify desired   D...
Marks of Quality – Test Reliability
Marks of Quality – Test Reliability
More ValidityValidity: relationship between test score and outcome measureSuccess /TrueMaster            200            50...
Item Analysis – Response Options
Item Analysis – Response Options
Item Analysis – Response Options
Item Analysis – Response Options                            Item 27             Rit = 0.43             100             80P...
Item Analysis – Response Options                            Item 22             Rit = -0.26             100             80...
Item Analysis – Response Options                       Item 39             Rit = 0.15             100             80Percen...
Item Analysis – Response Options                        Item 34             Rit = 0.57              100              80 Pe...
Item Analysis – Item discriminationQuestion 19              Score R/W               80    1      r = .70               70 ...
Item Analysis – Item discriminationQuestion 7
Item Analysis – Item discrimination    Question 19
Item Analysis – Item discrimination  Question 7
Constructing a quality test  1. Good representation of content  2. Good, proven test questions  3. Enough test questions  ...
Our Gifts To You!Online Item Writing                       30 minute needs analysisTrainingURL: training.caveon.net       ...
Helpful ResourcesLinkedIn Group – “Caveon Test Security” Join Us!Follow us on twitter @Caveonwww.caveon/resources/webinars...
Thank You!       Special Thanks to:           Lawrence M. Rudner, Ph.D.           Vice President and Chief Psychometrician...
Upcoming Events Please visit our sessions and our Caveon booth #209 at  ATP’s Innovations In Testing – February 3-6, 2013...
Caveon ATP Sessions Tell it to the Judge! Winning with Data Forensics  Evidence in Court   Steve Addicott - 2/4/13 – 10 ...
ATP Sessions of our presenters Free Tools to Nail the Brain Dumps   Monday 2/4/2013 4:00 PM – 5:00 PM      Lawrence Rud...
We hope to “See You” at our nextsessions! Caveon’s Lessons Learned from ATP   To be held: Feb 20, 2013 The next webinar...
Upcoming SlideShare
Loading in …5
×

Caveon Webinar Series: Considerations for Online Assessment Program Design

534 views

Published on

This month's Caveon Webinar Series session focuses on the Online Education market, but the message shared by two industry veterans will be helpful for all test programs.

In this Webinar, we are joined by special guests Dr. Larry Rudner of GMAC and Dr. Mika Hoffman of Excelsior College. These two esteemed testing veterans will describe basics of good and secure test design and provide considerations for designing an assessment program.

Here's what you'll learn:

- Identifying strategies for developing and improving online testing
- Why good test writing is important to overall learning
- Considerations for implementing low and high stakes online assessments
- Online assessment strategies that are specifically geared for the online education market

The speakers presented real-world examples, with practical considerations for implementing various levels of student assessment. An online assessment checklist will be provided to help you identify priorities for implementing online assessment initiatives.

Featuring presentations by:

Larry Rudner, Ph.D. – is Vice President of Research and Development at the Graduate Management Admission Council (GMAC). He has 30 years of experience is in the areas of test validation, adaptive testing, professional standards, QTI specifications, test security, data forensics, and contract monitoring.

Mika Hoffman, Ph.D. - is the Executive Director of the Center for Educational Measurement for Excelsior College. She has over 20 years of professional experience in test design, quality control, integration of psychometric analyses, assessments development and production processes for higher education and government.

Please contact richelle.gruber@caveon.com if you have any questions or problems viewing.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
534
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
14
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Upcoming Events slide?
  • Let’s try to trim this down in the Dry Run
  • What stakes are attached to a non-credit placement exam? (Low, Mid, High) What stakes are attached to an exam leading to a non-credit certificate for professional training? (Low, Mid, High)
  • Why should we worry about test quality?
  • We talked about validity earlier- basically the relationship between performance on the GMAT® and performance in your program. This chart shows the relationship between the gmat and first year grades. Each point shows values for one person. In general, higher GMAT® scores lead to higher expected grades, but this is not true for every case. Validity is the measure of the average line closest to these points. A lot of things will affect this validity, for instance, these people may not be accepted into the program, so it may be harder to find the trend with just these points. By the same token, if everyone gets grades in this range, again it’s hard to make out the trend. You may have a couple of people who buck the trend, people who succeed who you might not have expected to, or people who do poorly even when they have everything going for them. These people when averaged in with the others might skew the results a bit.
  • Explain next Webinar(s)?
  • Caveon Webinar Series: Considerations for Online Assessment Program Design

    1. 1. Considerations for OnlineAssessment ProgramDesign Co-hosted by: ExcelSoft Hash tag #CaveonWbnr
    2. 2. Considerations For OnlineAssessment Program Design Presented by: Mika Hoffman Executive Director – Center for Educational Measurement Excelsior College & Lawrence M. Rudner Vice President and Chief Psychometrician Research and Development Graduate Management Admission Council (GMAC)
    3. 3. ContentsIntroductionUnderstanding Online Assessment ProgramsUnderstanding Test Development &PsychometriciansQ&A
    4. 4. Considerations for Online Assessment Program DesignUnderstanding OnlineAssessment ProgramsMika Hoffman
    5. 5. Overview Types of academic assessment The stakes involved Validity Test planning Proctoring and identity verification Example—how Excelsior does it
    6. 6. Types of academic assessment Diagnostic assessment  Placement in sections/courses  Identification of strengths and weaknesses Formative assessment  Provides feedback to students  May shape lesson plans  Identification of strengths and weaknesses Summative assessment  Assesses outcome of learning
    7. 7. The stakes involved Low stakes  Quizzes with little impact on grade  Self-assessments  Assessments in non-credit courses Mid stakes  Tests with substantial impact on grade  Challenge exams to bypass requirements High stakes  Summative assessments determining all or most of grade  Credit by examination  Entrance exams (e.g., SAT, GRE, GMAT)
    8. 8. Validity In academic testing, we can say that a test is valid if it gives us reasonable assurance that a person claiming to know the relevant academic material actually does know it. Need to establish  What is the knowledge?  Is it relevant to the academic subject?  Who has the knowledge?
    9. 9. Test Planning Deals with what knowledge is being tested and whether it’s relevant to the academic subject and the purpose of the assessment  For tests that are for “all the marbles,” the test plan is the equivalent of a syllabus and learning objectives  Even for quizzes, it’s good to know what the quiz is expected to accomplish
    10. 10. Test Security Not just about proctoring  Need to know that the test has been secure throughout development Security is related to validity  If students get a good score because they saw the material and memorized it ahead of time, what are you testing?
    11. 11. Proctoring and Identity Verification Need to verify that the people taking the test are who they say they are Need to verify that the people taking the test are using their knowledge of the subject, not other aids (references, friends, the Internet) Need to ensure that the test content is secure
    12. 12. Example Excelsior College’s exams are high stakes, “all the marbles” exams: designed to stand alone as the equivalent of a 3-credit course Test Plans are written by a committee with testing experts and instructors of the subject taken from around the country Practice exams delivered online with username/password verification Proctoring and identity verification done in person at Pearson VUE testing centers
    13. 13. Considerations for Online Assessment Program DesignUnderstanding TestDevelopment & PsychometriciansLawrence M. Rudner
    14. 14. Overview Building a quality test Marks of quality Sources of error
    15. 15. Question for our attendees Why should we worry about test quality?
    16. 16. Quality Test takers are entitled to assurance that no examinee enjoys an unfair advantage Testing organizations have an obligation to provide, or use its best efforts to provide, only valid scores Organizations have the right to protect their own reputation by assuring the reliability of the information they provide.
    17. 17. Test Development Process Review ReviewIdentify desired Develop Develop new new content new new items items items items Pilot new Pilot new items itemsEstablish test Administer Administerspecifications Conduct Conduct Assemble Assemble item item new new analysis analysis pool/forms pool/forms
    18. 18. Marks of Quality – Test Reliability
    19. 19. Marks of Quality – Test Reliability
    20. 20. More ValidityValidity: relationship between test score and outcome measureSuccess /TrueMaster 200 500 800 Test Score
    21. 21. Item Analysis – Response Options
    22. 22. Item Analysis – Response Options
    23. 23. Item Analysis – Response Options
    24. 24. Item Analysis – Response Options Item 27 Rit = 0.43 100 80Percentage 60 1 (3) 2* (69) 40 3 (7) 20 4 (20) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0 (Missings)
    25. 25. Item Analysis – Response Options Item 22 Rit = -0.26 100 80Percentage 60 1 (5) 2 (5) 40 3 (59) 20 4* (30) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0 (Missings)
    26. 26. Item Analysis – Response Options Item 39 Rit = 0.15 100 80Percentage 60 1* (97) 2 (0) 40 3 (1) 20 4 (2) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0
    27. 27. Item Analysis – Response Options Item 34 Rit = 0.57 100 80 Percentage 60 1* (73) 2 (7) 40 3 (11) 20 4 (9) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0
    28. 28. Item Analysis – Item discriminationQuestion 19 Score R/W 80 1 r = .70 70 1 65 0 60 1 50 0 30 0 20 0
    29. 29. Item Analysis – Item discriminationQuestion 7
    30. 30. Item Analysis – Item discrimination Question 19
    31. 31. Item Analysis – Item discrimination Question 7
    32. 32. Constructing a quality test 1. Good representation of content 2. Good, proven test questions 3. Enough test questions 4. Equivalent alternate forms
    33. 33. Our Gifts To You!Online Item Writing 30 minute needs analysisTrainingURL: training.caveon.net http://testing-assessments.excelindCode word: online14 • Internet-based training – 1 hour of training, code good for 2 weeks! • ExcelSoft will provide you with a 30-minute consultation, at no cost, for assessing your testing development and delivery needs • Looking to implement or make change to your test delivery platform
    34. 34. Helpful ResourcesLinkedIn Group – “Caveon Test Security” Join Us!Follow us on twitter @Caveonwww.caveon/resources/webinars – slides & recordingsCheating Articles at www.caveon.com/citnCaveon Security Insights blog – www.caveon.com/blogCSI Newsletter – Contact us to get on the mailing list!
    35. 35. Thank You! Special Thanks to: Lawrence M. Rudner, Ph.D. Vice President and Chief Psychometrician Research and Development Graduate Management Admission Council lrudner@gmac.com Mika Hoffman, Ph.D. Executive Director - Center for Educational Measurement Excelsior College mhoffman@excelsior.eduPlease contact: richelle.gruber@caveon.comfor feedback or a copy of the slides
    36. 36. Upcoming Events Please visit our sessions and our Caveon booth #209 at ATP’s Innovations In Testing – February 3-6, 2013 Steve Addicott presenting at SeaSkyLand Conference in ShenZhen – February 2013 John presenting at TILSA meeting February 7th in Atlanta.  Other presenters include John Olson, Greg Cizek. Release of TILSA Test Security Guidebook – Visit our booth to discuss it with John Fremer at ATP! Handbook of Test Security – To be published March 2013 CCSSO Best Practices meeting in June 2013
    37. 37. Caveon ATP Sessions Tell it to the Judge! Winning with Data Forensics Evidence in Court  Steve Addicott - 2/4/13 – 10 am Data Forensics: Opening the Black Box  John Fremer & Dennis Maynes – 2/4/13 – 2:45 pm A Synopsis of the Handbook of Test Security  David Foster & John Fremer – 2/4/13 – 5 pm From Foundations to Futures: Online Proctoring and Authentication (Kryterion session)  David Foster – 2/5/13 – 11 am Make, Buy, or Borrow: Acquiring SMEs  Nat Foster – 2/5/13 – 1:15 pm
    38. 38. ATP Sessions of our presenters Free Tools to Nail the Brain Dumps  Monday 2/4/2013 4:00 PM – 5:00 PM  Lawrence Rudner The Game’s Afoot: Sleuths Match Wits  Tuesday 2/5/2013 11:00 AM – Noon  Lawrence Rudner & Dennis Maynes Online Education – How Can We Make Sure Students are Really Learning?  Tuesday 2/5/2013 11:00 AM – Noon  Jamie Mulkey  Mika Hoffman
    39. 39. We hope to “See You” at our nextsessions! Caveon’s Lessons Learned from ATP  To be held: Feb 20, 2013 The next webinar in the Online Education series:  Designing Assessments for the Online Education Environment  To be held: March 20, 2013

    ×