Caveon Webinar Series: Considerations for Online Assessment Program Design
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Caveon Webinar Series: Considerations for Online Assessment Program Design

on

  • 478 views

This month's Caveon Webinar Series session focuses on the Online Education market, but the message shared by two industry veterans will be helpful for all test programs. ...

This month's Caveon Webinar Series session focuses on the Online Education market, but the message shared by two industry veterans will be helpful for all test programs.

In this Webinar, we are joined by special guests Dr. Larry Rudner of GMAC and Dr. Mika Hoffman of Excelsior College. These two esteemed testing veterans will describe basics of good and secure test design and provide considerations for designing an assessment program.

Here's what you'll learn:

- Identifying strategies for developing and improving online testing
- Why good test writing is important to overall learning
- Considerations for implementing low and high stakes online assessments
- Online assessment strategies that are specifically geared for the online education market

The speakers presented real-world examples, with practical considerations for implementing various levels of student assessment. An online assessment checklist will be provided to help you identify priorities for implementing online assessment initiatives.

Featuring presentations by:

Larry Rudner, Ph.D. – is Vice President of Research and Development at the Graduate Management Admission Council (GMAC). He has 30 years of experience is in the areas of test validation, adaptive testing, professional standards, QTI specifications, test security, data forensics, and contract monitoring.

Mika Hoffman, Ph.D. - is the Executive Director of the Center for Educational Measurement for Excelsior College. She has over 20 years of professional experience in test design, quality control, integration of psychometric analyses, assessments development and production processes for higher education and government.

Please contact richelle.gruber@caveon.com if you have any questions or problems viewing.

Statistics

Views

Total Views
478
Views on SlideShare
478
Embed Views
0

Actions

Likes
0
Downloads
4
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Upcoming Events slide?
  • Let’s try to trim this down in the Dry Run
  • What stakes are attached to a non-credit placement exam? (Low, Mid, High) What stakes are attached to an exam leading to a non-credit certificate for professional training? (Low, Mid, High)
  • Why should we worry about test quality?
  • We talked about validity earlier- basically the relationship between performance on the GMAT® and performance in your program. This chart shows the relationship between the gmat and first year grades. Each point shows values for one person. In general, higher GMAT® scores lead to higher expected grades, but this is not true for every case. Validity is the measure of the average line closest to these points. A lot of things will affect this validity, for instance, these people may not be accepted into the program, so it may be harder to find the trend with just these points. By the same token, if everyone gets grades in this range, again it’s hard to make out the trend. You may have a couple of people who buck the trend, people who succeed who you might not have expected to, or people who do poorly even when they have everything going for them. These people when averaged in with the others might skew the results a bit.
  • Explain next Webinar(s)?

Caveon Webinar Series: Considerations for Online Assessment Program Design Presentation Transcript

  • 1. Considerations for OnlineAssessment ProgramDesign Co-hosted by: ExcelSoft Hash tag #CaveonWbnr
  • 2. Considerations For OnlineAssessment Program Design Presented by: Mika Hoffman Executive Director – Center for Educational Measurement Excelsior College & Lawrence M. Rudner Vice President and Chief Psychometrician Research and Development Graduate Management Admission Council (GMAC)
  • 3. ContentsIntroductionUnderstanding Online Assessment ProgramsUnderstanding Test Development &PsychometriciansQ&A
  • 4. Considerations for Online Assessment Program DesignUnderstanding OnlineAssessment ProgramsMika Hoffman
  • 5. Overview Types of academic assessment The stakes involved Validity Test planning Proctoring and identity verification Example—how Excelsior does it
  • 6. Types of academic assessment Diagnostic assessment  Placement in sections/courses  Identification of strengths and weaknesses Formative assessment  Provides feedback to students  May shape lesson plans  Identification of strengths and weaknesses Summative assessment  Assesses outcome of learning
  • 7. The stakes involved Low stakes  Quizzes with little impact on grade  Self-assessments  Assessments in non-credit courses Mid stakes  Tests with substantial impact on grade  Challenge exams to bypass requirements High stakes  Summative assessments determining all or most of grade  Credit by examination  Entrance exams (e.g., SAT, GRE, GMAT)
  • 8. Validity In academic testing, we can say that a test is valid if it gives us reasonable assurance that a person claiming to know the relevant academic material actually does know it. Need to establish  What is the knowledge?  Is it relevant to the academic subject?  Who has the knowledge?
  • 9. Test Planning Deals with what knowledge is being tested and whether it’s relevant to the academic subject and the purpose of the assessment  For tests that are for “all the marbles,” the test plan is the equivalent of a syllabus and learning objectives  Even for quizzes, it’s good to know what the quiz is expected to accomplish
  • 10. Test Security Not just about proctoring  Need to know that the test has been secure throughout development Security is related to validity  If students get a good score because they saw the material and memorized it ahead of time, what are you testing?
  • 11. Proctoring and Identity Verification Need to verify that the people taking the test are who they say they are Need to verify that the people taking the test are using their knowledge of the subject, not other aids (references, friends, the Internet) Need to ensure that the test content is secure
  • 12. Example Excelsior College’s exams are high stakes, “all the marbles” exams: designed to stand alone as the equivalent of a 3-credit course Test Plans are written by a committee with testing experts and instructors of the subject taken from around the country Practice exams delivered online with username/password verification Proctoring and identity verification done in person at Pearson VUE testing centers
  • 13. Considerations for Online Assessment Program DesignUnderstanding TestDevelopment & PsychometriciansLawrence M. Rudner
  • 14. Overview Building a quality test Marks of quality Sources of error
  • 15. Question for our attendees Why should we worry about test quality?
  • 16. Quality Test takers are entitled to assurance that no examinee enjoys an unfair advantage Testing organizations have an obligation to provide, or use its best efforts to provide, only valid scores Organizations have the right to protect their own reputation by assuring the reliability of the information they provide.
  • 17. Test Development Process Review ReviewIdentify desired Develop Develop new new content new new items items items items Pilot new Pilot new items itemsEstablish test Administer Administerspecifications Conduct Conduct Assemble Assemble item item new new analysis analysis pool/forms pool/forms
  • 18. Marks of Quality – Test Reliability
  • 19. Marks of Quality – Test Reliability
  • 20. More ValidityValidity: relationship between test score and outcome measureSuccess /TrueMaster 200 500 800 Test Score
  • 21. Item Analysis – Response Options
  • 22. Item Analysis – Response Options
  • 23. Item Analysis – Response Options
  • 24. Item Analysis – Response Options Item 27 Rit = 0.43 100 80Percentage 60 1 (3) 2* (69) 40 3 (7) 20 4 (20) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0 (Missings)
  • 25. Item Analysis – Response Options Item 22 Rit = -0.26 100 80Percentage 60 1 (5) 2 (5) 40 3 (59) 20 4* (30) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0 (Missings)
  • 26. Item Analysis – Response Options Item 39 Rit = 0.15 100 80Percentage 60 1* (97) 2 (0) 40 3 (1) 20 4 (2) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0
  • 27. Item Analysis – Response Options Item 34 Rit = 0.57 100 80 Percentage 60 1* (73) 2 (7) 40 3 (11) 20 4 (9) 0 1 2 3 4 Score Groups Subgroup 0 -- Subtest 0
  • 28. Item Analysis – Item discriminationQuestion 19 Score R/W 80 1 r = .70 70 1 65 0 60 1 50 0 30 0 20 0
  • 29. Item Analysis – Item discriminationQuestion 7
  • 30. Item Analysis – Item discrimination Question 19
  • 31. Item Analysis – Item discrimination Question 7
  • 32. Constructing a quality test 1. Good representation of content 2. Good, proven test questions 3. Enough test questions 4. Equivalent alternate forms
  • 33. Our Gifts To You!Online Item Writing 30 minute needs analysisTrainingURL: training.caveon.net http://testing-assessments.excelindCode word: online14 • Internet-based training – 1 hour of training, code good for 2 weeks! • ExcelSoft will provide you with a 30-minute consultation, at no cost, for assessing your testing development and delivery needs • Looking to implement or make change to your test delivery platform
  • 34. Helpful ResourcesLinkedIn Group – “Caveon Test Security” Join Us!Follow us on twitter @Caveonwww.caveon/resources/webinars – slides & recordingsCheating Articles at www.caveon.com/citnCaveon Security Insights blog – www.caveon.com/blogCSI Newsletter – Contact us to get on the mailing list!
  • 35. Thank You! Special Thanks to: Lawrence M. Rudner, Ph.D. Vice President and Chief Psychometrician Research and Development Graduate Management Admission Council lrudner@gmac.com Mika Hoffman, Ph.D. Executive Director - Center for Educational Measurement Excelsior College mhoffman@excelsior.eduPlease contact: richelle.gruber@caveon.comfor feedback or a copy of the slides
  • 36. Upcoming Events Please visit our sessions and our Caveon booth #209 at ATP’s Innovations In Testing – February 3-6, 2013 Steve Addicott presenting at SeaSkyLand Conference in ShenZhen – February 2013 John presenting at TILSA meeting February 7th in Atlanta.  Other presenters include John Olson, Greg Cizek. Release of TILSA Test Security Guidebook – Visit our booth to discuss it with John Fremer at ATP! Handbook of Test Security – To be published March 2013 CCSSO Best Practices meeting in June 2013
  • 37. Caveon ATP Sessions Tell it to the Judge! Winning with Data Forensics Evidence in Court  Steve Addicott - 2/4/13 – 10 am Data Forensics: Opening the Black Box  John Fremer & Dennis Maynes – 2/4/13 – 2:45 pm A Synopsis of the Handbook of Test Security  David Foster & John Fremer – 2/4/13 – 5 pm From Foundations to Futures: Online Proctoring and Authentication (Kryterion session)  David Foster – 2/5/13 – 11 am Make, Buy, or Borrow: Acquiring SMEs  Nat Foster – 2/5/13 – 1:15 pm
  • 38. ATP Sessions of our presenters Free Tools to Nail the Brain Dumps  Monday 2/4/2013 4:00 PM – 5:00 PM  Lawrence Rudner The Game’s Afoot: Sleuths Match Wits  Tuesday 2/5/2013 11:00 AM – Noon  Lawrence Rudner & Dennis Maynes Online Education – How Can We Make Sure Students are Really Learning?  Tuesday 2/5/2013 11:00 AM – Noon  Jamie Mulkey  Mika Hoffman
  • 39. We hope to “See You” at our nextsessions! Caveon’s Lessons Learned from ATP  To be held: Feb 20, 2013 The next webinar in the Online Education series:  Designing Assessments for the Online Education Environment  To be held: March 20, 2013