Caveon Webinar Series: Lessons Learned from EATP and CSDPTF November 2013

701 views

Published on

Presented by Dr. John Fremer, Dennis Maynes and Steve Addicott, Caveon Test Security

Two important industry conferences have been held in the last couple of months, the European Association of Test Publishers (E-ATP) Conference and the Conference on Statistical Detection of Potential Test Fraud (CSDPTF). Caveon was at both of these events and wants to share some important information with you.

Join Dr. John Fremer, President of Caveon Consulting Services, Steve Addicott, Vice President, Sales and Marketing, and Dennis Maynes, Chief Scientist, Caveon Data Forensics, who attended both conferences and presented sessions. They will explore key takeaways and lessons learned on security. Stay updated on the latest and greatest industry security trends.

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
701
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • 25, 28, 30, 31, 33, 34, 35
  • 25, 28, 30, 31, 33, 34, 35
  • 25, 28, 30, 31, 33, 34, 35
  • Caveon Webinar Series: Lessons Learned from EATP and CSDPTF November 2013

    1. 1. November 20, 2013 Caveon Webinar Series: Lessons Learned from The European Association of Test Publishers Conference and The Conference on Statistical Detection of Potential Test Fraud John Fremer, President Caveon Consulting Services Steve Addicott, Vice President, Client Services Dennis Maynes, Chief Scientist, Caveon Data Forensics
    2. 2. Agenda for Today  John Fremer, Observations from E-ATP  Handbook of Test Security  CCSSO TILSA Test Security Guidebook  The Emergence of Social Media  Steve Addicott, Observations from E-ATP and CSDPTF     ATP/CCSSO OBP ITC Guidelines Electronic Badging ATP Security Committee CSDPTF Observations  Dennis Maynes, Observations from CSDPTF  Progress, Research and Direction  Techniques that work, and don’t work  Challenges in this field
    3. 3. Lessons Learned from The European Association of Test Publishers Conference John Fremer
    4. 4. European Association of Test Publishers (E-ATP) Annual Conference  5th Annual E-ATP conference: Growing Talent In Europe: Gaining Advantage Through Assessment  Brings together European and other international test publishers and related organizations  Held in beautiful St. Julian, Malta
    5. 5. HANDBOOK OF TEST SECURITY, FIVE TAKEAWAYS 1. What security vulnerabilities exist for all genres of testing? 2. The critical importance of security planning 3. Practical and proven strategies for preventing and detecting cheating 4. How security breaches have been dealt with in specific programs 5. What lessons have we learned from past instances of testing misbehavior?
    6. 6. HANDBOOK OF TEST SECURITY, FIVE PREDICTIONS In many high stakes testing programs: 1. Cheating detection statistical analyses will be performed routinely 2. Computer-based testing will increasingly become the norm 3. Technology developments will be critically important to test security 4. The internationalization of the program will increase 5. “Test Security Expert” will become a recognized and valued position
    7. 7. ADVICE FOR THE READER  Review the Table of Contents  “Skim read“ the chapters  Earmark chapters and sections for staff to read or for you to return to  Read sections as you deal with the issues addressed To get the Handbook, http://www.psypress.com/books/details/978041 5816540/ use discount code HYJ82 for a 20% discount.
    8. 8. TILSA GUIDEBOOK  Guidebook: Preventing, Detecting, and Following up on Testing Irregularities  Center for Chief State School Officers  Designed for State Assessment Directors
    9. 9. PREVENTION (Sample) Communicate Zero Tolerance 1. Clear 2. Consistent 3. Frequently emphasized messages Develop a Security Plan  Comprehensive  Up to date
    10. 10. DETECTION (Sample) Use Multiple Data Forensics Methods 1. Unusual Gains 2. Similarity 3. Erasures (Answer Changing) 4. Person Fit Analyses
    11. 11. TEST SECURITY INVESTIGATIONS (Sample #1) Areas Needing Attention  Level of evidence warranting an investigation  Roles for state, district, and other staff  Time lines  Model investigations kit
    12. 12. TEST SECURITY INVESTIGATIONS (cont.) (Sample #2) Model Investigations Kit  Clear definition of responsibilities  Procedures for evaluating evidence  Planning and conducting interviews  Developing a report with recommendations
    13. 13. TILSA GUIDEBOOK To get a copy of the Guidebook, Download a PDF for free or purchase your own copy for only $3.59. http://www.ccsso.org/Reso urces/Publications
    14. 14. SOCIAL MEDIA Importance of Social Media -- several presentations  Sessions suggest using “Glocal” approach  Social Media continue to gain users in testing internationally
    15. 15. Ignite Session: Are You One Post Away From a Social Media Crisis? Speaker: Matthew Poyiadgi, Managing Director, Pearson VUE  3 P’s to deal with a social media crisis:  Police  Proactive Stance  Proportional Response  Customers expect a 1-day response from you.  20% get it.
    16. 16. Ignite Session: Are You One Post Away From a Social Media Crisis? (cont) Summary  Choose your communication channels wisely  Write your guidelines – make them practical  Choose and train those representing your agency  Be proactive – put your guidelines in action  Be proportionate with resources and responses  Track what you do – watch for patterns  Be personable on social media! You’re dealing with people!
    17. 17. Lessons Learned from The European ATP Conference & The Conference on Statistical Detection of Potential Test Fraud Steve Addicott
    18. 18. Important Test Security Resources  ATP/CCSSO Operational Best Practices for Statewide Large Scale Assessment Programs  International Test Commission Guidelines for the Security of Examinations, Tests and Other Assessments
    19. 19. ATP/CCSSO Operational Best Practices  Published in Spring 2010:  Enhance the quality, accuracy, and timeliness of student test data derived from large scale assessments  Strengthen public confidence in the accuracy and quality of testing data and their uses
    20. 20. ATP/CCSSO Operational Best Practices 2013: Revised OBP for CBT and Online Testing  Just published September 2013!  May not be precisely applicable to “…(test programs) used on an international basis”  Provides a solid framework from which others might seek to define a set of practices tailored to their testing programs  ATP and the CCSSO encourage others to use this document
    21. 21. ATP/CCSSO Operational Best Practices To get a copy of the OBP, Purchase a copy for $29.99 at http://www.ccsso.org/resourc es/publications
    22. 22. International Test Commission (ITC) Security Guidelines  International Test Commission Guidelines for the Security of Examinations, Tests and Other Assessments
    23. 23. ITC Guidelines: Purpose With increasing security problems, the guidelines are:  intended to share key elements of security best practices to promote better security, and  defend the value of scores produced through the assessment process
    24. 24. ITC Guidelines: Scope  All high-stakes assessments, tests and exams  All stakeholders in the assessment process (publishers, users, students, test developers, etc.)  The entire assessment process from development to administration to results processing and scoring  Across all areas of testing (e.g., education, employment, certification/licensure government, clinical psychology, etc.)
    25. 25. ITC Guidelines: Value The ITC Guidelines provide answers to these questions:  Threats. What are the dangers?  Test Fraud. How can we deal effectively with increasing security problems?  Terminology. How can we best communicate about test security
    26. 26. Electronic Badging  What is electronic badging?  Big industry players involved  Compelling value  Badging + accreditation + assessment = ???  Stay tuned
    27. 27. ATP Security Committee  Picking up from last February’s conference  Annual Survey results released: https://www.createspace.com/4418924  Hands-on demonstrations of new tech innovations used by test thieves and pirates  Harder and harder for the “good guys” to keep up
    28. 28. About the Conference The Conference on Statistical Detection of Potential Test Fraud (CSDPTF)  2nd Annual Conference this year  Presents various methods of cheating detection and offers an important forum to examine a variety of methods  Huge thanks to University of Wisconsin at Madison’s Dr. James Wollack and team!
    29. 29. Conference on Statistical Detection of Potential Test Fraud  Last year’s event was groundbreaking  This year, growing momentum…  Premier academic & industry organizations supporting/sponsoring  Attendance steadily increasing  Publishing of selected papers  Plans for future years already underway!  Rumors of a scholarly journal – stay tuned!!
    30. 30. Conference on Statistical Detection of Potential Test Fraud  Compelling dynamics behind the conference:     Growth in testing Stakes growing ever-higher Focus on trustworthy test results How could we NOT focus on statistical detection?
    31. 31. Conference on Statistical Detection of Potential Test Fraud  Challenge:  Current research on the use of statistics.  What about the policies and procedures- the legal defensibility- wrapping their use?
    32. 32. Lessons Learned from The Conference on Statistical Detection of Potential Test Fraud Dennis Maynes
    33. 33. Summary of CSDPTF  There were three general sessions on Friday with four or five presentations and a discussant.  About ten poster presentations were on display Friday evening.  There were three concurrent sessions on Saturday with five presentations in the first session, four in the second, and a symposium/workshop in the third.  The conference was attended by over 120 researchers!
    34. 34. Topics Covered at CSDPTF          Similarity and Person Misfit Pre-knowledge and “Braindumps” Unusual Gains Robustness and Sensitivity Issues Test Tampering Computer-Based Testing Policy Test Fraud Detection Challenge Security Planning
    35. 35. CSDPTF: 2012 versus 2013 The conference this year was more varied in terms of topics and research than last year. Attendees were from all over the world. The conference has expanded in its reach, focus, and attractiveness.
    36. 36. Progress  For this field to progress, the following are needed:  A set of data sets that have known characteristics of security breaches.  Commonly accepted ways to compare the efficacy of methods.  Consensus on the utility and implementation of statistical methods.  Consensus on how to make inferences concerning potential test fraud.  Guidance to practitioners on how to use the statistics.
    37. 37. Research The current state of research is that of individual or organization-based efforts with low cross-fertilization between those efforts.
    38. 38. Direction Greg Cizek presented the keynote speech. He discussed where this field has been and where it is going.
    39. 39. Techniques that work well and/or are gaining greater acceptance  Test Tampering aka “Erasure Analysis”  Similarity aka “Answer Copying”
    40. 40. Techniques that do not work well  Detection of item compromise in all its forms such as braindumps, rogue review courses, and disclosure of live exam content by teachers and/or instructors  Analysis of gain scores for detecting potential test compromise  Analysis of response times to detect potential test compromise
    41. 41. Challenges Practitioners in This Field Face 1. Using statistics to invalidate test scores because the scores are not valid. The problem is that the public and measurement professionals have focused on the behavior not the scores. 2. Conducting investigations when the evidence that misbehavior occurred is circumstantial. There are no missing persons, broken doors, or misappropriated funds.
    42. 42. Challenges Practitioners in This Field Face 3. Demonstrating the science behind the techniques that are being used:      Reproducible results Sound reasoning Quantitative measures Defensible assumptions Measurement of error
    43. 43. Questions?
    44. 44. Caveon Online  Caveon Security Insights Blog  http://www.caveon.com/blog/  twitter  Follow @Caveon  LinkedIn  “Caveon Test Security” Group  Please contribute!  Dr. John’s Test Security Tip of the Day  Caveon Security Minute  Facebook  “Like” us!
    45. 45. Thank you for your attendance! Dr. John Fremer President, Consulting Services John.fremer@caveon.com @TestSecurityGuy Steve Addicott Vice President, Client Services Steve.addicott@caveon.com @SdAddicott Dennis Maynes Chief Scientist Dennis.Maynes@caveon.com @DennisMaynes At Caveon, we fundamentally believe in quality testing and trustworthy test results.

    ×