Caveon Webinar Series: What you Should Know about High Stakes Cheating in Your Schools


Published on

This slide deck was presented during an informational webinar by industry experts Michael Stetter, former Delaware State Assessment Director and test security consultant, and Dr. John Fremer, President, Caveon Consulting Services. Fremer and Stetter talked about how to evaluate the fairness and accuracy of your state assessment results, and discussed strategies to keep your district from becoming the next Atlanta Public Schools scandal.

Published in: Education, Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Be sure to insert polls and poll placeholder slides. (See John’s Email) Also insert Q&A slides between parts 1,2,3 and at end
  • Recent Headlines: Paid Test taker proxies on the SAT Student caught hacking the College Bowl quiz questions
  • Remember John’s guidance: This webinar must provide participants with suggested steps they can take themselves.
  • However, it could be much, much worse. A couple of months ago, results of a Michigan educator survey were published. Of the 3,000 or so teachers who responded: 34% felt pressure to change grades 1 in 3 felt pressure to cheat on standardized tests 21%, 1 in 5 said they know of an educator who changed scores, and most alarming… 8% admitted to changing students’ grades due to outside pressure. Holy smokes—if that’s the case, we just jumped from 1,700 to 14,000 classrooms we’re concerned about!
  • Its so bad, that two weeks ago Sec of Education Arne Duncan referred to the “morally bankrupt culture” in Atlanta.
  • Its so bad, that two weeks ago Sec of Education Arne Duncan referred to the “morally bankrupt culture” in Atlanta.
  • Its so bad, that two weeks ago Sec of Education Arne Duncan referred to the “morally bankrupt culture” in Atlanta.
  • Its so bad, that two weeks ago Sec of Education Arne Duncan referred to the “morally bankrupt culture” in Atlanta.
  • Its so bad, that two weeks ago Sec of Education Arne Duncan referred to the “morally bankrupt culture” in Atlanta.
  • This is a great slide to acknowledge the unique demands and challenges for assessment in school districts. Be prepared to mention a few additional points that reveal your long experience in schools. Also the notion that the teachers, if engaged in inappropriate assistance to students, are doing so in the misguided notion that their actions represent student advocacy rather than self-imposed pressure to use test results to validate their teaching effectiveness. Potential Conflicts of Interest Teacher Accountability & Bonuses tied to Score Gains School Accountability or exit from “Under Improvement” Status Non-Standardized Environment Making do in potentially compromised test situations with word walls, other cues, students in close proximity, lax proctor monitoring, cell phones Clustering distractible or poor students with good models in small groups Testing in short chunks to avoid fatigue, or maximize effort Examinee Familiarity Teacher identification with, advocacy for, students with whom they have invested considerable effort Student Halo effect in the eyes of teacher who also their proctor, scorer Transporting and storing test materials Large schools/long hallways, and concessions by building test coordinator to teachers on storage, access, collection Shortcuts by teacher proctors and building test coordinators when crises or fatigue arises, or typical last minute dilemmas such as teacher absence, time crunch toward end of testing window LOST secure materials, unauthorized prep guides, teacher copies items from screen
  • I’ll touch on some specifics for each key role across the segmented time frame of state assessment programs, but I invite you to review the entire Test Security Matrix located on the Caveon website at the link shown in the next slide. The District leader sets the tone for the overall administration of all phases of the district’s assessment program- including state as well as local assessments. Positive, highly visible, and well-organized interactions by the school leader with test proctors, building assessment coordinators, student groups, the teacher association, and the community serve the district well by signaling the importance and the overall monitoring of a sound program to measure and report on student growth. All local stakeholders will be reassured by these activities- while potential mischief makers will be put on alert that their efforts to subvert the process will quickly be brought to light. Prompt investigation of suspected test irregularities will further reinforce this positive stance. In the absence of a strong consistent leader stance on the importance of adhering to the established assessment procedures – before, during, and after the testing window, teachers, administrators, and students often come to believe they are on their own- each may have, as we have commented, different perspectives on the importance of both the testing, and their own rule compliance. In instances when school leaders emphasis score gains at all costs, threatening discipline, demotion, or dismissal for poor results, the potential rises for score erasure activity, deliberate coaching before and during test sessions, or actions such as illegally removing low scoring students from attendance rolls .
  • This is a screen shot of the District Test Security Roles Matrix with the weblink to locate it. Please consider downloading and sharing this document as a helpful resource in your district’s planning for comprehensive assessment program security.
  • The well-designed District Test Security Program acknowledges, and plans for, potential test security threats which exist within each assessment window, as well in the intervals before and after testing. Key examples of these threats include the following: An individual school team’s decision to alter guidelines regarding test session requirements, responsibility for log-ins with student groups, session monitoring, invited visitors to test sessions, and composition of specific worrisome student groups Teacher decisions to incorporate aspects or whole portions of observed test items into pre-testing prep sessions, complete with photocopy study sheets Inappropriate use of test day accommodations for IEP and ELL students- those for which the student does not receive accommodation in regular classroom instruction. Teacher-Teacher sharing of secure item content gleaned from test development or standardization activities. Test administration by staff who have not completed proctor training or signed non-disclosure forms.
  • General comments here about the bulletted items.
  • General comments here about the bulletted items.
  • General comments here about the bulletted items.
  • Data Forensics is comprised of multiple statistic analytic measures, with one of the most prominent being the Similarity Index. In order to gain some sense of how common a significant finding on the Similarity Index is, we can compare it to some naturally occurring events NEXT SLIDE
  • As you can see then, a flagged Similarity Index is indeed an extremely rare occurrence- one that should alert program officials to probe further into potential test irregularities which could compromise the integrity of district results on high stakes assessments.
  • Caveon is proud to announce its newest test security product- The Caveon Security Screen.
  • The Caveon Security Screen consists of the following components (See above list) The Similarity Index is calculated for all district schools in the assessment content areas (Reading and Mathematics) in the assessed grades.
  • The Similarity Index is sensitive to unusual patterns in schools- either within classroom groups or at the whole school level. Here the Caveon analysis compares the number of identical right and wrong responses for clusters of students. To what extent are the number of identical responses significantly larger than what could reasonably be expected of students working independently? Any values over 7% warrant closer inspection. In this example, a number of schools have been flagged for follow-up by the Caveon data forensics analysis. Conversely, extremely low Similarity scores often point to clusters of high scoring, highly able students. The strong performances of highly able groups of students should be confirmed for both Midtown Middle School and McDorren High School in the slide. Consultation between the Caveon Specialist and district leaders will help formulation of both interpretation and recommended action steps regarding possible test irregularities.
  • Review the Caveon Security Screen components- in Data Forensics and in Consultation. Caveon’s experience with these tools at the statewide level has helped Departmental Chiefs and assessment leaders identify and take action, where necessary, to eliminate threats to assessment program integrity. We are confident that use of the Caveon Security Screen at the district level will prove to be equally effective.
  • Dr. Fremer or I will be happy to talk with you regarding any questions you may have, and certainly as you and your staff review your assessment program guidelines.
  • Caveon Webinar Series: What you Should Know about High Stakes Cheating in Your Schools

    1. 1. Upcoming Caveon Events• Caveon Webinar Series: Next session, May 17“Using Decision Theory to Score Accurate Pass/Fail Decisions”Presented by Jamie Mulkey and Larry Rudner• Caveon at NCSA – June 19-21, National Harbor, MDVisit our booth and spend some time talking with us one-on-one!
    2. 2. Latest Publications• Handbook of Test Security – Now available forpurchase! We’ll share a discount code beforeend of session.• TILSA Guidebook for State AssessmentDirectors on Data Forensics – coming soon!
    3. 3. Caveon Online• Caveon Security Insights Blog–• twitter– Follow @Caveon• LinkedIn– Caveon Company Page– “Caveon Test Security” Group• Please contribute!• Facebook– Will you be our “friend?”– “Like” us!
    4. 4. “What You Should Know about High StakesCheating in Your Schools”Mike Stetter EdDSenior Security ConsultantFormer State Assessment DirectorMay 2, 2013Caveon Webinar Series:John Fremer PhDPresidentCaveon Consulting Services@TestSecurityGuy
    6. 6. A. THE TEST SECURITY SITUATION Who cheats? How many of them are there? How often do they cheat?
    7. 7. JOSEPHSON INSTITUTE SURVEY RESEARCH• 95% admitted to cheating in high school• 51% believed they must cheat to get ahead• 90% were pleased with their own morality40,000 College Students Surveyed in 2010
    8. 8. Under 17 Over 50Believe cheating is necessary 50% 10%OK to deceive your boss 31% 8%OK to keep incorrect change 49% 15%OK to lie to your spouse 45% 22%7,000 Survey RespondentsSURVEY OF ADULTS OVER 50 AND YOUTH UNDER 17
    9. 9. 29% felt pressure to cheat on standardized tests34% felt pressure to change students’ scores for the better21% knew educators that changed scores on students’ tests8% admitted to changing students’ scores due to pressureDetroit Free Press, July 26, 2011MICHIGAN EDUCATOR SURVEY
    10. 10. School District Cheating ScandalsIn The News
    11. 11. DC Public SchoolsMore than 100 schoolsdisplayed suspiciousanswer sheet erasuresSo far, 2 teachers have been forced to resignUSA TODAYJune 22, 2012
    12. 12. Houston IndependentSchool DistrictTeachers at 2 elementaryschools caught cheatingChanged student answer sheets on State examsThe housTon ChroniCleFebruary 25, 2013
    13. 13. Atlanta Public Schools179 educators in58 schools implicatedin dishonest test practices41 have resigned or retired. 35 indicted by Grand JuryFormer Superintendent indicted on conspiracyTIMENewsFeedApril 13, 2013
    14. 14. Analyzed test results from69,000 public schoolsSuspicious test scores, similar to those in Atlanta,were found in 196 school districtsDishonest test practices on State level assessmentsappears to be a national problemApril 13, 2013
    15. 15. August 25, 2011Duncan: No link between cheating and NCLBPlaces the blame on school leadersRegardless of who the perpetrators are . . .School District Administrators will feel the heat!
    16. 16. B. CHALLENGES FOR SCHOOL DISTRICT LEADERS• Proctors’ potential conflict of interest• Non-Standardized environment• Examinee familiarity• Transporting and storing test materials
    17. 17. SCHOOL DISTRICT TEST SECURITY MATRIX• District Superintendent• District Assessment Coordinator• School Test Coordinator
    18. 18. DISTRICT TEST ROLES SECURITY MATRIXThe full Matrix is posted on the Caveon website:
    19. 19. SCHOOL DISTRICT TEST SECURITY MATRIXEssentialProactive StepsDuring TestWindowBefore & AfterTest WindowSuperintendentDistrictAssessmentDirectorBuilding TestCoordinator
    20. 20. District Superintendent• Assign clear responsibility & rolesfor test security• Maintain communications with all localstakeholders on assessment activities, results• Advocate for test program best practicesSCHOOL DISTRICT TEST SECURITY MATRIX
    21. 21. District Assessment Director• Supervise test training for all proctors• Protect secure materials and test sessionintegrity through detailed, accountableprocedures• Develop/Update district Test Security PlanSCHOOL DISTRICT TEST SECURITY MATRIX
    22. 22. Building Test Coordinator• Confirm every test proctor completes training,and understands Do’s & Don’ts during testing• Observe test sessions to confirm proceduralcompliance• Directly supervise all handling of securetest materials• Report suspected test irregularitiesSCHOOL DISTRICT TEST SECURITY MATRIX
    24. 24. #1 – COMMUNICATION• Raise test security awareness• Include security references in communications• Issue periodic security focused communications
    25. 25. #2 – EXPECTATIONS• Develop a system-wide Student Honor Code• Emphasize the importance of honest test taking• Issue clear expectations to test staff for professionalbehavior in test administration
    26. 26. #3 – PROCEDURES• Review all procedures for security related content• Emphasize procedures for handling test materials• Review / update security procedures annually
    27. 27. #4 – TRAINING• Review your test security training materials• Track who completes the training• Provide brush-up training annually
    28. 28. #5 – OBSERVATIONS• Perform announced pre-test session site visits• Perform unannounced test session observations• Follow-up on all observation outcomes
    29. 29. #6 – FEEDBACK• Seek feedback from test administrators / proctors• Review procedures for reporting test irregularities• Provide an anonymous tip line
    30. 30. #7 – STUDY YOUR TEST DATA CAREFULLY• Very Unusual Gains• Other Indices• Similarity• Erasures• Individual Patterns• Timing (In Computer Testing)
    31. 31. #8 – PREPARING FOR POSSIBLE SECURITY INCIDENTS• Develop a “Security Incident Response Plan”• Specify procedures, roles and responsibilities• List potential outcomes for security infractions
    32. 32. #9 – LEARNING FROM OTHERS• CCSSO TILSA SCASS “Guidebook on Test Security”• Wollack & Fremer “Handbook of Test Security”• Caveon Webinars and Security Insights NewsletterLearn about developments in prevention and detection
    33. 33. #10 – IF YOU DEVELOP YOUR OWN TESTS• Reduce exposure rate of test questions• Randomize question locations• Rotate answer options
    35. 35. Data Forensics is the analysis of student test datausing sophisticated proprietary computer programsto statistically identify instances likely representingtest fraud, collusion, and other security violations.DATA FORENSICS
    36. 36. A statistical measure of similarity within test resultsor test response patterns that highlight instanceswith a very high likelihood of a test security violation.Similarity IndexDATA FORENSICS
    37. 37. HOW CREDIBLE IS A FLAGGED SECURITY INDEX?• Chance of being hit by lightning = 1 in 1 Million• Chance of winning the lottery = 1 in 10 Million• Chance of DNA false-positive = 1 in 30 Million• Chance of students flagged forSimilarity doing their own work = 1 in 1 Trillion
    38. 38. Caveon’s data forensic diagnostic screening service forschool districts – a cost-effective means to monitor for• Answer copying• Test coaching• Proxy test taking• Collusion and other security violationsTHE CAVEON SECURITY SCREEN
    39. 39. • Custom forensic analyses of student test records• Data output by subject, by school and by student• Data interpretation with a Caveon test security specialist• Advice on follow-up with highlighted security anomalies• Consultation with test security planning and evaluation• Security focused analysis of policies and proceduresTHE CAVEON SECURITY SCREEN
    41. 41. Consultation• Consultation with a Caveon Test Security Specialist• Advice on Test Security planning and evaluation• Interpretation of forensic data output reports• Guidance on test irregularity follow-upTHE CAVEON SECURITY SCREEN SUMMARY• Similarity Index analyses• Perfect Test and Identical Test Analyses• School level data reported by grade and by subject• Flagged student output with cluster details for pairs and groupsData Forensics
    42. 42. Michael W. Stetter, D.Ed.Senior Security ConsultantCaveon Test Security(302) 415-0375mike.stetter@caveon.comFor information on the Caveon Test Security Screen
    43. 43. HANDBOOK OF TEST SECURITYEditors - James Wollack & John FremerPublished March 2013Preventing, Detecting, and Investigating CheatingTesting in Many DomainsCertification/LicensureClinicalEducationalIndustrial/OrganizationalDon’t forget to order your copy at www.routledge.com (Case Sensitive)Save 20% - Enter discount code: HYJ82
    44. 44. - Follow Caveon on twitter @caveon- Check out our blog… LinkedIn Group – “Caveon Test Security”Mike StetterSenior Security ConsultantCaveon Test SecurityJohn FremerPresidentCaveon Consulting Services@TestSecurityGuyThank You!