Caveon Webinar Series: The Good and Bad of Online Proctoring

  • 664 views
Uploaded on

These slides were shared during an informational Webinar presented by the Caveon Webinar Series with guest panelists David Foster, CEO, Caveon Test Security, and Harry Layman, Executive Director of …

These slides were shared during an informational Webinar presented by the Caveon Webinar Series with guest panelists David Foster, CEO, Caveon Test Security, and Harry Layman, Executive Director of Digital Assessment Planning, The College Board.

Description: Online proctoring is rapidly garnering the attention of testing programs that want better security, a longer reach, convenience, and lower test administration costs. Today there are several vendors offering services in this area, and they differ widely in the type and quality of services they provide. What features of online proctoring boost test security and which actually hurt test security? These slides focus on the major security threats during test administration and how those threats can be managed by different features of online proctoring. The slides provided should help an organization make sense of the variety of offerings available, and to select online proctoring features that improve the integrity of their tests.

Please join our LinkedIn group "Caveon Test Security" to join this and other discussions.

Please contact richelle.gruber@caveon.com with any questions.

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
664
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
32
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Introduce the paper; describe it in general terms. Next slide will provide specifics on structure and content.On this slide we say what the paper is FOR; on the next slide we can say a little bit more about “what it is”.
  • This slide is just intended to set out the contents of the paper; it (and subsequent slides) will not replicate or include the contents – either the vendor list or the comparison tables.Harry, I changed the wording of the 4th bullet point only slightly to add in the concept of security threats early on.We need to focus this presentation on how online proctors and associated technology can affect test security during test administration.
  • Might squeeze in: “We prefer the term online proctoring to remote proctoring: for these reasons.
  • Other value propositions include: good view of testing environment; lockdown for a number of good reasons; proctor communication and control.Some more notes here might help.
  • These are general features that help the online proctoring work or be effective. I would argue that they are not specific online proctoring features. Just general best practices. They are all related to the quality of security, but aren’t intended to detect and deal with threats.Not sure I believe what I just wrote. These are more enabling features.
  • I wanted a way to show the entire set of threats, and then fade some away until only those relevant to what online proctoring can be effective against remain. About half of test security threats are not relevant to our discussion, but are relevant to a program’s security management plan. Something graphical or animated would work here.DFF: Some threats During Test Administration are not amenable to any kind of proctor detection or management. Use of pre-knowledge is a good example. Human proctors of any sort are not able to detect its use directly. But some data forensics models can do so in real-time (theoretically) and inform the proctor. Whether the proctor can or should do anything about it is anohter matter. We should probaby restrict this webinar to threats occurring during test administration, AND to threats the proctor can detect and manage.
  • I wanted a way to show the entire set of threats, and then fade some away until only those relevant to what online proctoring can be effective against remain. About half of test security threats are not relevant to our discussion, but are relevant to a program’s security management plan. Something graphical or animated would work here.Test content protection is important every day, all day, all year; but online proctoring worries about content only during the test administrationCheating or inappropriate assistance is a multi-faceted threat that includes everything from electronic cheating, ‘traditional’ cheating (writing on your hand, pre-staging materials, etc) to impersonation / authentication threats.
  • Relevance means that an online proctor (or onsite proctor) can detect the threat or attempted breach AND can do something about it.
  • Describe collusion as someone helping the test taker during the exam. Proctors are individuals capable of collusion.Watching for a person leaving the testing room to perhaps communicate with someone in another room.Can’t deal with collusion that occurs through hidden ear mic (unless examinee vocalizes things ?) Can tell if the person is talking with someone in the room or on a micDFF: Squeeze this in. It will help later on. “In some situations, Proctors often collude with test takers. Being unable to see the test taker screen helps.”==========a) the camera itself is important; where it is positioned -- field of view -- is important. worst case are simple cameras built into monitors that offer a narrow view of the examinee's; as will be discussed later, side-views are better, and required by some services and big users.b) sounds are important queues. and individual mics included with camera units are one area where OLP clearly improves on a human proctor -- particularly one in a larger area monitoring many test takers. setup and validation procedures prior to exam start will insure the sound system meets requirements to monitor even the quietest talking -- as in talking to a concealed microphone.c) if there is suspecisous activity, a remote proctor's ability to pause a session and ask a test taker to explain -- a noise, sound of voices, reflections, etc. (especially when they have been logged electronically), and with the authority to suspend a test session, allows incidents to be dealt with quickly and forthrightly.d) can notify test taker if security breach (or potential breach) was recorded and noted int he logs. test taker may withdraw, or exercize greater caution if behavior was truely incidental.e) not allowing proctors to see test content is simply a zero cost, zero effort incremental security measure. while collusion is unlikely, there may be tests where an enterprising proctor might be tempted to monitize access to potential valuable content.
  • Separate from cheating by getting help from another person (collusion) is the possibility of getting help from access to unauthorized devices, services and materials – books, notes, post-its, web sites, etc.Cheating aids can include calculators, cell phones, books, papers, notes, post-its, computer, Internet, etc.Camera angle will help proctor to detect if a person leaves the room to use a cheating aid.Some of these devices of course may be allowed by program.I would note here that for both collusion with others or using any sort of cheating aid, one factor impacting these risks arises from the examinee's access to and control of the testing facility / room pior to and after the exam session. So often the threats you are guarding against here is the use of materials or devices pre-staged by the test taker. I good "startup routine" where the proctor has the test taker pan the room and surrounds with the camera, does a full voice and video quality check, etc. will help deter (and detect) cheating attempts of this kind.
  • Some improvement (over human proctoring with no logging / video of the actual test taker) arises from having video and audio of the test taker at the start of the test – to compare with something (potentially something submitted earlier by the test sponsor), and to have that available with test results. Includes: fingerprint readers, facial recognition software, voice recognition software, keystroke pattern recognition, photo matchingContinuous refers to occasional, perhaps random, requests for authentication. Automated procedures are amenable to continuous authentication. For longer exams where students have breaks and my leave the area and return, re-authentication after breaks, random re-authentication prompts at multiple points during an exam, might be warranted.Weak: include the use of ID, particularly as verified through a webcam. A “picture of a picture” is not as robust as a good, high resolution face picture to compare with an externally provided (by test sponsor or results recipient) face picture. ID’s can be faked and can’t be verified as genuine through a webcam.I’ve read about more cloud-based “software as a service” deployment of machine learning and AI tools and algorithms, and of “homeland security” contractors trying out facial recognition software and algorithms delivered to desktops via AWS – so this technology might be incorporated in more secure delivery system sooner than expected. (you still need something to compare a “face” taking a test “with”).( DFF: Facial recognition software can be used early and later.)
  • This is not a big issue but I would leave the slide on for completeness. Note that this applies only if more than one test taker is allowed in the same room – not the typical use case. Online Proctoring is mostly used in circumstances where examinees are alone in their room. In other cases, tests may start at different times and questions may be randomized. DFF: You sent me a picture last week of an online proctor viewing an entire room of chinese school children. This is probably a good example of what this means. Do you want to add that slide?
  • Good news and bad news!This is a big problem, because proctors can really only detect this behavior from stupid test takers that utilize an obvious cell phone or camera. Many photography devices are meant to be hidden (button, pen, watch, glasses, etc.) and are realistically not detectable by the proctor whether through a webcam or even if in the room. Methods to foil this type of theft are not proctor relevant and is the subject for a different webinar.
  • Test taker can write down the questions/answers on a pad of paper or tablet, or orally capture on digital recording device.These are fairly easily detected by an online proctor.Need to make sure that online proctor cannot view test content.
  • This is a summary page of the best practices listed on the several slides before.
  • Weaknesses: Can see all areas in a room, even if the room is panned. (under desk, in closet, in drawers, in a hall. After panning cheating devices and helpers can return.
  • Lockdown is complicated to create and maintain and requires a download by student/test taker (but which might be “automatic” / zero-touch (if it works)); will result in tech support calls, etc. But it’s advantages are clear. Plus, it will make it unnecessary for the proctor to view the test taker screen, which is a huge security problem.This may be a separate download from some other software required to administer a test or it may be integrated. Some approaches to this are problematic – and can have browser, operating and hardware-based issues and dependencies. But for many sorts of exams, prohibiting the user from access other programs from the testing workstation during testing, is important and best accomplished with this sort of solution.Specific behaviors desired or disallowed may vary between test programs – depending on whether or not, for example, their test delivery software provides “right mouse-button”, context-specific action menus, or allows “cut-and-past” during a test section that includes a performance task or long form constructed response type of question.
  • It’s well-known in most high-stakes testing scenarios that the proctors or test administrators are the ones doing the cheating. Often proctors have a stake in how the test turns out.Security measures must consider the possibility of collusion and test theft with proctors.
  • Having the ability to interact with the examinee directly during a test session, and their own interaction with the remote proxy during a “test preparation and startup” exercise (which validates the integrity of the test delivery system and the functioning of the remote proctoring solution), can serve as a significant deterrent to undesirable test taker behavior. Being able to question an examinee, say if there is an off-screen interaction, or suspicious movement, allows the test taker to address the issue in real time, provides greater evidence for the test taking event log, and can help insure that fair and practical decisions are made with regard to such incident. An added benefit is to provide a channel for technical support issues if they are sufficiently minor and specific to the test delivery system operation which could result in salvaging test sessions that might otherwise have to be re-scheduled.The form of the interactions can include free-form dialog with users, as you might find on more and more shopping sites, or by using canned scripts and pre-formatted messages, which can help insure a level of quality control, professionalism and standardized behavior by a test delivery organization (particularly if the canned messages are part of an over-arching test proctoring protocol which specified specific messages and content for use in dealing with the most frequent test taking proctor-involvement issues. Specific scripts and messages could be pre-codified for dealing with such things as “off-camera conversations with third parties”, movement of test taker from camera view, temporary occlusion of camera view, and the like.
  • With such control a proctor can elicit test taker cooperation. Pausing a test and covering or removing the current test question on the screen puts the proctor in control of the cheating or theft situation. Ultimately the proctor can simply suspend or cancel the session if compliance isn’t forthcoming.I interviewed a manager of an online proctoring service using these features. She told me that in literally tens of thousands of such interactions NOT ONE test taker has refused to acknowledge the warning and comply with proctor instructions.When a test is suspended, it can be restarted at the same place the test was suspended.
  • Note the strong methods involve biological-driven factors (fingerprint, facial recognition, voice recognition), and deeply rooted physiological patterns (keyboard analytics) that are both robust ways to authenticate examinees and quite difficult to “fake”, provided a secure mechanism has been used to capture the subject’s identity in the first place (or for validation of test results against a particular user profile even after the fact). (DFF: this involves a complex topic of authentication versus identification; it’s okay and maybe even desirable to simply collect this “biometric” or pattern when registering in a program, and make sure that those same steps are repeated (authentication) before the test begins and maybe during it. I would avoid the concept of identification for this webinar.)As noted in the authentication discussion above, “don’t use a “picture of a picture” – for anything. And do re-validate in circumstances where examinees can leave a monitored testing area and return for additional work.As noted above in the “authentication risks” of the “using proxy test takers” section: Weak: include the use of ID, particularly through a webcam. A “picture of a picture” is not as robust as a good, high resolution face picture to compare with an externally provided (by test sponsor or results recipient) face picture. (DFF: Fake ID’s are easy to obtain and impossible for an untrained proctor through a camera or in person to verify.)I’ve read about more cloud-based “software as a service” deployment of machine learning and AI tools and algorithms, and of “homeland security” contractors trying out facial recognition software and algorithms delivered to desktops via AWS – so this technology might be incorporated in more secure delivery system sooner than expected. (you still need something to compare a “face” taking a test “with”).
  • It is critical for decision-making and taking action to have recorded the test security incident, stored it for a specified period of time, and easily used to inform discussion and decisions. Proctor logs of timing and communications and decisions are an important companion set of data.If a question is asked: No, except perhaps for comparative research purposes, I can see no reason to store video records for test sessions where no incident occurred.
  • Technology advances continue to improve features / functions
  • Very nice slide. As I said earlier, I’m not all that sure about the point made in the second bullet point.Second bullent: Feel free to delete or change. My thought was for example some programs may not care about content theft – they may release all their tests every year (psych 101 at purdue, calculus 101 at a community college, etc) as they are not that different from “problem sets” assigned during the course. Others might allow “open notes” so don’t care about monitoring if the kid has a cheat sheet – must more concerned about texting a fellow student or running IM on the test computer. Maybe for “speeded tests” there is even less concern about consulting notes or contacting others as there is just not enough “time” to really get help. Say long reading passages, a few short MCQs, with 10 passages and 40 questions in 40 minutes (this is the ‘reading comprehension’ section of the SSAT); much more concern here about securing the material before the test, and ability to look at suspecious timing data to see if someone already had access to the passages etc.Maybe I am just wrong !!
  • Why should we worry about test quality?

Transcript

  • 1. Upcoming Caveon Events• NCME Conference, April 26 – 30th, San FranciscoTest Security I: Policy Issues: Technical/Statistical/Methodological Issues– Presenters include: Dr. John Fremer, Caveon• USDLA Conference, April 28- May 1, St. LouisWill the real learner please stand up?– Presenters include Caveon’s Jamie Mulkey, Ed.D. and Patrick Martin• Caveon Webinar Series: Next session, May 2What You Need to Know about High Stakes Cheating in Your Schools– Presenters include former State Assessment Director, Dr. Mike Stetter,and Dr. John Fremer, President, Caveon Consulting Services
  • 2. Latest Publications• Handbook of Test Security – Now available forpurchase! We’ll share a discount code before endof session.• TILSA Guidebook for State Assessment Directorson Data Forensics – soon to be released
  • 3. Caveon Online• Caveon Security Insights Blog– http://www.caveon.com/blog/• twitter– Follow @Caveon• LinkedIn– Caveon Company Page– “Caveon Test Security” Group• Please contribute!• Facebook– Will you be our “friend?”– “Like” us!www.caveon.com
  • 4. “The Good and Bad of Online Proctoring”Harry LaymanExecutive DirectorDigital Assessment PlanningThe College BoardApril 17, 2013Caveon Webinar Series:Dr. David FosterCEOCaveon Test Security
  • 5. Agenda for today• Recap of Structure and Content of recent paper “OnlineProctoring Systems Compared”• Introduction and Definitions for Online Proctoring Discussion• Test Security Threats Addressed by Online Proctoring• Online Proctoring Features and Threat Impacts• Seven Critical Online Proctoring Best Practices• Odds and Ends– Research Support– Future Features– Trends• Lessons for Test Program Users
  • 6. Online Proctoring Systems Compared:Purpose of Paper• Published by Dave Foster and Harry Layman– http://bit.ly/proctoring• Background of paper• Recent paper provides a framework for describing differentofferings and several tables of feature comparisons• Lots of detail, not entirely complete – will remain a work inprogress• A good starting point as you begin to think about features /benefits and your specific risk profile and threats of concern
  • 7. Online Proctoring Systems Compared:Structure and Content• Overview of Online Proctoring – what it is, what it is not, and primary valueproposition• An enumeration of 8 vendor products / services included in the scope of thereview– Mix of capabilities and approaches; offerings not homogenous/equivalentand so comparisons need to be more holistic– We have not yet investigated all options equally, so the “comparisons”are informative but not complete; look for updates over time• Seven groups of features or characteristics of Online Proctoring offeringsorganized into a series of comparison tables• This presentation will “hit the highlights,” setting online proctoring in thegeneral context of test security threats, and provide a high-level discussionof the attributes and features of Online Proctoring systems that can mitigatethose threats.
  • 8. Online Proctoring Is / Is not:• Online Proctoring refers specifically to using an internet-basedapproach to remotely monitor individual test administrations, replacingtraditional approaches that have relied on “eyes in the room”– Not CCTV, roving personnel, or using volunteers in user-selectedlocations (libraries, etc.)• Online Proctoring is NOT a “find-your-own-proctor” model as used fordecades in distance education exams
  • 9. Online Proctoring Provides:• Key value proposition in a nutshell:– Independent, bias-free proctors– Professional proctors (training, supervision, logging,accountability, etc)– Technology-assisted audio, visual and activity monitoringsupport (e.g., log files, video, lock-down, keystroke monitoring,test session control, real-time intervention, etc.)
  • 10. General Proctoring Requirements• Online Proctor During Entire Exam• Continuous Internet• Encryption for Data Transfer• Schedule Availability• Proctor Management– Supervised– Training– Career Path– Certification• Program Customization
  • 11. Security Threats / Scope• Online Proctoring is primarily concerned with– Threats during test administration; and hence– Attempts to either capture test content, or– Get inappropriate assistance in any way.
  • 12. Security Threats / Scope• Online Proctoring is only concerned with a subset of theoverall set of program vulnerabilities, threats and risks.What are they?• The subset are those threats that are detectable by theonline proctor (or onsite proctor) and over which he orshe has some control
  • 13. Test Security Threats: A Finite Universe• Only Six Categories of Cheating Threats– Using pre-knowledge of test content– Colluding with others– Using cheating aids– Using a proxy test taker– Hacking into scoring system– Copying answers from other test takers• Only Six Categories of Stealing Threats– Hacking into a system and stealing test files– Capturing content by digital photography devices– Capturing content by electronic recording of the screen– Memorizing content– Transcribing content verbally (on paper or recording device)– Getting content from test program insider
  • 14. Test Security Threats: A Finite Universe• Only Six Categories of Cheating Threats– Using pre-knowledge of test content– Colluding with others– Using cheating aids– Using a proxy test taker– Hacking into scoring system– Copying answers from other test takers• Only Six Categories of Stealing Threats– Hacking into a system and stealing test files– Capturing content by digital photography devices– Capturing content by electronic recording of the screen– Memorizing content– Transcribing content verbally (on paper or recording device)– Getting content from test program insider
  • 15. Down to Six! Test Security ThreatsRelevant for Online Proctors• Cheating Threats– C1. Colluding with others– C2. Using cheating aids– C3. Using a proxy test taker– C4. Copying answers from other test takers• Stealing Threats– S1. Capturing content by digital photography devices– S2. Transcribing questions verbally (on paper or recordingdevice)
  • 16. C1. Colluding with Others• How can a proctor effectively detect and deal withcollusion?– Camera that can see a large portion of the room– Camera that can pick up sounds and talking– Ability to pause test session, and perhaps suspend it– Ability to contact test taker, record/storesession and time-stamp incident– Proctors should be prohibitedfrom viewing test taker screen
  • 17. C2. Using Cheating Aids• How can a proctor effectively detect and deal with a testtaker using a cheating aid?– Camera that can see a large portion of the room, particularlydesk, computer, keyboard, test taker head/arms/hands– Strong lockdown program to prevent access to online aids– Ability to contact test taker, pausetest session, and perhaps suspend it– Ability to record/store session andtime-stamp incident
  • 18. C3. Using a Proxy Test Taker• How can a proctor effectively detect and deal witha proxy test taker?– Strong initial authentication– Continuous authentication methods– Ability to contact test taker, pause test session, andperhaps suspend it– Ability to record/store session and time-stamp incident
  • 19. C4. Copying Answers from Another TestTaker• How can a proctor effectively detect and deal witha test taker copying from another?– Camera that can see a large portion of the room,including other test takers allowed in the room– Ability to contact test taker, pause test session, andperhaps suspend it– Ability to record/store session and time-stamp incident
  • 20. S1. Capture Test Content with DigitalPhotography• How can a proctor effectively detect and deal withthe digital photography of the test questions?– Camera that can see a large portion of the room, alongwith head/hands/torso of test taker– Prevent proctor from viewing test screens– Ability to contact test taker, pause test session, andperhaps suspend it– Ability to record/store session and time-stamp incident
  • 21. S2. Transcribing Questions Verbally• How can a proctor effectively detect and deal withthe transcription of test questions?– Camera that can see a large portion of the room, alongwith head/hands/torso of test taker– Prevent proctors from viewing the test taker screen– Ability to contact test taker, pause test session, andperhaps suspend it– Ability to record/store session andtime-stamp incident
  • 22. Seven Critical Online ProctoringBest Practices1. Use wide-view camera with microphone2. Require lockdown program3. Prevent proctor view of test screens4. Allow and require proctor to communicate with test taker5. Allow proctor to control the test session6. Use strong authentication methods7. Record, store and time-stamp test session, along withproctor logs
  • 23. First: Use a Capable Webcam• Is external to the computer; connected by 2-4 feetof USB cable and set back• Has large field of view (approx. 80° up to 360°)• Allows proctor to see testing workstation and head,torso, arms, hands of test taker• Camera angle and placementdoes not allow viewing theworkstation screen• Has integrated microphone
  • 24. Second: Use a Capable Lockdown ProgramOperating System/Computer/Browser• Prevent right-click • Prevent min/max windows• Prevent printing • Prevent Copy/Paste• Prevent function keys • Prevent running of applications• Prevent important key combos • Prevent launch of applications• Hide Taskbar and Desktop • Prevent communication tools• Hide menus and icons • Prevent browser control/navigation
  • 25. Third: Prevent Proctor View of TestWorkstation Screen• What are the reasons a proctor should not view thecontent of a test taker’s workstation screen?– Security Reason #1: Viewing the screen may encouragecollusion between proctor and test taker– Security Reason #2: Seeing the items of a high-stakesexam on a proctor’s screen may encourage the theft ofthe test content– Privacy Reason: It’s not appropriate for a proctor to viewhow a test taker answers test questions
  • 26. Fourth: Allow Proctor to Communicatewith Test Taker• Chat• Canned Messages• Technical Support call if necessary
  • 27. Fifth: Allow Proctor to Control Test Session• Control allows proctor to deal effectively withdetected threats– Pause the test• cover or remove the current question• require acknowledgement and compliance from the testtaker• provide a warning– Suspend the test (can restart at later time)– Cancel the test• Proctor rules for these actions are subject tocustomization by the testing program
  • 28. Sixth: Use Strong Authentication Methods• Strong– Fingerprint readers– Facial recognition– Keystroke pattern recognition– Voice recognition• Avoid weak authentication methods– Viewing of government-issued IDs through webcam
  • 29. Seventh: Record and Store Sessions• Record entire test session and proctorlogs/communications• Time stamp incidents to make them easy to locate• Provide easy-to-use review application for both video oftest session and proctor logs• Store long-term or until no longer needed
  • 30. Best Practices Effects on Security Incidents• If these best practices are followed what benefits canbe expected?– Better detection of threats or attempted breaches– Professional, un-biased, and competent response byproctors– Deterrence: Fewer incidents going forward– Stronger evidence basis for decision making and taking anyaction, including legal action
  • 31. Research Support• Reliable data – on site or online – is hard to come by• Limited data available for online proctoring; what isthere is encouraging• More reporting by participating organizations willincrease the data pool• Need studies also from academic researchers
  • 32. It’s Only Going to Get Better:Future Proctoring Features• Improvements (e.g., better cameras; better proctortraining, etc.)• New features: Automated proctoring technologies– Inappropriate keystroke detection– Audio level detection and display– Detection of suspicious response patterns with Real-TimeData Forensics– Better facial, voice recognition software on the horizon
  • 33. More Trends• More providers, may see increasing stratificationbetween cost and quality / professionalism (as withother security services)• New technologies may allow better detection, orlower costs – or both– Augmenting human monitoring of online video with real-time “artificial intelligence” could allow for greater proctor-to-examinee ratios with even better security
  • 34. Lessons for Program Users• Do it right or be an easy target• Strong security is always needed for tests wherecheating and theft occur, but not everyone has thesame set threats or concerns• Different audiences may have different levels oftechnical sophistication, risk, time sensitivity, etc.• Online proctoring when done properly can be ascapable as any other well-run proctoring method atreducing test security risks
  • 35. HANDBOOK OF TEST SECURITY• Editors - James Wollack & John Fremer• Published March 2013• Preventing, Detecting, and Investigating Cheating• Testing in Many Domains– Certification/Licensure– Clinical– Educational– Industrial/Organizational• Don’t forget to order your copy at www.routledge.com– http://bit.ly/HandbookTS (Case Sensitive)– Save 20% - Enter discount code: HYJ82
  • 36. Questions?Please join our “Caveon Test Security” LinkedIngroup and submit questions into the discussionbox or contact richelle.gruber@caveon.com
  • 37. THANK YOU!- Follow Caveon on twitter @caveon- Check out our blog…www.caveon.com/blog- LinkedIn Group – “Caveon Test Security”Harry LaymanExecutive DirectorDigital Assessment PlanningThe College BoardDr. David FosterCEOCaveon Test SecurityAgain, find our Online Proctoring Systems Compared paper at http://bit.ly/proctoring