(2013) The Role of Test Administrator and Error


Published on

Michael Brockly's M.S. thesis presentation for Purdue University, December 2013.

This study created a framework to quantify and mitigate the amount of error that test administrators introduced to a biometric system during data collection. Prior research has focused only on the subject and the errors they make when interacting with biometric systems, while ignoring the test administrator. This study used a longitudinal data collection, focusing on demographics in government identification forms such as driver’s licenses, fingerprint metadata such a moisture and skin temperature, and face image compliance to an ISO best practice standard. Error was quantified from the first visit and baseline test administrator error rates were measured. Additional training, software development, and error mitigation techniques were introduced before a second visit, in which the error rates were measured again. The new system greatly reduced the amount of test administrator error and improved the integrity of the data collected. Findings from this study show how to measure test administrator error and how to reduce it in future data collections.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

(2013) The Role of Test Administrator and Error

  2. 2. •Test administrator error is not currently included in the Human-Biometric Sensor Interaction model, thereby potentially attributing data collection errors to the wrong metric STATEMENT OF THE PROBLEM
  3. 3. • The test administrator has been ignored in the Human- Biometric Sensor Interaction (HBSI) • A portion of biometric data collection error is due to the test administrator • Test methodology needs to take test administrator errors into account • Taking additional performance issues into account will help to meet the criteria of data collection best practices SIGNIFICANCE
  5. 5. •“Data quality one of the most important factors in the effectiveness of a biometric system” (Hicklin & Khanna, 2006) •“Poor data quality is responsible for many or even most matching errors in biometric systems” (Hicklin & Khanna, 2006) •Interested in the quality of the data itself, not solely image quality QUALITY OF BIOMETRIC DATA
  6. 6. •Very important in biometric data collections •Connects biometric sample with the variables that affect the sample •Examples include: • Gender • Fingerprint characteristics such as moisture • Number of attempts needed QUALITY OF METADATA
  7. 7. •Critical to the biometric acquisition process •Takes various roles in data collection •Used to reduce the amount of poor quality data in a system TEST ADMINISTRATOR
  8. 8. •Many factors affect the system performance •Human factors and usability •Studies have shown that the subject has a direct impact on the performance of the system •Next step to analyze the test administrator impact BIOMETRIC PERFORMANCE
  9. 9. •One method to reduce test administrator error •Prevent poor quality from the source •Adhere to ISO 17025 •Internal auditing checklist TRAINING
  10. 10. •Test administrators have multiple responsibilities •Workload needs to be balanced •Use automation when possible •Reduce unwanted workload •Prevent mental calculations WORKLOAD
  11. 11. •System is designed to provide functionality along with ease of use •Cognitively engineered system •Usability testing DESIGNING THE DATA COLLECTION
  12. 12. •Well-made Graphical User Interface (GUI) •Free of extraneous information •Ease of use for both test administrator and subject SYSTEM EASE OF USE
  13. 13. •Improving GUI •Improving test •Eliminating error CONTINUOUS IMPROVEMENT
  14. 14. •Literature has mentioned the need for a test administrator (Graves et al., 2011) (Theofanos et al., 2007) •There is a need for test administrator performance metrics •The test administrator is not included in the HBSI model SUMMARY OF RELATED WORK
  16. 16. •Best practice documentation •Corrective Action Requests •Preventive Action Requests VARIABLES FROM LITERATURE
  17. 17. •Consulted a group of trained test administrators •Recalled events and experiences •Recommended changes to the system VARIABLES FROM FOCUS GROUPS
  18. 18. •Department of Homeland Security (DHS) Aging Study visit 1 •Biometric samples •Biometric metadata VARIABLES FROM ONGOING STUDY
  20. 20. •Data from survey were used to create significance for project •Data were analyzed from DHS Aging Study visit 1 •System changes were put into effect for DHS Aging Study visit 2 EXPERIMENTAL SETUP
  21. 21. • Based off test administrator error frequencies • Recommendations from literature and test administrator surveys • Improvements in: • Consent (Demographic) • Government ID Capture (Demographic) • Fingerprint Statistics Capture (Biometric metadata) • Face Capture (Biometric data) PROCEDURE IMPROVEMENTS
  22. 22. •Created electronic consent form •Eliminated need for paper documents •Documents signed electronically •Records saved to database CONSENT
  23. 23. •Introduced a procedure to check and enter data directly into the database •Subjects with missing or incorrect data were automatically flagged for verification GOVERNMENT ID
  24. 24. •Introduced procedure to enter data directly into the database •Mandatory that all fields are entered •Corrected method for collecting oiliness (sebum) FINGERPRINT STATISTICS
  25. 25. •Created standardized camera settings •Corrected test administrator challenge of looking at external portrait template for a standard distance •Integrated portrait template on the device itself FACE COLLECTION
  26. 26. •Put all system changes into effect •Collected data in visit 2 •Analyzed data for old and new errors •Conducted post-collection survey for test administrators •Recommended further changes as needed METHODOLOGY
  27. 27. RESULTS
  28. 28. •Minimum score of 80% to pass •Most commonly missed question: •Test administrators thought no changes could be made to the data collection once it has begun •Reminded of continuous improvement, allowing changes if they did not jeopardize the data integrity TEST ADMINISTRATOR QUIZ
  29. 29. •20 questions •Based on laboratory processes and test administrator best practices TEST ADMINISTRATOR QUIZ Test Administrator % Correct 1 90% 2 80% 3 95% 4 80% 5 80% 6 95% 7 100%
  31. 31. •Lookup tool for returning subjects •Red fields for blank text •Calendar tool for date of birth •Used to provide a standard format •Radio buttons for demographic information SUBJECT CHECK-IN
  33. 33. •Red fields for blank text •Drop-down boxes for categorical fields •Calendar tools for date of birth and issue date •Comment box for issues or missing identification GOVERNMENT ID COLLECTION
  35. 35. GOVERNMENT ID RESULTS Metric Visit One Visit Two Missing Subjects (All Fields Blank) 25 3 Date of Birth (Blank) 27 1 Date of Birth (Incorrect Format) 1 0 Issue Country (Blank) 1 1 Issue Date (Blank) 8 1 Issue Date (Erroneous Entry) 5 0 Issue State (Blank) 1 1 Issue State (Incorrect Format) 0 1 ID Type (Blank) 0 1 Signature Image (Blank) 3 2 Face Image (Blank) 0 2 Total Erroneous Fields 221 31 Percent Erroneous Fields 28.44% 5.47%
  36. 36. •Red fields for blank text •Drop-down boxes for categorical data •Error messages for invalid values •Corrected methodology for collecting sebum •Comment box for issues or concerns FINGERPRINT METADATA
  38. 38. FINGERPRINT METADATA RESULTS Metric Visit One Visit Two Missing Subjects (All Fields Blank) 12 0 Temperature (Blank) 0 0 Skin Texture (Blank) 0 0 Pigmentation (Blank) 0 0 Sebum (Measured Incorrectly) 99 0 Moisture (Blank) 0 0 Elasticity (Blank) 0 0 Skin Color (Blank) 0 0 Keratin (Blank) 0 0 Total Erroneous Fields 195 0 Percent Erroneous Fields 21.96% 00.00%
  39. 39. •Birthdate •Fingerprint Metadata ERROR MESSAGES
  40. 40. •A template was used to line up the subjects’ eyes •Instructed that “+” symbols should cover the entire eye •Aided height to width ratio of image •Degree of blur and other metrics also reduced FACE CAPTURE
  41. 41. FACE CAPTURE RESULTS% Compliant Metric Visit One Visit Two Eye Separation 95.34% 97.21% Eye Axis Angle 97.21% 99.20% Eye Axis Location Ratio 87.58% 97.61% Centerline Location Ratio 0% 0% Height to Width Ratio 50.93% 100% Head Height to Image Height Ratio 97.52% 97.61% Image Width to Head Width Ratio 69.26% 37.85% Eye Contrast 100% 100% Brightness Score 100% 100% Facial Dynamic Range 100% 100% Percent Facial Brightness 100% 100% Percent Facial Saturation 100% 100% Degree of Blur 60.56% 68.13% Image Format 100% 100%
  42. 42. •Drop-down menu added redundancy in case test administrators did not log in •Provided accountability for errors TEST ADMINISTRATOR RESPONSIBILITY
  44. 44. •Standardizing training was an essential step in reducing test administrator error •Test administrators used the tools established for error documentation •44 Corrective Action Requests •5 Preventive Action Requests CONCLUSIONS
  45. 45. •Conducted a group session, asking test administrators about their data collection experiences •Three focuses: • What went well • What went poorly • What changes would be made if study was repeated POST-MORTEM
  46. 46. •Positive reaction about GUI •“The checklists and tabs in the test administrator GUI decreased my stress level” •Scripts were not always used •Repetitive in multi-visit study POST-MORTEM
  47. 47. •Single program solution for most data collection needs •GUI is fully modifiable for future data collections •CAR and PAR system will continue to be used in error reporting and correcting FUTURE WORK
  48. 48. •Test administrators create a portion of the error in a biometric data collection •Test administrators also influence subjects to create a portion of the error •Future work can assign these errors to the HBSI metrics FUTURE WORK IN HBSI
  50. 50. • Anonymous Test Administrator (personal communication, July 12, 2013). • Braun, D. (1998). The role of funding agencies in the cognitive development of science. Research Policy, 27(8), 807–821. doi:10.1016/S0048-7333(98)00092-4. • Campbell, J., & Madden, M. (2009). ILO Seafarers’ Identity Documents Biometric Interoperability Test (ISBIT-4) Report. ILO (Vol. 2003, pp. 1–162). • Database. (n.d.). Merriam-Webster dictionary. Retrieved from http://www.merriam-webster.com/dictionary/database. • Druckman, J.N. and Green, D.P. and Kuklinski, J.H. and Lupia, A. (2011). Cambridge Handbook of Experimental Political Science. Cambridge University Press. • Dumas, J., & Loring, B. (2008). Moderating Usability Tests. Elsevier. doi:978-0-12-373933-9. • Elliott, S., Kukula, E., & Modi, S. (2007). Issues Involving the Human Biometric Sensor Interface. In S. Yanushkevich, P. Wang, M. Gavrilova & S. Srihari (Eds.), Image Pattern Recognition: Synthesis and Analysis in Biometrics (Vol. 67, pp. 339-363). Singapore: World Scientific. • Elliott, S. J., & Kukula, E. P. (2010). A Definitional Framework for the Human-Biometric Sensor Interaction Model). doi:10.1117/12.850595. • Ernst, A., Jiang, H., Krishnamoorthy, M., & Sier, D. (2004). Staff scheduling and rostering: A review of applications, methods and models. European Journal of Operational Research, 153(1), 3–27. doi:10.1016/S0377-2217(03)00095-X. • Hales, G. T., M.S., Purdue University, December 2010. Evaluation of the Indiana Department of Correction Mug Shot Capture Process. Major Professor: Dr. Stephen Elliott. REFERENCES
  51. 51. • Hicklin, A., & Khanna, R. (2006). The Role of Data Quality in Biometric Systems. White Paper. Mitretek Systems (February 2006), 1–77. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi= • International Ergonomics Association (IEA). (2006). The Discipline of Ergonomics. Retrieved February 23, 2011 from http://www.iea.cc/01_what/What%20is%20Ergonomics.html. • International Organization for Standardization (ISO). (2005). Biometric Performance Testing and Reporting – Part 1: Principles and Framework. ISO.IEC FCD 19795-1. • International Standards Organization. (2006b). Software engineering – Software product Quality Requirements and Evaluation (SQuaRE) – Common Industry Format (CIF) for usability test reports (No. ISO/IEC 25062:2006(E)). Geneva: ISO/IEC. • International Organization for Standardization (ISO). (2010). Information processing systems – Vocabulary – Part 37: Harmonized Biometric Vocabulary. ISO/IEC DIS 2382.37. • International Organization for Standardization (ISO). (2011). Information technology – Biometric performance testing and reporting – Part 6: Testing methodologies for operational evaluation. ISO/IEC FCD 19795-6.2. • Kushniruk, a W., Patel, V. L., & Cimino, J. J. (1997). Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces. Proceedings : a conference of the American Medical Informatics Association / ... AMIA Annual Fall Symposium. AMIA Fall Symposium, 218–22. Retrieved from http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2233486&tool=pmcentrez&rendertype=abstract. • Kukula, E., & Elliott, S. (2006). Implementing Ergonomic Principles in a Biometric System: A Look at the Human Biometric Sensor Interaction (HBSI). Proceedings 40th Annual 2006 International Carnahan Conference on Security Technology (pp. 86–91). Lexington, KY: IEEE. doi:10.1109/CCST.2006.313434. REFERENCES
  52. 52. • Kukula, E. P., & Elliott, S. J. (2009). Ergonomic Design for Biometric Systems. Encyclopedia of Biometrics. Springer US. doi:10.1007/978-0-387-73003-5_184. • McCabe, M. R. (1997). Best Practice Recommendation for the Capture of Mug Shots. National Institute of Standards and Technology. Retrieved September 22, 2013 from http://biometrics.nist.gov/cs_links/standard/ansi_2010/Best_Practice_Face_Pose_Value.pdf. • Murata, A., & Iwase, H. (1998). Effectiveness of Cognitively Engineered Human Interface Design, 20(5), 7–10. doi:0-7803-5164- 9/98. • Norman, D. A. (1986). Cognitive engineering. In D.A. Norman & S.W. Draper (Eds.), User centered system design. Hillsdale, NJ: Erlbaum. • Plan For Biometric Qualified Product List (QPL). (2005). • Redman, T. C. (1998). Poor Data Quality on the Typical Enterprise. Communications of the ACM, 41(2), 79–82. • Ruthruff, E. (1996). A test of the deadline model for speed-accuracy tradeoffs. Perception & Psychophysics, 58(1), 56–64. • Sekaran, U. (2003) Research methods for business: A skill building approach. • Senjaya, Benny. M.S., Purdue University, December 2010. The Impact of Instructional Training Methods on the Biometric Data Collection Agent. Major Professor: Dr. Stephen Elliott. • Theofanos, M., Stanton, B., Micheals, R., & Orandi, S. (2007). Biometric Systematic Uncertainty and the User. IEEE Conference on Biometrics: Theory, Applications and Systems (pp. 1–6). doi:978-1-4244-1597-7/07. REFERENCES
  53. 53. • Wayman, J. (1997). A generalized biometric identification system model. Conference Record or the Thirty-First Asilomar Conference on Signals, Systems and Computers, 1, 291-295. Pacific Grove, California: IEEE. doi:10.1109/ACSSC.1997.6802. • Wickens, CD., Lee, J.D., Liu, Y., and Gordon-Becker, S.E. (2004). An Introduction to Human Factors Engineering. 2nd Edition, Prentice Hall, Upper Saddle River, NJ. REFERENCES