Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

(2012) The Role of Test Administrator and Error proposal

262 views

Published on

Published in: Technology
  • Be the first to comment

  • Be the first to like this

(2012) The Role of Test Administrator and Error proposal

  1. 1. BIOMETRICS LABBiometric Standards, Performance and Assurance LaboratoryDepartment of Technology, Leadership and InnovationTHE ROLE OF TESTADMINISTRATOR ANDERRORMICHAEL BROCKLYMARCH 6, 2013
  2. 2. STATEMENT OF THE PROBLEM• Test administrator error is not currentlyincluded in the Human-Biometric SensorInteraction model, thereby potentiallyattributing data collection errors to thewrong metric
  3. 3. SIGNIFICANCE• The test administrator has been ignoredin the Human Biometric SensorInteraction (HBSI)• A portion of biometric data collectionerror is due to the test administrator• Test methodology needs to take testadministrator errors into account• Taking additional performance issuesinto account will help to meet the criteriaof data collection best practices
  4. 4. BIOMETRICS LABBiometric Standards, Performance and Assurance LaboratoryDepartment of Technology, Leadership and InnovationREVIEW OF LITERATURE
  5. 5. QUALITY OF BIOMETRIC DATA• “Data quality one of the most importantfactors in the effectiveness of a biometricsystem” (Hicklin & Khanna, 2006)• “Poor data quality is responsible formany or even most matching errors inbiometric systems” (Hicklin &Khanna, 2006)
  6. 6. QUALITY OF METADATA• Very important in biometric datacollections• Connects biometric sample with thevariables that affect the sample• Examples include:– Gender– Fingerprint characteristics such as moisture– Number of attempts needed
  7. 7. TEST ADMINISTRATOR• Critical to the biometric acquisitionprocess• Takes various roles in data collection• Used to reduce the amount of poorquality data in a system
  8. 8. BIOMETRIC PERFORMANCE• Many factors affect the systemperformance• Human factors and usability• Studies have shown that the subject hasa direct impact on the performance of thesystem
  9. 9. HBSI
  10. 10. TEST ADMINISTRATOR ERROR• Can occur in biometric data and inmetadata• Adversely affects the quality of biometricdata• Literature has documented the need fortest administrator performance metrics(Hicklin & Khanna, 2006)
  11. 11. TRAINING• One method to reduce test administratorerror• Prevent poor quality from the source• Adhere to ISO 17025– Internal auditing checklist
  12. 12. QUALITIES OF THE TESTADMINISTRATOR• Knowledge– Understanding of the test– To correct procedures• Leadership– To instruct the test subjects– Providing assistance if necessary
  13. 13. WORKLOAD• Test administrators will have multipleresponsibilities• Workload needs to be balanced• Use automation when possible– Reduce unwanted workload– Prevent mental calculations
  14. 14. FATIGUE• Fatigue, stress and distractions will affecttest administrator performance• Maintaining vigilance and attentionreduces over time (Graves et al., 2011)
  15. 15. STRESS• Additional errors and quality problemsincrease with test administrator workloadand stress (Hicklin & Khanna, 2006)• Throughput times– Time constraints
  16. 16. DESIGNING THE DATACOLLECTION• System is designed to providefunctionality along with ease of use• Cognitively engineered system• Usability testing
  17. 17. SYSTEM EASE OF USE• Well-made Graphical User Interface(GUI)– Free of extraneous information• Ease of use for both test administratorand subject
  18. 18. CONTINUOUS IMPROVEMENT• Improving GUI• Improving test• Eliminating error
  19. 19. IMPACT ON THE SYSTEM• Costs associated• If errors remain unresolved it canjeopardize data quality• Impact on HBSI
  20. 20. SUMMARY OF RELATED WORK• Literature has mentioned the need for atest administrator (Graves et al., 2011)(Theofanos et al., 2007)• There is a need for test administratorperformance metrics• The test administrator is not included inthe HBSI model
  21. 21. BIOMETRICS LABBiometric Standards, Performance and Assurance LaboratoryDepartment of Technology, Leadership and InnovationMETHODOLOGY
  22. 22. IDENTIFICATION OF VARIABLES• From literature• From survey and focus groups• From ongoing study
  23. 23. VARIABLES FROM LITERATURE• Best practice documentation• Corrective Action Requests• Preventive Action Requests
  24. 24. SURVEY• Quantitative data from Likert questions• Qualitative data from short answerquestions
  25. 25. FOCUS GROUPS• Consulting a group of trained testadministrators• Recall events and experiences• Recommend changes to the system
  26. 26. VARIABLES FROM ONGOINGSTUDY• Department of Homeland Security (DHS)Aging Study visit 1• Biometric samples• Biometric metadata
  27. 27. TESTING ENVIRONMENT
  28. 28. EXPERIMENTAL SETUP• Data from survey is used to createsignificance for project• Data is analyzed from DHS Aging Studyvisit 1• System changes put into affect for DHSAging Study visit 2
  29. 29. PROCEDURE IMPROVEMENTS• Based off test administrator errorfrequencies• Recommendations from literature andtest administrator surveys• Improvements in:– Consent (Demographic)– Driver’s License Capture (Demographic)– Fingerprint Statistics Capture (Metadata)– Face Capture (Biometric data)
  30. 30. CONSENT• Creating electronic consent form• Eliminates need for paper documents• Documents signed electronically• Records saved to database
  31. 31. DRIVER’S LICENSE• Introduce a procedure to check andenter data directly into the database• Subjects with missing or incorrect dataare automatically flagged for verification
  32. 32. FINGERPRINT STATISTICS• Introduce procedure to enter datadirectly into the database– Mandatory that all fields are entered• Corrected method for collecting oiliness(sebum)
  33. 33. FACE COLLECTION• Create standardized camera settings• Correct test administrator challenge oflooking at external portrait template for astandard distance– Integrated portrait template on the deviceitself
  34. 34. AFTER APPROVAL• Put all system changes into effect• Collect data in visit 2• Analyze data for old and new errors• Conduct post-collection survey for testadministrators• Recommend further changes ifnecessary
  35. 35. BIOMETRICS LABBiometric Standards, Performance and Assurance LaboratoryDepartment of Technology, Leadership and InnovationQUESTIONS?
  36. 36. REFERENCES• Braun, D. (1998). The role of funding agencies in the cognitive development ofscience. Research Policy, 27(8), 807–821. doi:10.1016/S0048-7333(98)00092-4• Campbell, J., & Madden, M. (2009). ILO Seafarers’ Identity Documents BiometricInteroperability Test (ISBIT-4) Report. ILO (Vol. 2003, pp. 1–162)• Database. (n.d.). Merriam-Webster dictionary. Retrieved from http://www.merriam-webster.com/dictionary/database• Druckman, J.N. and Green, D.P. and Kuklinski, J.H. and Lupia, A. (2011).Cambridge Handbook of Experimental Political Science. Cambridge UniversityPress.• Dumas, J., & Loring, B. (2008). Moderating Usability Tests. Elsevier. doi:978-0-12-373933-9• Elliott, S., Kukula, E., & Modi, S. (2007). Issues Involving the Human BiometricSensor Interface. In S. Yanushkevich, P. Wang, M. Gavrilova & S. Srihari(Eds.), Image Pattern Recognition: Synthesis and Analysis in Biometrics (Vol.67, pp. 339-363). Singapore: World Scientific• Elliott, S. J., & Kukula, E. P. (2010). A Definitional Framework for the Human-Biometric Sensor Interaction Model). doi:10.1117/12.850595• Ernst, A., Jiang, H., Krishnamoorthy, M., & Sier, D. (2004). Staff scheduling androstering: A review of applications, methods and models. European Journal ofOperational Research, 153(1), 3–27. doi:10.1016/S0377-2217(03)00095-X
  37. 37. REFERENCES• Hicklin, A., & Khanna, R. (2006). The Role of Data Quality in Biometric Systems.White Paper. Mitretek Systems (February 2006), 1–77. Retrieved fromhttp://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.110.4351&rep=rep1&type=pdf• International Ergonomics Association (IEA). (2006). The Discipline of Ergonomics.Retrieved February 23, 2011 fromhttp://www.iea.cc/01_what/What%20is%20Ergonomics.html• International Organization for Standardization (ISO). (2005). BiometricPerformance Testing and Reporting – Part 1: Principles and Framework. ISO.IECFCD 19795-1• International Standards Organization. (2006b). Software engineering – Softwareproduct Quality Requirements and Evaluation (SQuaRE) – Common IndustryFormat (CIF) for usability test reports (No. ISO/IEC 25062:2006(E)). Geneva:ISO/IEC.• International Organization for Standardization (ISO). (2010). Informationprocessing systems – Vocabulary – Part 37: Harmonized Biometric Vocabulary.ISO/IEC FCD 19795-6.2• International Organization for Standardization (ISO). (2011). Informationtechnology – Biometric performance testing and reporting – Part 6: Testingmethodologies for operational evaluation. ISO/IEC FCD 19795-6.2
  38. 38. REFERENCES• Kushniruk, a W., Patel, V. L., & Cimino, J. J. (1997). Usability testing in medicalinformatics: cognitive approaches to evaluation of information systems and userinterfaces. Proceedings : a conference of the American Medical InformaticsAssociation / ... AMIA Annual Fall Symposium. AMIA Fall Symposium, 218–22.Retrieved fromhttp://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2233486&tool=pmcentrez&rendertype=abstract• Kukula, E., & Elliott, S. (2006). Implementing Ergonomic Principles in a BiometricSystem: A Look at the Human Biometric Sensor Interaction (HBSI). Proceedings40th Annual 2006 International Carnahan Conference on Security Technology (pp.86–91). Lexington, KY: IEEE. doi:10.1109/CCST.2006.313434• Kukula, E. P., & Elliott, S. J. (2009). Ergonomic Design for Biometric Systems.Encyclopedia of Biometrics.• Kukula, E., & Proctor, R. (2009). Human-Biometric Sensor Interaction: Impact ofTraining on Biometric System and User Performance. In M. J. Smith & G.Salvendy (Eds.), Human Interface, Part II, HCII 2009 (pp. 168–177). Berlin /Heidelberg: Springer. doi:10.1007/978-3-642-02559-4_19• Mansfield, T., Kelly, G., David, C., & Jan, K. (2001). Biometric Product TestingFinal Report (pp. 1–22). Teddington. Retrieved fromhttp://www.lgiris.com/download/brochure/uk_report.pdf
  39. 39. REFERENCES• Murata, A., & Iwase, H. (1998). EFFECTIVENESS OF COGNITIVELYENGINEERED HUMAN INTERFACE DESIGN, 20(5), 7–10. doi:0-7803-5164-9/98• Norman, D. A. (1986). Cognitive engineering. In D.A. Norman & S.W. Draper(Eds.), User centered system design. Hillsdale, NJ: Erlbaum.• Plan For Biometric Qualified Product List (QPL). (2005).• Redman, T. C. (1998). Poor Data Quality on the Typical Enterprise.Communications of the ACM, 41(2), 79–82.• Ruthruff, E. (1996). A test of the deadline model for speed-accuracy tradeoffs.Perception & Psychophysics, 58(1), 56–64.• Sekaran, U. (2003) Research methods for business: A skill building approach.• Senjaya, Benny. M.S., Purdue University, December 2010. The Impact ofInstructional Training Methods on the Biometric Data Collection Agent. MajorProfessor: Stephen Elliott.• Theofanos, M., Stanton, B., Micheals, R., & Orandi, S. (2007). BiometricSystematic Uncertainty and the User. IEEE Conference on Biometrics:Theory, Applications and Systems (pp. 1–6). doi:978-1-4244-1597-7/07
  40. 40. REFERENCES• Wayman, J. (1997). A generalized biometric identification system model.Conference Record or the Thirty-First Asilomar Conference on Signals, Systemsand Computers, 1, 291-295. Pacific Grove, California: IEEE.doi:10.1109/ACSSC.1997.6802• Wickens, CD., Lee, J.D., Liu, Y., and Gordon-Becker, S.E. (2004). An Introductionto Human Factors Engineering. 2nd Edition, Prentice Hall, Upper SaddleRiver, NJ.

×