(2010) HBSI and Hand Geometry


Published on

In this paper, we examine the role of HBSI and Hand Geometry.

Published in: Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

(2010) HBSI and Hand Geometry

  1. 1. Evaluation of the Human Biometric Sensor Interaction using Hand GeometryCarnahan Conference| San Jose, CA| October 7th, 2010<br />Biometric Standards, Performance, and Assurance Laboratory | <br />Purdue University<br /> www.bspalabs.org<br />www.twitter.com/bspalabs<br />www.slideshare.net/bspalabs<br />www.linkedin.com/companies/bspa-labs<br />
  2. 2. Agenda<br />Current Testing Standards and Norms<br />The Missing Link?<br />What performance evaluations should also explain<br />Usability & Biometrics: Our systems should be usable?<br />The Human-Biometric Sensor Interaction (HBSI)<br />HBSI Framework<br />Applicability to Hand Geometry<br />Questions<br />
  3. 3. Scope – from Mansfield & Grother’sThe Wide World of Biometric Testing<br />Measure FRR after data collection<br />What should be done<br />Have tests been driven by what can be done <br />VS<br />Observe and count mispresentation effects<br />
  4. 4. Development of a general model <br />Wayman <br />“commonly-held knowledge and oft-repeated descriptions of biometric identification systems are more complicated than necessary because of this lack of a generally accepted system model”<br />
  5. 5. A General Biometric System Model (ISO/IEC 19795-1)<br />
  6. 6. What do our testing standards say?<br />Distinctions between technology and scenario evaluations according to ISO/IEC 19795-2:<br />
  7. 7. What do our testing standards say?<br />What about the:<br />Environment<br />ISO/IEC 1st WD 29197, Evaluation methodology for environmental influence in biometric systems<br />Human-Sensor Interaction<br />…<br />
  8. 8. Lack of user-centric design<br />(A. Adams and M.A. Sasse, “Users are Not the Enemy: Why Users Compromise Security Mechanisms and How to Take Remedial Mea- sures,” Comm. ACM, vol. 42, no. 12, 1999, pp. 41–46 explain that typical security deployments ignore, or at least leave till the end the user-centric design.<br />
  9. 9. Aim of HBSI model <br />Provide a structure and definition to errors that are observed while undertaking biometric tests on various biometric modalities. <br />
  10. 10. What Performance Evaluations Should Also Explain<br />Is the algorithm the cause of matching errors?<br />Is the application or environment the problem?<br />Is the design of the sensor the problem?<br />Are the users/agents causing the issue?<br />Can users/agents do what the system/sensor is asking for?<br />Do users/agents understand how to use the system/sensor?<br />Can users/agents produce repeatable images?<br />
  11. 11. HBSI model and deployed system<br />Understanding how users interact with the system is also important for successful deployment of the biometric system. <br />Full habituation arises when match scores are stabilized<br />
  12. 12. The journey to match score stabilization<br />
  13. 13. Usability & Biometrics “Usability”<br />Usability<br />The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use (ISO 9241-11:1998, ISO/IEC 25062:2006)<br />Failure to Acquire (FTA)<br />Traditional measure of “usability” in biometrics<br />Proportion of verification or identification attempts for which the system fails to capture or locate an image or signal of sufficient quality (ISO/IEC 19795-1)<br />
  14. 14. The Human-Biometric Sensor Interaction (HBSI)<br />Derived from multiple research fields to better understand and evaluate overall functionality and performance of a biometric system<br />
  15. 15. HBSI Framework for Biometric Interactions<br />Objective<br />Classify every human-sensor interaction “event” with the resulting biometric system “reaction”<br />Event + Reaction = HBSI episode<br />Purpose<br />Understand and classify all interactions / movements / behaviors that occur with a biometric device to improve performance, quality, and usability<br />Examines a biometric system from 2 perspectives:<br />User & Biometric System<br />
  16. 16. HBSI Framework for Biometric Interactions<br />
  17. 17. Incorrect Presentation – Defective Interaction<br />A defective interaction occurs when a bad presentation is made by the subject to the hand geometry machine that is not detected by the system<br />
  18. 18. Incorrect Presentation – User Concealed Interaction<br />Presentations to the hand geometry machine when the user is at fault for the erroneous presentation.<br />
  19. 19. Incorrect Presentation – System Concealed Interaction<br />The user interacts with the hand geometry machine, but does not provide a good quality sample.<br />
  20. 20. Incorrect Presentation – System Failure to Detect<br />Where the user correctly places their hand in the hand geometry machine, but the machine does nothing and times out.<br />
  21. 21. Incorrect Presentation – External FTD<br />Factors outside the control of the user impact the system<br />
  22. 22. Incorrect Presentation – Failure to Extract<br />Occurs after a sample has been collected from the hand geometry machine, but the algorithm fails to extract meaningful data<br />
  23. 23. Incorrect Presentation – Successfully Acquired Sample<br />When the sample has been detected by the system and subsequently extracted and passed through to the biometric matching system<br />
  24. 24. Observations<br />Interaction videos are interesting so that we can segment these errors<br />Once these errors have been identified we can improve training<br />
  25. 25. Examples – False Interaction<br />
  26. 26. Examples - FTX<br />
  27. 27. Examples – System CI <br />
  28. 28. Examples - FTD <br />
  29. 29. Example - SAS<br />
  30. 30. Results - Enrollment<br />
  31. 31. Results - Verification<br />
  32. 32. Analysis and Results<br />
  33. 33. Future Work<br />Evaluate more modalities with the framework<br />physical-interactive<br />image-based<br />behavioral<br />Refinement of the metrics<br />Inter-rater reliability<br />T&E Standard Methodology? <br />
  34. 34. Any Questions?<br />Follow the discussion on the research blog after the conference<br />www.bspalabs.org/<br />
  35. 35. Authors and Primary Contact Information<br />Authors<br />Benny Senjaya<br />Graduate Researcher at BSPA Lab<br />bennysenjaya@gmail.com<br />Stephen Elliott, Ph.D.<br />BSPA Lab Director & Associate Professor<br />elliott@purdue.edu<br />Eric Kukula, Ph.D.<br />Visiting Assistant Professor<br />ekukula@gmail.com<br />Mark Wade<br />Undergraduate Researcher<br />Jason Werner<br />Undergraduate Researcher<br />Contact Information<br />Stephen Elliott, Ph.D.<br />Associate Professor<br />Director of BSPA Labs<br />elliott@purdue.edu<br />