Your SlideShare is downloading. ×
(2010) HBSI and Hand Geometry
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

(2010) HBSI and Hand Geometry

993
views

Published on

In this paper, we examine the role of HBSI and Hand Geometry.

In this paper, we examine the role of HBSI and Hand Geometry.

Published in: Education

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
993
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
26
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Evaluation of the Human Biometric Sensor Interaction using Hand GeometryCarnahan Conference| San Jose, CA| October 7th, 2010
    Biometric Standards, Performance, and Assurance Laboratory |
    Purdue University
    www.bspalabs.org
    www.twitter.com/bspalabs
    www.slideshare.net/bspalabs
    www.linkedin.com/companies/bspa-labs
  • 2. Agenda
    Current Testing Standards and Norms
    The Missing Link?
    What performance evaluations should also explain
    Usability & Biometrics: Our systems should be usable?
    The Human-Biometric Sensor Interaction (HBSI)
    HBSI Framework
    Applicability to Hand Geometry
    Questions
  • 3. Scope – from Mansfield & Grother’sThe Wide World of Biometric Testing
    Measure FRR after data collection
    What should be done
    Have tests been driven by what can be done
    VS
    Observe and count mispresentation effects
  • 4. Development of a general model
    Wayman
    “commonly-held knowledge and oft-repeated descriptions of biometric identification systems are more complicated than necessary because of this lack of a generally accepted system model”
  • 5. A General Biometric System Model (ISO/IEC 19795-1)
  • 6. What do our testing standards say?
    Distinctions between technology and scenario evaluations according to ISO/IEC 19795-2:
  • 7. What do our testing standards say?
    What about the:
    Environment
    ISO/IEC 1st WD 29197, Evaluation methodology for environmental influence in biometric systems
    Human-Sensor Interaction

  • 8. Lack of user-centric design
    (A. Adams and M.A. Sasse, “Users are Not the Enemy: Why Users Compromise Security Mechanisms and How to Take Remedial Mea- sures,” Comm. ACM, vol. 42, no. 12, 1999, pp. 41–46 explain that typical security deployments ignore, or at least leave till the end the user-centric design.
  • 9. Aim of HBSI model
    Provide a structure and definition to errors that are observed while undertaking biometric tests on various biometric modalities.
  • 10. What Performance Evaluations Should Also Explain
    Is the algorithm the cause of matching errors?
    Is the application or environment the problem?
    Is the design of the sensor the problem?
    Are the users/agents causing the issue?
    Can users/agents do what the system/sensor is asking for?
    Do users/agents understand how to use the system/sensor?
    Can users/agents produce repeatable images?
  • 11. HBSI model and deployed system
    Understanding how users interact with the system is also important for successful deployment of the biometric system.
    Full habituation arises when match scores are stabilized
  • 12. The journey to match score stabilization
  • 13. Usability & Biometrics “Usability”
    Usability
    The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use (ISO 9241-11:1998, ISO/IEC 25062:2006)
    Failure to Acquire (FTA)
    Traditional measure of “usability” in biometrics
    Proportion of verification or identification attempts for which the system fails to capture or locate an image or signal of sufficient quality (ISO/IEC 19795-1)
  • 14. The Human-Biometric Sensor Interaction (HBSI)
    Derived from multiple research fields to better understand and evaluate overall functionality and performance of a biometric system
  • 15. HBSI Framework for Biometric Interactions
    Objective
    Classify every human-sensor interaction “event” with the resulting biometric system “reaction”
    Event + Reaction = HBSI episode
    Purpose
    Understand and classify all interactions / movements / behaviors that occur with a biometric device to improve performance, quality, and usability
    Examines a biometric system from 2 perspectives:
    User & Biometric System
  • 16. HBSI Framework for Biometric Interactions
  • 17. Incorrect Presentation – Defective Interaction
    A defective interaction occurs when a bad presentation is made by the subject to the hand geometry machine that is not detected by the system
  • 18. Incorrect Presentation – User Concealed Interaction
    Presentations to the hand geometry machine when the user is at fault for the erroneous presentation.
  • 19. Incorrect Presentation – System Concealed Interaction
    The user interacts with the hand geometry machine, but does not provide a good quality sample.
  • 20. Incorrect Presentation – System Failure to Detect
    Where the user correctly places their hand in the hand geometry machine, but the machine does nothing and times out.
  • 21. Incorrect Presentation – External FTD
    Factors outside the control of the user impact the system
  • 22. Incorrect Presentation – Failure to Extract
    Occurs after a sample has been collected from the hand geometry machine, but the algorithm fails to extract meaningful data
  • 23. Incorrect Presentation – Successfully Acquired Sample
    When the sample has been detected by the system and subsequently extracted and passed through to the biometric matching system
  • 24. Observations
    Interaction videos are interesting so that we can segment these errors
    Once these errors have been identified we can improve training
  • 25. Examples – False Interaction
  • 26. Examples - FTX
  • 27. Examples – System CI
  • 28. Examples - FTD
  • 29. Example - SAS
  • 30. Results - Enrollment
  • 31. Results - Verification
  • 32. Analysis and Results
  • 33. Future Work
    Evaluate more modalities with the framework
    physical-interactive
    image-based
    behavioral
    Refinement of the metrics
    Inter-rater reliability
    T&E Standard Methodology?
  • 34. Any Questions?
    Follow the discussion on the research blog after the conference
    www.bspalabs.org/
  • 35. Authors and Primary Contact Information
    Authors
    Benny Senjaya
    Graduate Researcher at BSPA Lab
    bennysenjaya@gmail.com
    Stephen Elliott, Ph.D.
    BSPA Lab Director & Associate Professor
    elliott@purdue.edu
    Eric Kukula, Ph.D.
    Visiting Assistant Professor
    ekukula@gmail.com
    Mark Wade
    Undergraduate Researcher
    Jason Werner
    Undergraduate Researcher
    Contact Information
    Stephen Elliott, Ph.D.
    Associate Professor
    Director of BSPA Labs
    elliott@purdue.edu