• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Trust, Privacy and Biometrics

Trust, Privacy and Biometrics



Presentation given to the Biometrics Working Group on 14 May

Presentation given to the Biometrics Working Group on 14 May



Total Views
Views on SlideShare
Embed Views



6 Embeds 82

http://dooooooom.blogspot.com 67
http://dooooooom.blogspot.co.uk 6
http://www.slideshare.net 5
http://dooooooom.blogspot.it 2
http://dooooooom.blogspot.ca 1
http://www.docshut.com 1


Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment
  • - eg use genuinely one-way templates (hardly any of which exist), keep templates on hardware directly under user control (not in verifier databases), do checks on equipment under user control (eg smartcards) or at most on readers certified not to retain biometric data after scanning it and passing it to user-controlled equipment that says \"yes/no\" (and hence will need some TPM-style zero-knowledge certified approval and check against revocation)

Trust, Privacy and Biometrics Trust, Privacy and Biometrics Presentation Transcript

  • Privacy, trust and biometrics Dr Ian Brown Oxford Internet Institute University of Oxford
  • Short-term trust Reputation of the organising institution • Opinions in the mass media about • technologies Attitudes & opinions of friends and family • Convenience system brings • AM Oostveen (2007) Context Matters: A Social Informatics Perspective on the Design and Implications of Large-Scale e- Government Systems, PhD thesis, Amsterdam University
  • Trust in government
  • Trust is fragile “Trust is built over the long term, on  the basis not of communication but of action. And then again, trust, once established, can be lost in an instant” -Neil Fitzgerald, Chairman, Unilever
  • Longer-term legitimacy Informed, democratic consent • Do citizens and their representatives have full • information on costs & benefits? Privacy Impact Assessment? • Compatibility with human rights (S & Marper v • UK, Liberty v UK, I v Finland) Continued legislative and judicial oversight and • technological constraint Privacy by Design •
  • How not to do it “We really don't know a whole lot about the overall • costs and benefits of homeland security” –senior DHS economist Gary Becker (2006) “Policy discussions of homeland security issues are • driven not by rigorous analysis but by fear, perceptions of past mistakes, pork-barrel politics, and insistence on an invulnerability that cannot possibly be achieved.” – Jeremy Shapiro (2007) “Finding out other people’s secrets is going to involve • breaking everyday moral rules.” –David Omand (2009)
  • Credible impact assessment Risk must be quantified to be meaningful, even for • low-probability high-impact events How strong is evidence that “solution” will work? • How widely do stakeholders agree that cost < • benefit? Include direct cost, inconvenience, enhancement of fear, negative economic impacts, reduction of liberties “Any analysis that leaves out such considerations is • profoundly faulty, even immoral” John Mueller (2009) The quixotic quest for invulnerability, International Studies Association, New York
  • CCTV efficacy Effective only in limited • circumstances (e.g. car parks); otherwise reduces crime by about 3% (NACRO) Better street lighting reduces • crime by 20% (Home Office) “It's been an utter fiasco: • only 3% of crimes were solved by CCTV” -DCI Mike Neville, head of Visual Images, Identifications and Detections Office
  • Efficacy of facial recognition Does it identify terrorists • and serious criminals, or pickpockets and ticket touts? How many arrests might • we expect? How accurate in typical • conditions? Do we have high-quality • images of terrorist suspects?
  • What we need for biometrics Strong evidence base for any biometric proposed for public use 1. A careful threat analysis and cost/benefit assessment for each 2. proposed use, including damage caused to privacy and other human rights and a comparison with alternative mechanisms, with independent scrutiny The strict technological limit of the use of biometrics to the 3. minimum required to achieve the security goals of each use Wherever possible, full consumer choice in the decision to use 4. biometrics at all and then in the issuer (eg Crosby-style private sector leadership and consumer choice with govt playing a minimal standards-setting role) Full transparency and strict oversight and enforcement of DPA 5. and ECHR principles in the design and operation of systems
  • Conclusions Democratic legitimacy and human rights • are critical to ensure security technology supports rather than subverts liberal political values Meaningful transparency, ECHR • compliance and oversight are critical Slap-dash risk assessment and spin are • extremely corrosive to trust in long term