Trust, Privacy and Biometrics


Published on

Presentation given to the Biometrics Working Group on 14 May

Published in: Technology, News & Politics
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • - eg use genuinely one-way templates (hardly any of which exist), keep templates on hardware directly under user control (not in verifier databases), do checks on equipment under user control (eg smartcards) or at most on readers certified not to retain biometric data after scanning it and passing it to user-controlled equipment that says \"yes/no\" (and hence will need some TPM-style zero-knowledge certified approval and check against revocation)
  • Trust, Privacy and Biometrics

    1. 1. Privacy, trust and biometrics Dr Ian Brown Oxford Internet Institute University of Oxford
    2. 2. Short-term trust Reputation of the organising institution • Opinions in the mass media about • technologies Attitudes & opinions of friends and family • Convenience system brings • AM Oostveen (2007) Context Matters: A Social Informatics Perspective on the Design and Implications of Large-Scale e- Government Systems, PhD thesis, Amsterdam University
    3. 3. Trust in government
    4. 4. Trust is fragile “Trust is built over the long term, on  the basis not of communication but of action. And then again, trust, once established, can be lost in an instant” -Neil Fitzgerald, Chairman, Unilever
    5. 5. Longer-term legitimacy Informed, democratic consent • Do citizens and their representatives have full • information on costs & benefits? Privacy Impact Assessment? • Compatibility with human rights (S & Marper v • UK, Liberty v UK, I v Finland) Continued legislative and judicial oversight and • technological constraint Privacy by Design •
    6. 6. How not to do it “We really don't know a whole lot about the overall • costs and benefits of homeland security” –senior DHS economist Gary Becker (2006) “Policy discussions of homeland security issues are • driven not by rigorous analysis but by fear, perceptions of past mistakes, pork-barrel politics, and insistence on an invulnerability that cannot possibly be achieved.” – Jeremy Shapiro (2007) “Finding out other people’s secrets is going to involve • breaking everyday moral rules.” –David Omand (2009)
    7. 7. Credible impact assessment Risk must be quantified to be meaningful, even for • low-probability high-impact events How strong is evidence that “solution” will work? • How widely do stakeholders agree that cost < • benefit? Include direct cost, inconvenience, enhancement of fear, negative economic impacts, reduction of liberties “Any analysis that leaves out such considerations is • profoundly faulty, even immoral” John Mueller (2009) The quixotic quest for invulnerability, International Studies Association, New York
    8. 8. CCTV efficacy Effective only in limited • circumstances (e.g. car parks); otherwise reduces crime by about 3% (NACRO) Better street lighting reduces • crime by 20% (Home Office) “It's been an utter fiasco: • only 3% of crimes were solved by CCTV” -DCI Mike Neville, head of Visual Images, Identifications and Detections Office
    9. 9. Efficacy of facial recognition Does it identify terrorists • and serious criminals, or pickpockets and ticket touts? How many arrests might • we expect? How accurate in typical • conditions? Do we have high-quality • images of terrorist suspects?
    10. 10. What we need for biometrics Strong evidence base for any biometric proposed for public use 1. A careful threat analysis and cost/benefit assessment for each 2. proposed use, including damage caused to privacy and other human rights and a comparison with alternative mechanisms, with independent scrutiny The strict technological limit of the use of biometrics to the 3. minimum required to achieve the security goals of each use Wherever possible, full consumer choice in the decision to use 4. biometrics at all and then in the issuer (eg Crosby-style private sector leadership and consumer choice with govt playing a minimal standards-setting role) Full transparency and strict oversight and enforcement of DPA 5. and ECHR principles in the design and operation of systems
    11. 11. Conclusions Democratic legitimacy and human rights • are critical to ensure security technology supports rather than subverts liberal political values Meaningful transparency, ECHR • compliance and oversight are critical Slap-dash risk assessment and spin are • extremely corrosive to trust in long term