Your SlideShare is downloading. ×
Getting the social side of pervasive computing right
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Getting the social side of pervasive computing right

1,380
views

Published on

Published in: Business, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,380
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
19
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Getting the social side of pervasive computing right Ian Brown, Computer Science, UCL Also work with: Privacy International, European Digital Rights, Foundation for Information Policy Research, Open Rights Group
  • 2. Overview
    • “ Dark scenarios”
    • Privacy principles and laws
    • Designing privacy in
  • 3. Pervasive/ubiquitous/ambient/whatever
    • Computing systems that use new sensor technology (RFIDs, smart dust, biometric readers, millimeter wave scanners…) to collect an order of magnitude more environmental information
    • Networked together with novel user interfaces and large data stores to provide history
  • 4. Many positive prospects for “invisible” computing
    • Especially for those who are less comfortable with existing device-centric technology:
      • Seniors
      • Less educated
      • Non information workers
    • Much more information available from environment to improve system decision making
  • 5. Need to address social impacts to ensure trust in new systems
    • Just like security, privacy is much easier to design in from the start than to lump on at the end
    • Privacy disasters (see RFIDs) are hard to recover from
    • “ How would you like it if, for instance, one day you realized your underwear was reporting on your whereabouts?”
      • California State Senator Debra Bowen, at a 2003 hearing
  • 6. Safeguards in a World of Ambient Intelligence (SWAMI)
    • European Commission-funded research to identify four “dark scenarios” that show potential social problems with pervasive computing
    • Based on capabilities of existing and prototype systems and recent news stories
    • Two reports available from swami.jrc.es, conference being held 21-22 March in Brussels
  • 7. SWAMI “threats”
    • Quantity of personal information in circulation will increase greatly
    • Introduction of perceptual and biometric interfaces will transform the qualitative nature of personal information in circulation
    • Personalised services require the tracking and collection of significant portions of users’ everyday activities
  • 8. Day-to-day family life scenario
    • Middle-class teleworkers with two teenage children
    • Security requirements for home office includes video surveillance, presence and biometric sensors
    • Location-based services and shopping agents widely used
  • 9. Day-to-day family life problems
    • Police open investigation into father based on inaccurate profiling of his activities, delaying a promotion at work
    • Shopping agent interrupts mother’s client presentation with inappropriate gift suggestion. Borrowed dress is scanned by criminals, who break into friend’s home
    • Teenage son circumvents home network security, places bets, downloads porn, and rifles through father’s favourite sites and stored data
  • 10. Seniors on a journey scenario
    • Two retirees take a group vacation and a bus accident occurs after a malfunction in traffic light management software
    • Group members use advanced health monitoring systems
  • 11. Seniors on a journey problems
    • Seniors’ families are automatically notified of accident, but have trouble getting in contact while digital communicators are blocked by emergency responders and hospital
    • One trip member dies on way to hospital as her outdated health monitors fail to inform ambulance workers of serious internal injuries
    • Hackers caused traffic light problem by breaking into traffic management priority system
  • 12. Corporate malfeasance scenario
    • Data Mining Corporation builds profiles on hundreds of millions of adults based on personal data gathered through ambient intelligence
    • Profiles used by private companies and also government intelligence and immigration agencies
    • Critical infrastructure staff must wear location implants
  • 13. Corporate malfeasance problems
    • 16m records stolen through insider attack
    • Two highly-placed employees have disappeared but cannot be tracked via implants. Incidences of ID theft and blackmail leap up
    • Lack of ambient intelligence profiles on developing world citizens makes Western nations highly reluctant to allow visits
  • 14. Societal dependence scenario
    • Developed societies become totally dependent on pervasive technologies
    • Individuals have no meaningful choice to “opt out”
    • Resources concentrated on profitable activities such as marketing and profiling, not in areas such as environmental monitoring that could have a wider social impact
  • 15. Societal dependence problems
    • Very few systems are engineered for the 99.999% reliability we expect from the phone system
    • Imagine the chaos that could result when large-scale pervasive systems crash, are infected by viruses, are taken down by DoS attacks…
    • What emergent effects will we see from the interaction of greatly-increased personal area networks and systems?
    • Will privacy be an option?
  • 16. Overview
    • “ Dark scenarios”
    • Privacy principles and laws
    • Designing privacy in
  • 17. OECD Fair Information Practices
    • Collection Limitation Principle
    •   Data Quality Principle
    •   Purpose Specification Principle
    •   Use Limitation Principle
    •   Security Safeguards Principle
    •   Openness Principle
    •   Individual Participation Principle
    •   Accountability Principle
  • 18. EU Privacy Directive
    • OECD Fair Information Practices incorporated into 1995 EU Directive
    • All 25 member states must give force in national law
    • Limits export of personal data to non-compliant jurisdictions
    • Creates Working Party of Commissioners
  • 19. Art. 29 Working Party concerns
    • “ Working Party 29 (“Working Party 29”) is concerned about the possibility for some applications of RFID technology to violate human dignity as well as data protection rights. In particular, concerns arise about the possibility of businesses and governments to use RFID technology to pry into the privacy sphere of individuals. The ability to surreptitiously collect a variety of data all related to the same person; track individuals as they walk in public places (airports, train stations, stores); enhance profiles through the monitoring of consumer behaviour in stores; read the details of clothes and accessories worn and medicines carried by customers are all examples of uses of RFID technology that give rise to privacy concerns.” http://europa.eu.int/comm/justice_home/fsj/privacy/docs/wpdocs/2005/wp105_en.pdf
  • 20. Overview
    • “ Dark scenarios”
    • Privacy principles and laws
    • Designing privacy in
  • 21. Security not enough
    • Security is necessary but not sufficient for privacy
    • Magical crypto fairy dust will not solve your privacy problems
    • "those who think that their problem can be solved by simply applying cryptography don't understand cryptography and don't understand their problem" (Needham/Lampson)
  • 22. Pharmaceutical RFID trials (Enterprise Privacy Group)
    • Big pharmaceutical company wants to track medicines through the supply chain
    • Use RFID to monitor packages as they transit from factory through distribution chain to individual pharmacies
    • Pharmacy can check authenticity of specific medicines against manufacturer database
  • 23. How do we solve the “Viagra problem”?
    • Disclosure: publicise code of practice to customers
    • Collection limitation: remove tag at point of sale
    • Use limitation: kept entirely separate from loyalty schemes
    • Data quality: integrate into supply chain management
    • Accountability: pharmacist responsible to customers
  • 24. Government data sinks
    • If data can be collected about individuals, there will always be government pressure to store and access that information
    • E.g. PATRIOT Act National Security Letters, NSA activities within the US, EU data retention directive
    • Data minimisation is a key requirement for privacy in this legislative environment
    • Encryption is no protection if governments can compel decryption
  • 25. They have the technology…
    • "Our survey of 128 federal departments and agencies on their use of data mining shows that 52 agencies are using or are planning to use data mining. These departments and agencies reported 199 data-mining efforts, of which 68 are planned and 131 are operational.” –Government Accountability Office
    • “ [Techniques that] look at people's behavior to predict terrorist intent are so far from reaching the level of accuracy that's necessary that I see them as nothing but civil liberty infringement engines." –Jeff Jonas, chief scientist, IBM Entity Analytics
  • 26. Conclusions
    • Privacy choices are essential for consumer acceptance of technologies and regulatory compliance
    • Privacy needs to be built in, not slapped on
    • Key design choice: at every stage, minimise personal information collected

×