1. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Today‘s Menu
PRIVACY IN UBIQUITOUS Understanding Privacy Technical Approaches
COMPUTING Definitions Challenges
1. History and legal aspects 1. Location privacy
Marc Langheinrich 2. Motivating privacy 2. RFID privacy
University of Lugano (USI), Switzerland
14
A Privacy Definition
“The right to be let alone.“
Warren and Brandeis, 1890
(Harvard Law Review)
“Numerous mechanical
devices threaten to make
Privacy in Ubiquitous Computing good the prediction that
UNDERSTANDING PRIVACY ’what is whispered in the
closet shall be proclaimed
from the housetops’“
Image source: http://historyofprivacy.net/RPIntro3-2009.htm
Technological Revolution, 1888 Information Privacy
“The desire of people to choose
freely under what circumstances and
to what extent they will expose
themselves, their attitude and their
th l th i ttit d d th i
behavior to others.“
Alan Westin, 1967
Privacy And Freedom, Atheneum
Dr. Alan F. Westin
George Eastman
1854-1932
18
Image Source: Wikipedia; Encyclopedia Britannica (Student Edition)
1
2. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Privacy Facets Privacy Invasions
Bodily Privacy When Do We Feel that Our Privacy Has Been Violated?
Strip Searches, Drug Testing, … Perceived privacy violations due to crossing of “privacy
Territorial Privacy borders“
Privacy Of Your Home, Office, … Privacy Boundaries
Communication Privacy 1. Natural
Phone Calls, (E-)mail, … 2. Social
3. Spatial / temporal
Informational Privacy
4. Transitory
Personal Data (Address, Hobbies, …) Gary T. Marx
MIT
20
Privacy Borders (Marx)
Natural
Physical limitations (doors, sealed letters)
Social
Group confidentiality (doctors, colleagues)
Privacy in Ubiquitous Computing
Spatial / Temporal
Family vs. work, adolescence vs. midlife 1. HISTORY AND LEGAL ISSUES
Transitory
Fleeting moments, unreflected utterances
21
Privacy Law History Fair Information Principles (FIP)
Justices Of The Peace Act (England, 1361) Drawn up by the OECD, 1980
Sentences for Eavesdropping and Peeping Toms “Organisation for economic cooperation & development“
Voluntary guidelines for member states
„The poorest man may in his cottage bid defiance to Goal: Ease transborder flow of goods (and information!)
all the force of the crown. It may be frail; its roof may Eight Principles
shake; … – but the king of England cannot enter; all his 1. Collection Limitation 5. Security Safeguards
forces dare not cross the threshold of the ruined 2. Data Quality 6. Openness
tenement“ 3. Purpose Specification 7. Individual Participation
William Pitt the Elder (1708-1778) 4. Use Limitation 8. Accountability
First Modern Privacy Law in the German State Hesse, 1970 Core principles of modern privacy laws world-wide
Source: Robert Gellman „Fair Information Practices: A Basic Histroy“, http://bobgellman.com/rg-docs/rg-FIPshistory.pdf
23 See also http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html
2
3. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Laws and Regulations US Public Sector Privacy Laws
Privacy laws and regulations vary widely Federal Communications Act, 1934, 1997 (Wireless)
throughout the world Omnibus Crime Control and Safe Street Act, 1968
US has mostly sector-specific laws, with relatively Bank Secrecy Act, 1970
minimal protections Privacy Act, 1974
Self-Regulation favored over comprehensive privacy laws Right to Financial Privacy Act, 1978
Privacy Protection Act, 1980
Fear that regulation hinders e-commerce
Computer Security Act, 1987
Europe has long favored strong, omnibus privacy laws Family Educational Right to Privacy Act, 1993
Often single framework for both public & private sector Electronic Communications Privacy Act, 1994
Privacy commissions in each country (some countries have Freedom of Information Act, 1966, 1991, 1996
national and state commissions) Driver’s Privacy Protection Act, 1994, 2000
25 26
US Private Sector Laws EU Privacy Law
Fair Credit Reporting Act, 1971, 1997 EU Data Protection Directive 1995/46/EC
Cable TV Privacy Act, 1984 Sets a benchmark for national law for processing personal
information in electronic and manual files
y
Video Privacy Protection Act, 1988 Expands on OECD Fair Information Practices:
Health Insurance Portability and Accountability Act, 1996 no automated adverse decisions
Children’s Online Privacy Protection Act, 1998 minimality principle
retention limitation
Gramm-Leach-Bliley-Act (Financial Institutions), 1999 special provisions for “sensitive data”
Controlling the Assault of Non-Solicited Pornography And compliance checks
Marketing Act (CAN-SPAM), 2003 Facilitates data-flow between Member States and restricts
export of personal data to „unsafe“ non-eu countries
27 28
National Implementation EU Privacy Law
Directive(s) Transcribed Into National Law(s) EU Data Protection Directive 1995/46/EC
Fines for countries that fail to meet deadline Sets a benchmark for national law for processing personal
information in electronic and manual files
National Laws Can Be Stricter Than Directive
Expands on OECD Fair Information Practices:
Directive only sets baseline privacy level no automated adverse decisions
Still 27+3 national regimes (EU+EEA)! minimality principle
Data Protection Commissioner Oversight retention limitation
special provisions for “sensitive data”
Significantly different powers in each country: some only
compliance checks
„advise“, others can block legislation
Facilitates data-flow between Member States and restricts
export of personal data to „unsafe“ non-EU countries
EEA: European Economic Area (Norway, Lichtenstein, Iceland)
EFTA: European Free Trade Association (EEA+Switzerland)
30
3
4. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Related EU Directives
Telecommunications Directive 97/66/EC
Added specific rules for telecommunications systems
Privacy & Electronic Comm. Directive 2002/58/EC
Updates 97/66 to cover „electronic communications“
Privacy in Ubiquitous Computing
Data Retention Directive 2006/24/EC
Adds provisions for retaining {call, email, Web}-logs
2. MOTIVATING PRIVACY
Data must be stored for 6-24 months
Member states can go beyond what 2006/24 mandates
See, e.g., https://wiki.vorratsdatenspeicherung.de/Transposition for current status of transposition
Why Privacy? The NTHNTF-Argument
“A free and democratic society requires respect „If you’ve got nothing to hide,
for the autonomy of individuals, and limits on
the power of both state and private you’ve got nothing to fear”
organizations to intrude on that autonomy… UK Gov’t Campaign Slogan for CCTV (1994)
privacy is a key value which underpins human
dignity and other key values such as freedom of Assumption
association and freedom of speech…“ Privacy is (mostly) about hiding (evil/bad/unethical) secrets
Preamble To Australian Privacy Charter, 1994
Implications
“All this secrecy is making life harder, more
expensive, dangerous and less serendipitous“ Privacy protects wrongdoers (terrorists, child molesters, …)
Peter Cochrane, Former Head Of BT Research No danger for law-abiding citizens
“You have no privacy anyway, get over it“ Society overall better off without it!
Scott McNealy, CEO Sun Microsystems, 1995
36 37
Dec. 2009
“But I’ve Got Nothing to Hide!”
Do you?
Arson Near Youth House Niederwangen, CH
At scene of crime: Tools from supermarket chain
Court ordered disclosure of all 133 consumers
who bought items on their supermarket loyalty
card (8/2004)
(Arsonist not yet found)
“Give me six lines written by the most
honorable of men, and I will find an
excuse in them to hang him”
Armand Jean du Plessis, 1585-1642
(a.k.a. Cardinal de Richelieu)
4
5. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Issue: Profiles Why Privacy Law?
Allow Inferences About You As Empowerment
May or may not be true (re. AOLStalker)! “Ownership“ of personal data
As Utility
May Categorize You
Protection from nuisances
High spender, music afficinado, credit risk (e.g., spam) Prof. Lawrence Lessig
Stanford Law School
May Offer Or Deny Services As Dignity
Rebates, different prices, priviliged access Balance of power (“nakedness“)
As Constraint Of Power
„Social Sorting“ (Lyons, 2003)
Limits enforcement capabilities of ruling elite
Opaque decisions „channel“ life choices As By-Product
Residue of inefficient collection mechanisms
Source: Lawrence Lessig, Code and Other Laws Of Cyberspace.
Image Sources: http://www.jimmyjanesays.com/sketchblog/paperdollmask_large.jpg and
41
http://www.queensjournal.ca/story/2008-03-14/supplement/keeping-tabs-personal-data/
Example: Search And Seizures Search & Seizures 21st Century
4th Amendment Of US Constitution All Home Software Configured By
“The right of the people to be secure in Law To Monitor For Illegal Activities
their persons, houses, papers, and effects,
against unreasonable searches and Fridges detect stored explosives, PCs scan
seizures, shall not be violated, and no hard disks for illegal data, knifes report stabbings
warrants shall issue, but upon probable Non-illegal Activities NOT Communicated
cause, supported by oath or affirmation,
and particularly describing the place to be Private conversations, actions, remain private
searched, and the persons or things to be Only illegal events reported to police
seized.“
No Nuisance Of Unjustified Searches
Privacy As Utility? Privacy As Dignity?
Compatible with 4th amendment?
Source: Lawrence Lessig, Code and Other Laws Of Cyberspace. Source: Lawrence Lessig, Code and Other Laws Of Cyberspace.
42 43
Not Orwell, But Kafka! Today‘s Menu
Understanding Privacy Technical Approaches
Definitions Challenges
1. History and legal aspects 1. Location privacy
2. Motivating privacy 2. RFID privacy
47 48
5
6. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
The Information Society
More transactions will tend to be recorded
The records will tend to be kept longer
Information will tend to be given to more people
More data will tend to be transmitted over public
M d t ill t d t b t itt d bli Paul Sieghart
Portrait by Paul Benny
communication channels
Privacy in Ubiquitous Computing
Fewer people will know what is happening to the data
The data will tend to be more easily accessible
Data can be manipulated, combined, correlated, associated and
analysed to yield information which could not have been
obtained without the use of computers“
Paul Sieghart: Privacy and Computers. London, Latimer, 1976, pp. 75-76
50
Ubicomp Privacy Implications Changing the Playing Field
Data Collection (“more transactions“) Ubicomp: The Reversal of Defaults
Scale (everywhere, anytime) What was once hard to copy is now trivial to
duplicate
Manner (inconspicuous, invisible)
What was once forgotten is now stored forever
Motivation (context!)
What was once private is now public
Ron Rivest
Data Types (“not without computers“) MIT
Observational instead of factual data Challenges for Society
Data Access (“more easily accessible“) New ways of public/private life?
“The Internet of Things“ New balance between the individual and society?
Who is in charge?
51 52
FIP Challenges in Ubicomp Border Crossings in Ubicomp
How to inform subjects about data collections? Smart appliances (natural borders)
Unintrusive but noticeable “Spy“ on you in your own home
How to provide access to stored data? Family intercom (social borders)
Who has it? How much of this is “my data“? Grandma knows when you’re home
How to ensure confidentiality, and authenticity?
Consumer profiles (temporal borders)
Without alienating user (think „usability“)!
Span time & space
How to minimize data collection?
“Memory amplifier“ (transitory borders)
What part of the “context“ do we reall need?
Records careless utterances
How to obtain consent from data subjects?
Missing UIs? Do people understand implications?
53 54
6
7. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Why Location Information?
Positioning
e.g., emergencies
Navigation and Routing
for mobile devices
A Brief Intro To Logistics
tracking moving objects,
LOCALIZATION monitoring,...
Resource optimization, energy savings
You Are Here turn down the heating when I am far away
Location-based services
F.Ma. 56
Location slides courtesy of F. Mattern: Ubiquitous Computing Lecture, ETH Zurich
Location Models Location Technologies
Geometric („physical“)
based on a reference coordinate system (“grid based”) Various location technologies
locations and located objects: points, areas,
volumes - sets of coordinate-tuples
No technology is right for every
Symbolic
b l A
situation, different considerations
symbolic
topology (contained, adjacent,...), B C
view
typically hierarchically organized D
cost
(e.g., postal address) geometric accuracy
A B D C view
human-friendly, but
scalability
needs to be maintained
names depend on the application domain indoor / outdoor
reverse mapping (symbolic to physical) may be not unique
limited spatial resolution
private / public
F.Ma. 57 F.Ma. 58
Loc. Technology Characterization
Absolute Positioning: Geometry
Q1
Triangulation (AOA) a1 b1
Absolute Positioning
w.r.t. Some reference system
by taking the bearings of
an object from fixed points a3
b3 X
Relative Positioning Q3 b2
a2
e.g., measure
eg
movement of object Q2
Trilateration (TOA)
also called „spherical positioning“
by measuring the distance P1
Tagged Self–positioning
object knows its position d1
locate a marker
Remote positioning Multilateration (TDOA) d3
X d2
Untagged system is aware of
e.g., vision
also called „hyperbolic positioning“ P3
object position
by comparing relative distances P2
F.Ma. 59 F.Ma./M.La. 60
7
8. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Angle of Arrival (AOA) Time of Arrival (TOA)
Angle between sender and receiver Delay between sender and receiver
Needs only 2 transmitters for 2-D
positioning!
propagation time (3 stations for 2-D positioning)
θA g y g p
Highly range dependent One-way time: time synchronization needed
X Small measurement errors can lead to accurate, stable clocks
A big inaccuracies at large distances
or 2 signals having different
θB velocity
VOR (VHF Omnidirectional Range)
B
used in aviation or additional time reference
GSM sector Round-trip time
no synchronization
GPS, Radar
F.Ma. 61 F.Ma. 62
Measuring Time-of-Flight
Trilateration with Two Different Velocities
Radio channel is used to synchronize the sender and
receiver (over short distances)
Time-of-flight of acoustic signal is determined by
comparing arrival of RF and acoustic signals
3ns/m for electromagnetic (i.e., RF) signals
3ms/m for sound (6 orders of magnitude difference!)
Radio Radio
CPU CPU
Speaker Microphone
F.Ma. 63
Source: FU Berlin F.Ma. 65
Received Signal Strength Indicator
Time Difference of Arrival (TDOA) Signal Strength As Distance Measure?
Receivers compare time TDOAC-A In theory signal strength (RSSI) correlates with distance
difference of signal arrival But: Various sources of errors (multipath, fading etc.)
sent by unknown location X
not a monotonic function!
TDOAB-A
A
In 2D: at least 2 hyperbolae
required X C
needs 3 receivers A, B, C
hyperbola
Synchronization between Source: Victor Bahl,
reference stations required B Microsoft Research
mobile phones
F.Ma./M.La. 66 F.Ma. 67
8
9. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Practical Difficulties with RSSI Fingerprinting
Path loss characteristics depend
on environment (1/rn)
Have I seen this before?
correlation with past observations
Shadowing depends on Path loss need to keep track of
environment Shadowing
environmental properties
Fading
Short-scale fading due to Signal Strength
h works with: vision, radio signals etc
vision signals, etc.
multipath
adds random high frequency Requires learning phase
component with huge connect true locations with observations in database
amplitude (30-60dB) e.g., tabulate <location, signal strength> information
mobile nodes might average out
fading, but static nodes can be Recognition phase
stuck in a deep fade Distance make observations (e.g., signal strength from base stations)
find entry that “best” matches the observation
F.Ma. 68 M.La. 69
Signal Strength Fingerprinting Summary: Absolute Positioning
Map of signal distribution TOA - time of arrival TDOA – AOA - angle of arrival
measured time difference of arrival
model, calculated TDOAC-A
θA
TDOAB-A
Errors A
X
A X
C
obstacle hyperbola
θB
multipath B B
Signal strength
Absolute positioning
methods do not rely
on knowledge of
previous positions
Must be periodically retrained, as environment changes (base stations)
F.Ma. 70 F.Ma. 75
Relative Positioning
Distance Orientation in space
• gyroscope (rigid in space)
• distance itself (odometer)
• velocity (speedometer)
• acceleration
x = ∫∫ a ( t ) dtdt Privacy in Ubiquitous Computing
• height (e.g., barometer) LOCATION PRIVACY
• Inertial Navigation System (INS)
used in aviation
• Car navigation
(also uses compass)
F.Ma./M.La.
76
9
10. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Location Privacy Motivating Disclosure
“… the ability to prevent other parties Why Share Your Location?
from learning one’s current or past By-product of positioning technology
location.“ (e.g., cell towers, WiFi, ...)
Alastair Beresford Frank Stajano
(Beresford and Stajano, 2003) Cambridge Univ. Cambridge Univ. Required to use service (local search,
automated payment for toll roads, ...)
„It‘s not about where you are... Social benefits (let friends and family
It‘s where you have been!“ know where I am, finding new friends, ...)
Gary Gale, Head of UK Engineering for
Yahoo! Geo Technologies Why NOT to Share Your Location?
Gary Gale Location profiles reveal/imply activities, interests, identity
Yahoo! UK
Location Implications Location Privacy Technology
Places I Go Many Proposals
Where I Live / Work Laws/regulations and audits (enterprise privacy)
Who I Am (Name) Anonymization (“k-anonymity“)
Hobbies/Interests/Memberships Obfuscation
People I Meet Rule-based access control John Krumm
Microsoft Research
My Social Network Privacy Model?
Profiling, e.g., Assumption: Less location disclosure means more privacy
ZIP-Code: implies income, ethnicity, family size
(Krumm, 2008) Provides Overview of State-of-the-Art
81
Location Anonymity
[Naïve Approach] Might As Well Use Pseudonyms
Use random IDs that change periodically Since naïve approach is trivial to pseudonymize
Trivial to trace Any better?
10
11. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Why Pseudonyms Don‘t Work Observation Identifcation Attack
Observation Identification (OI) Attack
Correlate single identifiable observation with
location pseudonym
ATM use @ location -> Name for pseudonym
Observation Identifcation Attack Observation Identifcation Attack
Why Pseudonyms Don‘t Work Pseudonymous User Trace
Observation Identification (OI) Attack
Correlate single identifiable observation with
location pseudonym
ATM use @ location -> Name for pseudonym
Restricted Space Identification (RSI) Attack
Works without direct observations
Uses known mapping from place to name
Home location -> Home address -> Name (Phonebook)
Img src: [Bereseford, Stajano 2003]
11
12. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Challenge: Where to setup such Mix Zones? And what if no one’s there (late night)?
Location Mix Zones Location Obfuscation
Address Restricted Space
Identification Attacks
How to change pseudonyms?
Idea: Designate “Mix Zones“ Alastair Beresford
Cambridge Univ.
Frank Stajano
Cambridge Univ.
With No Tracking / LBS Active
Change pseudonyms only within Adding noise, pertubation, dummy traffic to location data
mix zone
Protects against attackers, but degrades service use
(Beresford and Stajano, 2003)
offer probabilistic model for (Krumm, 2007) showed that LOTS of obfuscation is needed
unlinkability in mix zones Typically combined with rules to selectively adjust accuracy
Image Source: Krumm, J., Inference Attacks on Location Tracks, in Fifth International Conference
on Pervasive Computing (Pervasive 2007). 2007: Toronto, Ontario Canada. p. 127-143.
Track Obfuscation Summary: Location Privacy
Location popular information to share
Location-based Web search
Friend finder, local recommendations
Location traces as source for profiling
Imply activities, interests, friends, $$, ...
Location tracks more difficult to fake! Requires Simple anonymization does not work
Believable speeds (existing speed limits) Observation Identification Attack
Realistic start/end-points, trip times (duration, days) Restricted Space Identification Attack
Suboptimal routes (human driver vs. route planner) Solutions? Mix Zones, Obfuscation, Dummy Traffic, ...
Expected GPS noise (higher in urban environments)
7th
Krumm, J., Realistic Driving Tracks for Location Privacy. In International Conference on
93
Pervasive Computing (Pervasive 2009), Nara, Japan, Springer.
Privacy in Ubiquitous Computing
RFID PRIVACY Summary and Outlook
Img src: www.flickr.com/photos/nomeacuerdo/431060441/
12
13. Marc Langheinrich: Privacy in Ubiquitous Computing 5/29/2010
Beware the Techno Fallacies! Take Home Message
“if some is good, more is better” Privacy is Not Just Secrecy and Seclusion!
“only the computer sees it” Privacy is a process, not a state
Solution requires good understanding of social,
pp
“that has never happened” legal, and policy issues involved
legal
“facts speak for themselves” Gary T. Marx
MIT Ubiquitous Computing Offers New Challenges
“if we have the technology, why not use it?” Invisible, comprehensive, sensor-based, …
“technology is neutral” Ubicomp (Privacy) Challenges
User interface (notice, choice, consent)
Technology Is Neither Good Nor Bad. Nor Is It Protocols (anonymity, security, access)
Melvin C. Kranzberg
Neutral Melvin C. Kranzberg
Georgia Tech (1917-1995) Social compatibility (privacy boundaries)
Source: G. Marx „ Some Information Age Techno-Fallacies,“ Contingencies and Crisis Management, 11(1), March 2003, pp. 25-31.
111 112
See also http://www.spatial.maine.edu/~onsrud/tempe/marx.html
Ubiquitous Computing General Privacy Reading
John Krumm (ed): David Brin: The Transparent Society.
Ubiquitous Computing Fundamentals.
Taylor & Francis, 2009 Perseus Publishing, 1999
Simson Garfinkel: Database Nation –
With Contributions From:
Roy Want The Death of Privacy in the 21st
Th D h f P i i h
Jakob Bardram and Adrian Friday Century. O’Reilly, 2001
Marc Langheinrich
A.J. Bernheim Brush
Lawrence Lessig: Code and Other
Alex S. Taylor Laws of Cyberspace. Basic Books,
Aaron Quigley 2006 http://codev2.cc/
Alexander Varshavsky and Shwetak Patel
Anind K. Dey
John Krumm
114
Privacy Law Privacy and Technology
Rotenberg: The Privacy Law Deborah Estrin (ed.): Embedded, Every-
Sourcebook 2004. EPIC, 2004 where: A Research Agenda for Networked
Systems of Embedded Computers.
Privacy & Human Rights 2006.
y g National Academies Press, 2001.
Press 2001
EPIC http://www.nap.edu/openbook.php?isbn=0309075688
Solove, Schwartz: Information Waldo, Lin, Millett (eds.): Engaging Privacy
and Information Technology in a Digital
Privacy Law. 3rd edition, Age. National Academies Press, 2007.
Aspen, 2009 Wright, Gutwirth, Friedewald, et al.:
Safeguards in a World of Ambient
Intelligence. Springer, 2008
115 116
13