Eran toch: Designing for privacy
Upcoming SlideShare
Loading in...5
×
 

Eran toch: Designing for privacy

on

  • 568 views

A lecture in Microsoft Hertzelia, Out-of-the-Box week. The lecture revoles around the privacy threats in mobile computing, and its remedies.

A lecture in Microsoft Hertzelia, Out-of-the-Box week. The lecture revoles around the privacy threats in mobile computing, and its remedies.

Statistics

Views

Total Views
568
Views on SlideShare
481
Embed Views
87

Actions

Likes
0
Downloads
17
Comments
0

1 Embed 87

http://toch.tau.ac.il 87

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Eran toch: Designing for privacy Eran toch: Designing for privacy Presentation Transcript

  • http://toch.tau.ac.il/Department of Industrial Engineering Designing for privacy Microsoft Hertzelia April 2013 1
  • A Brief History of Privacy BC-300AC 1890 Controlling . “the right to be information and ? let alone” . : accessibility to " others - Samuel D. Warren and Louis D. Brandeis, - Ruth Gavison 2
  • Agenda① Privacy disasters② The mobile privacy landscape③ Is privacy important?④ The privacy toolbox 3
  • 1. Privacy Disasters What’s the worst that can happen? 4
  • Remember Google Buzz? 5
  • Followers in Buzz ‣ Google suggested a list of followers to new users. ‣ The suggestions were the people who corresponded most with the user. ‣ By default, the list was open to the public and accessible through the user’s profile page. 6
  • After 4 Days…‣ Google had canceled the automatic follower list.‣ And the removed Buzz’s public profile completely. 7
  • After a Week…‣ Law suits and FTC complaints were submitted.‣ Users had abandoned Buzz quickly.‣ Google had agreed to pay $8.5 Mil and was restricted considerably with regard to user data.‣ Buzz was cancelled a year later. 8
  • 2. The Mobile Privacy Landscape 9
  • Privacy Spheres in Mobile ComputingPhysical Data PrivacyPrivacy Collecting and usingInterference of the information collected in thephysical environment user’s action sphereand attention 10
  • Information Threats‣ Can other people find where the person is?‣ And physically threat the user or her property? 11
  • Identity Threats‣ With only 4 locations of a person,‣ and a census database,‣ 95% of the population can be uniquely identified.Yves-Alexandre de Montjoye, César A. Hidalgo, Michel Verleysen & Vincent D.Blondel, Unique in the Crowd: The privacy bounds of human mobility, Nature2013 12
  • Social Threats‣ A location can tell about: ‣ What the user does ‣ Who the user meets‣ Information is shared with the social network. 13
  • Physical Privacy The extent to which the phone interfere with the physical context of the user, draws the attention of the user or the environment.Sounds and notifications Vellux Beepers 14
  • Concerns in Information Privacy Tsai, Janice, Patrick Kelley, Lorrie Cranor, and Norman Sadeh. "Location-sharing technologies: Privacy risks and controls." TPRC, 2009. 15
  • 3. Is Privacy Important Anymore? 16
  • “You already havezero privacy anyway.Get over it.” Scott McNealy Sun Microsystems CEO 1999 17
  • Do Users Actually Care? Shoppers at a mall were offered $10 discount card - and an extra $2 discount if they agreed to share their shopping data. 50% declined the extra offer.Source: The New York Times - http://www.nytimes.com/2013/03/31/technology/web-privacy-and-how-consumers-let-down-their-guard.html?smid=pl-share 18
  • But Wait…Shoppers were offered a $12 discount cardand the option of trading it in for a $10 cardto keep their shopping record private.90% percent chose to trade privacy for$2. 19
  • Privacy is not Abstract Anymore Google Buzz Facebook PathPeople care about concrete privacy threats, that impacttheir actual lives. 20
  • What do users actually do? Facebook users in an American University 21
  • Professional and Ethical Duty 22
  • Legal Duty 23
  • It is a BasicHuman Need Its impossible live without a safe space for experimentation, growth, andpersonal expression 24
  • 4. The Privacy Toolbox 25
  • Types of Tools Data Minimization Access and Privacy Guarantee Recourse Choice NoticePolicy-based Architecture-basedSource: Marc Langheinrich. 2001. Privacy by Design - Principles of Privacy-Aware UbiquitousSystems. In Proceedings of the 3rd international conference on Ubiquitous Computing (UbiComp 2601),
  • Notice‣ Be open with the user.‣ Tell the user what happens to the data, at the right moment, and at the right context. 27
  • What is a Good Notice?‣ A good notice is a way that will enable the user to intelligently make a decision.‣ We need to think: what is the default? What are the implications? Is there an undo? 28
  • Notice Tell the user what happens to the data. http://cups.cs.cmu.edu/privacyLabelPrivacy as Part of the App Decision-Making Process.Patrick Gage Kelley, Lorrie Faith Cranor, and NormanSadeh. CHI 2013. 29
  • Choice‣ Provide the user with meaningful control over the information: ‣ Discriminative ‣ Easy to use ‣ Works out of the box‣ A simple test should be: the data belongs to the user. Can she effectively exercise her ownership? 30
  • Discriminative ControlThe Do Not Track (DNT) headerrequests that a web application disableeither its tracking or cross-site usertracking. http://ie.microsoft.com/testdrive/browser/donottrack/ 31
  • Do Not Trackhttp://ie.microsoft.com/testdrive/browser/donottrack/ 32
  • Non-Discriminative: Access to Locations ‣ Application-level limitations: ‣ Not all locations are the same. ‣ Not all situations are the same. ‣ Not all information destinations are the same. ‣ Default is overpowering 33
  • Control is ToughWhat happens when we ask the user to controlcomplex sharing preferences?How can we balance usability and privacy? 34
  • Crowdsourcing Privacy Preferences Preference Preference Preference From: Eran Toch, Crowdsourcing Privacy Management Preference in Context-Aware Applications, Personal and Ubiquitous Computing, 2013. Preference Preference Collecting preferences and their Aggregator underlying context Building a model for the preference Modeler according to a context Personalizer Personalizing the model for a specific, given user Using the preference model in a specific Application application 35
  • Our User Study‣ 30 Users, 2 weeks.‣ Smart-Spaces: Tracking locations and activities.‣ Participants were surveyed three times a day.‣ Asked about their willingness to share their location on a Likert scale. 36
  • Place Discrimination Some places are shared bySome places almost are everybody considered private 1 2 3 4 5 Lesslikely to share More likely to share 37
  • Accuracy of Decision Strategies 1.0 0.8 strategy accuracy 0.6 A 0.4 M SM 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 threshold 38
  • Defaults are Enormously Important ‣ People have a tendency to stick to the defaults: ‣ Organ donation choices ‣ Access control policies ‣ Browser selection 39
  • Generating DefaultsOded Maimon, Ron Hirschprung, Eran Toch. Evaluating Bi-Directional Data Agent Applicability and Design in Cloud Com-puting Environment, In proceedings of the 17th Industrial Engineering Conference, 2012. 40
  • Testing the Defaults 41
  • Access and Recourse‣ Privacy is a long-term relationship.‣ Applications need to provide an ongoing access to privacy data and controls.‣ Meaningful recourse (helping with problems) is crucial for the user’s security and trust. 42
  • Personal Data Centers 43
  • Privacy through Time‣ Digital information is hardly erased.‣ With search engines and timelines, it becomes more and more accessible.‣ What are the consequences for user- controllable privacy? 44
  • Our Study‣ Between-subject user study (n=298)‣ Analyzing differences between users, randomly assigned to three conditions: ‣ One month ‣ One year ‣ Two years ‣ More than two years.‣ Using a custom FB application.Eran Toch and Oshrat Rave-Ayalon. Understanding theTemporal Aspects of Sharing Preferences in Online SocialNetworks, Submitted to SOUPS 2013 45
  • Willingness to Share Over Time 46
  • Implications for DesignA default expiration time of 1.5 years 47
  • Data Minimization‣ The best solution for privacy is trying not to know anything about the user.‣ In most interesting applications, its not possible.‣ However, analyzing the minimal data requirements for an application is always an interesting idea. 48
  • Anonymity Levels Anonymous Pseudo- Privacy Guarantee anonymous IdentifiedMore recognition Less recognition 49
  • Pseudo-anonymous Profiles 50
  • Managing Identity‣ Don’t ask users to identify.‣ If users need personalized service, rely on pseudo-anonymous identification.‣ Use k-Anonymity, l-diversity, p-closeness and differential privacy to release user information. 51
  • Architectural Choices Serve r The privacy bottleneckClient Client Client Client Client Client 52
  • http://toch.tau.ac.il/ erant@post.tau.ac.ilEran TochDepartment of Industrial EngineeringTel Aviv University, Israel 53