Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Eran toch: Designing for privacy


Published on

A lecture in Microsoft Hertzelia, Out-of-the-Box week. The lecture revoles around the privacy threats in mobile computing, and its remedies.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Eran toch: Designing for privacy

  1. 1. of Industrial Engineering Designing for privacy Microsoft Hertzelia April 2013 1
  2. 2. A Brief History of Privacy BC-300AC 1890 Controlling . “the right to be information and ? let alone” . : accessibility to " others - Samuel D. Warren and Louis D. Brandeis, - Ruth Gavison 2
  3. 3. Agenda① Privacy disasters② The mobile privacy landscape③ Is privacy important?④ The privacy toolbox 3
  4. 4. 1. Privacy Disasters What’s the worst that can happen? 4
  5. 5. Remember Google Buzz? 5
  6. 6. Followers in Buzz ‣ Google suggested a list of followers to new users. ‣ The suggestions were the people who corresponded most with the user. ‣ By default, the list was open to the public and accessible through the user’s profile page. 6
  7. 7. After 4 Days…‣ Google had canceled the automatic follower list.‣ And the removed Buzz’s public profile completely. 7
  8. 8. After a Week…‣ Law suits and FTC complaints were submitted.‣ Users had abandoned Buzz quickly.‣ Google had agreed to pay $8.5 Mil and was restricted considerably with regard to user data.‣ Buzz was cancelled a year later. 8
  9. 9. 2. The Mobile Privacy Landscape 9
  10. 10. Privacy Spheres in Mobile ComputingPhysical Data PrivacyPrivacy Collecting and usingInterference of the information collected in thephysical environment user’s action sphereand attention 10
  11. 11. Information Threats‣ Can other people find where the person is?‣ And physically threat the user or her property? 11
  12. 12. Identity Threats‣ With only 4 locations of a person,‣ and a census database,‣ 95% of the population can be uniquely identified.Yves-Alexandre de Montjoye, César A. Hidalgo, Michel Verleysen & Vincent D.Blondel, Unique in the Crowd: The privacy bounds of human mobility, Nature2013 12
  13. 13. Social Threats‣ A location can tell about: ‣ What the user does ‣ Who the user meets‣ Information is shared with the social network. 13
  14. 14. Physical Privacy The extent to which the phone interfere with the physical context of the user, draws the attention of the user or the environment.Sounds and notifications Vellux Beepers 14
  15. 15. Concerns in Information Privacy Tsai, Janice, Patrick Kelley, Lorrie Cranor, and Norman Sadeh. "Location-sharing technologies: Privacy risks and controls." TPRC, 2009. 15
  16. 16. 3. Is Privacy Important Anymore? 16
  17. 17. “You already havezero privacy anyway.Get over it.” Scott McNealy Sun Microsystems CEO 1999 17
  18. 18. Do Users Actually Care? Shoppers at a mall were offered $10 discount card - and an extra $2 discount if they agreed to share their shopping data. 50% declined the extra offer.Source: The New York Times - 18
  19. 19. But Wait…Shoppers were offered a $12 discount cardand the option of trading it in for a $10 cardto keep their shopping record private.90% percent chose to trade privacy for$2. 19
  20. 20. Privacy is not Abstract Anymore Google Buzz Facebook PathPeople care about concrete privacy threats, that impacttheir actual lives. 20
  21. 21. What do users actually do? Facebook users in an American University 21
  22. 22. Professional and Ethical Duty 22
  23. 23. Legal Duty 23
  24. 24. It is a BasicHuman Need Its impossible live without a safe space for experimentation, growth, andpersonal expression 24
  25. 25. 4. The Privacy Toolbox 25
  26. 26. Types of Tools Data Minimization Access and Privacy Guarantee Recourse Choice NoticePolicy-based Architecture-basedSource: Marc Langheinrich. 2001. Privacy by Design - Principles of Privacy-Aware UbiquitousSystems. In Proceedings of the 3rd international conference on Ubiquitous Computing (UbiComp 2601),
  27. 27. Notice‣ Be open with the user.‣ Tell the user what happens to the data, at the right moment, and at the right context. 27
  28. 28. What is a Good Notice?‣ A good notice is a way that will enable the user to intelligently make a decision.‣ We need to think: what is the default? What are the implications? Is there an undo? 28
  29. 29. Notice Tell the user what happens to the data. as Part of the App Decision-Making Process.Patrick Gage Kelley, Lorrie Faith Cranor, and NormanSadeh. CHI 2013. 29
  30. 30. Choice‣ Provide the user with meaningful control over the information: ‣ Discriminative ‣ Easy to use ‣ Works out of the box‣ A simple test should be: the data belongs to the user. Can she effectively exercise her ownership? 30
  31. 31. Discriminative ControlThe Do Not Track (DNT) headerrequests that a web application disableeither its tracking or cross-site usertracking. 31
  32. 32. Do Not Track 32
  33. 33. Non-Discriminative: Access to Locations ‣ Application-level limitations: ‣ Not all locations are the same. ‣ Not all situations are the same. ‣ Not all information destinations are the same. ‣ Default is overpowering 33
  34. 34. Control is ToughWhat happens when we ask the user to controlcomplex sharing preferences?How can we balance usability and privacy? 34
  35. 35. Crowdsourcing Privacy Preferences Preference Preference Preference From: Eran Toch, Crowdsourcing Privacy Management Preference in Context-Aware Applications, Personal and Ubiquitous Computing, 2013. Preference Preference Collecting preferences and their Aggregator underlying context Building a model for the preference Modeler according to a context Personalizer Personalizing the model for a specific, given user Using the preference model in a specific Application application 35
  36. 36. Our User Study‣ 30 Users, 2 weeks.‣ Smart-Spaces: Tracking locations and activities.‣ Participants were surveyed three times a day.‣ Asked about their willingness to share their location on a Likert scale. 36
  37. 37. Place Discrimination Some places are shared bySome places almost are everybody considered private 1 2 3 4 5 Lesslikely to share More likely to share 37
  38. 38. Accuracy of Decision Strategies 1.0 0.8 strategy accuracy 0.6 A 0.4 M SM 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 threshold 38
  39. 39. Defaults are Enormously Important ‣ People have a tendency to stick to the defaults: ‣ Organ donation choices ‣ Access control policies ‣ Browser selection 39
  40. 40. Generating DefaultsOded Maimon, Ron Hirschprung, Eran Toch. Evaluating Bi-Directional Data Agent Applicability and Design in Cloud Com-puting Environment, In proceedings of the 17th Industrial Engineering Conference, 2012. 40
  41. 41. Testing the Defaults 41
  42. 42. Access and Recourse‣ Privacy is a long-term relationship.‣ Applications need to provide an ongoing access to privacy data and controls.‣ Meaningful recourse (helping with problems) is crucial for the user’s security and trust. 42
  43. 43. Personal Data Centers 43
  44. 44. Privacy through Time‣ Digital information is hardly erased.‣ With search engines and timelines, it becomes more and more accessible.‣ What are the consequences for user- controllable privacy? 44
  45. 45. Our Study‣ Between-subject user study (n=298)‣ Analyzing differences between users, randomly assigned to three conditions: ‣ One month ‣ One year ‣ Two years ‣ More than two years.‣ Using a custom FB application.Eran Toch and Oshrat Rave-Ayalon. Understanding theTemporal Aspects of Sharing Preferences in Online SocialNetworks, Submitted to SOUPS 2013 45
  46. 46. Willingness to Share Over Time 46
  47. 47. Implications for DesignA default expiration time of 1.5 years 47
  48. 48. Data Minimization‣ The best solution for privacy is trying not to know anything about the user.‣ In most interesting applications, its not possible.‣ However, analyzing the minimal data requirements for an application is always an interesting idea. 48
  49. 49. Anonymity Levels Anonymous Pseudo- Privacy Guarantee anonymous IdentifiedMore recognition Less recognition 49
  50. 50. Pseudo-anonymous Profiles 50
  51. 51. Managing Identity‣ Don’t ask users to identify.‣ If users need personalized service, rely on pseudo-anonymous identification.‣ Use k-Anonymity, l-diversity, p-closeness and differential privacy to release user information. 51
  52. 52. Architectural Choices Serve r The privacy bottleneckClient Client Client Client Client Client 52
  53. 53. TochDepartment of Industrial EngineeringTel Aviv University, Israel 53