Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Introduction to usable privacy

129 views

Published on

An introduction to usable privacy, including some privacy definitions, usability objectives, thinking about why is usable privacy hard? And some examples of usable privacy experiences

Published in: Data & Analytics
  • Be the first to comment

  • Be the first to like this

Introduction to usable privacy

  1. 1. Usable Privacy Eran Toch September 2017
  2. 2. 2@erant Eran Toch Co-director of the IWiT lab Department of Industrial Engineering Tel Aviv University, Israel http://toch.tau.ac.il/ erant@post.tau.ac.il Twitter: @erant
  3. 3. Worst-case Scenario 2009: Buzz embedded in Gmail Public profile by default, including followers Follower suggestions based on Gmail correspondence Why nobody remembers Google Buzz
  4. 4. Aftermath ‣ Google had missed its chance to get into the social networking market ‣ The FCC had seriously constrained Google’s ability to collect data 4@erant The UI was changed after four days. The service was canceled a year afterwards
  5. 5. Privacy Interactions Today 5@erant ‣ Why do users interact? Social networks, active user sharing, changing privacy norms, and regulation ‣ How to create usable privacy experiences?
  6. 6. Agenda 1.Some privacy definitions 2.Usability objectives 3.Why is usable privacy hard? 4.Usable privacy experiences 6@erant
  7. 7. Some Privacy Definitions
  8. 8. “What is more holy, what is more carefully fenced round with every description of religious respect, than the house of every individual citizen? Thus is the asylum of everyone” Marcus Tullius Cicero, 106BC, De Domo Sua 109
  9. 9. 9 Privacy is a Basic Human Need Privacy is a basic psychological need that allows us to experiment and to change (S.M. Jourard, 1966) @erant
  10. 10. ‣ Privacy is necessary for managing social ties with different levels of intimacy (Chaikin, 1977) ‣ Privacy is critical for flexibly projecting and designing our self to others (Goffman 1959; Vitak, 2015) 10@erant
  11. 11. Privacy as Control Privacy as control: “The claims of individuals, groups, or institutions to determine for themselves when, how and to what extent information about them is communicated to others” Privacy and Freedom, Alan Westin, 1970, The Bodley Head Ltd. 11@erant
  12. 12. Privacy as contextual integrity: Privacy violation is a harm to appropriate information flows, which depends on informational norms: the social context of an information flow, the roles of the subject and the sender. Privacy as contextual integrity, Helen Nissenbaum, 2004 12@erant
  13. 13. Informed Consent ‣ Informed consent is main instrument in allowing data access while protecting privacy ‣ EU GDPR ‣ US FTC FIPPs ‣ OECD regulation ‣ … 13@erant
  14. 14. Summary ‣ User control enables the data processing in systems in which users are identified ‣ It explicitly requires some interaction with the user ‣ But how should this interaction carried out? 14@erant
  15. 15. Usability Objectives
  16. 16. The Definition of Usability The effectiveness, efficiency, and satisfaction with which specified users achieve specified goals in particular environments. ISO Ergonomics requirements, ISO 9241 part 11: Guidance on usability specification and measures. 16@erant
  17. 17. Easy to Recognize when its Absent 17@erant Misleading clues Clearness and Aesthetics Mismatch between form and function
  18. 18. An intuitive definition ‣ Usability is the end-user's view of system quality ‣ Learnability ‣ Efficiency (Productivity) ‣ Memorability ‣ Lack of Errors ‣ Satisfaction ‣ Pleasurable 18@erant Objects with clear affordance
  19. 19. User Centered Design 19@erant Name: Patricia Age: 31 Occupati on: Sales Manager, IKEA Store Hobbies: Painting Fitness/biking Taking son Devon to the parkGoals: Finalize banking orders as soon as possible Do work on the go (busses, driving)Level of expertise : High technical skill Medium-low banking skills Investigate: users (through Personas), uses, and contexts Design: various stages and fidelities Test and Monitor
  20. 20. Summary ‣ Users expect systems to “just work” ‣ Therefore, system should have clear affordance and be designed to their users ‣ What are the challenges in privacy? 20@erant
  21. 21. Why is Usable Privacy Hard?
  22. 22. Major Challenges ‣ Notice and choice complexity ‣ The privacy paradox ‣ Choice architectures ‣ Temporal aspects 22@erant
  23. 23. Notice and Choice Complexity
  24. 24. Notice Complexity: Privacy Policies ‣ An average Internet user would need about 244 hours per year to read all website privacy policies (McDonald and Cranor, 2008) ‣ And need a legal degree (Cate, 2010) 24@erant
  25. 25. The Problem with Privacy Policies ‣ Policies are both notices and a binding legal contract ‣ Therefore, they are written as such ‣ When we investigate usability, we need to ask ourselves about the context in which users are working 25@erant
  26. 26. Information architectures are complex 26@erant It is difficult to understand the privacy landscape, to understand preferences, and to express them
  27. 27. Overwhelming Technology 27@erant Surrounding devices User devices
  28. 28. The Privacy Paradox
  29. 29. The Privacy Paradox “Ask 100 people if they care about privacy and 85 will say yes. Ask those same 100 people if they'll give you a DNA sample just to get a free Big Mac, and 85 will say yes.” Austin Hill 29@erant http://bigthink.com/david-ryan-polgar/the-privacy- paradox-an-interview-with-manoush-zomorodi
  30. 30. Privacy Decisions are very Sensitive to Context 30@erant Shoppers at a mall were offered $10 discount card - and an extra $2 discount if they agreed to share their shopping data. 50% declined the extra offer. Source: The New York Times - http://www.nytimes.com/2013/03/31/technology/web-privacy-and- how-consumers-let-down-their-guard.html?smid=pl-share
  31. 31. But Wait 31@erant Shoppers were offered a $12 discount card and the option of trading it in for a $10 card to keep their shopping record private. 
 90% percent chose to trade privacy for $2. Acquisti, A., Brandimarte, L. & Loewenstein, G., 2015. Privacy and human behavior in the age of information. Science, 347(6221), pp. 509-514.
  32. 32. Privacy as a secondary task “Users do not want to be responsible for, nor concern themselves with, their own security.” Blake Ross Firefox Foundation 
 32@erant
  33. 33. Other Explanations ‣ Consequences are abstract, uncertain, and far into the future ‣ People often trust service providers ‣ People do not get all the information when making the decision 33@erant
  34. 34. The Limits of the Privacy Paradox 34@erant ‣Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509-514. ‣F. Stutzman, J. Kramer-Duffield, Friends Only: Examining a Privacy-Enhancing Behavior in Facebook (SIGCHI Conference on Human Factors in Computing Systems, ACM, Atlanta, 2010), pp. 1553–1562 ‣R. Gross, A. Acquisti, Information Revelation and Privacy in Online Social Networks (ACM Workshop–Privacy in the Electronic Society, New York, 2005), pp. 71–80
  35. 35. Teens and Privacy 35@erant Pew Report, Teens, Social Media, and Privacy, May 21, 2013, http://www.pewinternet.org/Reports/2013/ Teens-Social-Media-And-Privacy.aspx
  36. 36. Choice Architectures
  37. 37. Choice Architectures ‣ The layout, order, defaults, and information presentation of the decision-making ‣ Defaults, for example, frame the way users perceive their options (Tversky & Kahneman, 1986; Rothman & Salovey, 1997) ‣ Relevant in many fields, including privacy (Knijnenburg et al., 2013) 37@erant User X
  38. 38. Sometimes, no choices at all 38@erant
  39. 39. 39@erant
  40. 40. Empirical Analysis 40@erant Hirschprung, R., Toch, E., Schwartz-Chassidim, H., Mendel, T., & Maimon, O. (2017). Analyzing and optimizing access control choice architectures in online social networks. ACM Transactions on Intelligent Systems and Technology (TIST), 8(4), 57 ‣ An analysis of 21,950 posts by 266 Facebook users ‣ 2 optimized defaults can serve the population similarly to Facebook’s 3 defaults ‣ 3 optimized defaults fare even better
  41. 41. Temporal Aspects
  42. 42. Privacy through Time ‣ Digital information is almost never erased ‣ With search engines and timelines, it becomes more and more accessible ‣ What are the consequences for privacy?
  43. 43. 43 Willingness to Share Over Time Willingness to share decreases with time (ρ = -0.21, p < 0.0001, Spearman correlation test) @erant Ayalon, Oshrat, and Eran Toch. "Not Even Past: Information Aging and Temporal Privacy in Online Social Networks." Human– Computer Interaction32.2 (2017): 73-102. Ayalon, Oshrat, and Eran Toch. "Retrospective privacy: Managing longitudinal privacy in online social networks." Proceedings of the Ninth Symposium on Usable Privacy and Security. ACM, 2013.
  44. 44. Life Changes 44@erant Some life changes decrease the willingness to share (p = 0.0226, Kruskal-Wallis test)
  45. 45. Summary ‣ Notice and choice complexity ‣ The privacy paradox ‣ Choice architectures ‣ Temporal aspects 45@erant
  46. 46. Usable Privacy Experiences
  47. 47. A User-Centered Design Approach 1. Aim to understand users 2. Design interfaces which are consistent, timely, and contextualized 3. Think about the choice architecture 4. Investigate groundbreaking interfaces 47@erant
  48. 48. Understanding Users ‣ Which different privacy “personas” can be found among your users? (Spears and Erete, 2014) ‣ Westin identified three clusters of people with different attitudes toward privacy ‣ Fundamentalists (~25%) ‣ Unconcerned (~10%) ‣ Pragmatists (~65%) 48@erant
  49. 49. Crowdsourcing Privacy Critique ‣ Privacy experiences and interfaces are presented to crowd workers ‣ Allowing multivariate testing of privacy designs 49@erant Oshrat Ayalon and Eran Toch, Crowdsourcing Privacy Design Critique: An Empirical Evaluation of Framing Effects, accepted to HICSS 2018
  50. 50. Designing Privacy Experiences Example: HP Enterprise Privacy UX guidelines 50@erant
  51. 51. New Privacy Interactions 51@erant Privacy nutrition labels (Kelly et al., 2012) Privacy score Temporary Interactions
  52. 52. Summary 1.Privacy as contextualized control over data 2.The human traits that make usable interfaces hard to design 3.User-centered approach to privacy 52@erant
  53. 53. 53@erant Thanks! ‣ Oshrat Ayalon ‣ Rony Hirschprung ‣ Israel Science Foundation http://toch.tau.ac.il/ erant@post.tau.ac.il Twitter: @erant

×