CSCW networked privacy workshop - keynote

  • 896 views
Uploaded on

My keynote presentation to the CSCW networked privacy workshop

My keynote presentation to the CSCW networked privacy workshop

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
896
On Slideshare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Is privacy a matter oftransparency and control?Towards a Privacy Adaptation Procedure
  • 2. Outline1. Beyond transparency and control Privacy calculus, more paradoxes, and bounded rationality2. Privacy nudging A solution inspired by decision sciences... with some flaws3. Privacy Adaptation Procedure Adaptive nudges based on a contextualized understanding of users’ privacy concerns INFORMATION AND COMPUTER SCIENCES
  • 3. Beyond transparency and controlPrivacy calculus, more paradoxes, and bounded rationality
  • 4. The state of privacy, 2013 INFORMATION AND COMPUTER SCIENCES
  • 5. A model by Smith et al. 2011 Why aren’t these more strongly related? Transparency Control INFORMATION AND COMPUTER SCIENCES
  • 6. Transparency and control Transparency ControlInformed consent User empowerment “companies should “companies should offer provide clear descriptions consumers clear and of [...] why they need the simple choices [...] about data, how they will use it” personal data collection, use, and disclosure” INFORMATION AND COMPUTER SCIENCES
  • 7. Examples from your workNa Wang, Heng Xu, and Jens Grossklags “Our new designs encompass control and awareness as the essential dimensions of users’ privacy concerns in the context of third-party apps on Facebook.”Jessica Vitak “Users may also employ advanced privacy settings to segregate audiences, so they can still share relevant content with their various connections” INFORMATION AND COMPUTER SCIENCES
  • 8. Examples from your workStacy Blasiola “By drawing awareness to the issue, users will be better equipped to understand the vulnerabilities posed by third party applications”Ralf De Wolf and Jo Pierson “different audience management strategies [...] can be used as a framework for access control models and/or feedback and awareness tools” INFORMATION AND COMPUTER SCIENCES
  • 9. The Transparency Paradox Useful for concerned users, but bad for others Makes them more fearful Any mention of privacy, whether it is favorable or not, triggers privacy concerns INFORMATION AND COMPUTER SCIENCES
  • 10. Example: Website A/B testing INFORMATION AND COMPUTER SCIENCES
  • 11. The Control Paradox Decisions are too numerous Most Facebook users don’t know implications of their own privacy settings! Decisions are difficult Uncertain and delayed outcomes Control gives a false sense of security INFORMATION AND COMPUTER SCIENCES
  • 12. Beyond controlEden Litt “while many sites give users a variety of buttons and dashboards to help them technologically enforce their privacy, these features are only useful if users are aware that they exist, know where to find them, and use them effectively”Ralf De Wolf and Jo Pierson “in an online environment managing privacy becomes a time-consuming chore” INFORMATION AND COMPUTER SCIENCES
  • 13. Example: Facebook“bewildering tangle of options” (New York Times, 2010)“labyrinthian” controls” (U.S. Consumer Magazine, 2012) INFORMATION AND COMPUTER SCIENCES
  • 14. Example: Knijnenburg et al. Introducing an “extreme” sharing option E Nothing - City - Blockbenefits --> Add the option Exact B Expected: C Some will choose Exact instead of Block N Unexpected: privacy --> Sharing increases across the board! bit.ly/chi2013privacy INFORMATION AND COMPUTER SCIENCES
  • 15. Bounded rationality People’s decisions are inconsistent and seemingly irrational - Framing effects - Default effects - Order effects Transparency: Information overload Control: Choice overload INFORMATION AND COMPUTER SCIENCES
  • 16. Example: Acquisti et al. Foot in the door Door in the face (innocuous requests first) (risqué requests first) INFORMATION AND COMPUTER SCIENCES
  • 17. Example: Acquisti et al. 1400 1200Cumulative admission rates in percentages 1000 800 Decreasing Increasing 600 Baseline 400 200 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 Question number (Increasing condition) INFORMATION AND COMPUTER SCIENCES
  • 18. Example: Lai and HuiA Please send me Vortrex Newsletters and information. 25% 4.B Please do not send me Vortrex Newsletters and information. 37%the InC Please send me Vortrex Newsletters and information. 53% inher defauD Please do not send me Vortrex Newsletters and information. 0% t onlin The Figure 4: Subjects were assigned one of the following conditions the s in the registration page. Conn 3.1. Data Analysis and Results actio The mean levels of participations in each experimental condition are nega reported in Table 1 below. conv Table 1: Mean participation levels as a function of frames and posit INFORMATION AND COMPUTER SCIENCES
  • 19. Summary of part 1 We need to move beyond control and transparency Rational privacy decision- making is bounded Transparency and control increase choice difficulty INFORMATION AND COMPUTER SCIENCES
  • 20. Privacy nudgingA solution inspired by decision sciences... with some flaws
  • 21. A new modelJessica Vitak “it is likely that users employ a number of strategies when making decisions regarding what and with whom to share content online”These strategies are not rational, therefore: - People do not always choose what is best for them - There is significant leeway to influence peoples decisions - Being objectively neutral is impossible INFORMATION AND COMPUTER SCIENCES
  • 22. A new modelJessica Vitak “it is likely that users employ a number of strategies when making decisions regarding what and with whom to share content online”These strategies are not rational, therefore: - People do not always choose what is best for them - There is significant leeway to influence peoples decisions - Being objectively neutral is impossible INFORMATION AND COMPUTER SCIENCES
  • 23. A new model Behavioral reactions (including disclosures) Default Default Nudge Nudge order value Decision Privacy Calculus strategiesJustification Nudge Justification Nudge Risk/ Benfits Costs INFORMATION AND COMPUTER SCIENCES
  • 24. A new model DefaultJustification valueA succinct reason to Relieve users from thedisclose (or not disclose) burden of making decisionsa piece of information - Path of least resistance - Make it easier to - Implicit normative cue rationalize the decision (what I should do) - Minimize the potential - Endowment effect (what regret of choosing the I have is worth more than wrong option what I don’t have) INFORMATION AND COMPUTER SCIENCES
  • 25. Example: Acquisti et al. 1400 1200Cumulative admission rates in percentages 1000 800 Decreasing Increasing 600 Baseline 400 200 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 Question number (Increasing condition) INFORMATION AND COMPUTER SCIENCES
  • 26. Example: Knijnenburg & Kobsa Disclosure*behavior* * Demographics*disclosure * *Context*disclosure* Context#first# Demographics#first# Context#first# Demograpics#first#100%# 90%# 80%# 70%# 60%# 50%# 40%# 30%# 20%# 10%# 0%# bit.ly/tiis2013 INFORMATION AND COMPUTER SCIENCES
  • 27. Example: Lai and HuiA Please send me Vortrex Newsletters and information. 25% 4.B Please do not send me Vortrex Newsletters and information. 37%the InC Please send me Vortrex Newsletters and information. 53% inher defauD Please do not send me Vortrex Newsletters and information. 0% t onlin The Figure 4: Subjects were assigned one of the following conditions the s in the registration page. Conn 3.1. Data Analysis and Results actio The mean levels of participations in each experimental condition are nega reported in Table 1 below. conv Table 1: Mean participation levels as a function of frames and posit INFORMATION AND COMPUTER SCIENCES
  • 28. Example: Brown & Krishna %pt increase in “High” compared to baseline Default High default works Reactance when aware of motives INFORMATION AND COMPUTER SCIENCES
  • 29. Example: Knijnenburg & Kobsa 5 justification types None Useful for you Number of others Useful for others Explanation bit.ly/tiis2013 INFORMATION AND COMPUTER SCIENCES
  • 30. Example: Knijnenburg & Kobsa Disclosure*behavior* * Demographics*disclosure * *Context*disclosure* Context"first" Demographics"first" Context"first" Demograpics"first"100%" 90%" 1" 80%" ***" 70%" *" **" *" 60%" *" *" 50%" 40%" 30%" 20%" 10%" 0%" none" useful"for"you" #"of"others" useful"for"others" explanaDon" bit.ly/tiis2013 INFORMATION AND COMPUTER SCIENCES
  • 31. Example: Knijnenburg & Kobsa Sa#sfac#on)with)) Disclosure*behavior* the)system)Anticipated satisfaction with * Demographics*disclosure * *Context*disclosure*the system (intention to use): Context"first" Demographics"first" 1,00" Context"first" Demograpics"first"100%" 0,75" 90%" 6 items,1"e.g. “I would ***" 0,50" 80%" recommend the system 70%" 0,25" *" **" *" 60%" to others” 0,00" *" *" 50%" $0,25"Lower for any justification!40%" 30%" $0,50" 1" 20%" $0,75" **" **" 10%" $1,00" ***" 0%" none" useful"for"you" #"of"others" useful"for"others" explanaDon" bit.ly/tiis2013 INFORMATION AND COMPUTER SCIENCES
  • 32. Problems with Privacy NudgingWhat should be the purpose of the nudge?“More information = better, e.g. for personalization” Techniques to increase disclosure cause reactance in the more privacy-minded users“Privacy is an absolute right“ More difficult for less privacy-minded users to enjoy the benefits that disclosure would provide INFORMATION AND COMPUTER SCIENCES
  • 33. Problems with Privacy Nudging Smith, Goldstein & Johnson: “What is best for consumers depends upon characteristics of the consumer: An outcome that maximizes consumer welfare may be suboptimal for some consumers in a context where there is heterogeneity in preferences.” INFORMATION AND COMPUTER SCIENCES
  • 34. Summary of part 2 Nudges work Defaults and justifications can influence users’ decisions But we cannot nudge everyone the same way! Users differ in their disclosure preferences Nudges should respect these differences INFORMATION AND COMPUTER SCIENCES
  • 35. Privacy Adaptation Procedure Adaptive nudges based on a contextualized understanding of users’ privacy concerns
  • 36. Contextualized preferencesSameer Patil et al. “One of the factors contributing to this “privacy paradox” is the decoupling of the circumstances in which privacy- affecting behaviors occur from the time at which privacy concerns are expressed”Sam McNeilly, Luke Hutton, and Tristan Henderson “Participants were willing to share different types of information in different ways” INFORMATION AND COMPUTER SCIENCES
  • 37. Contextualized preferencesPamela Wisniewski and Heather Richter Lipford “By operationalizing SNS desired privacy level at a more granular level, we were able to unpack, if not disprove, aspects of the privacy paradox”Sam McNeilly, Luke Hutton, and Tristan Henderson “privacy settings are fairly robust to capturing peoples contextual norms over time” INFORMATION AND COMPUTER SCIENCES
  • 38. Contextualized preferences Contextualize The optimal justification and different users default may depend on: - type of info (what) different context Privacy - user characteristics (who) decision - recipient (to whom) - etc... INFORMATION AND COMPUTER SCIENCES
  • 39. Example: Knijnenburg et al. Type of data ID Items 1 Wall 2 Status updates Facebook activity 3 Shared links 4 Notes 5 Photos 6 “What?” Hometown Location 7 = Location (city) 8 Location (state/province) 9 Four Residence (street address) Contact info dimensions 11 Phone number 12 Email address 13 Religious views Life/interests 14 Interests (favorite movies, etc.) 15 Facebook groups bit.ly/privdim INFORMATION AND COMPUTER SCIENCES
  • 40. Example: Knijnenburg et al. “Who?” = Fivedisclosure profiles 159 pps tend to share little information overall (LowD) 26 pps tend to share activities and interests (Act+IntD) 50 pps tend to share location and interests (Loc+IntD) 65 pps tend to share everything but contact info (Hi-ConD) 59 pps tend to share everything bit.ly/privdim INFORMATION AND COMPUTER SCIENCES
  • 41. Example: Knijnenburg et al. Detect class member- ship bit.ly/privdim INFORMATION AND COMPUTER SCIENCES
  • 42. Example: Knijnenburg & Kobsa I care about the benefits I do whatever others do INFORMATION AND COMPUTER SCIENCES
  • 43. disclosure tendency, where requesting context data firstleads to less threat and more trust. Best Strategy to Achieve High Total Disclosure Since it is best to ask demographics first to increaseFigure 4 compares for each group the best strategy (marked demographics disclosure, and context first to increase Example: Knijnenburg & Kobsawith an arrow) against all other strategies. Strategies that context disclosure, increasing total disclosure asks for aperform significantly worse than the best strategy are compromise. The best way to attain this compromise is tolabeled with a p-value. first choose a preferred request order, and then to select a User type Context first Demographics first Males with low The ‘useful for you’ justification gives the Providing no justification gives the highest disclosure tendency highest demographics disclosure. context disclosure. Females with low Providing no justification gives the highest The ‘explanation’ justification keeps disclosure tendency demographics disclosure. context disclosure on par. Males with high The ‘useful for others’ justification keeps The ‘useful for you’ justification keeps disclosure tendency demographics disclosure almost on par. context disclosure on par. Females with high Providing no justification gives a high The ‘useful for you’ justification gives the disclosure tendency demographics disclosure. highest context disclosure. Table 2: Best strategies to achieve high overall disclosures. User type Best strategy Males with low disclosure tendency Demographics first with ‘useful for you’. Males with high disclosure tendency The ‘useful for you’ justification in any order. Females with low disclosure tendency Context first with ‘useful for you’. Females with high disclosure tendency Context first with no justification, but ‘useful for you’ is second best. Table 3: Best strategies to achieve high user satisfaction. bit.ly/iui2013 INFORMATION AND COMPUTER SCIENCES
  • 44. The Adaptive Privacy Procedure • Determine the item-. user-, and recipient-type • Select the default and justification that fits best for this context pshare = α + βitemtype + βusertype + βrecipienttype OUTPUT INPUT {user, item, recipient} {defaults, justification} INFORMATION AND COMPUTER SCIENCES
  • 45. The Adaptive Privacy ProcedurePractical use: - Automatic initial defaults in line with “disclosure profile” - Personalized disclosure justificationsRelieves some of the burden of the privacy decision: The right privacy-related information The right amount of control“Realistic empowerment” INFORMATION AND COMPUTER SCIENCES
  • 46. Summary of part 3 Smith, Goldstein & Johnson: “the idea of an adaptive default preserves considerable consumer autonomy [...] and strikes a balance between providing more choice and providing the right choices.” INFORMATION AND COMPUTER SCIENCES
  • 47. Final summary1. Beyond transparency and control Rational privacy decision-making is bounded, and transparency and control only increase choice difficulty2. Privacy nudging Needs to move beyond the one-size-fits-all approach3. Privacy Adaptation Procedure The optimal balance between nudges and control INFORMATION AND COMPUTER SCIENCES