0
Crowdsourcing Privacy Preferencesthrough Time and SpaceEran TochBay Area Talks, June2013
2
Privacy: an Old Idea3Privacy in the year 500 BC
4Mishnah, Baba Batra 2a-b(500 BC - 200 CE) rules:"A person should notopen a window to acommon yard... A personwould not op...
Houses and SystemsPrivacy challenges in the design of peoples houses isnot that different that challenges in online privac...
Social NetworksInformation Sharing Systems6Multi-UserCloud Services CrowdsourcingAppsEnterpriseInteroperability
The Architecture of Privacy7SystemUserInformation SharingSystemsSystemUserTraditional Systems
What can go wrong?
Expressing Preferences10Users find it challenging to understand the privacylandscape, to understand preferences, and to exp...
Agility and Changes11Sharing preferences will change over time, while thecontent is still accessible.
Where Should Users Start?12
Agenda‣ Crowdsourcing preferences.‣ Predictions and defaults.‣ Modeling longitudinal privacy.‣ The future: design and theo...
Crowdsourcing PrivacyPreferences14Eran Toch, Crowdsourcing Privacy Management in Context-Aware Applications,Personal and U...
Crowdsourcing and Privacy1. Modeling user behavior by analyzing a crowd of users.2. Identifying meaningful context and rec...
The Question16‣ Can we model and predict users’preferences for a given scenario?‣ In our case, a combination of the locati...
Crowdsourcing Privacy Preferences17AggregatorPreferenceApplicationCollecting preferences and theirunderlying contextModele...
Our User Study‣ 30 Users, 2 weeks.‣ Smart-Spaces: Trackinglocations and activities.‣ Participants were surveyedthree times...
19Meta-Data Survey
Place Discrimination2021 3 4 5Less likely to share More likely to shareSome placesare consideredprivateSome places areshar...
Distribution by Semantics21Wilingness to Share Location (0-Low, 4-High)density0.00.10.20.30.40.5Home1 2 3 4None1 2 3 4Publ...
Regression Model22‣ Predictions for a user u regarding a place k, arelearned linearly.‣ Easy to model and to compute‣ Prov...
Individual Differences‣ People can becategorizedaccording to theirprivacy approach:‣ Privacy"Unconcerned"‣ Privacy "Pragma...
Prediction Methods24Prediction Methodsquarederror0.00.20.40.60.81.0(a)Simple (b)Semantic (c)Semanic/BiasedPredicting bypla...
Applications ofCrowdsourcing25
26Applications‣ Aggregating user models for:‣ Automatic decision making.‣ Preference prediction for semi-manualsuggestions...
27DecisionEngineLocationLocationDisclose |Deny |ManualLocationRequestsSimulated Access Control
Access Control Performance28thresholdaccuracy0.00.20.40.60.81.00.0 0.2 0.4 0.6 0.8 1.0strategyAMSMIf we simulate an access...
29The Price of PrivacyHow much should businesses pay for a location?25Omer Barak, Gabriella Cohen, Alla Gazit and Eran Toc...
30Creating DefaultsPeople have atendency to stick tothe defaults:‣ Organ donationchoices‣ Access controlpolicies‣ Enterpri...
31Can we create defaults that would reflect the decision space ofexisting users?Selecting DefaultsEran Toch, Norman M. Sade...
Clustering Defaults32Policy a: location withinthe campusPolicy b: location outside ofthe campus
Generalizing Defaults33Developing clustering methods that map a large set of preferencesinto a manageable number of defaul...
34Testing the DefaultsUsing two user studies (n=499) we have evaluated theperformance of the clustering methods.
Privacy through Time35Oshrat Rave-Ayalon and Eran Toch. Retrospective Privacy: Managing LongitudinalPrivacy in Online Soci...
36Privacy through Time‣ Digital information isalmost never erased.‣ With search enginesand timelines, itbecomes more andmo...
A Note About Forgetting37In the past, forgetting was thedefault. Now, its the other wayaround.“Forgetting thus affords us ...
The Question‣ How information aging impacts users’ sharingpreferences in Online Social Networks?‣ Providing a quantitative...
The Question - cont’d‣ Guiding the design of mechanisms longitudinalprivacy:‣ Retrospective mechanisms.‣ Future-facing mec...
40‣ A within-subject user study(n=193)‣ Between-subject user study(n=298)‣ Analyzing differences betweenusers, randomly as...
41Willingness to Share Over TimeWillingness to share decreases with time (Spearman correlationtest, ρ = -0.21, p < 0.0001)
Growth of Variability42The variability of sharing preferences grows considerably after 1 month(Levene test for variance, F...
Time and Self Representation43A post is becoming less and less representative of the user over time(Kruskal-Wallis , p < 0...
The Decay of ContentIrrelevancy is the major reason for considering to hide the post (61% ofthe cases in which users decla...
Life Changes45Some life changes decrease the willingness to share (p < 0.05,Kruskal-Wallis test)
46Expiry Date for InformationA default expiration time of 1.5 years
Retrospective Mechanisms47
What an Engineer WouldDo?48
Privacy-by-Design‣ What should an engineer do?‣ Classic PbD solutions such as k-anonymity, andDifferential Privacy are par...
The Info-Share Toolbox50Policy-based Architecture-basedPrivacyGuaranteeAn extension of Langheinrichs Privacy-by-Design fra...
‣ Ongoing project: Privacy-Peer-Pressure‣ Empowering users to coachtheir friends to privacybehavior.‣ Allowing users to co...
52Patrick Gage Kelley et al., Privacy asPart of the App Decision-MakingProcess. CHI 2013.7/19/09 10:24 PMprivacy policycon...
Nudging53Wang, Yang, et al. "Privacy nudges for social media: an exploratory Facebook study." WWW’13, 2013.
54Choice
Theory‣ We need effective theory to think aboutprivacy‣ Softer models of privacy: privacy by obscurity(Hartzog and Stutzma...
A New (Old) Metaphor
57Acknowledgments‣Israel Science Foundation‣Israel Ministry of Science‣Israel Cyber Bureau‣Tel Aviv Universityhttp://toch....
Crowdsourcing privacy: Bay Area Talks
Upcoming SlideShare
Loading in...5
×

Crowdsourcing privacy: Bay Area Talks

279

Published on

This talk summarize several works that try to create new ways to help users manage their privacy in online social networks, and in other privacy-sensitive applications.

Published in: Technology, News & Politics
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
279
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Crowdsourcing privacy: Bay Area Talks"

  1. 1. Crowdsourcing Privacy Preferencesthrough Time and SpaceEran TochBay Area Talks, June2013
  2. 2. 2
  3. 3. Privacy: an Old Idea3Privacy in the year 500 BC
  4. 4. 4Mishnah, Baba Batra 2a-b(500 BC - 200 CE) rules:"A person should notopen a window to acommon yard... A personwould not open to acommon yard a dooragainst door and awindow againstwindow."The Rashbam (1080 – 1160 CE)defines the Hezek Re’iya (“thedamage of being seen)The Talmud is contextualizing thetext (Bible, Numbers 24-10):“and whose eyes are opened: Howbeautiful are your tents, O Jacob,your dwelling places, O Israel.”“What did Balaam see? He saw thattheir tents openings were notdirected at each other, and said: theydeserve that the divine spirit will beon them.”
  5. 5. Houses and SystemsPrivacy challenges in the design of peoples houses isnot that different that challenges in online privacy ofinformation-sharing systems.
  6. 6. Social NetworksInformation Sharing Systems6Multi-UserCloud Services CrowdsourcingAppsEnterpriseInteroperability
  7. 7. The Architecture of Privacy7SystemUserInformation SharingSystemsSystemUserTraditional Systems
  8. 8. What can go wrong?
  9. 9. Expressing Preferences10Users find it challenging to understand the privacylandscape, to understand preferences, and to express them.
  10. 10. Agility and Changes11Sharing preferences will change over time, while thecontent is still accessible.
  11. 11. Where Should Users Start?12
  12. 12. Agenda‣ Crowdsourcing preferences.‣ Predictions and defaults.‣ Modeling longitudinal privacy.‣ The future: design and theory.
  13. 13. Crowdsourcing PrivacyPreferences14Eran Toch, Crowdsourcing Privacy Management in Context-Aware Applications,Personal and Ubiquitous Computing, 2013.
  14. 14. Crowdsourcing and Privacy1. Modeling user behavior by analyzing a crowd of users.2. Identifying meaningful context and recognizingpersonal differences.3. Building mechanisms that can help individualsmanage preferences.15
  15. 15. The Question16‣ Can we model and predict users’preferences for a given scenario?‣ In our case, a combination of the locationreported and the target of the information.‣ Can we build mechanisms that helpusers manage their access control usingthe predictions?
  16. 16. Crowdsourcing Privacy Preferences17AggregatorPreferenceApplicationCollecting preferences and theirunderlying contextModelerBuilding a model for the preferenceaccording to a contextPersonalizer Personalizing the model for a specific,given userUsing the preference model in a specificapplicationPreferencePreferencePreferencePreferencePreference
  17. 17. Our User Study‣ 30 Users, 2 weeks.‣ Smart-Spaces: Trackinglocations and activities.‣ Participants were surveyedthree times a day.‣ Asked about their willingnessto share their location on aLikert scale.18
  18. 18. 19Meta-Data Survey
  19. 19. Place Discrimination2021 3 4 5Less likely to share More likely to shareSome placesare consideredprivateSome places areshared by almosteverybody
  20. 20. Distribution by Semantics21Wilingness to Share Location (0-Low, 4-High)density0.00.10.20.30.40.5Home1 2 3 4None1 2 3 4Public1 2 3 4Transit1 2 3 4Travel1 2 3 4Work1 2 3 4SemanHoNoPuTrTrW
  21. 21. Regression Model22‣ Predictions for a user u regarding a place k, arelearned linearly.‣ Easy to model and to compute‣ Provides insight into variabilityPrediction byplacePrediction bysemanticsPersonaltendency
  22. 22. Individual Differences‣ People can becategorizedaccording to theirprivacy approach:‣ Privacy"Unconcerned"‣ Privacy "Pragmatic"‣ Privacy"Fundamentalist"23universityeverybody1.01.52.02.53.03.54.01.5 2.0 2.5 3.0 3.5 4.0factor(fit.cluster)123
  23. 23. Prediction Methods24Prediction Methodsquarederror0.00.20.40.60.81.0(a)Simple (b)Semantic (c)Semanic/BiasedPredicting byplacePredicting bysemanticsSemantics andpersonaltendencyHigher error rate
  24. 24. Applications ofCrowdsourcing25
  25. 25. 26Applications‣ Aggregating user models for:‣ Automatic decision making.‣ Preference prediction for semi-manualsuggestions.‣ Defaults for new users.‣ Providing possibilities for rule modifications.
  26. 26. 27DecisionEngineLocationLocationDisclose |Deny |ManualLocationRequestsSimulated Access Control
  27. 27. Access Control Performance28thresholdaccuracy0.00.20.40.60.81.00.0 0.2 0.4 0.6 0.8 1.0strategyAMSMIf we simulate an access control mechanism, we reach 80%accuracy in predicting what a user would do, and 93% accuracywhen allowing the worst predictions to be examined by the user.AutomationAccuracy
  28. 28. 29The Price of PrivacyHow much should businesses pay for a location?25Omer Barak, Gabriella Cohen, Alla Gazit and Eran Toch. The Price Is Right? Economic Value of Location Sharing, Submittedto MCSS: Workshop on Mobile Systems for Computational Social Science, Ubicomp 2013
  29. 29. 30Creating DefaultsPeople have atendency to stick tothe defaults:‣ Organ donationchoices‣ Access controlpolicies‣ Enterprise calendars(L Palen, 1999)Johnson, Eric J. and Goldstein, Daniel G., DoDefaults Save Lives? (Nov 21, 2003). Science,Vol. 302, 2003.
  30. 30. 31Can we create defaults that would reflect the decision space ofexisting users?Selecting DefaultsEran Toch, Norman M. Sadeh, Jason I. Hong: Generating default privacy policiesfor online social networks. CHI Extended Abstracts 2010: 4243-4248
  31. 31. Clustering Defaults32Policy a: location withinthe campusPolicy b: location outside ofthe campus
  32. 32. Generalizing Defaults33Developing clustering methods that map a large set of preferencesinto a manageable number of default preferences.Ron Hirschprung, Eran Toch, and Oded Maimon. Evaluating Bi-Directional Data Agent Applicabilityand Design in Cloud Computing Environment, submitted to Information Systems Research
  33. 33. 34Testing the DefaultsUsing two user studies (n=499) we have evaluated theperformance of the clustering methods.
  34. 34. Privacy through Time35Oshrat Rave-Ayalon and Eran Toch. Retrospective Privacy: Managing LongitudinalPrivacy in Online Social Net- works, accepted to the Symposium on Usable Privacy andSecurity (SOUPS), 2013
  35. 35. 36Privacy through Time‣ Digital information isalmost never erased.‣ With search enginesand timelines, itbecomes more andmore accessible.‣ What are theconsequences forprivacy?
  36. 36. A Note About Forgetting37In the past, forgetting was thedefault. Now, its the other wayaround.“Forgetting thus affords us asecond chance, individually and asa society, to rise above our pastmistakes and misdeeds, to acceptthat humans change over time.”
  37. 37. The Question‣ How information aging impacts users’ sharingpreferences in Online Social Networks?‣ Providing a quantitative model and proof forlongitudinal privacy.38t0 tm1Publication Time 1 month 1 yearty1AnticipationprivacypreferencesRetrospectiveprivacy
  38. 38. The Question - cont’d‣ Guiding the design of mechanisms longitudinalprivacy:‣ Retrospective mechanisms.‣ Future-facing mechanisms.39
  39. 39. 40‣ A within-subject user study(n=193)‣ Between-subject user study(n=298)‣ Analyzing differences betweenusers, randomly assigned tofour conditions:‣ 0-1 years‣ 1-2 years‣ 2+ years‣ Control: 0-2+ years‣ Using a custom FB application.Our Studies
  40. 40. 41Willingness to Share Over TimeWillingness to share decreases with time (Spearman correlationtest, ρ = -0.21, p < 0.0001)
  41. 41. Growth of Variability42The variability of sharing preferences grows considerably after 1 month(Levene test for variance, F = 10.69, p < 0.00001).
  42. 42. Time and Self Representation43A post is becoming less and less representative of the user over time(Kruskal-Wallis , p < 0.0001)
  43. 43. The Decay of ContentIrrelevancy is the major reason for considering to hide the post (61% ofthe cases in which users declare they wish to hide the post)44offendinappropriateotherchangeirrelevant0 50 100 150 200CountReasonsforhidingthepostfromfriends
  44. 44. Life Changes45Some life changes decrease the willingness to share (p < 0.05,Kruskal-Wallis test)
  45. 45. 46Expiry Date for InformationA default expiration time of 1.5 years
  46. 46. Retrospective Mechanisms47
  47. 47. What an Engineer WouldDo?48
  48. 48. Privacy-by-Design‣ What should an engineer do?‣ Classic PbD solutions such as k-anonymity, andDifferential Privacy are partial.‣ A new engineering toolbox is needed for Info-Share systems:‣ Privacy-by-Design: Identifying Gaps and OvercomingBarriers in the Intersection of Law and Engineering,‣ An Israel Science Foundation Project, with Prof.Michael Birnhack (TAU) and Dr. Irit Hadar (Haifa).49
  49. 49. The Info-Share Toolbox50Policy-based Architecture-basedPrivacyGuaranteeAn extension of Langheinrichs Privacy-by-Design frameworkDataMinimizationChoiceNotice andNudgeCoachingRecourse
  50. 50. ‣ Ongoing project: Privacy-Peer-Pressure‣ Empowering users to coachtheir friends to privacybehavior.‣ Allowing users to copy theirfriends, learn from theirfriends, and spread goodideas around.51Coaching
  51. 51. 52Patrick Gage Kelley et al., Privacy asPart of the App Decision-MakingProcess. CHI 2013.7/19/09 10:24 PMprivacy policycontact information provide service and maintain siteAccess to your informationThis site gives you access to your contact data and someof its other data identified with youHow to resolve privacy-related disputes with this sitePlease email our customer service departmentacme.com5000 Forbes AvenuePittsburgh, PA 15213 UnitedStatesPhone: 800-555-5555help@acme.comwe will collect and use yourinformation in this wayby default, we will collect and useyour information in this way unlessyou tell us not to by opting outwe will not collect and use yourinformation in this wayby default, we will not collect anduse your information in this wayunless you allow us to by opting inThe Acme Policytypes ofinformationthis sitecollectshow we use your informationwho we share yourinformation withmarketing telemarketing profilingothercompaniespublicforumscontactinformationcookiespreferencespurchasinginformationsocial securitynumber &govt IDyour activity onthis siteInformation not collected or used by this site: demographic, financial,health, location.Definitions h"p://cups.cs.cmu.edu/privacyLabelNotice
  52. 52. Nudging53Wang, Yang, et al. "Privacy nudges for social media: an exploratory Facebook study." WWW’13, 2013.
  53. 53. 54Choice
  54. 54. Theory‣ We need effective theory to think aboutprivacy‣ Softer models of privacy: privacy by obscurity(Hartzog and Stutzman, 2012).‣ Behavioral economic approaches (Acquisti’s work).‣ Models that analyze privacy within a social context(e.g., social capital and privacy, Ellison et al., 2011).‣ Models that analyze the relations between codeand norms.55
  55. 55. A New (Old) Metaphor
  56. 56. 57Acknowledgments‣Israel Science Foundation‣Israel Ministry of Science‣Israel Cyber Bureau‣Tel Aviv Universityhttp://toch.tau.ac.il/erant@post.tau.ac.il
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×