Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
What to Upload to SlideShare
Next
Download to read offline and view in fullscreen.

0

Share

Download to read offline

Designing for Usable Security and Privacy

Download to read offline

Overview of key concepts in usable security and privacy for UX designers, chiefly: Threat modeling; Fair Information Practices; 3-pronged approach to usable security + privacy; Learning science principles; Communication-Human Information Processing model for warnings. Guest lecture in Programming Usable Interfaces, Spring 2020, Carnegie Mellon University.

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

Designing for Usable Security and Privacy

  1. 1. Designing 4 Security + Privacy Cori Faklaris April 15, 2020 Programming Usable Interfaces, Spring 2020 Human-Computer Interaction Institute
  2. 2. About me @heycori ● 3rd -year PhD researcher at Carnegie Mellon Univ. Human-Computer Interaction Institute, advised by Laura Dabbish and Jason I. Hong ○ M.S., Human-Computer Interaction, Indiana University School of Informatics and Computing ● Industry career in news + design, mainly at Indianapolis Star / IndyStar.com / Gannett ○ Engagement Producer, News Designer, Systems Analyst, Software Trainer, Copy Editor, Reporter, “Doer of Things No One Else Wants to Do” (IT, UX) ● Social Media Editor and Consultant Cori Faklaris - Carnegie Mellon University - Page 2
  3. 3. My research at Carnegie Mellon HCII 3Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 3
  4. 4. Agenda for this lecture 4 ● Why care about designing for usable security + privacy ● Differences between security and privacy ○ Pessimistic vs. optimistic orientation to security ○ Data privacy vs. personal privacy ● Three-pronged approach to usable security + privacy ○ Make it invisible (where possible) ○ Offer better user interfaces (affordances, mappings, mental models, etc) ○ Train users (where necessary) ● Research that makes use of this approach Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 4 �� Slides largely based on materials from Prof. Jason I. Hong - many thanks to him!
  5. 5. Agenda for this lecture 5 ● Why care about designing for usable security + privacy ● Differences between security and privacy ○ Pessimistic vs. optimistic orientation to security ○ Data privacy vs. personal privacy ● Three-pronged approach to usable security + privacy ○ Make it invisible (where possible) ○ Offer better user interfaces (affordances, mappings, mental models, etc) ○ Train users (where necessary) ● Research that makes use of this approach Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 5 �� Slides largely based on materials from Prof. Jason I. Hong - many thanks to him!
  6. 6. ‘Unusable’ security + privacy is all around us ... 6Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 6 What are some examples that you can think of?
  7. 7. ‘Unusable’ security + privacy is all around us ... 7Cori Faklaris - Designing for Usable Privacy and Security, April 15, 2020 - Carnegie Mellon University - Page 7 Taylor Lorenz. 2020. “Zoombombing”: When Video Conferences Go Wrong. The New York Times. Retrieved April 13, 2020 from https://www.nytimes.co m/2020/03/20/style/zoo mbombing-zoom-trolling. html
  8. 8. ‘Unusable’ security + privacy is all around us ... 8Cori Faklaris - Designing for Usable Privacy and Security, April 15, 2020 - Carnegie Mellon University - Page 8 Also see https://www.bogleheads. org/forum/viewtopic.php ?t=278973
  9. 9. ‘Unusable’ security + privacy is all around us ... 9Cori Faklaris - Designing for Usable Privacy and Security, April 15, 2020 - Carnegie Mellon University - Page 9 https://www.extremetech.com/extreme/262 166-hawaiis-missile-scare-driven-terrible-ui -fcc-launches-investigation
  10. 10. ‘Unusable’ security + privacy is all around us ... 10Cori Faklaris - Designing for Usable Privacy and Security, April 15, 2020 - Carnegie Mellon University - Page 10 https://www.extremetech.com/extreme/262 166-hawaiis-missile-scare-driven-terrible-ui -fcc-launches-investigation
  11. 11. Norman’s Gulfs of Evaluation + Execution 11 ● “Mismatch between our internal goals on the one side, and, on the other side, the expectations and the availability of information specifying the state of the world (or an artifact) and how we may change it.” Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 11 https://www.interaction-design.org/literature/book/the-glossary -of-human-computer-interaction/gulf-of-evaluation-and-gulf-of- execution https://medium.com/@gazdgabr/the-gulf-of-execution-and-eva luation-890fca716bb7
  12. 12. ‘You are not the user’ - experts vs. nonexperts 12Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 12 Iulia Ion, Rob Reeder, and Sunny Consolvo. 2015. “... No one Can Hack My Mind”: Comparing Expert and Non-Expert Security Practices. In Symposium on Usable Privacy and Security (SOUPS) 2015, 1–20. Retrieved from https://www.usenix.org/site s/default/files/soups15_full _proceedings.pdf#page=34 9 What do you do to keep your data and accounts safe?
  13. 13. Security actions differ for experts vs. nonexperts 13Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 13 Iulia Ion, Rob Reeder, and Sunny Consolvo. 2015. “... No one Can Hack My Mind”: Comparing Expert and Non-Expert Security Practices. In Symposium on Usable Privacy and Security (SOUPS) 2015, 1–20. Retrieved from https://www.usenix.org/site s/default/files/soups15_full _proceedings.pdf#page=34 9
  14. 14. Designers must address laws + regulations 14Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 14
  15. 15. IoT security + privacy tensions are multiplying 15Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 15
  16. 16. Consumers growing more wary about privacy 16 2015 Pew Research survey found: ● 60% of people chose not to install an app when they discovered how much personal info it required ● 43% uninstalled app after download, for the same reason Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 16 2015. Apps Permissions in the Google Play Store. Pew Research Center: Internet, Science & Tech. Retrieved April 14, 2020 from https://www.pewresearch.org/internet/2015/11/10/apps-permissions-in-the -google-play-store/
  17. 17. ‘Social’ cyberattacks rising with mobile usage 17 ● Verizon data: from 2013 to 2018, the number of cybersecurity breaches in which attackers used “social” methods increased from 17% to 35%. ● The involvement of human assets in these breaches rose from 19% to 39% over the same time period. Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 17 Results and Analysis, 2019 Verizon Data Breach Investigations Report, available at https://enterprise.verizon.com/resources/reports/dbir/2019/results-and-analysis/
  18. 18. Agenda for this lecture 18 ● Why care about designing for usable security + privacy ● Differences between security and privacy ○ Pessimistic vs. optimistic orientation to security ○ Data privacy vs. personal privacy ● Three-pronged approach to usable security + privacy ○ Make it invisible (where possible) ○ Offer better user interfaces (affordances, mappings, mental models, etc) ○ Train users (where necessary) ● Research that makes use of this approach Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 18 ��
  19. 19. Security vs. Privacy - Different but intertwined 19Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 19 What do you think is the difference between them?
  20. 20. Security vs. Privacy - Different but intertwined 20 ● Security ○ “CIA” model: confidentiality, integrity, availability - originally, for guarding information ○ New desired properties emerging (ex. safety) Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 20 https://cryptiot.de/iot/security/security-solution-iot-com-protocol/
  21. 21. Security vs. Privacy - Different but intertwined 21 ● Security ○ Nowadays, many people talk about security more as a process or in a certain use context (workgroups vs. publics), that it’s not a binary state Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 21 https://cryptiot.de/iot/security/security-solution-iot-com-protocol/ Still might not be secure?
  22. 22. Security vs. Privacy - Different but intertwined 22 ● Security ○ Nowadays, many people talk about security more as a process or in a certain context (workgroups vs. publics), that it’s not a binary state Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 22 https://cryptiot.de/iot/security/security-solution-iot-com-protocol/ Still might not be secure? Users’ Security Attitudes + Recalled Security Actions ● Cori Faklaris, Laura Dabbish and Jason I. Hong. 2019. A Self-Report Measure of End-User Security Attitudes (SA-6). In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). USENIX Association, Berkeley, CA, USA. Available at: https://www.usenix.org/system/files/soups 2019-faklaris.pdf Users’ Security Behavior Intentions ● Serge Egelman and Eyal Peer. 2015. Scaling the Security Wall: Developing a Security Behavior Intentions Scale (SeBIS). In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15). Association for Computing Machinery, New York, NY, USA, 2873–2882. DOI: https://doi.org/10.1145/2702123.2702249
  23. 23. Security vs. Privacy - Different but intertwined 23 ● Privacy ○ Security necessary but not sufficient for privacy ○ Generally, appropriate use of sensitive data (& same data could also be used inappropriately, which makes this tricky!) ■ Personal privacy: Perception, how users feel, manage their data and devices ■ Data privacy: How orgs handle personal data ○ Subjectively defined, difficult to measure Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 23 https://support.apple.com/ en-us/HT208650
  24. 24. Security vs. Privacy - Different but intertwined 24 ● Privacy ○ Security necessary but not sufficient for privacy ○ Generally, appropriate use of sensitive data (& same data could also be used inappropriately, which makes this tricky!) ■ Personal privacy: Perception, how users feel, manage ■ Data privacy: How orgs handle personal data ○ Subjectively defined, difficult to measure Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 24 https://support.apple.com/ en-us/HT208650 Very Short Primer for Conceptualizing Tech + Privacy ● Brandeis’ “right to be left alone” from time of photography’s introduction, established US privacy standard (https://en.wikipedia.org/wiki/The_Right_to_Privacy_(article) ) ● Altman’s Privacy Regulation Theory articulates five dimensions, such as desired vs. actual privacy, bi-directional nature (https://en.wikipedia.org/wiki/Privacy_regulation_theory ) ● Altman’s work is adapted for HCI in Leysia Palen and Paul Dourish. 2003. Unpacking “privacy” for a networked world. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’03), 129–136. https://doi.org/10.1145/642611.642635
  25. 25. Threat modeling is important in security design 25 ● What are you trying to protect? ● How important is it to you? ● How much are you willing to spend to protect it? ● Who are you concerned about? ○ Honest but curious, prankers, ex-partners, ex-coworkers, script kiddies, cybercriminals, insider attack, nation state ● How will they attack you? Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 25 https://www.microsoft.com/en-us/securityengineering/ sdl/threatmodeling
  26. 26. Threat modeling is important in security design 26 ● What are you trying to protect? ● How important is it to you? ● How much are you willing to spend to protect it? ● Who are you concerned about? ○ Honest but curious, prankers, ex-partners, ex-coworkers, script kiddies, cybercriminals, insider attack, nation state ● How will they attack you? Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 26 https://www.microsoft.com/en-us/securityengineering/ sdl/threatmodeling
  27. 27. Threat modeling is important in security design 27 ● What are you trying to protect? ● How important is it to you? ● How much are you willing to spend to protect it? ● Who are you concerned about? ○ Honest but curious, prankers, ex-partners, ex-coworkers, script kiddies, cybercriminals, insider attack, nation state ● How will they attack you? Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 27 https://www.microsoft.com/en-us/securityengineering/ sdl/threatmodeling
  28. 28. Security practices - Experts vs. nonexperts 28Cori Faklaris - Designing for Usable Privacy and Security, April 15, 2020 - Carnegie Mellon University - Page 28
  29. 29. Threat model will help determine your approach 29 ● Prevent problems from happening ○ Ex. Access control, firewalls, IP blocking, blacklists ○ Ex. Better programming tools, better OS ○ Ex. Require strong passwords or 2FA, user training Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 29 ● Detect + respond to problems after the fact ○ Ex. Intrusion detection systems (machine learning) ○ Ex. Takedown of malicious posts, call the FBI ○ Ex. Notifying users of logins on new devices PESSIMISTIC OPTIMISTIC
  30. 30. Tradeoff - ‘wall out’ harm vs. ‘open door’ policy 30 ● Choose prevention when possible if needs high enough ○ Ex. CMU payroll system ● Can be hard to figure out all cases beforehand ● (-) Cost can be high to make sure you got it right Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 30 ● Choose when access is paramount & you trust people ○ Ex. Hospitals need access to supplies, assume wise usage ● Cost to fix problems is cheap ○ Ex. Wikipedia revert ○ (-) User frustration/trauma ● Configuration costs can be lower PESSIMISTIC OPTIMISTIC
  31. 31. Security practices - Experts vs. nonexperts 31Cori Faklaris - Designing for Usable Privacy and Security, April 15, 2020 - Carnegie Mellon University - Page 31
  32. 32. Data privacy is different than personal privacy 32 ● Primarily about how orgs collect, use, and protect sensitive data, beyond a single product or service ● Focuses on Personally Identifiable Information (PII) ○ Ex. Name, address, unique IDs, pictures ● Rules about data use, privacy notices Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 32 https://www.trulioo.com/blog/managing-personally-identifiable-information/
  33. 33. Data privacy is different than personal privacy 33 ● Even more procedurally oriented than personal privacy ○ Did you follow this set of rules? ○ Did you check off all of the boxes? ● Contrast to outcome- oriented, hard to measure too (Better? Worse?) Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 33 https://www.trulioo.com/blog/managing-personally-identifiable-information/
  34. 34. Fair Information Practices (FIPs) - FTC version 34 1. Notice / Awareness 2. Choice / Consent 3. Access / Participation 4. Integrity / Security 5. Enforcement / Redress https://en.wikipedia.org/wiki/FTC_fair_ information_practice#cite_note-FIPNot ice-10 Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 34
  35. 35. Fair Information Practices (FIPs), continued 35 ● Many laws embody the Fair Information Practices ○ GDPR, CCPA, HIPAA, Financial Privacy Act, COPPA, FERPA ● But, enforcement is a weakness here ○ If an org violates, can be hard to detect ○ In practice, limited resources for enforcement Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 35
  36. 36. IoT security + privacy tensions multiplying … 36Cori Faklaris - Designing for Usable Privacy and Security, April 15, 2020 - Carnegie Mellon University - Page 36 Keyword Team. 2020. Apple and Google partner on COVID-19 contact tracing technology. Google. Retrieved April 14, 2020 from https://blog.google/inside-google/co mpany-announcements/apple-and- google-partner-covid-19-contact-tra cing-technology/
  37. 37. 37
  38. 38. Agenda for this lecture 38 ● Why care about designing for usable security + privacy ● Differences between security and privacy ○ Pessimistic vs. optimistic orientation to security ○ Data privacy vs. personal privacy ● Three-pronged approach to usable security + privacy ○ Make it invisible (where possible) ○ Offer better user interfaces (affordances, mappings, mental models, etc) ○ Train users (where necessary) ● Research that makes use of this approach Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 38 ��
  39. 39. 3-prong approach to usable security + privacy 39Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 39 1. Make it invisible (where possible) 2. Offer better user interfaces (affordances, mappings, mental models, etc) 3. Train users (where necessary) https://www.yo utube.com/wat ch?v=p03TIGq Ec8o
  40. 40. 3-prong approach to usable security + privacy 40Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 40 1. Make it invisible (where possible) 2. Offer better user interfaces (affordances, mappings, mental models, etc) 3. Train users (where necessary) My Work
  41. 41. Good ‘invisible’ security means user is weak pt 41Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 41
  42. 42. Security focus shifts to UX solutions and training 42Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 42
  43. 43. User education is a challenge (pessimistic view) 43 ● Users are not motivated to learn about security ● Security is a secondary task ● Difficult to teach people to make right online trust decision without increasing false positives Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 43 “User education is a complete waste of time. It is about as much use as nailing jelly to a wall…. They are not interested…they just want to do their job.” Martin Overton, IBM security specialist http://news.cnet.com/21007350_361252132.html
  44. 44. User education is a challenge in this work 44 ● Users are not motivated to learn about security ● Security is a secondary task ● Difficult to teach people to make right online trust decision without increasing false positives ● “User education is a complete waste of time. It is about as much use as nailing jelly to a wall…. They are not interested…they just want to do their job.” - Martin Overton, IBM security specialist http://news.cnet.com/21007350_361252132.html Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 44
  45. 45. Actually, users ARE trainable (optimistic view) 45 ● Users want to keep themselves - and those they care about - safe ● Users can learn to protect themselves from phishing… if you can get them to pay attention to training ○ Create “teachable moments” ○ Make training fun ○ Use learning science principles Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 45 Ponnurangam Kumaraguru, Steve Sheng, Alessandro Acquisti, Lorrie Faith Cranor, and Jason Hong. 2010. Teaching Johnny not to fall for phish. ACM Trans. Internet Technol. 10, 2, Article 7 (June 2010), 31 pages. DOI: https://doi.org/10.1145/1754393.1754396
  46. 46. Nova Cybersecurity Lab and Game 46Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 46 https://www.pbs.org/wgbh/nova/labs/l ab/cyber/ Great example of creating “teachable moments” and also injecting light-hearted humor and design with the simple game mechanics and lessons.
  47. 47. Apps vs. Hackers (Ongoing research) 47Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 47 https://apps-vs-hacker s.firebaseapp.com/cla ssic Adapting “Plants vs. Zombies” game to a cybersecurity context
  48. 48. Hacked Time (Ongoing research) 48Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 48 http://www.tianyingch en.com/hackedtime/ Choose Your Own Adventure, based in Self Efficacy Theory, narrative immersion
  49. 49. These make use of principles to boost learning 49 ● Learning by doing – like our labs, get hands on practice ● Immediate feedback – better quickly than later ● Conceptual-procedural – Interleave abstract principles with concrete examples (like we’re doing right now!) Help people understand the principle, and offer examples to help people understand specifics, then back to principle to generalize ● Reflection – thinking about why you did something helps with retention (which is why we have this for homeworks) ● Multimedia – images, text, sound Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 49
  50. 50. User studies help evaluate learning outcomes 50 ● Evaluation of PhishGuru system - is embedded training effective? ○ Study 1: Lab study, 30 participants ○ Study 2: Lab study, 42 participants ○ Study 3: Field trial at company, ~300 participants ○ Study 4: Field trial at CMU, ~500 participants ● Studies showed statistically significant decrease in falling for phish, increased ability to retain what they learned Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 50 Ponnurangam Kumaraguru, Yong Rhee, Alessandro Acquisti, Lorrie Faith Cranor, Jason Hong, and Elizabeth Nunge. 2007. Protecting people from phishing: the design and evaluation of an embedded training email system. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’07). Association for Computing Machinery, New York, NY, USA, 905–914. DOI:https://doi.org/10.1145/1240624.1240760
  51. 51. Good interfaces for security + privacy are hard! 51 ● Lots of security terminology ○ Ex. You have digital keys to “encrypt” things ○ Ex. You can also use digital keys to sign things ● Lots of complexity ○ Ex. Might have multiple sharing policies ○ Ex. Some tasks might need to be harder to prevent attacks (account creation) ● Security is a secondary task ○ Ex. You don’t go to Dropbox to do security Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51
  52. 52. 52
  53. 53. 53 Sauvik Das, Gierad Laput, Chris Harrison, and Jason I. Hong. 2017. Thumprint: Socially-Inclusive Local Group Authentication Through Shared Secret Knocks. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17). Association for Computing Machinery, New York, NY, USA, 3764–3774. DOI: https://doi.org/10.1145/3025453.3025991
  54. 54. Communication-Human Information Processing Model 54 ● See the warning? ● Understand? ● Believe it? ● Motivated? ● Can and will act? Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 54 Serge Egelman, Lorrie Faith Cranor, and Jason Hong. 2008. You’ve been warned: an empirical study of the effectiveness of web browser phishing warnings. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1065–1074. https://doi.org/10.1145/1357054.1357219 Make a recommendation, but leave it to the user to act
  55. 55. Agenda for this lecture 55 ● Why care about designing for usable security + privacy ● Differences between security and privacy ○ Pessimistic vs. optimistic orientation to security ○ Data privacy vs. personal privacy ● Three-pronged approach to usable security + privacy ○ Make it invisible (where possible) ○ Offer better user interfaces (affordances, mappings, mental models, etc) ○ Train users (where necessary) ● Research that makes use of this approach Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 55 ��
  56. 56. Communication-Human Information Processing Model 56 ● See the warning? ● Understand? ● Believe it? ● Motivated? ● Can and will act? Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 56 Serge Egelman, Lorrie Faith Cranor, and Jason Hong. 2008. You’ve been warned: an empirical study of the effectiveness of web browser phishing warnings. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1065–1074. https://doi.org/10.1145/1357054.1357219
  57. 57. SA-6 Measures a User’s Security Attitude 57Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 57 Cori Faklaris, Laura Dabbish and Jason I. Hong. 2019. A Self-Report Measure of End-User Security Attitudes (SA-6). In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). USENIX Association, Berkeley, CA, USA. Available at: https://www.usenix.org/system/files/soups2019-faklaris.pdf On a scale of 1=Strongly Disagree to 5=Strongly Agree, rate your level of agreement with the following: ● Generally, I diligently follow a routine about security practices. ● I always pay attention to experts’ advice about the steps I need to take to keep my online data and accounts safe. ● I am extremely knowledgeable about all the steps needed to keep my online data and accounts safe. ● I am extremely motivated to take all the steps needed to keep my online data and accounts safe. ● I often am interested in articles about security threats. ● I seek out opportunities to learn about security measures that are relevant to me.
  58. 58. SA-6 Measures a User’s Security Attitude 58Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51 Cori Faklaris, Laura Dabbish and Jason I. Hong. 2019. A Self-Report Measure of End-User Security Attitudes (SA-6). In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). USENIX Association, Berkeley, CA, USA. Available at: https://www.usenix.org/system/files/soups2019-faklaris.pdf On a scale of 1=Strongly Disagree to 5=Strongly Agree, rate your level of agreement with the following: ● Generally, I diligently follow a routine about security practices. ● I always pay attention to experts’ advice about the steps I need to take to keep my online data and accounts safe. ● I am extremely knowledgeable about all the steps needed to keep my online data and accounts safe. ● I am extremely motivated to take all the steps needed to keep my online data and accounts safe. ● I often am interested in articles about security threats. ● I seek out opportunities to learn about security measures that are relevant to me. TAKE THE QUIZ AT http://bit.ly/sa6quiz
  59. 59. SA-6 Measures a User’s Security Attitude 59Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51 Cori Faklaris, Laura Dabbish and Jason I. Hong. 2019. A Self-Report Measure of End-User Security Attitudes (SA-6). In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). USENIX Association, Berkeley, CA, USA. Available at: https://www.usenix.org/system/files/soups2019-faklaris.pdf On a scale of 1=Strongly Disagree to 5=Strongly Agree, rate your level of agreement with the following: ● Generally, I diligently follow a routine about security practices. ● I always pay attention to experts’ advice about the steps I need to take to keep my online data and accounts safe. ● I am extremely knowledgeable about all the steps needed to keep my online data and accounts safe. ● I am extremely motivated to take all the steps needed to keep my online data and accounts safe. ● I often am interested in articles about security threats. ● I seek out opportunities to learn about security measures that are relevant to me. SEE RESPONSES AT http://bit.ly/sa6charts
  60. 60. How to Use the SA-6 Psychometric Scale 60Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51 Cori Faklaris, Laura Dabbish and Jason I. Hong. 2019. A Self-Report Measure of End-User Security Attitudes (SA-6). In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). USENIX Association, Berkeley, CA, USA. Available at: https://www.usenix.org/system/files/soups2019-faklaris.pdf Answer practical research questions such as: ● How attentive to security advice is a certain user group likely to be? ● Does a new awareness campaign or usability tool help or hurt a user’s attitude toward security compliance? Conduct theory-motivated research on human factors: ● Measure attitude in Elaboration Likelihood Model ● Measure motivation in Self-Determination Theory ● Measure coping appraisal in Protection Motivation Theory
  61. 61. Social Contexts of Security Behavior 61Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51 Yunpeng Song, Cori Faklaris, Zhongmin Cai, Jason I. Hong, and Laura Dabbish. 2019. Normal and Easy: Account Sharing Practices in the Workplace. In Proceedings of the ACM: Human-Computer Interaction, Vol. 3, Issue CSCW, November 2019. ACM, New York, NY, USA. Available at: https://drive.google.com/file/d/17xb07vuKjPrgoKNzBSGouTgqNNEeACF0/view Workplace cybersecurity: Sharing accounts and devices to collaborate on tasks and to keep costs down. ● Workarounds are norm (ex: password taped to PC) ● Difficult to share and to control access with systems that presume one user at a time ● Lack of accountability and awareness of one person’s activities by others
  62. 62. Social Contexts of Security Behavior 62Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51 Cheul Young Park, Cori Faklaris, Siyan Zhao, Alex Sciuto, Laura Dabbish and Jason I. Hong. 2018. Share and Share Alike? An Exploration of Secure Behaviors in Romantic Relationships. In Proceedings of the Fourteenth Symposium on Usable Privacy and Security (SOUPS 2018). USENIX Association, Berkeley, CA, USA. Available at: https://www.usenix.org/system/files/conference/soups2018/soups2018-park.pdf Romantic cybersecurity: Sharing accounts and devices as relationships and households form and while working through the end of a relationship. ● Account sharing is both functional and emotional ● Usability challenges for romantic couples that share accounts and devices (such as 2FA tied to only one person’s device, breakups lead to data breaches)
  63. 63. 63Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51 Safesea browser plugin for Google Chrome ● Helps Facebook users navigate privacy and security settings. ● Displays crowd and expert suggestions for settings. Social Contexts of Security Behavior
  64. 64. Social Contexts of Security Behavior 64Cori Faklaris - Designing for Usable Security and Privacy, April 15, 2020 - Carnegie Mellon University - Page 51 ‘Fitness’ Tracking for cybersecurity: Could be used for contests or for sharing and displaying behavior changes, just like with physical fitness tracking
  65. 65. Key takeaways for design ● Threat modeling - pay attention to “who” ○ Prevent Problems vs. Detect + Respond ○ Personal privacy vs. data privacy ● Fair Information Practices ● 3-pronged approach to usable security + privacy ○ Make it invisible, Better UIs, Train ● Learning science principles ● C-HIP model for warnings ○ Also useful for non-security warnings too! Cori Faklaris - Carnegie Mellon University - Page 65
  66. 66. Key takeaways for YOU ● Use a password manager & install all legit software updates ● Sense of urgency is probably fake ● You’re not too smart to get fooled ● DON’T CLICK ANYTHING (google) ● Choose not easily guessable security questions ● No free lunch Cori Faklaris - Carnegie Mellon University - Page 66
  67. 67. 67 67 https://socialcybersecurity.org

Overview of key concepts in usable security and privacy for UX designers, chiefly: Threat modeling; Fair Information Practices; 3-pronged approach to usable security + privacy; Learning science principles; Communication-Human Information Processing model for warnings. Guest lecture in Programming Usable Interfaces, Spring 2020, Carnegie Mellon University.

Views

Total views

71

On Slideshare

0

From embeds

0

Number of embeds

0

Actions

Downloads

3

Shares

0

Comments

0

Likes

0

×