OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
Â
UX Edinburgh Meetup deck - Privacy UX - March 2024.pdf
1. Privacy UX
UX Meetup | March 2024
Privacy as human and user experience
Gintare Venzlauskaite
gintare@uservision.co.uk
Duration â 60 min
2. 2
A bit about me
âș UX Consultant at User Vision
âș Teacher in Higher Education
âą Research methods
âą Critical thinking skills
âą History of the Soviet Union
âș Social & Political science researcher
âą Forced migration, diaspora
âą Collective memory, cultural trauma
âą Data brokers, anonymity, and digital privacy
âș Privacy enthusiast (but not an expert!)
3. 3
What I am going to talk about
âș Why privacy matters
âș What are current concerns revolving around digital privacy
âș How response to privacy concerns is evolving
âș What design and UX has to do with digital privacy
âș Principles of Privacy UX
âș How to practice privacy personally and professionally
âș Whatâs in the horizon for digital privacy
5. 5
Definitions
Privacy [prahy-vuh-see; British also priv-uh-see]
âș Someoneâs right to keep their personal matters and relationships secret (Cambridge dictionary).
âș The right to be let alone, or freedom from interference or intrusion (IAPP).
Informational privacy
âș The right to have some control over how your personal information is collected and used and under
what circumstances (IAPP; Law insider).
Right to privacy
âș Privacy is among human rights enshrined in the Universal Declaration of Human Rights, the European
Convention of Human Rights, and the European Charter of Fundamental Rights (European Data
Protection Supervisor).
6. 6
Your relationship with digital privacy
Please scan this QR code to enter Mentimeter questionnaire
7. 7
Even when you think youâve done it all, they still get youâŠ
Sitcom âFrasierâ
8. 8
Users as data generators
How much data do we generate?
âș According to DOMO (2020), an average
internet user generates 1.7 MB of data per
second
How do we generate data?
âș The data trail is accumulated by interactions
with digital technologies that we use, wear,
register for, check-in at, etc.
9. 9
Users as data generators
What happens to our data?
âș Data can be harvested through tracking, scraping,
proxies; then mined and aggregated
âș It benefits us â efficiency, convenience, discounts,
accuracy, scientific development
So what?
âș It makes us data subjects whose information is collected
and used for profit
âș This data is monetised by first parties and may be also sold
to third parties, e.g., to data brokers â companies who
then sell it on to other market actors
âș It makes us vulnerable. The real-life short- and long-term
implications are difficult to anticipate, grasp, or control
11. 11
Digital privacy as human experience
Real-life consequences of data economy
âș Individual (e.g., abused vulnerabilities, prevention of opportunities, physical and digital harassment,
identity theft)
âș Socio-political impact (e.g., targeted political advertising, social polarization)
âș National security (e.g., cyber attacks, surveillance)
12. 12
Privacy concerns, awareness and regulations are on the rise
Legislation Research Products and Services
âș GDPR
âș UK Data Protection Act
âș US state privacy laws
âș 71% of countries with some
kind of legislation around
data protection and/or
privacy
âș Academic literature
âș Journalism
âș Guidelines and
advice for users
âș Browsers
âș Automatic opt-out engines
âș Removal of data from
commercial and data broker
databases
âș Tracker/cookie blockers
âș Data breach alerts
âș Product reviews
13. 13
âș $275mil.
collecting
personal
information
from children
under 13
âș $245mil.
deployment of
deceptive
design for
making
unintended
purchases
âș âŹ746 mil.
tracking
without
consent
âș $30 mil.
Alexa/Ring
invasion of
privacy
Recent fines related to privacy violation
Sources: Termly; TechCrunch; US Federal Trade Commission
âș âŹ159 mil. not
giving users
easy way to
refuse cookies
on Google and
Youtube
âș $50 mil. for
poorly
structured
privacy
consent
agreements
âș $7.8 mil.
sharing usersâ
sensitive data
with outside
companies
âș ÂŁ12.7 mil.
illegally
processing
data of
children under
13 without
checking for
parental
consent
âș âŹ345mil. For
breaking EU
data law on
childrenâs
accounts
âș âŹ60 mil. not giving users
easy way to refuse
cookies on Facebook
âș âŹ225 mil. unclear privacy
policies and lack of
transparency
âș âŹ405 mil. mishandling
teenagersâ data on
Instagram
âș âŹ390 mil. forcing consent
âș âŹ265 for exposure to data
scraping
âș ⏠1.2 bil. for unlawful
transfer of data of EU
citizens to the US
14. 14
Data privacy concerns on popular TV
Last week tonight with John Oliver on Data brokers
Parks and Recreation on cookies Documentary exploring social media through the case of
Cambridge Analytica
The final season of TV series âSuccessionâ
dropping a few references to data economy,
lack of peopleâs understanding of how it
works, and business overtures to bypass
governmental regulation
Black Mirror, season 6 (2023)
on implications of sneaky T&Cs
16. 16
It is hard to know what you donât know
The internet and technology has not been kind on
our privacy
âș Tech companies are never explicit or specific of if and
how our personal and behavioural data will be used
âș Companies update their privacy notices all the time
âș Data economy prompts practices aimed at collecting
more data.
âș Limitations of knowledge causes digital resignation
âș Putting much of responsibility on individuals is unfair
(Daniel Solove)
17. 17
It is hard to know what you donât know
Design is part of the problem
âș Design contributes to maximising data collection
âș Difficult language, lack of visibility, ambiguous choices, deceptive
patterns or lack of user-focused controls are all utilised and in turn
is part of the problem
UX design paradox
âș The UX field prides itself in its human-centred philosophy and
commitment to serve user needs through an optimal experience
âș If UX designers are involved in compromising privacy-related
design, thatâs problematic
âș Balancing between business needs and user needs in data-driven
world can lead well-meaning teams into the grey area of deceptive
design practices
19. 19
Question time
Do you ever take note of privacy-related
interfaces and designs?
Feel free to give examples.
What do think makes a design privacy-
unfriendly?
Feel free to give examples.
20. 20
Privacy UX
Privacy UX generally refers to:
âș Approach promoting privacy by design and default
âș Practices bridging data privacy protection regulations and their
translation into user-friendly interfaces, journeys and content
âș Application of UX principles in the user-facing versions of
products and services relevant to privacy by:
âą being transparent
âą tailoring explanations
âą providing options
âą enabling user to make meaningfully informed decisions about what
they share/provide/act upon
âș Acknowledgement that privacy is an important element of user
experience (as broadly understood) and should not be exploited
by way of deceptive design practices
21. 21
Deceptive patterns
Definition
âș Deceptive patterns (also known as
âdark patternsâ) are tricks used in
websites and apps that make you do
things that you didn't mean to, like
buying or signing up for something
(deceptive.design).
Deceptive patterns relevant to
privacy
âș Preselection
âș Forced action
âș Privacy zuckering
âș Confirm-shaming
âș Hard to cancel
âș Visual interference
âș Decisional interference
âș Hindering
22. 22
Deceptive patterns relevant to privacy
Preselection
Employing the defaults, options that
are already chosen for you.
Most often refers to preselected
boxes or steps this way manipulating
peopleâs awareness; if users donât
notice it, their choice and autonomy is
undermined (Deceptive design)
23. 23
Deceptive patterns relevant to privacy
Forced action
A transactional pressure to choose an option that is
better for the provider in return for something that
user want
It can also come in form of a âbundled consentâ
whereby agreeing or providing something
mandatory, the user is also agreeing to something
else (Deceptive patterns).
Source: Luiza Jarovsky @Privacy Whisperer
24. 24
Deceptive patterns relevant to privacy
Privacy zuckering
Linked to forced action, this refers to tricking users
into sharing more personal information than
intended (e.g., photos, address, phone number,
contacts, preferences, date of birth)
Source: Uxcel
25. 25
Deceptive patterns relevant to privacy
Confirm shaming
Influencing userâs decision making by
triggering uncomfortable emotions around
choice that would be more beneficial for
them (Deceptive patterns).
Source: The Mobiversal blog
26. 26
Deceptive patterns relevant to privacy
Hard to cancel or opt-out
(roach motel)
Typically, easy to subscribe or sign up with a
service but very difficult to cancel.
This can cause user resignation and leaving
them with staying with the service longer
than intended (Deceptive patterns).
Source: noyb.eu
27. 27
Deceptive patterns relevant to privacy
Visual interference
Purposefully hiding, disguising or obscuring
choices by way of lower contrast, smaller
text or general prominence of associated
design components (Deceptive patterns).
On cookie banners
âș Introduced to provide users with the opportunity to give
consent on tracking and for companies to get their consent
âș criticised for becoming a pain point, a barrier between site
visitors and the content they are interested in
âș have bad reputation for deceptive design practices
âș good example of user versus human experience dilemma
(quick acceptance removes the barrier but contributes to
long-term harm)
28. 28
Deceptive patterns relevant to privacy
Decisional interference
Limited or absent choices or alternatives that
otherwise would be preferable to an individual
(Privacy Wiki).
29. 29
Deceptive patterns
Demo of 12 minutes of unchecking
legitimate interest boxes on a cookie
banner (I gave up and left the website)
Hindering
Delaying, hiding or making it difficult for the user to adopt
privacy-protective actions (difficult rejection, complex
settings, hiding opt-out, deletion, or revoking of consent,
privacy-negligent defaults).
30. 30
Deceptive patterns relevant to
privacy
Hindering is common in privacy settings
âș There is rarely on place where you can control all
privacy-related settings
âș While Meta now refreshed their privacy controls, it is
still challenging to navigate and manage them with
ease
Example of adjusting audience-based ad settings
on Facebook. It is possible to refuse to be shown
adds from particular advertisers, but I need to opt-
out one advertiser at a time instead of being able to
disallowing targeting from all.
31. 31
Similarly, imagine you
discovered that there is a
way of severing ties with
advertisers that share my
activity happening outside
Facebook with Facebook to
target-advertise me.
There is not âselect allâ or
âdisconnectâ option.
32. 32
Compliant does not mean user- or human-friendly
The above examples show that laws and regulations are not
enough (at least not in their current form)
âș Just because a company complies it does not mean that it will
do so with the best userâs interest at heart
âș Your data and privacy protection legally depends on where
you live
âș It is hard to check if company honours your choices
âș For big companies, paying the penalty bill may be seen as
lesser price to pay than profits they get from commodifying
peopleâs data
The business are doing their
best to make money, which
means that loopholes are likely
to be exploited including in the
form of deceptive design.
â Harry Brignull
â Richie Koch, Proton blog
34. 34
Question time
So, shouldnât designers know better?
What do you think are the challenges of UX professionals when it
comes to creating with privacy in mind?
Have you ever received any training around data
privacy (either in education and/or your organisation)?
Feel free to give examples.
35. 35
Challenges of practicing Privacy UX
What limits designers
âș Lack of privacy-forward requirements and
direction from upper echelons of the
organisation/company
âș Lack of voice in decision making
âș Lack of awareness of the relevant regulations
âș Lack of education and training
âș Lack of collaboration with other privacy
stakeholders (lawyers, engineers)
âș Lack of guidelines and standardized practices
âș User-unfriendly legal documents
36. 36
Your priva-Cs. How to practice privacy UX
Choice Clarity
Control
Consciousness Curiosity Championing
Adopted from IAPP; Fairpatterns. ICO, Alexandra Schmidtâs book âDeliberate intervention [âŠ]â
Cautioning
Compliance Collaboration
Contributing
37. 37
Your priva-Cs. How to practice privacy UX (in words)
On professional level:
âș Be aware of data privacy regulations and your organisationâs policies
âș Be curious about privacy-forward solutions, privacy-friendly businesses, guidelines, practices, tips, advice
âș Be a champion in your personal and professional circles â if you know something, spread the word
âș Caution the team when you anticipate a potentially harmful (or unlawful) design practice. Seek support or report malpractice
âș Contribute to development and sharing of good practice
âș Seek collaboration with other teams (engineers, legal, etc.)
On practical level, when it comes to hands on work:
âș It is important to provide users with choice and inform them they have a choice (e.g., to opt-out) in a meaningful way
âș The options need to be presented in a balanced and fair manner (avoidant of framing one choice over other)
âș Users should be able find, understand and control their privacy settings. Having control ensures autonomy. These controls should
be granular, and easy to manage
âș Being able to revoke consent or opt-ins should be ensured
âș The privacy-related interfaces and content should be clearly formulated
âș The websites/apps should inform the users about purposes and choices in clear fashion
âș Transparency about how and why the data is collected, or what the value of the transaction is
Adopted from IAPP; Fairpatterns; ICO; Alexandra Schmidtâs book âDeliberate intervention [âŠ]â
40. 40
Question time
What is your knowledge about changes in data privacy
landscape?
Regulations? Tools? Projects? Approaches? Innovations?
Are you hopeful or doubtful?
Is privacy too hard to do?
41. 41
Developments in sight signal hope
Growing consumer awareness
âș ICO survey in 2022 showed 90% of the respondentsâ
concern about organisations using their personal
information without their permission.
âș KMPG in 2021 revealed that 86% of respondents
reported growing concerns about their data privacy.
âș 63% of consumers worldwide think companies aren't
honest about how they use their personal information,
and nearly 48% have stopped purchasing from
companies because of privacy worries. (Tableau)
42. 42
Developments in sight signal hope
Emerging laws and regulations
âș Steady and continuous growth in global privacy laws â
by 10% every year.
âą There are at least 157 countries with some kind of data
privacy law. That is 50% more than 10 years ago
âą In the US, the number of state privacy laws is also
growing
âą The further development of regulations on deceptive
patterns (India banned them in 2023)
âș The EUâs AI Convention
(Source: IAPP)
43. 43
Developments in sight signal hope
Growing business awareness
âș We see more pleads for commitment to privacy
âș Privacy is increasingly seen as the next
component of ESG framework to gaining and
maintaining consumer trust and loyalty (Brian
Lesser)
âș Companies witness an average return on
investment of 1.8% from their privacy-related
expenditures, and 92% acknowledge they have a
moral obligation to use consumer data honestly
and transparently (Cisco)
Businesses now understand that if
they want to keep the customers
they have and attract new
opportunities, theyâre going to
have to sell privacy as part of what
they are offering
â Ann Cavoukian
44. 44
Developments in sight signal hope
Adoption of standards and initiatives for guidelines
âș ISO adopted Privacy by design standard (ISO31700 in Feb
2023)
âș Draft report with opinions and advice on cookie banners (EU)
âș Good practice initiative for cookie banner consent
management (Germany)
âș Privacy-enhancing design heuristics
âș New rules for apps to boost consumer security and privacy
(UK)
âș ICO and CMA joint position paper on harmful design in digital
markets and stakeholder workshop for design guidelines
45. 45
Developments in sight signal hope
Demand for privacy professionals
âș Privacy engineering field is growing
âș The effort to define the field as well as the role
and concepts for privacy by design
âș More courses, programmes, literature focusing
on how to protect personal data in digital
environment
46. 46
Developments in sight signal hope
Communities and services of
practice
âș We see more initiatives, individuals, and
communities dedicated to advocating for
privacy
âș More examples of privacy-by-design
services, off-the-shelf solutions, and
consulting firms providing fairer privacy
design advice, assessment, and products
47. 47
Developments in sight signal hope
Solutions and products for consumers
that:
âș Opt-out consent to collect data and target
you for you
âș Removes data from commercial databases
on your behalf
âș Requests data brokers to delete your data
âș Blocks trackers/cookies
âș Let you know when your data has been
compromised
âș Reviews products and websites for privacy
Examples of privacy-forward products
48. 48
Developments in sight signal hope
Privacy technologies
âș The number of privacy tech companies
increased by 777% since 2017. From 44 in
2017 to almost 370 in 2022 (IAPP)
âș Growing trend of real-world application of
Privacy-enhancing technologies (synthetic
data, homomorphic encryption, federated
learning, data minimisation)
49. 49
Developments in sight signal hope
Shift in approach to personal data
and privacy
âș Belief that technology is not only part of the
problem but also part of the solution (Jaap
Henk-Hoepman)
âș Ideas around your data working for you, not
against you (Prifina)
50. 50
... But it is still an uphill battle
Surveillance goes on
âș Every day, we keep feeding companies with our
data, and the tracking, profiling, and behavioural
data exploitation goes on
âș We keep seeing (and fall victims for) unsavoury,
privacy-undermining practices and data breaches
âș Companies say they embrace privacy but there is
also lot of privacy washing and privacy-branding
with continuous profiling of personal data by
organisations of all sizes
51. 51
... But it is still an uphill battle
Limited control and scrutiny
âș No global data privacy standards
âș Gaps and flaws in existing laws
âș The approach from regulators is largely reactive
rather than proactive
âș Too much industry appeasement and reservations
about disrupting business models (Wolfie Christl)
âș Limited resources in watching what companies are
doing
52. 52
... But it is still an uphill battle
Pushback and/or manoeuvring from the big tech
53. 53
... But it is still an uphill battle
Regulation vs innovation
âș View that regulation is slowing down and
curbing innovation
âș Ground-breaking developments in
technology distracts from difficult debates
âș Limited public knowledge about other
areas of privacy, e.g., cognitive, biometric
55. 55
The key takeaways
PrivacyâŠ
âș matters and it is a part of our human and
user experience
âș has been undermined by the development
of digital technology
âș is everyoneâs business to care about and
advocate for
âș is becoming important value and
component of business strategy
âș should not be sacrificed for the sake of
aggressive innovation
âș is a part of user experience, therefore
should be part of responsible UX design
Do the best you can until
you know better.
When you know better, do
better.
â Maya Angelou
57. 57
Before we part our ways, personal tips
Ask yourself:
âș What information I am comfortable giving
(even the most basic information can be
sensitive, and not all information must be
accurate)
âș How can I reduce exposure to my personal
data
âș The usage of apps on your phone â are you
using everything?
âș Does the website/app needs as much
information as it is asking?
âș Does the website/app provide privacy controls
they are committed to by law? (e.g., consent,
right to be forgotten, right to delete or correct
personal information); should I report them to
local data privacy watchdog?
Beware of:
âș Default settings - you maybe sharing more (and
publicly) than intended
âș Deceptive design (aka dark patterns)
âș Tracking
âș Apps that require a login â if you donât log out, they may
be running in the background and collecting your data
For families and parents/guardians:
âș Be proactive of family privacy settings on your devices
âș Check parental controls and privacy settings for
children (Parent Club has a great set of information and
resources about online safety)
âș Pause before âsharentingâ
âș Look for privacy guides for parents
60. 60
References (contâd)
âș âPrivacy-washing: What Is It And How To Stop It From Happening To Your Companyâ, California Lawyers Association:
https://calawyers.org/business-law/privacy-washing-what-is-it-and-how-to-stop-it-from-happening-to-your-company/
âș âThe State of Data Privacy in 2023â (video conference), Yes We Trust, 19 January, 2023: https://app.livestorm.co/didomi/the-state-of-
data-privacy-around-the-world-in-2023/live?s=ef20f40e-608d-49e2-bebf-820190c983bc
âș âThree years of GDRP: the biggest fines so farâ, BBC News, 24 May, 2021: https://www.bbc.co.uk/news/technology-57011639
âș Anant, V., Donchak, L., Kaplan, J., Soller, H. âThe consumer-data opportunity and the privacy imperativeâ, McKinsey & Company, 27
April 2022: https://www.mckinsey.com/capabilities/risk-and-resilience/our-insights/the-consumer-data-opportunity-and-the-
privacy-imperative
âș Auxier, B., Rainie, L., Anderson, M., Perrin, A., Kumar, M. and Turner, E. (2019) âAmericans and Privacy: Concerned, Confused and
Feeling Lack of Control Over Their Personal Informationâ, Pew Research Center:
https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-
their-personal-information/
âș Baines, J. âICO warns of fines for companies who do not get cookie banners rightâ, in Mishcon de Reya, 15 June 2023:
https://www.mishcon.com/news/ico-warns-of-fines-for-companies-who-do-not-get-cookie-banners-right
âș Beens, R. E. G. âThe Role of Digital Privacy in Brand Trustâ, Forbes, 8 January, 2021:
https://www.forbes.com/sites/forbestechcouncil/2021/01/08/the-role-of-digital-privacy-in-brand-trust/
âș Brignull, H., âDark Patterns overviewâ (video), Vimeo: https://vimeo.com/776232138
âș Christl, W. (2022) Digital Profiling in The Online Gambling Industry. A report on marketing and risk surveillance by the UK gambling
firm Sky Betting and Gaming, TransUnion, Adobe Google, Facebook, Microsoft and other data companies:
https://cdn.sanity.io/files/btrsclf0/production/e23ea75fe93f775d9f9ed795427f4b5ed8d67016.pdf
âș Deceptive Design: https://www.deceptive.design/
61. 61
References (contâd)
âș Jarovsky, L. âUnderstand Privacy-Enhancing Design and How it Can Be a Game Changer for Data Protectionâ, Linkedin, 15 June, 2022:
https://www.linkedin.com/pulse/understand-privacy-enhancing-design-how-can-game-changer-
jarovsky/?trackingId=F5E466iUQsK29mvK3wXYGw%3D%3D
âș FairPatterns by âamurabi: https://fairpatterns.com/
âș Hoepman, J. H. Privacy is Hard and Seven Other Myths. Achieving Privacy Through Careful Design, The MIT Press, Cambridge
Massachusetts, London England.
âș Ketch, J. J. âData privacy truly matters to your customers. Itâs time to make it a core business valueâ, Venture Beat, 31 August, 2022:
https://venturebeat.com/security/data-privacy-truly-matters-to-your-customers-its-time-to-make-it-a-core-business-value/
âș Koch, R. âBig tech has already made enough money to pay all its 2023 finesâ, in Proton blog, 8 January 2024:
https://proton.me/blog/big-tech-2023-fines-vs-revenue
âș Komnenic, M., â51 Biggest GDPR Fines & Penalties So Farâ, Termly, March 31 2022: https://termly.io/resources/articles/biggest-gdpr-
fines/
âș Lance, W. (2021) âData privacy is a growing concern for more consumersâ, Tech Republic:
https://www.techrepublic.com/article/data-privacy-is-a-growing-concern-for-more-consumers/
âș Lomas, N. âFacebook data-scraping breach triggers GDPR enforcement lawsuit in Irelandâ, Tech Crunch, 10 January, 2023:
https://techcrunch.com/2023/01/10/digital-rights-ireland-gdpr-lawsuit-facebook-data-scraping-breach/
âș Lupiåñez-Villanueva, F., Boluda, A., Bogliacino, F., Liva, G., Lechardoy, T. R., Ballell, H. (2022) âBehavioural study on unfair commercial
practices in the digital environment: dark patterns and manipulative personalisation. Final Reportâ, Directorate-General for Justice
and Consumers. EU Consumer-Programme: https://op.europa.eu/en/publication-detail/-/publication/606365bc-d58b-11ec-a95f-
01aa75ed71a1/language-en/format-PDF/source-257599418
âș Nelissen L., & Funk, M. (2022). Rationalizing dark patterns: Examining the process of designing privacy UX through speculative
enactments. International Journal of Design, 16(1), 77-94. https://doi.org/10.57698/v16i1.05