The document discusses best practices for designing digital experiences that protect user privacy, including avoiding "dark patterns" that trick users, being transparent about what personal data is collected and how it is used, and obtaining explicit user consent before collecting or sharing sensitive personal information. It emphasizes designing with privacy as the default setting and embedding privacy into the design from the start.
2. • Increasingly, companies need
consider the privacy of their users’
data, their content, even their
browsing behavior for their clients’
benefit and safety
• But they also do it for their own
personal and financial self interest
• So how do we design digital
experiences —apps, websites, etc —
for these companies, which ensure
people’s privacy?
Introduction
Photo by Maximalfocus, Unsplash
3. Privacy and security are different concepts:
Privacy: Your ability to control your personal information and
how it’s used
Security: How your personal information is protected by those
holding on to it
These concepts often overlap, so we’ll refer to both
Our focus: How we can ensure people’s privacy is maintained
as we design experiences for them
Purpose
5. “Arguing that you don't
care about the right to
privacy because you
have nothing to hide is
no different than saying
you don't care about
free speech because
you have nothing to
say.”
— Edward Snowden, former CIA
employee, infamous NSA leaker
Why Privacy?
6. • Even if we’re not concerned with a particular privacy
issue, we’re not designing for ourselves
• If we’re designing with empathy, if we’re designing
to be inclusive, we’ll consider the needs of people
not like ourselves — people with different
backgrounds and experiences
• That means researching and understanding privacy
issues, but also interviewing or talking to people
with diverse backgrounds and lived experiences
Why Privacy?
7. Example:
DayOne, a non-profit which provides
services for young people in abusive
dating relationships.
Online privacy is a very real issue for
these clients, who may be worried
about their partners tracking their
online activity or even stalking them in
real life. Something we’d consider
when designing a site for them.
Why Privacy?
Screenshot from DayOne’s website
8. Similarly, LGBTQ youth need to
feel their privacy is secure when
reaching out for help online.
In this sense, privacy issues are
very often also diversity issues.
Privacy is a key consideration
for inclusive design.
Why Privacy?
Screenshot from The Audre Lorde Project’s Facebook page
10. Wall Street Journal, 2019
Security experts believe quantum
computers will break through our
existing encryption technology within a
decade.
Experts must determine new ways to
protect our data—and quickly.
Imagine if all your passwords suddenly
became useless to protect your identity
and your personal information.
Data Security
11. In early April, we learned that
Facebook—the largest, most
popular social media platform
on the planet—was hacked.
533 million user’s phone
numbers and personal data were
leaked online.
Data for half a billion people.
Data Security
12. • Fraud and identity theft rose
during the pandemic
• FTC: 1.4 million reports of identity
theft in 2020 — double from 2019
• Identity thieves targeted
government funds earmarked to
help people hit by the pandemic
• Leaks of personal data can be
catastrophic to people’s lives
• Cleaning up the mess created by
identity theft can take years
Fraud & Identity
Theft
Photo by Kyle Glenn
13. Apps can do all sorts of fun stuff
makes you look older, younger,
etc.
But what else are they doing
with that very personal data
about your face?
While we’re taking selfies for fun
with these “free” apps, privacy
experts worry about what could
be done with that sort of data in
the future.
Facial
Recognition
14. Clearview.ai, a facial recognition platform
offers services to law enforcement.
They downloaded over 3 billion photos of
people from the Internet and social media
and used them to build facial recognition
models for millions of people without their
permission.
The SAND Lab at University of Chicago
developed Fawkes1, a tool that gives
individuals the ability to limit how third
parties can track them by building facial
recognition models out of their publicly
available photos.
Facial
Recognition
15. Stores, such as Albertsons,
Rite-Aid, Macy’s, ACE
Hardware are using facial
recognition programs to
identify customers.
Some also use apps to track
customers around their stores
to present them with ads
online later.
Facial
Recognition
16. Critics of Facebook’s new
wearable collab with Ray-Ban
point out that indicators of its
recording feature are so
subtle, they may go unnoticed.
Many expressed similar
concerns over Google Glass,
when it was released in 2013.
Facial
Recognition
17. Deep fakes use deep learning technology to
take an existing image or video and add
another person’s likeness to it.
Deep fake videos of Tom Cruise, Barack Obama
and other famous people, too.
But emerging deep fakes depict everyday
people saying, doing things they didn’t do.
Deep Fakes
DeepTomCruise: deepfake version of Tom Cruise
on TikTok uses an impersonator to create short
skits
18. Early this year, delivery
drivers were required to
sign consent forms, which
allowed Amazon to collect
their biometric data* and to
use AI cameras to monitor
their location, movement,
their driving patterns.
At least one driver quit over
this form of “AI surveillance.”
*Information about your face, your
expressions, body movements, etc
Biometric Data
19. In April, The New York
Times reported on how
the donation site for
Donald Trump deployed
“dark patterns” to trick
supporters into agreeing
to recurring donations,
earning the campaign a
huge spike in
contributions
Dark Patterns
20. • Designers rolled out different iterations of
this feature
• Each with increasingly confusing language,
fine print, bold text, all-caps, and a pre-
selected check box
• They referred to the feature internally as a
“money bomb”
• Donations grew astronomically — but so did
fraud complaints to banks and credit
companies from angry supporters
• 78-year-old supporter summed up his
thoughts on the feature: “Bandits!”
• Others referred to it as a scam
Dark Patterns
21. • Demand for personalized content which benefits from personal data
seems higher than ever
• People say they want personalized ads, so you’d think they enjoy
sharing their data: Let a company follow you around the internet,
theoretically, you get better content, better advertising
• But a 2019 survey by network security company RSA found only 17%
of respondents said it was ethical to track their online activity to
personalize ads
• Earlier, Pew Research found 91% of adults believe consumers have lost
control over how their personal information is collected and used by
companies
Data Sharing
22. Data Sharing
Sign of the times:
Apple rolled out a new iPhone
privacy feature called “App Tracking
Transparency,” an anti-tracking
shield, which prevents apps from
snooping and shadowing you
across the internet.
They have to ask first.
Hugely popular in the US: Only
about 20% of iOS users allowing
apps to track them so far.
23. Data Sharing
The oncoming “cookie
apocalypse”
• Google’s plan to update to its
Chrome browser by 2023
• Will prevent companies from
tracking your wanderings around
the internet via third-party
cookies
24. • Some companies aren’t happy
with these developments
• Much of this is about
competition among big
companies like Apple, Google
and Facebook
• But it’s also a result of
consumers’ increasing
concerns about the privacy —
and security — of their data
and their activity online
Data Sharing
25. We just discussed a few common privacy and security issues, but in 2019 Smashing Magazine, identified
24 top user privacy concerns. Those included many of the things we just discussed but also …
• Convoluted privacy policy changes
• Unwanted notifications and marketing emails
• Lack of proper control of your personal data
• Making it difficult to delete personal details or cancel your account
• Social profiling by potential employers
• Hidden fees
• Automatically importing the contact information of your friends
• Hacked email and social media accounts
• And so on
Top Privacy Concerns
These are real issues, which matter to real people.
27. GDPR stands for …
The General Data Protection Regulation
Law finalized in 2016, came into effect in
2018
Regulates how apps and sites can gather
and transfer or process personal data when
working within the European Union
Also, what happens to that data when it’s
transferred outside of the EU?
Impact of Regulations
Remember a while back
when you suddenly got a
gazillion emails from
companies telling you they
had updated their privacy
policies?
That was a result of the
GDPR.
28. Some things GDPR requires …
• Ask people to opt in to sharing their data
• Communicate to people in the moment, when
you’re collecting their personal data
• Be transparent about what you’re doing with it
• Allow people to download their data and
delete it — a “right to erasure” or “right to be
forgotten”
Impact of Regulations
29. In 2018, California passed their own version of the
GDPR — the California Consumer Privacy Act — to
give Californians more control over how their
personal data is used
Requirements very similar to those in the GDPR.
But CCPA differs in that it (currently) allows
businesses to collect your information by default—
though they still have to offer the ability to opt out
California Consumer Privacy Act 2018
Impact of Regulations
30. In March, California announced they’re
banning “dark patterns.”
And a new “Privacy Options” icon for
businesses to show you where to opt out of
data collection.
The icon was designed by Carnegie Mellon’s
CyLab and the University of Michigan’s School
of Information.
Impact of Regulations
31. New York, Maryland, Massachusetts and Hawaii are
developing their own privacy laws, too.
So, if you’re designing for GDPR and California privacy laws
and more, then you may as well design for everyone —
design for the highest common good.
And assume things will continue to change with more of this
emphasis upon privacy, security and transparency.
Impact of Regulations
33. What’s our role then?
Our Role
“You were not hired to get approval
or to have your work pinned to the
company fridge.”
“People hire you to be the expert, so
you might as well be the expert.”
—Mike Monteiro, designer, co-founder of Mule Design
in Ruined by Design
34. More specifically?
We have a responsibility to act as the
advocate for users — but even that’s
too abstract.
The term “user” tends to strip people of
their individual circumstances, their
personality, their history, even their
lives.
We have a responsibility to real human
beings.
We may need to push back where
necessary in terms our clients
understand.
Our Role
Photo by Vince Fleming
35. We may have to explain to our clients the impacts of ignoring privacy and security concerns.
What are these impacts, specifically?
• Civic responsibility: As user-centered designers, we really should be encouraging our clients to treat
their “end users” as human beings, who are members of their community
• Reputation management: We may have to remind our clients that what companies do can
undermine their brands
• Using dark patterns may anger people and cause them to abandon your site in favor of another with
a more transparent experience
• Data breaches and sloppy treatment of data may lead to the loss of their user base — likely affecting
their profits
• Financial consideration: Keep in mind the increasing number of laws and regulations and the
resulting fines for not following them
Even if there’s an up-front cost to designing for privacy and security, the long-term costs can be
devastating
Our Role
36. In the1940s a Frenchman, Rene Carmille was working on the
French Census.
He and his team have been dubbed the first “ethical
hackers.” They decided to sabotage their own machines, so
the punch cards couldn’t register people’s religion properly.
The team was discovered, arrested by the Nazis and
tortured. Carmille died at Dachau.
But they prevented the Nazis from discovering the identities
of tens of thousands Jewish people living in France, saving
their lives in the process.
They did so by changing an experience to maintain people’s
privacy.
Rene Carmille
37. In 2019, 5 employees quit their jobs at
GitHub, a major San Francisco tech
company after learning the company
shared its data with Immigration and
Customs Enforcement or ICE, the
government agency, which has been
accused repeatedly of human rights
violations — especially related to the
treatment of immigrants.
It might be scary to speak up in such a
situation, but we got into this business to
help people — and what we do has a real-
world impact.
Our Role
39. In her Privacy by Design manifesto, Dr. Ann
Cavoukian lays out 7 foundation principles for
implementing and mapping Fair Information
Practices.
She recommends making privacy the “default
setting” in our designs, for example, and says
privacy should be “embedded” into design.
So, what are some practical ways to ensure
we’re doing that?
Best Practices
Self Study:
“Privacy by Design: The 7 Foundational Principles”
by Dr. Ann Cavoukian
Founder of Global Privacy & Security by Design and the former Information and Privacy Commissioner
for the Canadian province of Ontario
41. Dark Patterns
UX designer Harry Brignull coined
the term “dark pattern” in 2010
He defines dark pattern: a “user
interface that has been carefully
crafted to trick users into doing
things” that you didn’t mean to do —
like buying or signing up for
something
Another researcher described dark
patterns as supplanting user value “in
favor of shareholder value”
42. Brignull identified about a dozen types of
dark patterns.
Bait and Switch – You set out to accomplish
one thing but something else completely
undesirable happens.
Confirmshaming – You try to unsubscribe
from something, for example, and the
feature to opt out uses language to guilt
you out of taking action.
Friend spamming – A site asks to access
your contacts, so you can find your friends,
then it emails all your friends without your
permission.
Dark Patterns
Example of confirmshaming
43. Dark Patterns
“Dark patterns are the canaries in the
coal mine of unethical design.
A company who’s willing to keep a
customer hostage is willing to do
worse.”
— Mike Monteiro, Ruined by Design
44. Dark patterns can expose users’ personal
information
When you make a payment on Venmo, it
defaults to public, so you automatically share
your payments with … everyone
The opposite of designing with privacy as a
default
Somebody created Vicemo, which scraped
payments listed with words associated with
drugs, alcohol or sex and posted them online
for all to see
Dark Patterns
45. Similarly, Strava, a popular app for runners automatically
tagged other runners when you passed them if they
didn’t change their settings.
This feature even had a name: Flyby.
If you clicked on a face, it showed the user’s full name,
picture and a map of their running route — effectively
revealing where they lived.
This happened without you following users and without
them knowing they were sharing their activity.
After receiving criticism, Strava did change the default
setting to private.
But it should have always been private.
“Stalkerware”– Apps which allow people to be tracked —
intentionally or not
Dark Patterns
46. In this example from a major airline, the
customer has already chosen Basic
Economy but "Move to Main Cabin”—
which costs $100 more — is placed as a
large red button in the place you’d
typically find a "Next" button.
To keep your first selection, you have to
click on "Continue with Basic Economy” —
the smallest, most low contrast copy on
the module.
Patterns like this already interrupt users’
experience but they also undermine
people’s trust and even anger them, too.
Dark Patterns
Here the pattern is used to trick people into an
upsell.
But the same pattern is used to trick people into
sharing their personal information in ways they
didn’t intend to.
47. Exercise:
If you were in a meeting with a
client and they asked you to
design a module this way,
what could you say to them to
convince them not to?
What alternative solutions
could you suggest?
Dark Patterns
49. It’s important to be very specific — especially
when sharing PII.
Personally identifiable information — data points
such as name, email, phone number, social
security number, mother’s maiden name, which
can be used to steal people’s identities and to
commit fraud
87% of the U.S. Population can be uniquely
identified by just their date of birth, gender, ZIP
code? (Those items aren’t even considered PII.)
Imagine how much damage a bad actor can do
with just 3 data points of PII.
The more personal information someone can
collect about an individual, the greater chance
they can do a lot of harm.
What Data Is Used?
51. Consider this an opportunity to explain the benefits of sharing
their data:
• Does it ensure a better experience in the future?
• Does it personalize ads and offers for them?
Be prepared to explain those benefits in detail.
And if you can’t, consider whether you’re designing the right
sort of product.
Why Is Data Used?
52. Why Is Data Used?
The home insurance app Lemonade
sets a great standard for digestible
privacy policies.
They include an itemized, detailed
explanation of what personal
information you’re sharing, and they
also explain why.
They also promise never to sell your
information to third parties.
“TL;DR: We will never, ever, sell your data to anyone.”
54. Clear Language
A couple of years ago, The New York Times studied
150 privacy policies from various tech and media
platforms. They described what they found as an
“incomprehensible disaster.”
They described AirBnB’s privacy policy as
“particularly inscrutable.”
“This information is necessary for the adequate performance of
the contract between you and us and to allow us to comply with
our legal obligations.”
Vague language and jargon allow for a wide range
of interpretation, making it easy companies to
defend their practices in a lawsuit while making it
harder for us to understand what’s really going on
with their data.
55. Twitter advises you to read their
privacy policy in full but highlights
key aspects of it up front — in a
dedicated section — advising you
to pay attention to those particular
things
Clear Language
56. Some guidelines:
• Avoid legalese and jargon: Even your
terms and conditions content doesn’t
have to sound like legal content
• Consider different ages groups and levels
of savviness
• Most adult Americans read at about a
basic or intermediate literacy level
• 50% can’t read a book written at an 8th
grade level
• The Content Marketing Institute
recommends writing for about a 14- or
15-year-old (about the 8th grade)
• Carefully crafted personas can help
determine if an experience’s reading level
should vary from that range
Clear Language
Photo by John-Mark Smith
58. User Controls
Google offers a Privacy Checkup with high
level descriptions of how your personal data is
being used and why.
This links to specific Privacy Controls, which
allow you to adjust how that data is accessed.
They allow you to turn off activity tracking,
location history, your YouTube history, your
Google photo settings, check which third
parties have access to your account
information, and access other key settings all
in one sort of privacy dashboard.
59. A good moment to recall Dr. Cavoukian’s
maxim:
Keep these settings private by default
User Controls
60. This module shows the site is keeping your
cookie settings to a minimum by default.
In this example, the messaging is prominent,
but …
Are these cookies set to the minimum?
Or the maximum?
You can’t be sure without opening the
“Manage Cookies” link to find out. They do
make it super easy for you to “Accept &
Continue” tho.
User Controls
Exercise:
This version looks a bit clunky.
How might you design a more streamlined version,
which still makes the options clear?
62. Easy to Find
This important information
shouldn’t be placed in
8-point font …
buried in the Terms &
Conditions …
hidden in the footer …
or several levels of navigation
down deep in your app
But that’s often where we find
it
This is where a feature like
California’s new “Privacy
Options” icon would come
in handy, too – to really
draw additional attention to
these privacy options.
63. Easy to Find
Contextual and easy to find also means …
Onboarding — Explaining in detail how
you use people’s data when they’re using
your app for the very first time.
“Just in time” alerts – Alerting users in
the moment—when they’re about to
share data in a new way—even if they
have a history of using your experience.
64. Easy to Find
Mozilla displays robust
Privacy information by
default in a dedicated tab
when you download and
open their Firefox browser
for the first time.
65. Easy to Find
Facebook offers a Privacy Checkup
one or two clicks away everywhere
on its desktop site. It’s pretty easy
to find on the app as well.
(If you know to look for it.)
67. Remind users regularly about their privacy
options
And actively encourage them to take
advantage of them
Reminders
7
68. Reminders
Facebook allows you to set
reminders to do a privacy
checkup every week, month, 6
months or year
Google also has a feature, which
will send you a reminder to check
your privacy settings.
69. Final point:
Never change users’ privacy settings
without telling them in advance.
They should also have the option to opt
out of such changes.
Never Change Without Notice
8
70. A few years ago, Facebook made users’ “likes” visible overnight,
which consequently may have outed some people in the
LGTBQ community or revealed people’s personal, political or
religious beliefs.
When I asked an employee how they justified this change, they
responded that Facebook valued transparency and wanted
people to be transparent about their interests.
The company’s founder, Mark Zuckerberg, had even famously
said privacy was no longer a “social norm.”
Never Change Without Notice
71. We don’t have the right to make decisions about other
people’s personal data and interests on their behalf.
Assuming everyone’s information can safely be made
public is a belief that comes from a position of privilege.
We should never make decisions like this, which can
profoundly affect people’s privacy without their explicit
consent.
Never Change Without Notice
73. We talk a lot about “empathy” in
design.
If we design with empathy, we
won’t design experiences we
wouldn’t want to use ourselves.
And we won’t design using “dark
patterns” either.
Conclusion
Photo by Josh Calabrese
74. Privacy is not about secrecy.
It’s all about control.
— Dr.Ann Cavoukian
If we want to ensure people have control over their
own personal information
If we want to ensure the experiences we design are
user friendly and truly “user-centered”
We’ll keep these best practices for privacy in mind
Conclusion
Photo by Zanardi, Unsplash
76. Further Study
• California Consumer Privacy Act
• GDPR.eu
• “Privacy by Design: The 7 Foundational Principles” -
Dr. Ann Cavoukian
• The Privacy Project – New York Times
• “We Read 150 Privacy Policies. They Were an
Incomprehensible Disaster”– Kevin Litman-Navarro,
New York Times
• “Privacy UX - Common Concerns and Privacy in Web
Forms” – Vitaly Friedman, Smashing Magazine
• “What GDPR Means for UX” – Claire Barrett
• www.darkpatterns.org – Harry Brignull
• “How Dark Patterns Trick You Online” – YouTube
• Ruined by Design – Mike Monteiro
Design for Privacy & Security – Presentation for Early Careers Experience new employees by Robert Stribley
As recorded and presented on 21 September 2021
Introduction
Photo by Maximalfocus, Unsplash - https://unsplash.com/photos/QtZDb5QJOFM
Purpose
Why Privacy?
Image by Jack Ferrentino for NPR from
https://www.npr.org/2020/10/09/922262686/your-technology-is-tracking-you-take-these-steps-for-better-online-privacy
Edward Snowden
Why Privacy?
Why Privacy?
Why Privacy?
Privacy Issues
WSJ – Quantum computing likely to defeat encryption
https://www.wsj.com/articles/the-race-to-save-encryption-11559646737
Early April and one half billion users’ personal data revealed hacked, leaked online
https://www.businessinsider.com/stolen-data-of-533-million-facebook-users-leaked-online-2021-4
Fraud & Identity Theft
Photo by Kyle Glenn on Unsplash
https://unsplash.com/photos/MbPDSi0ILMo
Facial Recognition
National Post - FaceApp makes you look older — what else is it doing with your face?
https://nationalpost.com/news/world/faceapp-makes-you-look-older-what-else-is-it-doing-with-your-face
Loss of Privacy: Amazon angered drivers recently by requiring them to sign a consent form allowing Amazon to collect their biometric data and use AI cameras to monitor their location and movement
Dark Patterns
NYT – How Trump Steered Supporters Into Unwitting Donationshttps://www.nytimes.com/2021/04/03/us/politics/trump-donations.html
Dark Patterns
NYT – How Trump Steered Supporters Into Unwitting Donationshttps://www.nytimes.com/2021/04/03/us/politics/trump-donations.html
Data Sharing
Data Sharing
Apple’s anti-tracking shield - https://www.nytimes.com/2021/09/16/technology/digital-privacy.html
Data Sharing
Google and “cookie apocalypse”
https://www.theverge.com/2021/6/24/22547339/google-chrome-cookiepocalypse-delayed-2023
Data Sharing
Article - https://theconversation.com/googles-scrapping-third-party-cookies-but-invasive-targeted-advertising-will-live-on-156530
Impact of Regulations
GDPR - https://gdpr.eu/data-privacy/
Impact of Regulations
California Consumer Privacy Act of 2018
Impact of Regulations
https://gizmodo.com/california-passes-new-regulation-banning-dark-patterns-1846482961
https://oag.ca.gov/news/press-releases/attorney-general-becerra-announces-approval-additional-regulations-empower-data
https://www.cylab.cmu.edu/news/2020/12/11-donotsell.html
https://www.vice.com/en_us/article/qvmkvx/twitter-bot-venmo-buying-drugs-photo-names
Impact of Regulations
What’s Our Role as Designers?
What’s our role then?
Our Role
Photo by Vince Fleming, Unsplash
https://unsplash.com/photos/Vmr8bGURExo
Our Role
Our Role
Photo/artwork of Rene Carmille from this Medium article by “Ava Ex Machina”
https://medium.com/@silicondomme/hacking-the-holocaust-abcd332947ae
Our Role
Best Practices
Best Practices
Dark Patterns
Dark Patterns
Dark Patterns
Dark Patterns
Dark Patterns
Delta dark pattern
Delta dark pattern
Maintain transparency about what personal data is used
Be transparent about why specific personal data is collected or shared
Illustration from here:
https://www.imperva.com/learn/data-security/personally-identifiable-information-pii/
Be transparent about why specific personal data is collected or shared
Be transparent about why specific personal data is collected or shared
Transparency - Explain the specific purpose behind collecting or sharing personal data – Lemonade as a good example
NYT - We Read 150 Privacy Policies. They Were an Incomprehensible Disaster by Kevin Litman-Navarro - https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html
NYT - We Read 150 Privacy Policies. They Were an Incomprehensible Disaster by Kevin Litman-Navarro - https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html
Clear Language
Tru Luv – “data collection” notice and privacy policy
Clear Language – Guidelines
Photo by John-Mark Smith, Unsplash
https://unsplash.com/photos/F_cHIM0Kcy4
User Controls
User Controls
User Controls
Dr. Cavoukian’s maxim: Keep these settings private by default
User Controls
Easy to Find
..Easy to Find
Onboarding + Just-in-time alerts
Easy to Find – Firefox example
Easy to Find
Onboarding example from Babbel
Reminders
Reminders – Facebook example
Never Change Without Notice
Never Change Without Notice
https://www.theguardian.com/technology/2009/dec/10/facebook-privacy
Never Change Without Notice
https://www.theguardian.com/technology/2009/dec/10/facebook-privacy
Exercise
Image by Margarida CSilva on Unsplash
https://unsplash.com/photos/cQCqoTjr0B4
Exercise
Photo by Myke Simon, Unsplash
https://unsplash.com/photos/atsUqIm3wxo
In Conclusion
Conclusion
Photo by Josh Calabrese, Unsplash
https://unsplash.com/photos/XXpbdU_31Sg
Conclusion
Photo by Zanardi on Unsplash
https://unsplash.com/photos/GJY1eAw6tn8
Further Study
Photo: Philippe Bout on Unsplash
https://unsplash.com/photos/93W0xn4961g