Title
The Illusion ofPrivacy: Unpacking
the Perception Gap in Data
Protection on Digital Platforms
Subtitle : A Critical Analysis of User Awareness,
Behavioral Contradictions, and Platform
Transparency in the Age of Surveillance Capitalism
2.
User vs PlatformUnderstanding of Privacy
• Users often believe their data is private
or only used for specific, limited
purposes (e.g., improving service).
• Platforms often define privacy in terms
of compliance (e.g., legal terms, user
consent), not ethical considerations or
user expectations.
• Gap: Users assume privacy means
control and confidentiality, but platforms
treat it as terms of service compliance.
3.
Complex and OpaquePolicies
•Privacy policies are long, legalistic, and hard to
understand.
•Users are often unaware of how much data is being
collected, who it’s shared with, or how it’s monetized.
•Lack of awareness or digital education
•Trust in popular brands and platforms
•Not seeing the risks clearly
•Gap: Platforms disclose practices legally, but not
transparently..
4.
Privacy Paradox
Many usersclaim to
care about privacy
but still use services
that exploit their
data.
This may be due to a
lack of alternatives,
convenience, or
misunderstanding
the risks
5.
Inadequate Consent Mechanisms
•Manyplatforms use dark patterns
or confusing interfaces to get users
to consent to data sharing.
•Consent is often bundled or pre-
checked, undermining meaningful
choice.
•Gap: Platforms obtain legal
consent but not informed or freely
given consent.
6.
Which type ofdigital platform used
and How much time they spend .
7.
Do Users changetheir privacy settings ?
• | Action Taken | Percentage | |
• | ----------------------------------------- | ---------- | ---------------------------------------------- |
• | Reviewed/Updated Privacy Settings | 67% | |
• | Use Multi-Factor Authentication | 68% | |
• | Use Password Manager | 61% | |
• | Submitted Data Access/Correction Requests | 36%
• These figures are based on the Cisco 2024 Consumer
Privacy Survey.
9.
Misuse or Overreachof Data
• Data collected for one purpose
(e.g., personalization) is often used
for another (e.g., advertising,
profiling).Breaches, leaks, or
misuse (e.g., Cambridge Analytica)
have eroded public trust.
• Gap: Platforms assume broad data
rights users expect narrow,
purpose-bound use.
10.
Facebook fined £500,000for Cambridge Analytica scandal
• Dr Aleksandr Kogan and
his company GSR (This Is
Your Digital Life) used a
personality quiz to harvest
the Facebook data of up
to 87 million people.
• Some of this data was
shared with Cambridge
Analytica, which used it to
target political advertising
in the US.
11.
Lack of UserEmpowerment
• Users often can't see, control,
or delete their data easily .
Even where legal rights exist
(like GDPR), exercising them
can be difficult.
• Gap: Users expect control;
platforms offer limited,
technical, or burdensome
tools.
Future of DataPrivacy
• AI and data protection.
• Privacy-enhancing technologies (PETs).
• Rise of data sovereignty and user-centric models.
14.
How to Closethe Gap
• Make privacy policies simpler and more honest.
• Teach people about digital safety.
• Give users more control over their data.
• Hold companies accountable.
15.
Conclusion
• Digital privacyis often an
illusion.
• People believe they are
protected, but the reality is
different.
• We need better rules, tools,
and awareness to protect
data.