Title
The Illusion of Privacy: Unpacking
the Perception Gap in Data
Protection on Digital Platforms
Subtitle : A Critical Analysis of User Awareness,
Behavioral Contradictions, and Platform
Transparency in the Age of Surveillance Capitalism
User vs Platform Understanding of Privacy
• Users often believe their data is private
or only used for specific, limited
purposes (e.g., improving service).
• Platforms often define privacy in terms
of compliance (e.g., legal terms, user
consent), not ethical considerations or
user expectations.
• Gap: Users assume privacy means
control and confidentiality, but platforms
treat it as terms of service compliance.
Complex and Opaque Policies
•Privacy policies are long, legalistic, and hard to
understand.
•Users are often unaware of how much data is being
collected, who it’s shared with, or how it’s monetized.
•Lack of awareness or digital education
•Trust in popular brands and platforms
•Not seeing the risks clearly
•Gap: Platforms disclose practices legally, but not
transparently..
Privacy Paradox
Many users claim to
care about privacy
but still use services
that exploit their
data.
This may be due to a
lack of alternatives,
convenience, or
misunderstanding
the risks
Inadequate Consent Mechanisms
•Many platforms use dark patterns
or confusing interfaces to get users
to consent to data sharing.
•Consent is often bundled or pre-
checked, undermining meaningful
choice.
•Gap: Platforms obtain legal
consent but not informed or freely
given consent.
Which type of digital platform used
and How much time they spend .
Do Users change their privacy settings ?
• | Action Taken | Percentage | |
• | ----------------------------------------- | ---------- | ---------------------------------------------- |
• | Reviewed/Updated Privacy Settings | 67% | |
• | Use Multi-Factor Authentication | 68% | |
• | Use Password Manager | 61% | |
• | Submitted Data Access/Correction Requests | 36%
• These figures are based on the Cisco 2024 Consumer
Privacy Survey.
Misuse or Overreach of Data
• Data collected for one purpose
(e.g., personalization) is often used
for another (e.g., advertising,
profiling).Breaches, leaks, or
misuse (e.g., Cambridge Analytica)
have eroded public trust.
• Gap: Platforms assume broad data
rights users expect narrow,
purpose-bound use.
Facebook fined £500,000 for Cambridge Analytica scandal
• Dr Aleksandr Kogan and
his company GSR (This Is
Your Digital Life) used a
personality quiz to harvest
the Facebook data of up
to 87 million people.
• Some of this data was
shared with Cambridge
Analytica, which used it to
target political advertising
in the US.
Lack of User Empowerment
• Users often can't see, control,
or delete their data easily .
Even where legal rights exist
(like GDPR), exercising them
can be difficult.
• Gap: Users expect control;
platforms offer limited,
technical, or burdensome
tools.
Lack of Users Empowerment
Future of Data Privacy
• AI and data protection.
• Privacy-enhancing technologies (PETs).
• Rise of data sovereignty and user-centric models.
How to Close the Gap
• Make privacy policies simpler and more honest.
• Teach people about digital safety.
• Give users more control over their data.
• Hold companies accountable.
Conclusion
• Digital privacy is often an
illusion.
• People believe they are
protected, but the reality is
different.
• We need better rules, tools,
and awareness to protect
data.
Data analysis and Interpretation of BRM.pptx

Data analysis and Interpretation of BRM.pptx

  • 1.
    Title The Illusion ofPrivacy: Unpacking the Perception Gap in Data Protection on Digital Platforms Subtitle : A Critical Analysis of User Awareness, Behavioral Contradictions, and Platform Transparency in the Age of Surveillance Capitalism
  • 2.
    User vs PlatformUnderstanding of Privacy • Users often believe their data is private or only used for specific, limited purposes (e.g., improving service). • Platforms often define privacy in terms of compliance (e.g., legal terms, user consent), not ethical considerations or user expectations. • Gap: Users assume privacy means control and confidentiality, but platforms treat it as terms of service compliance.
  • 3.
    Complex and OpaquePolicies •Privacy policies are long, legalistic, and hard to understand. •Users are often unaware of how much data is being collected, who it’s shared with, or how it’s monetized. •Lack of awareness or digital education •Trust in popular brands and platforms •Not seeing the risks clearly •Gap: Platforms disclose practices legally, but not transparently..
  • 4.
    Privacy Paradox Many usersclaim to care about privacy but still use services that exploit their data. This may be due to a lack of alternatives, convenience, or misunderstanding the risks
  • 5.
    Inadequate Consent Mechanisms •Manyplatforms use dark patterns or confusing interfaces to get users to consent to data sharing. •Consent is often bundled or pre- checked, undermining meaningful choice. •Gap: Platforms obtain legal consent but not informed or freely given consent.
  • 6.
    Which type ofdigital platform used and How much time they spend .
  • 7.
    Do Users changetheir privacy settings ? • | Action Taken | Percentage | | • | ----------------------------------------- | ---------- | ---------------------------------------------- | • | Reviewed/Updated Privacy Settings | 67% | | • | Use Multi-Factor Authentication | 68% | | • | Use Password Manager | 61% | | • | Submitted Data Access/Correction Requests | 36% • These figures are based on the Cisco 2024 Consumer Privacy Survey.
  • 9.
    Misuse or Overreachof Data • Data collected for one purpose (e.g., personalization) is often used for another (e.g., advertising, profiling).Breaches, leaks, or misuse (e.g., Cambridge Analytica) have eroded public trust. • Gap: Platforms assume broad data rights users expect narrow, purpose-bound use.
  • 10.
    Facebook fined £500,000for Cambridge Analytica scandal • Dr Aleksandr Kogan and his company GSR (This Is Your Digital Life) used a personality quiz to harvest the Facebook data of up to 87 million people. • Some of this data was shared with Cambridge Analytica, which used it to target political advertising in the US.
  • 11.
    Lack of UserEmpowerment • Users often can't see, control, or delete their data easily . Even where legal rights exist (like GDPR), exercising them can be difficult. • Gap: Users expect control; platforms offer limited, technical, or burdensome tools.
  • 12.
    Lack of UsersEmpowerment
  • 13.
    Future of DataPrivacy • AI and data protection. • Privacy-enhancing technologies (PETs). • Rise of data sovereignty and user-centric models.
  • 14.
    How to Closethe Gap • Make privacy policies simpler and more honest. • Teach people about digital safety. • Give users more control over their data. • Hold companies accountable.
  • 15.
    Conclusion • Digital privacyis often an illusion. • People believe they are protected, but the reality is different. • We need better rules, tools, and awareness to protect data.