Another way to look at it is to consider what data Facebook discloses by default when you sign up. In other words, if you create a profile or do things on the site, how much of it is public before you the user do anything? Again this keeeps changing. TAnother useful graphic from a private user called Matt McKeon using data gathered by US digital rights org the EFF produced some very useful graphics tracking how much data the PP demanded the user to disclose as the policy canaged from 2005 through to 2010. He has it for every year but ‘ll just shopw you 2005 and 2010..
In December 2009 FB changed the defaults the sites so that much data that was once by default private now became public – even some one to one chats between users, eg – and some data was shared automatically with other websites when users visited them. This is why some of the people on OpenBook are saying things they probably never expected anyone except thjeir friends to read – they don’t realise the terms of using the site have changed and now some of their very personal data is public.Protecting your privacy now on FB – ie changing the default settings , and keeping on top of new changes - is complicated enough that people are writing software to do it for you – tools liike ReclaimYourPrivacy.Org & SaveFace..-> Can it really be this difficult you ask?? see NYT times graphic.
Defaults? Spotify to FB – Fitbit sharing sexual activity
Big successesGoogle Flu Trends – predicted in real time where ful epidemic likely to be at its worst from search terms entered (G didn’t pick them , just used figures from gov to correlate w prior outbreaks and “learnt” the 45 search terms most correlatedDrugs – from existing abstracts of all existing chemical compounds -> new ones eg
1. The Death of Data Protection?
Professor of Internet Governance
University of Strathclyde
Goettingen, July 2013
3. Q. Do people still care about privacy?
JAN 2010: Zuckerberg : “People have
really gotten comfortable not only
sharing more information and
different kinds, but more openly and
with more people.. privacy is no longer
a ‘social norm’ .”
JUNE 2013: Washington is using
"American-style Stasi methods," said
Markus Ferber MEP, of Chancellor
Angela Merkel's Bavarian sister party. "I
thought this era had ended when the
4. PrivacyMemes, June 11 2013
5. Viviane Reding: Prism “shows
why a clear legal framework
for the protection of personal
data is not a luxury but a
Ron Paul: What most undermines the claims of the Administration and its
defenders about this surveillance program is the process itself. First the
government listens in on all of our telephone calls without a warrant and then if
it finds something it goes to a FISA court and gets an illegal approval for what it
has already done! This turns the rule of law and due process on its head.
6. In Europe, even UK(!) rising online privacy concerns
c 2010 on
C4, May 2010
7. Attitudes towards data protection
• 60% of Europeans who use the internet (40% of all
EU citizens) shop or sell things online and use social
• People disclose personal data, including biographical
information (almost 90%), social information
(almost 50%) and sensitive information (almost
10%) on these sites.
• 70% said they were concerned about how companies
use this data and they think that they have only
partial, if any, control of their own data.
• 74% want to give their specific consent before their
data is collected and processed on the Internet.
EC citizen attitudes towards data privacy –
8. Reform of the Data Protection Directive (DPD) ?
January 2012 Draft General Regulation
• Main issues
– Combine rules on DP police & LEAs sector with existing rules for
“civilian” data controllers? (in fact kept separate)
– Address globalisation better – data flows out of EU
– Improve harmonisation within EU (binding interpretation by Art
– Strengthen Data Subject’s rights/ enhancing control over PD eg,
online subject access, clarifying definitions of consent
– Reduce red tape for Data Controllers – multinationals only to be
regulated by 1 EC DPA - saving 2.3 billion Euros for EU industry -
quid pro quo?
– Make DCs more accountable, eg, must have a CPO; audit trails of
processing; “privacy by design” (?)
– Clarify rules on jurisdiction, applicable law and DP (eg Facebook?
9. Fiddling round the edges while privacy
OECD Privacy Principles, 1980 / “FIPs”/”notice and
• Collection Limitation Principle
• Data Quality Principle
• Purpose Specification Principle
• Use Limitation Principle
• Security Safeguards Principle
• Openness Principle
• Individual Participation Principle
• Accountability Principle
10. Data Protection Principles (DPD, art 6)
1. Personal Data shall be processed lawfully and fairly (“collection
2. Personal Data shall be obtained only for one or more specified
and lawful purposes, and shall not be further processed in a
manner incompatible with those purposes (“purpose /use
3. Personal data shall be adequate, relevant and not excessive in
relation to the purpose for which it was processed (add “data
minimisation” principle? – DP Reg)
4. Personal data shall be accurate and kept to date if necessary
5. Personal data shall not be kept for a longer time than it is
necessary for its purpose. (“retention”)
6. Personal data can only be processed in accordance with the rights
of the data subjects (“openness”)
7. Appropriate technical and organisational measures shall be taken
against unauthorised or unlawful processing (“security”).
8. Data export principle (and DP Reg may add “accountability”)
11. Fundamental challenges not
A. Decline of real and informed consent
B. Ubiquitous computing/ambient
intelligence/the Internet of Things
C. Big Data and profiling
• In each case fundamental elements of
the “notice and choice” paradigm are
elided or destroyed
12. A. Consent
• Existing DPD: Art 7 – grounds for fair processing
(1st DP principle)
– Consent of Data subject.
– Necessary to perform contract DS is party to or for DS to
enter a contract.
– Necessary to comply with a legal obligation of the data
– Necessary to protect DS’s “vital interests”.
– Processing is necessary for judicial purposes, public acts or
acts of crown.
– Necessary for “legitimate interests” of DC unless contrary
to human rights of DS.
13. Consent as it’s meant to be
• DPD , Art 2 “any freely given, specific and
informed indication of his wishes by which the
data subject signifies his agreement to personal
data relating to him being processed.”
• Art 7 as ground for fair processing,
• Art 8(2)(a) as ground for processing of sensitive
• Freely given? Standard terms? Employees?
Consumers? Social Networks?
• Art 29 WP reports questioned quality of consent
in privacy policies and some relationships esp
employment surveillance (social media
14. Consent online in real life
• Privacy policies largely unreadable by non lawyers
• Users prize immediate gains (social inclusion )over future
dangers (data leakage, employers, NSA etc) -> faulty risk
• Constant change of T& C and defaults requires continuing
vigilance and skill by users
• Lock-in network effect –=> non competitive market on user
rights (social death not to be on Facebook, who knows about
Duck Duck Go?)
• -> Market failure in respect of privacy on SNSs – so why
bother checking privacy policies anybody?
• SNS economic incentives are to encourage disclosure not
encourage privacy (changing?) (but even mentioning privacy
reduces revenues - Bonneau)
15. Consent in real life – complexity, legalese
16. Consent does
T & C and
17. Consent: DP Reg Solution?
• Change of definition to “freely given, specific, informed and
explicit” – meaning “based either on a statement or on a clear
affirmative action” (new recital 24) – but does this make any
difference in online standard form consumer contracts?
• Consent doesn’t count where there is a “significant imbalance”
between Data Subject and Data Controller (eg employee)
• Largely no restrictions on what can be consented to – no attempt at
a consumer protection/unfair terms regime approach re unread
adhesion contracts – “regulated contracts”
• No restrictions on use of “legitimate business purposes” as
alternative to consent for legalising processing (and one report
suggests this should enable incompatible uses with original grant of
• Conclusion – not much help?
18. B. Ubiquitous
RFID and the
Internet of Things
19. Example: Location data
• Richard Stallman, March 2011
• “It's Stalin's dream. Cell phones are tools of Big Brother.
I'm not going to carry a tracking device that records
where I go all the time, and I'm not going to carry a
surveillance device that can be turned on to
• Art 29 WP 13/2011
• Some attempt to provide enforceable rights to “turn
off” location data collection in PECD – how effective?
Eg recent UK EE subscriber location data sales by Ipsos
Mori to Met Police (anonymised?)
20. “Ambient” intelligence/sensor data collection by
21. Barcelona clubbers get chipped
BBC Science producer Simon Morton
goes clubbing in Barcelona with a
microchip implanted in his arm to
pay for drinks.
Imagine having a glass capsule
measuring 1.3mm by 1mm, about the size
of a large grain of rice injected under
Last week I headed for the bright lights of
the Catalan city of Barcelona to enter the
exclusive VIP Baja Beach Club.
The night club offers its VIP clients the
opportunity to have a syringe-injected
microchip implanted in their upper arms
that not only gives them special access to
VIP lounges, but also acts as a debit
account from which they can pay for
Data collection from the
22. Volunteered data about real world
23. London advertisement targets
consumers by gender, with facial
recognition, Feb 20 2012
- Plan UK (charity)
Non volunteered data?
Cas “Ubiquitous Computing, Privacy and DP”, 2009: “Biometric
procedures replace the need to remember passwords or actively
prove authorisation.. [ambient intelligence environments] require
previously inconceivable levels of knowledge about the inhabitants”
Chinese face recognition
enabled door – on sale,
24. The future of ambient environments
and the death of notice and choice?
• Ubiquity = “invisible and seamlessly adaptive” (Spiekerman and Pallas) - always
on, always collecting data
• Weiser – ICTs weaving themselves “into the fabric of everyday life until they are
indistinguishable from it”
• The more useful, the less obvious and the less controlled by individual notice and
• Adaptive – learn from ambient total data collection, data not forgotten otherwise
usefulness constrained– eg domestic or hospital care robots
• How can this match DP idea of privacy as individual right to control collection of
data? Notions of data minimisation in collection, limitation of purpose and use?
• Note that ambient environments also often collect data about those most vulnerable
and unable to exercise control – young, sick, geriatric, Alzheimers (eg the iPot, smart
chairs, robots, geo-tagged schools and libraries)
• Cas “ubiquitous computing will erode all central pillars of current privacy protection”
– Default off – but what happens to social gain?
– Controls on use rather than collection – how to enforce? (anonymisation – see later)
– “Negotiation”? Eg wearing hoodies round CCTV; injecting false info (“noise”) into social networks
etc – what is equivalent for ubicomp?
– Privacy impact assessments prior to building systems plus privacy by design? Spiekerman’s RFID
25. Big Data
What is Big Data?
• “about applying maths to huge amounts
of data to infer probabilities.. The key is
these systems perform well because they
are fed with lots of data on which to base
– Eg Google Flu Trends – most common 50 m
search query terms analysed
• “big data refers to things one can do at a
large scale that cannot be done at a small
• “in a Big Data age , most innovative
secondary uses haven’t been imagined
when the data is first collected”
– Eg Captcha - > ReCaptcha
• Internet industries produce these huge
amounts of data : Google, 24
Petabytes/day; FB, 10m photos uploaded
/hr; 400 m tweets/day (2012)
• “there is a treasure hunt underway” *(p
26. Effect on DP/FIPs?
• “How can companies provide notice for a purpose that has yet to
exist? How can individuals give informed consent to an unknown?”
• Seeking new consent for each re use at big data scale seems
• Seeking blanket consents for any re use destroys whole point of
consent as effective control
– Yet heading this way?: eg Google combining all its privacy consents
(policies) to mail, video, search, blogging etc , Jan 24 2012
• Anonymisation of data collected? Common excuse. Yet re-
identification ever easier esp with big data recombined - see Ohm
“Broken Promises”(2010) – AOL, Netflix scandals.
– Eg anonymise FB data and reidentification from friends, and friemds of
friends – “social graph” – often easy.
• Ohm “Utility and privacy are, at bottom, two goals at war with one
another” (p 1752)
• M-S and Cukier: “From privacy to accountability” – abandon
dependency on individual consent at time of collection & hold data
users (controllers) accountable (p 173)
– Means what?
– Risk assessment by users of whether data products should be
issued? External/internal audit by “algorithmists”?
– Prior privacy impact assessments for “risky” processing?
– Privacy by design – eg “differential privacy”, fuzzing some
– Justified by benefits of big data to users - Paternalistic trust?
• What would legal liability be for getting it wrong? Strict liability?
Causation? Slamming door after horse has bolted?
• My own “thought experiment” on “privacy tax” on data
users, 2004, “The Problem with Privacy” (SSRN)