Be careful what you wish for! How the GDPR even now it has been finalised may not solve the key problems of rthe tech community of what is personal data and what is anonymised/pseudonymous.
Sangyun Lee, Duplicate Powers in the Criminal Referral Process and the Overla...
The GDPR for Techies
1. Be Careful For What You Wish For!
The Great Data Protection Law Reform Saga of 2012-6
Lilian Edwards
Professor of E-Governance
University of Strathclyde
Lilian.edwards@strath.ac.uk
@lilianedwards
2. A. Europe: from the DPD to the GDPR
• Directive 95/46/EC of EU on the protection of individuals with regard
to the processing of personal data and on the free movement of such
data. Human rights based. Much case law now draws on Charter of
Rights and ECtHR as well as European Court of Justice.
• 1998 - intended to address computerisation/databases but NOT the
Internet
• DPD extended to deal with technology challenges eg spam, cookies, by
Privacy and Electronic Communications Directive 2002/58/EC revised
Oct 2009, i’f May 2011 (the “cookie” or E-Privacy Directive)
• Proposed reform as Regulation (GDPR), plus Directive on policing, plus
more – draft out, Jan 25 2012;
• Final compromise, Jan 2016; text April 2016
• 2 yrs implementation then DIRECT EFFECT.
3. Technological challenges to privacy/DP law
• 1995
• Volume of personal data processed, and number of data controllers,
enormous
• Data flows globally but lack of global harmonisation on DP laws
• Lack of public consumer awareness about privacy regulation
• Lack of compliant major actors in web 1.0 (SMEs, spammers, scams
etc)
• -> huge enforcement problems
• 2000 on
• “Consent” as perceived primary protection no longer works well in
web 2.0 click-wrap world (standard terms, privacy policies )
• Post 9/11 politics & low tech costs favour default surveillance and
data retention and mining – if you can do it, why not do it? ->
• Snowden revelations, June 2013 of mass extra legal surveillance by
public/private entities – safe harbor, Data Retention Dir struck down
• New innovative tech nearly always involves networking and data
collection eg robots; music online services; social media; e-voting
• The Cloud – signifies loss of control and visibility as to how/where
data processed
=> Public loss of confidence in privacy law
4. Attitudes to privacy protection - EU
• June 2011 Eurobarometer
• Just over a quarter of social network users (26%) and even
fewer online shoppers (18%) feel in complete control [of
their PD]
• Less than one-third trust phone companies, mobile phone
companies and Internet service providers (32%); and just
over one-fifth trust Internet companies such as search
engines, social networking sites and e-mail services (22%)
• 70% of Europeans are concerned that their personal data
held by companies may be used for a purpose other than
that for which it was collected.
• Only one-third of Europeans are aware of the existence of a
national public authority responsible for protecting their
rights regarding their personal data (33%).
5. Reform of the DPD? Nov 2010
consultation
• Main aims :
– Strengthen Data Subject’s (DS) rights/ trust – eg enhancing
control over PD eg “right to be forgotten”
– Reduce red tape for Data Controllers (DC) -> dump notification;
“one stop shop” national DP regulator
– BUT Make DCs more accountable, eg, must have a CPO;
– Give DP more teeth; higher penalties, security breach
notification
– Address global flows of data better, eg, to US cloud
providers
– Improve harmonisation within EU (binding interpretation
across EU DPAs via EU DP Board; Regulation not Directive)
6. DPD art 2(a)) Personal data is “information relating to an
identified or identifiable natural person ('data subject'); an
identifiable person is one who can be identified, directly or
indirectly,
• ..in particular by reference to an identification number or to
one or more factors specific to his physical, physiological,
mental, economic, cultural or social identity + see recital 26
[itals added]
Q. What of IP addresses; cookies, profiled data as collected by
FB, Google, police, insurers? Are they PD?
• Increasing problem in era of Big Data – reidentification
possibility increases – “mosaic” effect and persistent
identifiers like photo icons – tech driven by marketing and
surveillance needs
• When is “anonymization” sufficient to make sure NOT PD?
1. Personal data – scope of
GDPR
7. Personal data definition problems
• GDPR Art 4 (1) – almost identical to DPD – adds “by
reference to .. location data, an online identifier..”
• But GDPR recital 26: “to determine whether a person is
identifiable, account should be taken of all the means
reasonable likely to be used, such as singling out either
by the controller or any other person to identify the
individual” [italics added]
• Nb recital 30 :“traces” left by IP addresses, cookies and
RFID tages when “combined with unique identifiers”
may create profiles of natural persons and identify
them”
• Contextual tests – may depend what DPA gets to
decide on it (tho harmonisation will prevail)
• NB Special rules for consent to cookies exist in PECD because in
2002 not clearly regarded as personal data AND felt consent was
required, no alternatives.
8. 2. Anonymisation and pseudonymisation
Much “profile data” used to finance the Web – targeted ads – is presented
as “anonymous.” Therefore can be used and reused without DP constraint.
• Arguments over “effective” anonymization
– Privacy fundamentalist – everything can be re-identified with enough
data and time
– High degree of diligence – EU A29 WP
– “risk assessment” – UK approach – ICO Code
• Which won in GDPR?
– No defn anonymous data but pseudonymous data is encouraged
(GDPR art 4(5) and recitals 23-23a)
– “pseudonymisation” means processing such that the data can no
longer be attributed to a specific data subject without the use of
additional information so long as such info is “kept separately” and
held securely to ensure this
– Still personal data – but relaxed rules eg no security breach notifn
necc; POSSIBLY easier to re-use for “compatible” purposes(art 6(4 (e) );
and a plus for “privacy by design”
9. 3. Consent
DPD , Art 2 “any freely given specific and informed indication of
his wishes by which the data subject signifies his agreement to
personal data relating to him being processed.”
GDPR art 4 (11) adds unambiguous
And revocability as key aspect of valid consent (GDPR art 7(3)).
And “a clear affirmative action” ie silence is not acceptance
Arguably new(er) requirements in GDPR (art 7(2) and (4))
– written consent to processing should not be “bundled” ie one
consent to everything at once
- consent not free if tied to providing a service but the processing
not necessary for that service(cf FB etc)
BUT
NOT required all consent be “explicit” – sensitive PD only
NOT explicit that consent void if “significant imbalance of power”
Children’s consent – 13 lowest, 16 highest, depending EU state – is
messy
Privacy icons NOT required for policies but are encouraged
10. 4. New user rights – the “Right to Be
Forgotten”
• Right to be forgotten (RTBF) – GDPR, art 17. Right of DS to “obtain from the
DC the erasure of personal data” if
– data no longer necessary for original purpose
– DS withdraws consent
– DS objects to their PD being used for profiling
– They have been “unlawfully processed”
• Aimed at hosts/publishers, esp social networks. Intended to protect children
from own folly! NOT JUST SEARCH ENGINES – see G Spain v Costeja.
• DC also has further duties when data passed to 3rd parties to process: “shall
take reasonable steps, including technical measures, to inform controllers which are
processing the personal data that the data subject has requested the erasure” (GDPR
art 17(2a))
• Implications for cloud service providers?? Not always controllers.
• Exceptions – see art 17(3).
– Freedom of expression
– Archives, historical, statistical and scientific research? (cf Wikipedia on criminal convictions)
– For proof in legal claims
11. Right to data portability
• Right to data portability, ie, for DS to get a copy of their data to
take elsewhere (GDPR art 20) - “in in a structured, commonly used
and machine-readable format”
• Also right to have such data transmitted directly from co A to B
“where technically feasible”
– Aimed at breaking “lock in” to sites like Facebook – network
effects
– But some see as additional burden for service providers OR as
new market opportunity for infomediaries
– UK MiData initiative has already kicked off – mainly re energy cos,
also banks, mobile phone cos – see Enterprise & Regulatory Reform
Act 2013 – powers in reserve, not yet implemented
– Not a right to interoperability
12. 5. Increased enforcement - 1
• Mandatory security breach notification (GDPR art 33-34).
• Already introduced for telcos/ISPs in PECD art 17(1)
• Aim is naming and shaming to prevent breaches; also notice
to public enables them to get remedies, take protective steps
• Devil in the details:
– what triggers (all PD breaches “unless the personal data breach is unlikely
to result in a risk to the rights and freedoms of natural persons – data
encrypted or pseudonymised?);
– Tell DPA – for UK, ICO
– communication to individual DSs only if “high risk” of above
– Public announcement only necc if too hard to notify individuals in high
risk cases
– how long to fix before notifying (within 72 hours if feasible)
– Parallel notification under EU Network Information Security Directive (NIS)
likely (affects non PD breaches as well)
• How effective? US, Japanese experience found SBN not that
helpful. Lack of US style class action rules.
• In UK Vidal-Hall v Google may help DSs in collective claims in
allowing action for DP breach even where harm not economic
13. Heavier penalties
• GDPR originally suggested penalties of up to €1 million or
up to 2% of the global annual turnover of a company. EU
Parl suggested 5% turnover, up to 100 mn Euros.
• Final GDPR – two levels
– Up to 10 mn Euros or 2% annual global turnover
– Up to 20 mn Euros or 4% global turnover for more severe
infringements
• Cf USA –big privacy breach cases, FTC large fines – 2012,
Google fined $22.5m (but < 1 day’s profit) ; FB, 2012, no
fine but $16,000/day per violation of agreed privacy
settlements & 20 years audit
• Small more effective remedies? Disqualification from
company directorship??
• Competition remedies to break up infomonopoloies??
14. Preventing breaches?
• More guidance on security obligation, art 32, inc using
pseudonymisation and encryption, restoring access in timely
fashion, adhering to codes of conduct or certificates/seals
• “Privacy by design and default”
• Mandatory! “the controller shall.. having regard to the state of the
art and the cost of implementation” (art 25)
– Implement “technical and organisational” measures to implement DP
principles
– Pseudonymisation and data minimisation specially mentioned
– “privacy by default” – only collect the data necc for each specific purpose
– Art 35; DP impact assessments – if “high risk” processing, esp using “new
technologies”, DPIA to be carried out before processing
– Esp likely for automated profiling systems, or “systematic monitoring of
public areas”
– UK ICO has much guidance on PIAs but little use in private sector
– Lists of likely systems needing DPIAs to be issued by DPAs
Editor's Notes
Move from mainframes to client/server technology, + web 2.0, => millions of “data controllers” – private as well as state/big commerce; mice not elephants
Sheer amount of data processed + traffic data, profiling, data mining – the “database nation”
Internet/digitisation allows global rapid spread of data
But- Lack of harmonisation in transnational cyberspace/outside EU (also lack of harmonisation WITHIN EC – see definitions
=> DP does not fit corporate data sharing models and globalisation/out sourcing
Lack of public pressure/knowledge of rights – dullness! => Lack of enforcement resources
Review by E Comm overdue.
“identification numbers, location data, online identifiers or other specific factors as such should as a rule be considered personal data.“ (draft GDPR, recital 24), removed in final compromise