DATA PROTECTION MAGAZINE
spring 2018
issue 1
Published by Data Protection World Forum
Editor: Michael Baxter, michael@dataprotectionworldforum.com
Contributions from: Laura Edwards, Ardi Kolah, Adil Akkus, Deborah Dillon, Jenna Banks, Nathan Sykes, Jason Choy
Design : Hannah Richards, Amplified Business Content
DATA PROTECTION MAGAZINE
Lawfulness of processing
Before you go any further, take a note of this, it seems to be the most common point of discussion whenever the topic of GDPR
comes up.
Article 6 GDPR, Lawfulness of processing
•	 Processing shall be lawful only if and to the extent that at least one of the following applies:
•	 The data subject has given consent to the processing of his or her personal data for one or more specific purposes;
•	 Processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at
the request of the data subject prior to entering into a contract;
•	 Processing is necessary for compliance with a legal obligation to which the controller is subject;
•	 Processing is necessary in order to protect the vital interests of the data subject or of another natural person;
•	 Processing is necessary for the performance of a task carried out in the public interest or in the exercise of official
authority vested in the controller;
•	 Processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except
where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require
protection of personal data, in particular where the data subject is a child.
•	 Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of
their tasks.
a note from the editor
Welcome to Data Protection Magazine.
We are all data protection people now.
Privacy and data protection has moved from an obscure corporate
cubbyhole to the front pages of newspapers and it feels as if
everyone has an opinion.
The role of the data protection officer, or DPO, has been
transformed. In the post GDPR era, DPOs don’t simply take
instructions. They report to the highest level of management
authority and are experienced in business continuity, risk
and technology.
Read the media, and you may come away with the impression
that the DPO has gone from the person of last redress to
rock star. In fact, the change is more nuanced than that.
The role of the DPO under the GDPR is radically and legally
different.
But caring about data protection is not just a job for the
DPO. It needs to run deep within a company and become a key
consideration for senior management.
The c-suite does not merely need to become data protection
aware, the ladies and gentlemen who make up the higher echelons
of corporate hierarchy need to breathe it. It needs to be
embedded into their corporate mindset.
As the editor of this publication, I too have been on a
data protection and privacy journey. My background is as an
economic and disruptive technology writer, but until a year
or so ago, thinking deeply, or indeed narrowly, about privacy
was barely on my radar.
I had this notion that we may be entering an era which will see
the end of secrets - that this is the inevitable consequence
of creating transparency.
But the truth is there is a dividing line. Transparency is
good, but we do not want a transparent wall in our bathroom.
Teenagers crave privacy from their parents. The mums and dads
want transparency from their kids.
Creating privacy in the age of transparency is one of the most
important challenges of the digital economy.
Data Protection Magazine is aimed at the business generalist
rather than the data protection or privacy professional and
will be released on a quarterly basis. Enjoy!
You haven’t seen anything yet!
GDPR is good for you
Goodbye darkness hello friend
GDPR and the United States
GDPR and how the millennial generation manage privacy
GDPR – a quick summary
Privacy by design and default
From Grim Reaper to hero
The Canadian experience
Interview with Abigail Dubiniecki
GDPR and the Fourth Industrial Revolution
AI, GDPR and the anonymisation of data
Selling your data
Anatomy of a breach in trust, the Facebook story
Blockchain and GDPR
The GDPR and small businesses
The Data Protection World Forum
Breaches, incidences and security
Let’s put our head in the clouds
Everything you need to know about cybersecurity
The cardless society
Subject access requests
The wallflower of compliance grows thorns
spring 2018
issue 1 contents
02.
05.
06.
10.
12.
14.
20.
22.
26.
28.
30.
34.
38.
40.
44.
46.
50.
52.
56.
58.
60.
62.
65.
The General Data Protection Regulation is here.
We sat down with Nicola McKilligan-Regan, one of
the UK’s leading experts on data privacy, a woman
who literally wrote the book about the Data
Protection Act of 1998, to give us the benefit of
her wisdom.
Nicola herself has been working in privacy
for 20 years, she admits reluctantly.
She began working at the Information
Commissioner’s Office (ICO) before it was
called that. Instead, it went by the name of
the Data Protection Registrar. She worked as
a Strategic Policy and International Officer.
In 2000, after the Data Protection Act
1998 came into force, she left to focus on
helping organisations with the new law.
In 2004, her book: A Pocket Guide to the
Data Protection Act, was published. The
ICO once said of the book that it was the
sort of guidance it wished it had written.
Today, she is the Senior Partner at the
Privacy Partnership, as well as the founder
and CEO of Smart Privacy, which provides
privacy tools for businesses struggling
to implement GDPR. And between now
and when her book was published, she
has been the Vice President Privacy at
Thomson Reuters, and has enjoyed a
glittering career in data privacy. The latest
version of her book: The Pocket Guide to
the Data Protection Act, GDPR and DPA
2018, is being published later this year.
And we began by talking about whether
GDPR is over-hyped, like the millennium
bug 18 years earlier, soon to be forgotten,
just as May 25th 2018 itself will be
relegated to being just another ticked off
date on the calendar.
isGDPRislikethemillenniumbug,theso-calledY2K?
Nicola: “No. We didn’t know what effect
the millennium bug would have, but before
GDPR is even implemented, regulators
have begun flexing their muscles.
She poses a question. “If GDPR is
just like the millennium bug, why has
Facebook moved the accounts of 1.5
billion users from Ireland to the US?”
“The new regulations give regulators a
lot more scope to take a hard line against
businesses that don’t treat our personal data
and privacy with the respect it deserves.”
And then she sums it all up in one
phrase: “You haven’t seen anything yet!”
Why does GDPR matter?
We can all get lost in detail: GDPR is
such a vast topic that it can be devilishly
difficult to sum it up in a way that captures
the imagination. We asked Nicola to do
just that.SUBSCRIBE FOR FREE TODAY HERE
Thescaleofthepenaltiesindicatesthatgovernmentsnow
seeprotectingprivacyasanimportantanissueasbeing
abletostoppeoplefromlendingmoneytocriminals
orsellingarmstoroguestates.
•	 Whyit’snotlikethemillenniumbug
•	 WhydoesGDPRmatter?
•	 Isitgoodorbad?
•	 MessagefortheCEO
•	 MessagefortheDPO
•	 Firststeps
•	 GDPRandtheleanstart-up
•	
“
Nicola: “It’s the biggest change in data
legislation in 18 years, as it brings data
protection legislation up to speed with the
internet age.
“The scale of the penalties indicate that
governments now see protecting privacy
as an important an issue as being able
to stop people from lending money to
criminals or selling arms to rogue states.
“For the first time, privacy regulators
have really strong powers. The law
now applies more directly to the latest
technology and is supported by a
compliance regime which is the equivalent
of major compliance regimes such as AML
– anti-money laundering.
“If privacy was like a poor second cousin
to AML compliance, it’s not like that now.
“It’s about public interest. What
governments think is important has
changed from 18 years ago.”
And is GDPR good or bad?
So, given this, we asked whether the
regulation is good or bad.
Nicola: “It’s both. It was needed and I
am a supporter of the general provisions.
I am not so keen on the bureaucracy. It
means a lot of record keeping, a lot of
work for lawyers and consultants. But the
trouble is, there is a danger that you end
up paying less attention to the real risks
to individuals - bureaucracy may act as a
distraction from real issues.”
Nicola cites as an example, Facebook:
“It has a compliant privacy policy, but
whether it is fair in the way it processes
its users’ data is another matter.”
A message for CEOs
What advice does Nicola have for the CEO?
“It is no longer possible, as Mark
Zuckerberg found out, to hide behind
saying ‘I didn’t know.’ Make sure you
understand how privacy affects the
business. CEOs are always busy, but they
must find the time. If their organisation is
large, they may have a General Counsel
and a data protection officer. Pay attention
to them. CEOs who are not aware of the
key issues will be hung out to dry.”
A message for the Data Protection Officer
Andwhataboutthedataprotectionofficer, or
an external advisor, how do you convince
the board to take GDPR seriously?
“Make sure your advice is very specific
to the business model. Make sure your
leadership fully understand how GDPR
affects the business.
“Remember, if you can stand out as
an organisation that can solve GDPR
problems or provide data privacy
safeguards for customers and clients, that
can be a commercial advantage. CEOs like
to hear about that.
“If instead, you focus on the fines, you
risk the CEO saying ‘but, that’s what we
pay you to avoid.’”
InshortNicolasays,“engagetheboardontheir
levelbyfocusingonthebusinessadvantage”.
The DPO
As you read this magazine you will find
our article ‘Grim Reaper to Hero’. The role
of the data protection officer has changed.
But Nicola sees a challenge ahead, there is
a shortage of people with the necessary skills.
“To be a DPO you need to have experience
to deal with practical issues - just sending
someone on a course won’t cut it.”
She also foresees a conflict of interest
problem. “You can’t be the problem solver
and advisor like you used to be. You are
more like a policeman now. That means
you are going to have to say ‘no’ a lot more.
The first steps
It’s a bit late to be worrying about the first
steps now. But Nicola does have some
specific advice on your general approach
to GDPR. “If you do nothing else, train
your staff. If you have an informed work-
force it will reduce your risk.”
The lean-start-up
Nicola also has some advice for the start-
up, maybe a company applying the ideas
of a lean start-up.
“Elizabeth Denham, the (UK’s Information
Commissioner) has said that one of her
big focuses is supporting small businesses.
“I would say to a small business, talk to
the regulator, either on a no name basis or
get written advice.
“The ICO has specialist teams in
different sectors. And going to the ICO is
cheaper than going to a lawyer.
“My advice to a lean-start up, talking
to the ICO, is to provide it with as much
information as possible. If you get written
advice keep it, if you get advice from an
ICO help-line, take accurate notes, the
name of the person you spoke to and the
date and time of the call.
“But you need to know enough about
GDPR to recognise you have a problem.”
Three take-homes
Finally, we asked her to sum it all up and
tell us the three most important things
about GDPR.
First off, she talked about consent. “I
have always had a problem with consent
being used as grounds for processing
information, because people were not
really given a choice. Now companies can
no longer say ‘yes we have consent’ if they
make it a condition of providing a service.
And that is a big change. Now consent has
to be totally free and informed.”
Second off, she turned to legitimate
interests. Consent and legitimate interests
are two of the legal bases for processing data
under GDPR. “Organisations can no longer
tick a box, they have to document their
argument for why their legitimate interests
are not a threat to individuals’ privacy. So,
they have to conduct a balancing test. You
can’t just process information because it is a
good commercial idea, you have to make sure
it is balanced against the rights of individuals.
And you have to document it, prove it and
have a response if you are challenged.”
Thirdly, “what you now see is a
confident regulator, who has the power to
take on large companies and is not afraid
to act. Regulators now feel that they have
the tools to act against Big Tech.
“GDPRislikeagunaimedatthetempleof
the big technology companies. Other smaller
businesses are caught in the crossfire.”
The rest is history
The build-up to GDPR is over. That is now
history. Now the hard work is on-going.
In producing this magazine, we have
spoken to experts on GDPR and data
privacy, many of whom have been
speakers at our GDPR Summits.
We have more in the diary.
And don’t forget, the Data Protection
World Forum this November, where many
of the experts we have been talking to for
this publication, including Nicola herself,
will be speaking.
By MICHAEL BAXTER
03.
GDPR IS GOOD FOR YOU
The world’s largest company boasts a privacy policy that is a joy
to read.
What’s good for the customer is good for the business that
serves that customer and GDPR is good for both.
Let’s face it, privacy policies are not always fun to peruse. If
someone wrote a book: “The world’s riveting privacy policies” it
may be thick enough to act as a door stop, but only in a dolls’ house.
Yet Apple, the company whose iTunes terms and conditions
make GDPR read like romantic fiction, takes a pretty impressive
stab at it.
Maybe it is a coincidence that the company with such a privacy
policy is also valued at $845 billion. But then there is no doubt
that many millions of users choose its phones for the perception
of privacy and data security. Maybe it is unfair to describe the
company thus, Samsung will undoubtedly claim its own policies are
just as robust, and maybe it would be right to do so. But perception
is everything, and Apple is perceived in a certain way, this is at
least one of the reasons why it is so successful and valuable.
But to get to the nub of the matter, cast your mind back to 1949,
when a book was published whose first sentence should be etched
onto the minds of data professionals: “It was a bright cold day in
April and the clocks were striking thirteen.”
So began George Orwell’s 1984, a dystopian vision which
gave the world such memes as Big Brother, Room 101 and
Doublespeak. In the vision, privacy is no more. Not even our
thoughts belong to us.
Thirty nine years later, Apple announced its new computer to
the world with a now famous ad featuring an Orwellian world:
“On January 24th, Apple Computer will introduce Macintosh and
you will see why 1984 won’t be like ‘1984.’”
Fifteen years after that, Scott McNealy, the then chief executive
of Sun Microsystems said: “Privacy is dead, get over it”, and
famously in 2011, Mark Zuckerberg himself said: “People have
gotten really comfortable, not only sharing more information
and different kinds, but more openly and with more people.” He
suggested that such lack of privacy has become a “social norm”.
Zuckerberg is often quoted as saying privacy is dead.
But there is no evidence he actually said that. Soon after he
uttered those words about lack of privacy becoming a social
norm, a spokesperson for Facebook said: “His remarks were
mischaracterised and added: “A core part of Facebook’s mission
has always been to deliver the tools that empower people with
control over their information.”
She continued: “If the assertion is that anything Mark chooses
to make private is inconsistent with his remarks last week, here
are a few other hypocritical elements of his life: he hides his
credit card numbers in his wallet, he does not post the passwords
to his online accounts, and he closes the door behind him when
he goes to the toilet.”
Yet,lookathowtheFacebooksharepriceslidaftertherevelations
of its deal with Cambridge Analytica were outed by The Guardian
and New York Times, look at the value of Apple.
That is why there are big bucks in data protection, and it is why
GDPR is not merely a good thing, it’s good news for the world.
If, as The Economist suggested last year, big data is really the new
oil, and not the new asbestos, as cynics might suggest, then that
counts for nothing if people don’t trust companies with their data.
A recent survey by ForgeRock found that 57 per cent of UK
consumers are worried that they have shared too much personal
dataonline.Fiftythree percentsaidtheywouldnotbecomfortable
if their personal data was shared without their permission and
a further 58 per cent said they would stop using a company’s
services completely if it shared data without permission.
If data is the means by which the digital economy can create
wealth and give the so called Fourth Industrial Revolution
the impetus it needs, then without trust, without the public’s
confidence that their privacy is respected, the data revolution will
stutter and then collapse.
That is why we need GDPR - not as a piece of regulation to
scare companies, not through some sense of anti-Orwellian
sentiment, nor do we want regulators strutting the privacy stage
like thought police, punishing the slightest transgression with
fines to bankrupt well-meaning companies, but without trust, the
data revolution may not happen - and that would be a disaster.
PS: This article was written before Apple announced a new
upgrade to its iOS operating system and OS for the Mac, designed
to offer improved privacy. It seems that Apple sees data privacy
as core strength, a feature that can give its products an edge.
GDPR IS GOOD FOR YOU
05.
PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
GDPR is the general regulation, but contrary
to many media reports, other rules and
directives such as MiFID II, Open Banking,
ePrivacy Regulation and The Directive
on Security of Network and Information
Systems, do not muddy the water, in fact
they fit together.
People too easily forget the financial crisis in 2008 was so severe
that at one point it seemed as if capitalism itself was tottering.
The truth is, that in the build up to that crisis, if you listened to
the noise coming out of regulators and august bodies such as the
IMF, then to borrow words from Simon and Garfunkel, you heard
the sound of silence.
The famous singing duo also said: “Hello darkness my old friend”,
but the new digital single market and other regulations slowly being
unveiled by the EU do not create a new barrier to business, as some
might say, rather they are designed to bring “down barriers to online
opportunity,” or, so says the European Commission.
New regulations are also about transparency, and especially in
the case of GDPR, responsibility and accountability. Above all,
they are also about creating trust.
Take MiFID II - Markets in Financial Instruments Directive
version 2 - the media, always keen to spot a scandal, say that the
new financial instruments directive is not compatible with GDPR
- that the transparency required by MiFID II is at odds with the
privacy that GDPR is designed to support.
But this is simply not true. The detail does not reveal the devil,
it reveals light.
MiFID II is designed to force financial organisations to act in
the best interests of their clients, and do not, as happened prior
Goodbye darkness,
hello friend
to 2008, occasionally bet against clients, or recommend products
simply because they pay a higher commission.
So, for example, out goes the practice of giving away free
research as an inducement to trade via an organisation - instead
research must either be charged for or given away to all who
want it, regardless of whether they are customers.
High frequency trades must be time stamped in an effort to put an
end to the practice of ordering a trade in an attempt to manipulate
the market, and then cancelling the order before it is executed.
But perhaps most significantly of all, MiFID II requires the
holding of data concerning staff’s trading activities. It does this by
saying that for every trade that your bank or broker makes on your
behalf, there must be a detailed record. It must be possible to pull
up all the data and information that goes with the trade, regardless
of the medium, for example, a recorded phone call with someone
giving instructions. Or it could be a text, an email, anything that
relates to that trade - creating a paper trail, so there is a clear
record showing what went into that decision to make that trade
and whether it was in the best of interests of the customer.
So MiFID II is all about empowering the consumer to make the
right choices.
ButsomearguethattheyhavespottedacontradictionwithGDPR.
MiFID II requires the processing of data related to individuals,
GDPR imposes rules on the processing of personal data.
But as Abigail Dubiniecki, Associate at Henley Business
School’s GDPR Transition Programme and Specialist in data
privacy at My Inhouse Lawyer, says: “There may seem to be a
contradiction between MiFID II and GDPR but that’s not the
case. Indeed, the FCA (Financial Conduct Authority) and ICO
(Information Commission Office) have gone on record saying
they are not incompatible, because under article six of GDPR,
By Michael Baxter
06.
Subscribe for free here and get the next issue to your inbox
GDPR means you can only use
the data for the purpose for
which it was given. So if they
share it with someone else, or
try and sell another service,
that is a no no, unless they
find a legitimate interest or
they get my consent or make use
of one of the other bases of
lawful consent.
one of the lawful bases for processing personal data is legal
obligation – you have a legal obligation, under MiFID II, to suck
up all that data related to that trade.”
Of course, it is nuanced. As Ms Dubiniecki added: “Could you
suck up all that data and keep it forever, or not use it for the
purpose for which you gathered it? Absolutely not. Under GDPR
you have limited retention periods, you are not allowed to hold
onto it for longer than is necessary for the purpose for which you
got it, under the legal bases you used.”
Returning to the Digital Single Market another new area is
Open Banking, or PSD2. The idea here is simple enough, and
at least in part the regulation has been created from one of the
lessons of the 2008 crash, the risk of ‘too big to fail’ banks.
It’s about data portability for the financial services sector, it
is saying: ‘I should be able to take my personal data related to
the banking that I have been doing with a particular bank, in a
machine readable format that is common to everyone else, and
plug it in somewhere else to see if I can get better rates, or a
better device.’
Some fintechs may provide more information to you. Under
Open Banking, it will be possible to take your banking data
and via an app, provide you with a detailed online tool on your
spending, updated in real time, projecting how much money you
have at the end of the month, and search for better deals for you
on financial products.
But will the consumer have sufficient trust in how her or his data
is processed? If the consumer is reluctant to let a third party take
control of their data, then PSD2 will not achieve its desired outcome.
There are lots of safeguards built into PSD2, rules about how
data is used, but perception matters - the barriers to online
opportunities will not be overcome unless this trust can be earned.
That is why GDPR is so vital - the General Data Protection
Regulation is what its name suggests: general. And as Abigail
Dubiniecki said: “PSD2 does provide protections, but GDPR
means you can only use the data for the purpose for which it
was given. So if they share it with someone else, or try and sell
another service, that is a no no, unless they find a legitimate
interest or they get my consent or make use of one of the other
lawful bases.”
Open banking is a fundamental part of creating a digital market
that serves the best interests of customers, but without trust it
won’t get the take-up. GDPR is vital for creating that trust.
Another area to watch is the new ePrivacy Regulation. At the
moment, GDPR and the Privacy and Electronic Communications
Directive or PECR, work hand in hand. A marketer, for example,
seeking a legal basis for emailing customers may feel that
the requirements for consent, under GDPR, are too onerous,
but instead choose legitimate interest as their legal bases for
processing personal data.
GDPR is clear, Recital 47 states it in black and white: “The
processing of personal data for direct marketing purposes may
be regarded as carried out for a legitimate interest.” It is just
that PECR imposes additional considerations. For one thing,
it requires that in most cases, people have to give consent to
receive emails, but, as John Mitchison at the Direct Marketing
Association pointed out in a discussion with us: “There is a line
that refers to soft opt-in. So if you have collected someone’s email
in the course of doing business, and there is an opt-out option,
you can send them emails.”
PECR is soon to be replaced by the new e-Privacy Regulation.
We are not sure what the final regulation will say yet, but there
is one thing we can say for sure: it will dovetail with GDPR, the
regulations will support each other and provide more light.
Finally, there is The Directive on Security of Network and
07.
“
The media, always keen to spot a scandal,
say that the new financial instruments
directive is not compatible with GDPR
– that the transparency required by
MiFID II is at odds with the privacy
that GDPR is designed to support. But,
one of the lawful bases for processing
personal data is legal obligation –
you have a legal obligation, under
MiFID II, to suck up all that data
related to that trade.
Open banking is a fundamental part of
creating a digital market that serves
the best interests of customers, but
without trust it won’t get the take-up.
PECR is soon to be replaced by the
new e-Privacy Regulation. We are not
sure what the final regulation will
say yet, but there is one thing we can
say for sure: it will dovetail with
GDPR; the regulations will support
each other and provide more light.
Information Systems (NIS Directive). EU countries have until
25 May 2018 to transpose this into national law. GDPR is
unambiguous on the subject of security, and what to do in
the event of a data breach: Article 33 states: “In the case of a
personal data breach, the controller shall without undue delay
and, where feasible, not later than 72 hours after having become
aware of it, notify the personal data breach to the supervisory
authority competent in accordance with Article 55, unless the
personal data breach is unlikely to result in a risk to the rights
and freedoms of natural persons. Where the notification to the
supervisory authority is not made within 72 hours, it shall be
accompanied by reasons for the delay.”
GDPR then goes into further detail on cybersecurity. The NIS
directive objective is to “increase cooperation between member
states and lay down security obligations for operators of essential
services and digital service providers.” While GDPR focuses on
cybersecurity in the context of data privacy, the NIS focuses on the
sharing of information on cybersecurity threats, and improving
security safeguards. The two regulations are once again designed
to dovetail - and if there is any superficial indication that the
sharing of information required by NIS is at odds with GDPR’s
requirement for privacy, such a contradiction breaks down
when you consider that one of the lawful bases for processing
of personal data under GDPR is legal obligations. However, in
NISD it is specifically stated that any information sharing that
happens reincidence must be consistent with GDPR regulations
when personal data is involved.
The digital single market, big data, and the opportunity it
all brings can create wealth and prosperity - but only if it can
illuminate where currently there is darkness, bring transparency
where where there is lack of visibility and trust where there
is suspicion. GDPR, together with a raft of other measures, is
designed to bring lightness and make digital technology and data
the customers’ friend.
The NIS directive objective
is to “increase cooperation
between member states and lay
down security obligations for
operators of essential services
and digital service providers.”
Goodbye darkness,
hello friend
08.
GDPR IS A JOURNEY,
NOT A DESTINATION
£50 OFF
BOOK NOW
PROMO CODE: DPMAG
GDPRSUMMIT.LONDON
THE U K’ S L E ADING GDPR EVENT SERIES
UNDERSTAND WHAT GDPR
MEANS FOR THEIR BUSINESS
HEAR FROM THE LEADING
GDPR EXPERTS
BENEFIT FROM INTERACTIVE
IN-DEPTH SESSIONS
GET ANSWERS TO THEIR
BURNING QUESTIONS
GAIN AN ACTIONABLE ROADMAP
TO COMPLIANCE AND BEYOND
SUPPORTED BY
ATTENDEES WILL:
GDPR is an EU measure, but the principles
that lie behind the regulation are
gaining traction around the world.
George Soros, the billionaire investor and
speculator, who famously bet against the
Bank of England in 1992 and won, is an
arch critic of the way big tech companies
such as Alphabet and Facebook use data.
Maybe it has something to do with Soros’
roots - growing up in Hungary under Nazi
occupation. “Social-media companies are
inducingpeopletogiveuptheirautonomy,”
he said, “the power to shape people’s
attention is increasingly concentrated in
the hands of a few companies threatening
the concept of ‘the freedom of mind.’”
For that reason, he is a big supporter
of GDPR. “Europe has much stronger
privacy and data-protection laws than
America,” he said when speaking at the
World Economic Forum in Davos, earlier
this year. He continued, “Commissioner
Vestager (European Commissioner for
Competition) is the champion of the
European approach. It took the EU seven
years to build a case against Google, but
as a result of her success, the process
has been greatly accelerated. Due to her
proselytizing, the European approach has
begun to affect attitudes in the United
States as well.”
Yet there is a perception that the US is
too far behind.
Commenting on the ongoing issues
with Facebook, Vera Jouriuva, EU
Commissioner for Justice, Consumers and
Gender Equality, recently told the BBC:
“I can tell you that now we have much
stronger arguments to say that we did
the right thing to adopt the General Data
GDPR and
the United
Statesby Michael Baxter
Don’t miss the data protection event of the year – pre-register here
11.
Protection Regulation. . . So there will be
a much stricter regulatiory framework
in the EU. . . But I have just come back
from Washington and when I look at the
American legislation, I don’t see such a
robust legislative framework so it will be
interesting to look at the United States
to see whether they will come up, in the
future, with stricter rules.”
Could this be changing?
“The Americans have famously over time
been the people who will give away data
all day long,” says Eve Maler, the VP of
Innovation & Emerging Technology at
ForgeRock. But this could be changing:
speaking to us from Seattle, she said “but
Americans have become sensitised and
more cynical.”
A recent survey from The Economist
Intelligence Unit (EIU) and sponsored
by ForgeRock, which describes itself as
a digital identity management company,
found that Americans are the most
concerned about misuse of data, with the
survey including people from Europe and
Asia Pacific as well as the US.
Eve Maler speculates that, in what
she calls ‘the post, post-Snowdon era’,
people are beginning to have strategies
around their privacy rights. She suggests
that following various breaches, United
Healthcare and Equifax, for example,
American attitudes have shifted.
Vera Jouriuva said: “I have a very
strong feeling that the tiger got out of the
cage.” Referring to the recent furore over
Facebook, she continued: “Something that
is serious happened. That will have opened
all of our eyes. I see that American people
are more relaxed about their privacy and
they are not looking to impose such strong
pressure as the Europeans. This strong
pressure from Europeans resulted in the
stricter rules, coming into force in May
(GDPR). And I expect something like that
in the United States.”
What happens in the US regarding
data privacy is relevant to companies
trying to comply with GDPR, not least
because under GDPR, a data controller
is responsible for the personal data it has
collected, even when it is processed by
third parties including third parties not
based in the EU. See this in the context
of uploading data onto the cloud: as
far as GDPR is concerned, when data
gathered by a country that falls under the
jurisdiction of GDPR uploads data into
the cloud, and that data is held on servers
in the US, maybe owned by Amazon,
Microsoft or Alphabet, then that data is
regarded as imported. Or so Anthony
Lee, a privacy lawyer and Partner at DMH
Stallard told us.
Mr Lee explained: “There is a general
prohibition of personal data exported out
of the European Economic Area, unless
it is going to a country which has been
designated by the EU commission as
having suitably robust privacy laws in
place. And for example, the US is not one
of those countries.”
The data privacy story in the US weaves.
There used to be Safe Harbour, until the
Austrian lawyer and privacy activist, Max
Schrems, with half a mind on the Edward
Snowdon revelations, took on Facebook.
Safe Harbour was a voluntary scheme that
had been set up by the US government in
conjunction with the EU Commission, it
meant that if a company was a member you
could tick the adequacy box, as required
under GDPR for the export of data.
Schrems objected to the way, US state
agencies could, as Anthony Lee put it,
“essentially help themselves to data under
the Patriots Act - creating the impression
of mass surveillance.” Mr Schrems took the
matter up with the Irish Data Protection
Agency, who dismissed his case, saying it
was “frivolous and vexatious.”
Schrems was not finished, however,
takingthecasetotheIrishHighCourt,who
referred the matter to the European Court
of Justice to the European Commission,
who sided with Schrems saying that Safe
Harbour was unlawful and henceforth
‘you can’t use it’.
The eventual result was The Privacy
Shield, the voluntary system currently in
place. According to Anthony Lee, under
Obama there was an understanding that
there would be restraints on the extent to
which state agencies could help themselves
to data. The Trump administration,
however, has made it clear that US rules
are there to protect US residents and not
European citizens .”
Sylvia Kingsmill is a Partner at KPMG,
heading up digital privacy and compliance
nationally for the firm. Canada, and in
particular, Ontario, is the home of privacy
by design and default, a fundamental part
of GDPR, and Ms Kingsmill worked with
the Ontario privacy commissioner in the
early days of privacy by design.
Shesays:“TheproblemwiththeUSisthat
they do not have a national omnibus data
protection law that regulates data privacy,
but they do have a very heavy handed
regulator, The Federal Trade Commission,
which exercises its powers under section
five of the Federal Trade Act, for any
practices that are misleading and deceptive.
Privacy falls under that umbrella. That’s
where The FTC flexes its muscles, against
the tech giants like Facebook and Google,
but they do need a national standard, so
there is a uniform approach.”
However, Ms Kingsmill says that sectors
like healthcare and banking have been
regulated for some time, and “I know that
the US gets a lot of flack because the tech
giants aren’t regulated enough, but in
certain sectors take privacy very seriously.”
Another important element in US data
privacy relates to class action law suits,
which are far more prevalent in the US.
“Facebook could be the game changer,
though,” suggests Ms Kingsmill.
And that takes us to how the US, with
its own particular priorities and the
EU, via GDPR, view data privacy. The
differences are clear. Maybe in the EU, the
experiences from the World War II period,
strikes a stronger resonance. It may be no
coincidence that within the EU, the most
vociferous of defenders of data privacy, and
how it relates to human rights, is Germany.
But if attitudes in the US, as the
ForgeRock sponsored EIU survey suggests,
are changing, then the mood may be ripe
for a GDPR type framework to be applied
there too. Maybe the eruption of media
attention relating to Facebook and data
privacy will be the catalyst.
There is a general prohibition
of personal data exported out
of the European Economic Area,
unless it is going to a country
which has been designated by
the EU commission as having
suitably robust privacy laws in
place. And, for example, the US
is not one of those countries.
A recent survey from The
Economist Intelligence Unit (EIU),
sponsored by ForgeRock, found
that Americans are the most
concernedaboutmisuseofdata.
Expand your knowledge – pre-register to Data Protection World Forum today
Millennials are different, or so they say.
The generation born in the decade or two before the end
of the last century, or so we keep hearing, have a different
way of thinking.
Take their attitude towards personal data. A study by
the USC Annenberg Center for Digital Future and Bovitz
Inc found that 56 per cent of the millennial generation
would be willing to share their location with a nearby
company in return for a relevant coupon or promotional
deal. By contrast, only 42 per cent of users of 35 years and
older agreed they would share their location.
Jeffrey I. Cole, the director of the USC Annenberg Center
for the Digital Future said: “Millennials recognise that giving
up some of their privacy online can provide benefits to them.
This demonstrates a major shift in online behaviour.”
But is it really like that? The millennial generation and
their youngers, generation Z, are what they call digital
natives - they are digitally savvy.
But how many kids do you know who share only
minimal information on Facebook with parents? Don’t
they know precisely how to change privacy settings,
determining who can see certain posts.
Abigail Dubiniecki, data privacy specialist lawyer,
recalls “being at a conference where people were talking
about 14-year-olds using Instagram, setting up multiple
user profiles, deleting stuff when it no longer supports
the narrative they put forward to friends. Whether it
is cheetah photos or skate boarding, they are actively
developing their brands on social media.”
She says: “I think this is far more natural to them than
to us; I did not grow up being seen by the whole world. If
something embarrassing at high school happened, maybe
most of the people at the school would know about it, but
not someone a couple of blocks away, or half an hour
away. But they have found very creative ways to control
their own narrative.”
It’s what’s Dr Ann Cavoukian calls informational
self-determination. It’s about ‘when I want to share, I
do, when I don’t want to, I don’t. If I share it’s for this
purpose not that’. Abigail says: “That’s what privacy is.”
Valerie Steeves, Professor of Criminology at the
University of Ottawa, conducted a study into this very
area. It seems that privacy consideration come as almost
second nature to this generation.
In the study, she found that participants engaged in a
number of different strategies to manage their privacy.
For example, a small number of photos were kept
entirely private, while priority was given to efforts aimed
at controlling who sees particular photos and preventing
them from being spread to unintended audiences.Some
topics were seen as private and not appropriate to share.
Snapchat and Instagram were used to ensure audiences
saw particular photos. Snapchat was seen as platform for
sharing with friends, Instagram was seen as a platform for
building a persona and thus required more careful curation.
Some created multiple accounts to limit which
audiences saw which content, for example, ‘spamming
friends using one Instagram with less formal images,’
but these accounts only have a very small number of
followers, such as close friends.
Snapchat has a feature that tells users if a screen shot
is taken of one of their photos, and this was cited by
some as an important feature.
More than half expected friends to ask permission
before posting an image with them in it. There was a
lot of emphasis on retrospective consent, asking peers to
delete photos they didn’t want shared.
Data privacy seems to be something
that is implicitly understood by
this generation.
GDPR and how the millennial generation manage privacy
By Michael Baxter
13.
Dr Ann Cavoukian refers to informational self-
determination. It’s about ‘when I want to share, I
do, when I don’t want to, I don’t. If I share it’s
for this purpose not that’.
GDPR:
A QUICK SUMMARY
BY ARDI KOLAH,
Executive Fellow and Director of the GDPR Transition Programme at Henley Business School
Introduction
GDPR requires a re-boot in our thinking about data protection,
privacy and security for the digital age. It represents the biggest
shake up in European data protection and privacy laws for over
two decades.
The genesis of GDPR is the 2012 proposal by the European
Commission for a modern legal framework for the world’s largest
digital single market of 500 million consumers. This was more
about lowering the barriers to market entry for new entrants,
increasing competition and choice of products and services for
consumers and stimulating economic activity and employment
across the European Economic Area (EEA).
Since that time, the pendulum has swung in the other direction
and there’s now an obsessive focus on data protection, privacy
and security in the wake of an exponential increase in cyber-
crime and the misuse of personal data on a global scale.
Although this is extremely important it has clouded judgment
on seeing GDPR as a significant opportunity, not a regulatory
threat for companies and organisations.
The open competition aspects of GDPR remain in place,
and the European Commission is actively leading the effort to
encourage companies and organisations to create a deeper level
of digital trust in order to do more, not less, with personal data.
Fully enforceable across all 28 EU Member States from 25 May
2018, GDPR aims to deliver a high degree of consistency, certainty
and harmonisation in the application of data protection, privacy
and security laws across the EU, replacing the Data Protection
Directive 95/46/EC and other Member State legislation like the
Data Protection Act 1998 in the UK, in its wake.
Within this new landscape, there’s limited ‘wriggle room’
for Member States to pass laws that impact the processing of
personal data seen only through the lens of national self-interest.
Many commentators have pointed to this as evidence of a lack
of harmonisation of data protection, privacy and security laws
applying across the EU, given the differences in the way some
aspects of GDPR will work on a country-by-country basis.
However, the reality is that such differences are largely
confined to a relatively small number of operational areas for
companies and organisations within the EU.
Many countries, including the UK, have passed their own data
protection laws in alignment with GDPR and globally, this trend
is gaining momentum.
In the case of the UK, certain standards under GDPR are being
raised under Member State laws.
From a commercial perspective, companies and organisations
that conduct cross-border personal data processing will be
primarily regulated by the local supervisory authority in the
jurisdiction in which it has its main establishment.
Data protection principles
GDPR retains the core principles of the Data Protection Directive
95/46/EC,buthasbeefedthemup.Thecorerulesmaylookfamiliar
to experienced privacy practitioners and senior managers, but
this is a trap for the unwary as there are many important new
obligations as well as a tougher regime of sanctions and fines
for getting this wrong. There are seven data protection principles
and the data controller and data processor must ensure that it
complies with all of them:
1. Lawfulness, fairness and transparency
Personal data must be processed lawfully, fairly and in a
transparent manner.
2. Purpose limitation
Personal data must be collected for specified, explicit and
legitimate purposes and not further processed in a manner that’s
incompatible with those purposes (with exceptions for public
interest, scientific, historical or statistical purposes).
15.
GDPR is complex, right? Even if you feel you know the
basics the Regulation can be confusing. Here, GDPR is
summarisedbyArdiKolah,DirectoroftheGDPRTransition
Programme at Henley Business School, and Editor-in-
Chief at the Journal of Data Protection and Privacy. This
summary comes from Ardi’s book, The GDPR Handbook,
whichisavailableonAmazonfrom3June2018.
Make sure you don’t miss issue 2 – subscribe here for free
3. Data minimisation
Personal data must be adequate, relevant and limited to what’s
necessary in relation to purposes for which they are processed.
4.Accuracy
Personal data must be accurate and where necessary, kept up-
to-date. Inaccurate personal data should be corrected or deleted.
5. Retention
Personal data should be kept in an identifiable format for no
longer than is necessary (with exceptions for public interest,
scientific, historical or statistical purposes).
6. Integrityandconfidentiality
Personal data should be kept secure.
7. Accountability
The data controller should be able to demonstrate and in some
cases, verify compliance, with GDPR.
What to do now?
Double check that all policies, processes and procedures are in place
and that this delivers the seven data protection principles. Ensure
that the board support company and organisation-wide awareness
and training programmes that should be short, informative (not
boring!) and that all of this is recordable and logged.
Security of processing
GDPR requires the data controller and the data processor to
keep personal data secure. This obligation is expressed in general
terms but does indicate that some enhanced measures, such as
encryption and pseudonymizing may be required.
The data controller must report data breaches to their
supervisory authority within 72 hours of discovering this has
happened, unless the personal data breach doesn’t cause high or
very risk to the rights and freedoms of individuals.
What to do now?
Undertake a full review of technical security measures that
are appropriate for the type of personal data processing being
carried out at the data controller and data processor. Seek expert
guidance and support.
Data Protection Officer (DPO)
A data controller, joint data controller and a data processor may
be required to appoint a data protection officer (DPO). This
depends on what processing of personal data is being carried out.
Certain private and most public-sector organisations will be
required to appoint a DPO to oversee their data processing operations.
A DPO will be required where: the processing is carried out by
a public authority or body; the core activities of the data controller
or data processor consist of processing which requires regular and
systematic monitoring of data subjects on a large scale; the core
activities consist of processing special categories of personal data
on a large scale and required by Member State law.
The DPO must be involved in all data protection issues and
can’t be dismissed or penalised for performing their duties and
responsibilities. The DPO must also report to the highest level
of management authority within the company and organisation.
The DPO must be involved in all data protection and security
issues and can’t be dismissed or penalised for performing their role.
The DPO must report directly to the highest level of
management within the company or organisation but doesn’t
have to physically report to the CEO. The report they write must
be considered by the board.
What to do now?
Ensure that a suitable senior manager within the company and
organisation has been identified and can be trained independently
to fulfil the duties and responsibilities of the DPO. They need to
be adequately resourced otherwise this in itself is a breach of
GDPR. Other alternatives including using a consultant as a DPO
or an outsourced DPO service.
Data Controller and pan-European data breach notification obligations
The data controller is responsible for deciding the purposes and
means for the processing of personal data and can be a private
company and organisation, a charitable or voluntary body,
operate in the public sector or be government department.
The responsibility and liability of the data controller for any
processing of personal data or that done on its behalf by the data
processor needs to be established.
The data controller is obliged to implement appropriate and
effective organisational and technical measures to mitigate very
high risks of processing personal data that could cause harm or
damage to data subjects.
It’s important that the data controller is able to demonstrate
compliance of personal data processing activities with GDPR. Those
organisational and technical measures should take account of the
nature, scope, context and purposes of the personal data processing
and the risks to the rights and freedoms of natural persons.
GDPR imposes stricter obligations with respect to data security
and a specific data breach notification process.
In the event of a personal data breach that’s likely to result
in a high risk to the rights and freedoms of data subjects, the
data controller must notify the supervisory authority and where
appropriate affected data subjects within 72 hours of becoming
aware of the breach. Notice needs to be given “without undue
delay” and must contain proscribed information.
What to do now?
Prepare for personal data breaches and practice the way in which
the company and organisation will respond to the real thing.
16.
Double check policies, processes and procedures are fit for purpose and
ensure how the company and organisation will react and whom to notify
including the DPO. Implement staff training as to what to do in such
instances, as well as how to reduce the risk of incidents and personal
data breaches occurring in the first place.
Extra-territorial reach
GDPR primarily applies to companies and organisations established
in the EU. It will also apply to businesses based outside the EU that
offer goods and services to, or monitor the behaviour of individuals in
the EU.
What to do now?
These businesses based outside of the EU will need to appoint a
representative in the EU, subject to certain limited exemptions. The
representative may have to accept liability for breaches of GDPR and
could only act under specific instructions from the data controller. The
representative effectively has all of the downsides but no corresponding
upsides so it’s a difficult relationship from the start.
Cross-border data transfer rules
GDPR prohibits the transfer of personal data outside the EU unless
certain conditions are met. Those conditions are broadly the same as
those under the Data Protection Directive 95/46/EC. but adds new
ones such as certification mechanisms and codes of conduct, as well
as a new very limited derogation for occasional transfers based on
legitimate interest. Country-specific authorization processes will no
longer be needed (with some exceptions) and Binding Corporate Rules
(BCRs) are formally recognised in GDPR.
What to do now?
Double check that there’s an up-to-date inventory of cross-border
data flows; check the strategy in the light of any decisions from the
European Commission, the Court of Justice of the European Union and
EU-US Privacy Shield (if appropriate).
Data mapping
The data controller and data processor must maintain an up-to-date
record of processing activities. Under GDPR, detailed information
must be kept and could be inspected.
What to do now?
The DPO must ensure and document what actually the data controller
holds now in the way of personal data, the intentions of transferring
this personal data, and how such data flows internally through the
company and organisation.
Data Processor obligations
GDPR expands the list of provisions that data controllers must
include in their contracts with data processors.
Some aspects of GDPR are directly applicable to processors.
This will be a major change for some suppliers who have avoided
direct regulation under the Data Protection Directive 95/46/EC
by setting themselves up as data processors.
Processors will be jointly and severally liable with the relevant
controller for compensation claims by individuals.
What to do now?
Ensure that new obligations under GDPR are understood and
observed. GDPR imposes compliance obligations directly on the
dataprocessor,suchasimplementingsecuritymeasures,notifying
the data controller of personal data breaches, appointing a DPO (if
applicable), maintaining records of processing activities, etc. the
data processor will be directly liable in case of non-compliance
and may be subject to direct enforcement action.
The data controller and data processor will be required to enter into
detailed data processing agreements or renegotiate existing ones.
Data Subject’s rights
GDPR largely preserves the existing rights of individuals to
access their own personal data, rectify inaccurate data and
challenge automated decisions about them. The Regulation also
retains the right to object to direct marketing.
There are also potentially significant new rights for individuals,
includingthe“righttobeforgotten”andtherighttodataportability.
The new rights are complex and it isn’t clear how they will
operate in practice. In addition, the data controller will also
be required to provide significantly more information to Data
subjects about their personal data processing activitiesd
What to do now?
The data controller must implement appropriate processes and
infrastructure to be able to address data subjects’ rights and
requests. It must also ensure that an adequate Data Privacy
Notice is provided prior to commencement or within one month
where the personal information comes from a third party.
Quality of consent
Obtaining consent from an individual is just one way to justify
processing their personal data. There are other justifications.
It will be much harder for the data controller to obtain a
valid consent under GDPR. Individuals can also withdraw their
consent at any time.
As under the Data Protection Directive 95/46/EC, consent to
process special (sensitive) personal data must be explicitly given.
Consent to transfer personal data outside the Union must now
also be explicit.
What to do now?
Consent is retained as a processing condition but GDPR is more
prescriptive than the Directive 95/46/EC when it comes to the
conditions for obtaining valid consent, so this needs to be very
carefully understood.
The key change is that consent will require a statement or clear
affirmative action of the data subject. Silence, pre-ticked boxes
17.
and inactivity will not be sufficient. GDPR clarifies cases where
consent won’t be freely given (e.g. no genuine choice to refuse,
clear imbalance between the data subject and the data controller
such as in the workplace). Data subjects must be informed of their
right to withdraw consent.
One-Stop Shop
GDPR applies to the processing of personal data by data
controllers and data processors established in the EU, as well as
by data controllers and data processors outside the EU where
their processing activities relate to the offering of goods or
services (even for free) to data subjects within the EU, or to the
monitoring of their behaviour.
The supervisory authority in the jurisdiction of the main or single
establishment of the data controller/data processor will be the lead
authority for cross-border processing (subject to derogations).
What to do now?
Assess whether – as a non-EU data controller and data processor
– data processing activities of data subjects is within scope of
GDPR. This must determine where the ‘main establishment’
might be located based on data processing activities.
Profiling and Profiling-based decision making
GDPR includes a wide range of existing and new rights for data
subjects. Amongst these are the right to data portability (right to
obtain a copy of one’s personal data from the controller and have
them transferred to another controller), right to erasure (or ‘right
to be forgotten’), right to restriction of processing, right to object
to certain processing activities (profiling) and to automated
processing decisions. The data controller will also be required
to provide significantly more information to data subjects about
their processing activities.
What to do now?
Data subjects have the right not to be subject to a decision based
solely on automated processing, including profiling, which
produces legal effects concerning them or similarly significantly
affects them. Individuals will also have an express right to ‘opt
out’ of profiling and automated processing in a wide range of
situations. If the data controller is engaging in profiling activities, then
it must consider the best way to implement total security measures.
Data Protection by Design and by Default
These concepts are codified in GDPR and require the data
controller to ensure that individuals’ privacy is considered from
the outset of each new processing, product, service or application,
and that, by default, only minimum amounts of personal data as
necessary for specific purposes are collected and processed.
What to do now?
Implement measures, such as pseudonymisation or data
minimisation designed to implement data protection principles
from the outset of any project.
Data Protection Impact Assessment ‘Lite’ and DPIA
The data controller will be required to perform a data protection
Impact Assessment (DPIA), where the processing of personal
data (particularly when using new technologies) is likely to result
in a high risk to the rights and freedoms of the individuals.
At Henley Business School we conceived of DPIA ‘Lite’ which
is a helicopter view of what is being done that’s compliant,
what’s currently being done that isn’t compliant and what the
data controller needs to consider starting to do in order to be
complaint with GDPR.
What to do now?
DPIAs will particularly be required in cases of: an evaluation of
personal aspects based on automated data processing including
profiling processing on a large scale of special categories of
personal data; systematic monitoring of a publicly accessible
area; make DPIAs part of standard procedures for all personal
data processing operations so that they are easier to implement as
an everyday task; DPO should ensure that staff have been trained
in order to carry out a DPIA and these need to be documented
very carefully.
Sanctions and fines
There is a step change in sanctions and fines where supervisory
authorities can impose fines up to 4% of annual worldwide
turnover or €20 million, whichever is greater. There are also other
powers at their disposal including audit rights, issue warnings
and issue a temporary or permanent ban on the processing of
personal data (known as a ‘STOP order’).
What to do now?
Understand and keep under review the type of personal data
being processed, whether it is very high or high risk and what
mitigation measures are necessary to reduce this to a residual risk
that doesn’t cause harm or damage to data subjects and record
the organisational and technical measures deployed within the
company and organisation.
18.
The EU General Data Protection Regulation (GDPR)
comes into force in May 2018, and will impact every
organisation that trades with any EU country.
The new regulations will disrupt how organisations:
store,manageandprocessdata.GDPRisthebiggest
shake-up in data protection for a generation.
Ourhalf-dayandfull-daycoursesprovideapractical
and efficient staff training programme, presenting
staff with the knowledge, understanding and
practical tools to implement and maintain GDPR
compliant processes.
Contact Leon for more details:
leon@dataprotection.media
0203 515 3015
Half Day : £2,000
Full Day : £3,500
Privacy by design and default
Where it began:
Dr Ann Cavoukian is passionate about
data privacy. Appointed Information and
Privacy Commissioner of Ontario, Canada
in 1997, today she is the Distinguished
Expert-in-Residence, leading the Privacy
by Design Centre of Excellence at Ryerson
University, Canada.
And when it comes to privacy by design
and default - she is the boss, the governor
- the person who brought the concept to
the world.
Privacy by design and default is a key,
maybe the single most vital, part of GDPR
- it means privacy is part of the foundation
of a new product or service, an integral
part - not a bolt-on extra.
AndwhenDrCavoukiantalksonthesubject,
her passion is palpable - and contagious.
Back in 2016, she expanded on the idea
at the SAI Computing Conference, London.
You can see a video of her talk on the
Ryerson website. It’s worth watching.
But when you hear her story, you begin
to get it. Privacy, and by extension, because
it is so essential to it, privacy by design and
default, is about freedom - a human right.
The good doctor herself has a personal
tale which may point to why this concept
is so important to her. She is from an
Armenian family born in Egypt - her
parents fled the country in the 1950s in
the dead of night. They had lived charmed
lives, but still they left. Why? She asked
her parents that very question. “We moved
for you, for your freedom” they said.
But then maybe a similar story can be
applied to the German people as a whole.
Today they have a phrase to describe it.
Germans call it informationelle
Selbstbestimmung - Information self-
determination. It was enshrined into
the German constitution in 1983 and
Dr Cavoukian says “Germany is now
the leading privacy and data protection
country in the world.”
But there is a myth about privacy - it
does not mean secrecy.
Not zero sum
Some people fear that, in a healthcare
setting, data privacy could hold back
progress, cost safety and lives. Data is
helping the fight against disease - it was
once estimated that if the data gathered
by the British NHS was captured and
monetised it would create an organisation
worth over a trillion dollars.
But privacy does not have to come at the
expense of the benefits of data. It’s not a
zero-sum game - privacy does not means
less of another benefit, rather privacy can
enhance, create a positive sum.
Dr Cavoukian says “privacy can work
with safety, with data analytics, marketing
- it’s not an either, you can have both.
Privacy does not equal secrecy it’s about
personal control, which belongs to you.
Privacy does not equal secrecy.
By Michael Baxter
20.
Privacybydesign
It’s about freedom - a human right.
PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
Adil Akkus has 24 years’ experience in IT and Business change delivery. He holds a B.Sc.
degree in Computer Science as well as PRINCE2, MSP and MoP Practitioner certifications.
He has successfully implemented a wide varity of multi-site and multi-million-pound
programmes for industry leaders such as Mercer, Sky, IBM, Virgin Media, BT, AT&T Wireless
and T-Mobile. With his CIPP/E privacy and data protection certification, he has set up
and implemented 3 multi-national GDPR programmes. Here he introduces privacy by design.
Equifax
In September 2017, Equifax hit the news
headlines with reports suggesting that
over 140 million people’s personal data
was breached.
It transpired that Equifax failed to install
a security patch tool for building web
applications in a timely manner.
Some proactive actions and securing data
even at the design stage would have saved
a lot of money and bad press for Equifax.
Facebook, WhatsApp data sharing for
ad targeting
After Facebook purchased the WhatsApp
messaging platform, they set out to
change WhatsApp’s terms and conditions
to allow the data sharing with Facebook
for ad targeting purposes.
The news raised a lot of public uproar.
Even though most of us use both apps
separately, a significant percentage of the
users changed their settings to prevent
this. In March 2018, the ICO convinced
WhatsApp to sign a public commitment
not to share personal data with Facebook
until data protection concerns are
addressed.
Setting the users’ privacy to the strictest
mode (i.e. privacy by default) would surely
have prevented all this hassle.
Conclusion
With the undeniable influence of GDPR,
public and the regulators expect “privacy
by design” to prevent such high profile
cases and to increase public confidence
in the organisations with their data. Until
recently, data protection teams had to
fight hard to convince the product, service
and technology teams to pay attention to
privacy concerns, GDPR’s emphasis on
“privacy by design” will transform out
ways of working and thinking.
1.ProactivenotReactive;Preventative
not Remedial
The principle encourages the proactive
rather than reactive measures. It requires
the organisations to anticipate the risks and
prevent privacy incidents before they occur.
Privacy as the Default Setting
(referred to as Privacy by
Default)
This principle promotes the maximum
privacy protection as a baseline. It requires
any new service or product to use strictest
privacy settings by default, without any
manual input from the end user. Some
privacy experts argue that GDPR’s push
for “explicit” consent can be traced back
to this principle.
Privacy Embedded into Design
This principle demands the privacy
measures not to be bolted on as add-ons,
after the fact. It drives privacy to become
an essential component of the core service,
product or functionality that is being
delivered. As a result, privacy becomes
an integral part of the product or service,
instead of a functionality trade-off.
Full Functionality – Positive-Sum,
Not Zero-Sum
This principle encourages creating a
privacy-by-design culture where privacy
policies drive rather than hinder growth
and revenue.
End-to-End Security with a full
Lifecycle Protection
Having embedded privacy by design into
the systems in the previous principles, this
one requires the data to be protected cradle
to grave; wherever it goes, from the time it
is created, shared, and finally archived.
Visibility and Transparency –
Keep it Open
This principle is all about building
consumer trust. Information about your
privacy practices should be out in the
open and written in plain language.
Respect for User Privacy – Keep
it User-Centric
This principle puts consumers in charge of
their data. It requires users to be given the
ability to update and correct any data held
about them. It also requires the users to be
the only ones who can grant and revoke
access to their data.
2.
3.
4.
5.
6.
7.
21.
Over the recent few years, we have
witnessed some of the highest profile data
breaches and incidents from Yahoo to
Uber, from Equifax to Talktalk.
Some of these breaches remained
undetected, unreported or unknown for
various reasons.
A good proportion of these incidents could
have been prevented or handled in a much
less costly and much less distressing way by
using the privacy by design principles.
Background
Even though privacy by design has
become very popular over the last few
years, thanks to GDPR, it is has been
around since the 90s.
Privacy by design argues the view that
the future of privacy cannot be assured
solely by compliance with legislation
and regulatory frameworks. It argues
that privacy assurance must become an
organisation’s default mode of operation.
Dr. Ann Cavoukian, who is hailed as the
‘Godmother of Privacy by Design,’ set out
the foundations across seven principles:
The 7 Foundational Principles
of Privacy by Design
By Adil Akkus
From
Don’t miss our next issue – subscribe for free here
Youcandomorewiththedatabecause
people already know; they are not
goingtobeshockedorannoyed,they
are going to share more, they are
goingtosharegoodqualityaccurate
databecausetheywillseethatthere
isabenefittoit.Andtheytrustyou.”
Andthatistheessenceofprivacyby
designanddefault.
Grim Reaper
to hero
How the data protection officer has become an enabler - and is
no longer seen as a block to new ideas.
Once, the data protection officer was the last port of
call. So you had an idea for a product, you researched
it, tested it, tweaked it, honed it some more, the PR
people were ready, the advertising agency was briefed,
the design was just so - and then it was compliance -
the only barrier.
To some it felt like the kiss of death to their brilliance.
Deborah Dillon, a privacy protection officer of
considerable experience says that is how it used to
feel like to her.
Today, she is the Data Privacy Lead at Atos. Before
that, she worked for a ‘large bank.’
“Data privacy” she recalls “was like an after-
thought. There had been no consideration of data
development at all, until they came to me needing the
information governance. And I often had to put the
dampeners on projects.”
It is not like that now, of course, and privacy by
design is one of the main reasons, at least for Deborah.
There’s a growing realisation that privacy by design
is a key part of a product or service. And the data
protection officer, by being the person who is integral
to that, has been transformed.
“I used to feel like I was seen as a grim reaper, now
I am seen as an enabler,” says Deborah. Maybe she is
being too modest.
In the era of digital transformation and big data,
the data protection officer is a hero, the one whose
insights can help create trust between data subject,
and controller or processor.
Abigail Dubiniecki, Associate at Henley Business
School’s GDPR Transition Programme and Specialist
in data privacy at My Inhouse Lawyer, says that
GDPR, and privacy by design in particular, also
present an opportunity to do some house keeping.
It’s a kind of nudge. It is in our own interests to
wear seat belts when driving, but until it became
compulsory - the law - many of us ignored it. How
many lives were saved when the law was changed?
The numbers are probably countless? In fact, the
Royal Society for the Prevention of Accidents has
estimated that 50,000 lives have been saved in the
UK alone since the law making it compulsory to wear
seat belts in the front of a car was introduced in the
country some 30 years ago.
Indeed, behavioral economists have a word for it:
‘nudge’, advanced by Cass Sunstein and the Nobel
Prize winning Richard Thaler - that we can change
our behavior to act in our own best interests through
quite subtle influences.
Maybe GDPR and privacy by design are like that.
Companies are run by people - and some tasks are
seen as more fun. Efficiently managing data, ensuring
it’s not kept unnecessarily eating up valuable storage
space, may not feel sexy, but it’s still important.
Or for that matter, since data is valuable, knowing
what you have is vital.
As Ms Dubiniecki continued: “People are not
leveraging their data because they don’t know what
they have. So they are not unlocking the value from
23.
So for me, GDPR is the missing
link to get to companies
before they do digital
transformation.
“
it, they are just hoarding it, and putting it into lakes
because they are hoping that some day they will find
something in that closet or lake - so GDPR is forcing
you to know exactly what you have, which gives you
an opportunity to really harvest it.”
But privacy by design and default is about creating
trust. Ms Dubiniecki explained: “You can do more
with the data because people already know, they are
not going to be shocked or annoyed, they are going
to share more, they are going to share good quality
accurate data because they will see that there is
a benefit to it. And they trust you.” And that is the
essence of privacy by design and default.
When she was working at the ‘large bank’,
Deborah Dillon says “I brought in privacy by design,
in anticipation of GDPR, working with system
developers, products teams, and building it right at
the start of a product. And it worked so well, and data
was captured so well, that systems developers were
talking to each other about the datasets they needed
to collect to apply privacy principles.”
She says it was so refreshing to hear, “really turning
data privacy on its head. I am bringing it in with clients
now, it’s great for compliance and transparency.”
If you can get privacy right, rather than becoming
an after-thought, a necessary hurdle to overcome, it
can enhance a product, make it more appealing to the
customer, a reason to buy your product or service.
She recalls an occasion, at the bank, how she
managed to build data protection into a Christmas
competition in a way that made it more interesting.
It involved a Christmas tree in every branch - the idea
wasforcustomerstonominateaneighbourforanaward,
as someone who had done a lot for the local community
and put their nomination on the tree. But the privacy
implications were enormous - and potentially damaging
to the bank’s reputation with customers.
Deborah explained: “Privacy by design is not just
the big systems. In this particular case, by using
personal privacy as an advantage, we made the
identity of the person nominated anonymous, instead
they were described without revealing who they were
and the person who nominated them put their own
email address into the competition, and in the event
their nominee win, the neighbour was to be told and
in turn inform the winner.”
Digital transformation; the missing link
Digital transformation has become a big buzz word
of the moment - a lot of companies are looking to
transform their businesses - they may have a good
budget, but, as Deborah says: “They have to get their
house in order first, for me GDPR comes before
digital transformation.”
She says that if they know data is accurate, and
relevant, in the right place at the right time, then
when they are looking at digital transformation, the
process can become a whole lot more effective.
She gives as an example, uploading data onto the
cloud: if they have classification of documents, then if
there is highly sensitive information and data, it stays
in the private cloud, but internal information might
go into the public cloud.
“So for me, GDPR is the missing link to get to
companies before they do digital transformation.”
Atos is the European leader in the payment
industry. It focuses on big data services, cybersecurity,
and digital workplace, cloud services, infrastructure
& data management, business & platform solutions,
as well as transactional services through Worldline.
Given what it does, it may come as no surprise to
learn that Deborah says: “At Atos, privacy by design
is a must for our customers.”
The benefit
The headlines focus on the fines associated with
GDPR, the potentially heavy cost of not complying.
But for many large companies a fine, even one that
could be four per cent of turnover, is just another
problem - and not necessarily even one of the more
serious challenges.
Maybe that will change when the first fines are
rushed out, but at the moment there is a perception
that regulators across the world are thinly stretched,
have multiple responsibilities, and that in practice,
fines will be rare - a risk to a company for sure, but
maybe not a great one.
The latest revelations concerning Facebook add
another dimension - the damage to the company’s
reputation - the falls in the share price that saw
over $70 billion knocked off the company’s market
capitalisation - shows how data privacy and respecting
customer’s data is not just about trying to avoid fines,
but protecting reputation too - something that can be
worth far more than a few per cent of turnover.
So, while the threat of a fine, may be seen as a stick
to ensure data protection is taken more seriously,
the reputational damage that can arise from a data
breach, or allegations that governance is not what it
should be, is potentially a much more fearsome stick.
But privacy by design and default can provide a
carrot too, it can help a company manage costs by
making more efficient use of data, but above all can
be a superb communication message to the customer.
Privacy policies say: “we take your privacy seriously”
but do people believe it? Do they just assume the
policy is there to provide legal protection? By contrast,
if privacy can become a core part of a product, if it
screams out at the customer ‘you can trust us, we are
on your side, protecting your privacy is fundamental
to who we are, a human right’ then all of a sudden
your image is transformed - the customer will be more
willing to agree to their data being processed, and your
reputation is given the kind of boost that can make the
difference between success and failure, that can give
you a decisive edge over rivals.
That is why privacy by design has transformed
the data protection and privacy officer - from grim
reaper, who was seen as a hurdle that could prove
impossible to pass, to hero - someone whose insights
can transform the reputation of a company, or public
service, creating a bold, empowering and successful
relationship with the customer.
by michael baxter
24.
SPEAKERS INCLUDE:
WWW.DATAPROTECTIONWORLDFORUM.COM
20TH
& 21ST
November 2018, ExCeL London
JAM
IE
BARTLETT
PERN
ILLE
TRAN
BERG
RO
H
IT
TALW
AR
ARD
IKO
LAH
2018 WILL BE A LANDMARK YEAR FOR DATA PROTECTION AND PRIVACY
LEON@DATAPROTECTIONWORLDFORUM.COM
0203 515 3015
Pre-register for the data protection event of the year here
The Canadian experience...
By Michael Baxter
Don’t miss the data protection event of the year – pre-register here
“We exported privacy by design to the
rest of the world” says Sylvia Kingsmill,
a Partner at KPMG Canada, and the
Canadian Digital Privacy & Compliance
Leader, “now we are importing it back.”
But the Canadian experience is
illuminating - unlike in the EU, where
under GDPR, privacy by design and
default is a requirement, in Canada it has
been voluntary. But companies and other
organisations still apply it.
The Canadian view and the Dr Ann
Cavoukian view is that it is just common
sense. “We in Canada,” she says “took
Ann’s seven foundational principles
which are the lynchpin. It is one thing to
have it as a philosophy, another to actually
do it and operationally track it.”
Ms Kingsmill worked with Dr Ann
Cavoukian and her team at Ryerson
University in developing privacy by design
methodology and framework to assess
new technology and services against the
principles of privacy by design.
What does this mean? “What do
controls look like from the perspective of a
company or entity? How do we bake in the
right controls, so they can demonstrate,
whether it is in the EU or Canada,
that they are taking the right steps to
directionally protect our consumer rights
and freedoms?”
In Canada, you “can get certified to say
look we are doing the right things to your
data, we are listening, we are following
best practice, we are following guidelines,
we are listening to regulators, we don’t
want to just appease them.”
There is another way of putting it that
really does sum it all up: “We satisfied a
market need, and that need is to re-assure
the public.”
She cites as an example CANTATICS,
a Canadian organisation that produce an
anti-insurance fraud solution. Privacy
by design is a core part of the product.
The data generated by insurance claims
across companies can be used to spot
fraud, regular insurance claimants,
who, in order to avoid detection, spread
their fraudulent claims across multiple
companies. But the result of this is higher
insurance premiums, so it is very much in
the interests of customers to come up with
a fix.
So CANTATICS applied privacy by
design. They showed how processing
customer’s data worked to their benefit,
promoted how the data would only be
used for the specific purpose stated and
kept no longer than necessary. In short,
privacy by design became a feature, a key
part of what CANTATICS is about.
There was no need for an enforcement
body - it was there because it was
important, a key part of creating trust,
building and maintaining a reputation.
But rules are being introduced. “We had a
Parliamentaryreportrecommending19 areas
toreformtomaintainouradequacystandards
and the main topic is privacy by design.”
ButthenSylviaKingsmillreckonsthisishow
things are going worldwide. “International
regulators are collaborating, there is a global
privacy enforcement network,” she says.
GDPR then, or at least Privacy by design
anddefault,mayhavehaditsrootsinCanada,
but now it is being applied worldwide.
GDPR is about “transparency,
accountability and control.”
“It is actually pretty brilliant the way
they have put together GDPR, as awful
as it looks, I just wish they had used clear
concise intelligible language like they say
we should. The architecture of that law is
really quite impressive, and I say this as
someone who has read a lot of laws.”
Abigail Dubiniecki, is an Associate
at Henley Business School’s GDPR
Transition Programme and Specialist in
data privacy at My Inhouse Lawyer. To put
it mildly, she knows her GDPR. And, like
so many experts on GDPR and advocates
of privacy by design and default, she is
Canadian. When she talks about GDPR
her passion shines through, she tells it
like privacy is one of the most important
issues of the day - of the digital world, but
then given how data and personal data
is shaping the economy, business and
impacting upon us all, it probably is.
“Before I came here, I was already doing
privacy by design without calling it that,”
she says, referring back to when she was
the privacy lead on an in-house counsel
team for a Canadian crown corporation.
We in Canada took Ann’s seven foundational
principles which are the lynchpin. It is one
thing to have it as a philosophy, another to
actually do it and operationally track it.
27.
“
The Canadian take Cardinal rule
“Under GDPR, one of the cardinal rules is
‘know the data”. She explains:
•	 what data you have
•	 why you have it
•	 what you are doing with it
•	 do you have a lawful basis for having it
•	 how long you need to keep it
•	 who you are sharing it with
•	 and for what purpose
But then, look at the detail, the devil may not
emerge, but you must be devilishly careful:
•	 Are there secondary purposes for
which you are using it?
•	 if so, are they sufficiently linked?
•	 is this something that requires you to
go back to the drawing board?
“It’s not a legal obligation in Canada to
apply privacy by design, but those basic
principles are there.” Says Dubiniecki.
“She adds: “what GDPR does and what
privacy by design asks you to do is take
data cycle life management practices,
cyber security best practices and fair
information principles and make that a
legal obligation.”
She turns to the public sector. We
had this constant interaction with our
regulators, so when something comes up,
we are supposed to report that, not just to
the regulators and privacy commissioner’s
office, but there is also the Treasury Board
Secretariat, so accountability has been
built in; it’s just a reflex.
“And although no one would say it’s a legal
obligation to apply privacy by design, much
of that is already there, if not articulated.”
Shakespeare said it well, “that which we
call ‘privacy by design’ by any other phrase
smells so sweet.” Not to mention helps
confer a human right.
In Canada there is no formal right to
be forgotten law, but there is the Right to
Rectify. “You can correct something that
is wrong, and if something is unlawfully
processed that’s a problem, but where we
differ is that we don’t have the same teeth
for enforcement.”
Privacy by design may have been
invented in Canada, but she says “we
only started to take it seriously once the
EU adopted it”, but that seems to be the
Canadian way. If you are a Canadian
rock star, or Canadian hip hop artist or
painter, no one is going to care about you
in Canada until you go somewhere else
and become really popular, then they call
you Canadian.
“It is being looked at in many other
countries, such as in Latin America and
in countries that are eager to trade with
the EU, such as Japan where there is a
lot of interest and now finally Canadians
are paying attention to it too. As they
seek to reap the benefits of greater
trade with Europe under the Canada-
EU comprehensive and economic trade
agreement (CETA).”
Examples of privacy by
design and default
Dubiniecki cites as an example: Securekey
and its product, SecureKey Concierge,
technology that enables you to ‘bring
your own credentials’ to prove your
identity. Abigail explained: “You can
access 80 government services using
banking credentials” and it’s a service in
which privacy by design is an integral
part. “The idea is that you don’t have to
have loads and loads of different data, in
different treasure troves, my company
and your company, and government so
that you can be attacked from any angle.
Now we have this system to verify when
you need to be verified This has been a big
trend, putting control in the hands of the
individual, giving them greater security
because there are fewer places where your
identity is spread about, creating a richer,
more reliable way to confirm someone’s
identity, because you have much richer
dataset than you would have if you just
have passwords, and date of birth and
other things that someone could steal.”
1. 2. 3.
28.
GDPR is about transparency, accountability and control:
Interview with Abigail Dubiniecki, Associate at Henley Business
School’s GDPR Transition Programme and Specialist in
data privacy at My Inhouse Lawyer
End-to-end and the seven
foundation principles
With privacy by design, we talk about end to
end protection, rights, and putting the user
first, all those things are baked into GDPR.
So as a starting point, you have to
meet the seven personal data processing
principles, in every activity that you do. You
have to make sure that your default choice
is the most privacy preserving choice.
So if it is social media, the default is to
share information with the people ‘I want
to share with, not share with the public,’
you can’t ‘default to share with the public’
and roll it back, you have to default to
something more defensible that makes
sense in the light of our exchange, and the
reason why ‘I am coming to your platform.’
You have to meet those principles. It
must be lawful. You have to choose which
lawful basis you are relying on. If you are
relying on consent, you have to be able to
let someone withdraw consent as easily as
they give it, and ensure it is freely given.
In an employment situation you cannot
rely on consent, because it can’t be freely
given. Therefore, you have to look at other
lawful bases.
Then you have to go through all the
principles which must be baked into
everything you are doing.
People have these seven rights under
GDPR and you have to make good on
them, which means however you design
those systems you have to find that needle
in a haystack, so if, say Rob Lowe makes
a subject access request, when you are
designing your system, you have to make
it so that you can find everything that is
said about Rob Lowe in everything you
have done or shared with his data.
Be warned
“Lots of companies are doing back flips,
‘oh this does not apply to me because of
Brexit’, ‘it does not apply to me because I
amjustaprocessorforatelecomscompany
and I don’t actually see anything’, ‘I am
not even interested in what’s coming in
there’, but it absolutely does, if there is
personal data in there, if you don’t need it,
then don’t have it. Or anonymise it so you
reduce your compliance footprint.
“If they say, ‘I am not going to do anything
with it, so I should be okay,’ They will be
in trouble. If not under GDPR, then under
ePrivacy regulation and the Cyber Security
Act that is being proposed. Or under the
‘Directive on the Security of Networks and
Information Systems’. There is something in
that digital single market strategy that will hit
you. So, pay attention.”
4. 5.
GDPRandthefourthindustrialrevolution
by Michael Baxter
Subscribe for free here and get the next issue to your inbox
There is a nice analogy with a chess board that illustrates the
point. Put a grain of rice on the top left-hand square. Moving to
the right, put two on the one next to it, then four, doubling, with
each square working your way across and down the square until
you reach square 64, at the bottom right hand corner.
At the end of the first row, you would have 128 grains of rice.
At the end of row two, 32,000 grains of rice, eight million by the
end of row three, and by the last square 128 trillion grains of rice.
This, goes the idea, serves as a metaphor for the way technology
is accelerating.
Whether this is accurate or not, one can say, without doubt,
that technology is developing at an extraordinary pace.
Moore’s Law
Moore’s Law, relates to words spoken by Gordon Moore, co-
founder of Intel, in 1965, that the number of transistors on an
integrated circuit double every two years. Today, Moore’s Law is
said to refer to computers doubling in speed every 18 months to
two years. It’s the classic example of accelerating technology, the
chess board analogy seems to describe this perfectly.
The economist Brad de Long estimated that if someone had built a
computer with the kind of processing power of the iPhone X in 1957,
it would have cost 150 trillion of today’s dollars, the device would
have taken up a hundred-storey square building, 300 meters high,
and 3 kilometres long and wide and would have drawn 150 terawatts
of power—30 times the world’s generating capacity at that time.
There is a view that Moore’s Law is now slowing down, because
of the physical limits in cramming more transistors on a chip.
But newer technologies, such as quantum computers, applying
the super material graphene instead of silicon, or photonic chips
could give Moore’s Law a new lease of life.
Metaphorical Moore’s Law
But the idea of Moore’s Law can be applied to other applications
- not literally Moore’s Law, but following a similar trajectory.
Examples include Butter’s Law, which predicts a doubling of
data transmission over a fibre optic cable every nine months, or
genome sequencing, which has seen the cost of sequencing the
human genome fall from $2.7 billion when it was first sequenced
as part of the human genome project, at the start of the century,
to less than $1,000 today.
Renewable energy also seems to be falling in cost at an
exponential pace - although the factor here seems to be the
learning rate - a doubling in the user base leads to a proportional
fall in cost, or lithium ion batteries which have fallen in cost from
$1,000 a kilowatt hour in 2008 to around $200 in 2016.
Convergence
But it’s when we take into account convergence that things get
exciting. Technologies to watch include:
•	 The Internet of Things
•	 Robotics
•	 Artificial Intelligence (AI)
•	 Augmented reality
•	 Virtual reality
•	 Super materials such as graphene
•	 Quantum computing
•	 3D printing/additive manufacturing
•	 5G and faster internet speeds
•	 Genome sequencing
•	 CRISPR/cas-9 - DNA editing
•	 Mobile computing/the cloud
•	 Blockchain
•	 Drones
•	 Energy storage
•	 Renewable energies
Convergence, combined with Moore’s Law and metaphorical
Moore’s Law, are causing things to develop at an extraordinary pace.
Faster computers are helping to support advances in AI,
developments in screen technology, advances in processing
power, and in the components that make up smart phones such
as accelerometers - which tell a phone which way it is being held
- are supporting advances in virtual reality.
The combination of AI, with sensors that make up the Internet
of Things inside wearable devices, and genome sequencing is set
to create a revolution in health tech.
Data
But data is a key part of the revolution. Aggregation of data from
genomesequencingandsensorsmonitoringthebody,whenanalysed
by AI, may create new insights on the treatment of diseases.
According to a study by IBM, 90 per cent of the information on
the internet has been created since 2016.
Eric Schmidt, chair at Alphabet, once said that every two days
we create as much information as we did up to 2003. But it seems
to be getting quicker.
Another way to put it is to say technology has never changed
so fast, but it will never again change as slowly as it is now.
Fourth Industrial Revolution
The ‘Fourth Industrial Revolution’ was so named by Klaus Schwab,
founderoftheWorldEconomicForumwhichfamouslymeetsinDavos
every January. In 2016, it was the theme of the Davos conference.
It follows the first industrial revolution of the 18th century,
dominated by steam power and textiles, the period from the mid-19th
century to 1914, led by the use of electricity, the rise of the motor car,
flight, mass production, and then the more recent IT revolution.
Personal data
It is clear that personal data will be part of this revolution
and the processing of it may create extraordinary insight and
make incredible efficiencies. But there is risk. Will it lead to a
surveillance state? The end of secrets does not sound so bad, but
the end of privacy does.
Maybe we are heading for a time when we sell our data? Can
AI take our data, even anonymous data, and by comparison with
our data streams de-anonymise it?
These are among the biggest issues of the day. GDPR takes us
the heart of this matter. It has its critics, but the regulation may
represent a key moment in the battle to ensure the results of the
Fourth Industrial Revolution are benign.
Technology,astheysay,isaccelerating.
Moretothepoint,itisalsoconverging.
31.
“Our research shows that around two thirds of people are conformable
with the data exchange part of a modern economy” says Chris Combemale.
The Direct Marketing Association (DMA) recently published
research into what the consumer really thinks about data privacy.
Data isn’t the only thing that is making the Fourth Industrial
Revolution possible, but without it, the Revolution could flitter
out and die. And the DMA research finds that more needs to be
done to create the trust that is required so that consumers are
comfortable with their data being processed.
The DMA survey also found that 54 per cent say “trust in an
organisation is the single most important reason why they will
share data,” but 78 per cent say that “the company gets the best
value of the exchange, 86 per cent say they want more control
over what companies do with their data, and 88 per cent of
companies say they would like more transparency.
“Unless there is trust; that data is held securely and trust that
companies will only do with it what they say they are going to
do, tell you openly and honesty, then we will have a real issue
going forward,” suggests Chris.
Chris began his working life in TV production before entering
the world of advertising and marketing, joining Young and
Rubicam in New York. He worked for the famous advertising and
marketing agency for 12 years, with a stint in Asia, working in
Hong Kong and Singapore. Today, he is CEO of the DMA Group,
comprising the DMA, IDM (Institute of Direct and Digital
Marketing) and TPS (Telephone Preference Service).
He sees a parallel between GDPR and the Advertising
Standards Authority. “The ASA, ensures that, on a voluntary
self-regulated basis, ads are truthful and meet societal standards.
If ads are found to be untruthful they have to be removed. This
is really important as it means that the consumer can trust that
the ads they see are truthful, and don’t have to worry about
deciding which ones are telling lies and which ones are not. The
same applies to data, companies have to have open honest and
transparent in their communications about what they are doing
with their data. GDPR will mandate this.”
Referring to the DMA survey, he said “we have some way to go
if we are going to create the eco system and environment that is
right for the modern future.
“GDPR can play an important role, as it mandates that
companies, and government for that matter, give people more
control and greater transparency.
“If companies do a good job, they will articulate the benefits in
a better way than they have done, so that people start to see the
value exchange as a balance between the great benefit they get,
and the benefit the company gets.”
Yet, in some respects, he says that this perception that the
company achieves the greater benefit from the value exchange
is almost inexplicable. “When you think of some of the value we
get every day in our lives, so if you think of the number of people
who use Google maps for free, several times a day, it is not only
free it contains minimal advertising.”
“Do we just take it for granted, is that why we won’t see value,
why 78 per cent say the company benefits from the value exchange?”
“GDPR is going to be helpful, not perfect, but was drafted
with the idea of increasing consumer confidence in the future
technological and data driven economy. In a similar way to
how self-regulated watch dogs such as the ASA have created
confidence in the traditional advertising market.”
Advertising and the Fourth Industrial Revolution
The media is not like it used to be. There was a time when we
instinctively knew the difference between the media and, say, a
ChrisCombemale,CEOattheDMA:FourthIndustrialRevolution
32.
trust in an organisation is the single
most important reason why they will
sharedata.
Anyone who is wholly worried or
wholly optimistic, is missing the point.
optimismandworrymustbeinbalance.
And the benefits cannot accrue if you
don’tmitigateagainsttherisks.
retailer or taxi operator. “But,” suggests Chris, “technology has
evolved such that new products and services that have become
available, that make is easier and more efficient to live their
lives, are all essentially media channels that rely on the use of
data to deliver them. So that is true; whether it be an Uber, who
use data to work out where to collect you from and drop you
off somewhere, or websites that sell you products or access to
services providing content.
“And increasingly, as we move into a world of AI and big data
analytics, more and more we will combine technology and data
to create new services of the future. But it is really important,
that people trust the technologies and how they use the data. “
33.
“GDPRcanplayanimportantrole,asitmandatesthatcompanies,
and government for that matter, give people more control and
greater transparency.
The opportunity and threat from data
So, the point is hammered home. Trust is vital, and GDPR can be
a key part of creating that trust.
“Big data could be the new oil or the new asbestos. There is a
huge amount of opportunity for businesses, people and society
in the era of big data, but there is an equal amount of risk. That
is why it is important we have a set of rules on our playing field
which help us mitigate the risk and empower the opportunity.
“And that is what GDPR is trying to do. It is important we
think about the risk as well as the opportunities.
“At the DMA, we say we need to move from responsible
marketing and codes of practice, to ethical marketing, where
people think about the impact on society of these technologies
that are coming forward and understand how your business can
use them for positive purposes and mitigate the risk.
“Anyone who is wholly worried or wholly optimistic, is
missing the point, optimism and worry must be in balance. And
the benefits cannot accrue if you don’t mitigate against the risks.”
And that is why GDPR may prove to be a vital tool or even
weapon in the battle to ensure that the final outcome of the
Fourth Industrial Revolution is benign.
SUBSCRIBE FOR FREE TODAY HERE
Alford was able to take
anonymous data and reveal
the identification of Dread
Pirate Roberts, with Ulbricht
eventually receiving a life-
time jail sentence.
This anonymisation is so
poor that anyone could, with
less than two hours work,
figure which driver drove
every single trip in this
entire data set. It would even
be easy to calculate drivers’
gross incomes.
“Data with the individual
names and addresses removed
will have a high risk of
re-identification.” He
said that pseudonymised
data should be treated
aspersonalinformation.
Data Protection World Forum is not to be missed – pre-register now
Elon Musk reckons AI poses an
existential threat to society,
the late Stephen Hawkins was
similarly pessimistic.
This magazine is about privacy and data
protection. If you fret that AI may create
super intelligent artificial life-forms that
represent a threat to our existence, like a
Terminator, then you may or may not be
right, but that is not an issue for here.
How AI may use data, perhaps to
manipulate us, is an issue for here.
Artificial intelligence is a vague concept.
Many of us think we know what it means,
but if you immediately get thoughts of
The Terminator or HAL from Stanley
Kubrick’s 2001 A Space Odyssey, then
your view is quite different from that held
by many who work in AI itself.
Some experts define it as machine
learning - others say machine learning
is a subset of AI. But for all intents and
purposes, when we talk about AI in the
contextoftheFourthIndustrialRevolution,
we are talking about machine learning
and its close cousin, deep learning.
Machine learning relates to the analysis
of a data stream, often a stream containing
huge volumes of data, to extrapolate
insights. It will often do this via neural
networks, a computer configuration which
is meant to mirror how the brain works,
with neurons that form links or synapses
with each other.
Deep learning takes this concept a
little further, maybe analysing data from
multiple data streams.
The applications could involve creating
medical insights, spotting links with
certain behaviour, symptoms and data
from genome sequencing with certain
conditions. The result could be the more
accurate and speedy diagnosis of disease
and treatment.
One of the paradoxes of big data is the
personalisation of solutions - for example
analysis of data to pinpoint the ideal
treatment for individuals based on their
circumstances and genetic data.
But data can also be used to support
marketing, targeting advertising, for
example, with extraordinary accuracy,
or it could be used by retail to help
in merchandising, stock control and
matching stock with customer demand.
AI, GDPR and the
anonymisation of data
Or AI could, for example, create data
to help with traffic monitoring, control
and flow.
GDPR and AI
GDPR is quite specific. Whatever legal
bases you use for processing personal
data, whether for example it is consent,
legitimate interest or contractual, the
data used must be specific to its purpose
and it cannot be held longer than is
absolutely necessary.
Anonymisation
But while data can provide important
insights,invariablyitdoesnothavetoname
an individual. Big data can be anonymous
and only applied to an individual when
the insights from that data can benefit that
individual specifically.
De-identification of data
In fact, under GDPR, there is a spectrum of
de-identification. There is pseudonymised
personal data, and anonymised personal data.
The challenge here relates to the potential
for AI to take multiple data streams, and
from that identify who individuals are, even
though someone has gone to the trouble of
de-linking data to individuals.
So, say a clinician, doing research gets
non-personal data, they shouldn’t then be
able to identify the individual. This is an
example of pseudonymised data as, with
information from other sources, there is a
way to link back.
With anonymised data it is not possible
to link again.
In practice, Abigail Dubiniecki tells us,
it is extremely difficult, if not impossible
to stop the link back again, “because there
are a thousand points of light, so much
personal data and so many different data
elements pushing out constantly about us.”
Dread Pirate Roberts and the
tax investigator who brought
him down by de-anonymising data
Gary Alford was a tax inspector. He didn’t
have access to any unique information, he
just had the channels available to us all,
even so, in June 2013 be brought down
Ross William Ulbricht, the so called
kingpin, Dread Pirate Roberts, the man
who operated the Silk Road market place
on the dark web.
Selling drugs was just part of the story,
Silk Road was an online black market.
And most of the sales were indeed for
drugs, including stimulants, psychedelics,
prescription, precursors, opioids, ecstasy,
and dissociatives, but there were other
products too, fake driving licences, and
supposedly, even the option to take out
a contract on someone - hire a hit man.
Libertarians point to the philosophy behind.
Silk Road, the ideology of its founder, a
fan of the free market economist Ludwig
von Mises, terms of service prohibited the
sale of certain items whose purpose was to
‘harm or defraud’. Child pornography, stolen
credit cards, assassinations, and weapons of
any type were banned, thus making a hero
of Ulbricht in the eyes of many.
The name Dread Pirate Roberts came from
theUlbrichtlogindetails.TheFBIinvestigated,
but his identity remained a secret.
But Gary Alford was able to track him
down, ascertain his identity, by spending
time on his own computer, reading though
chat rooms, and old blog posts.
His main tool was Google, using advanced
search to focus on references to Silk Road
during a certain time period: the early days.
He was able to track down early messages on
forums that referred to Silk Road and found
one particular communication on a forum
asking for help in programming with the
email address; rossulbricht@gmail.com. The
original message had in fact been deleted,
but Alford was able to find a record of the
communication in someone else’s response.
By doing this, Alford was able to
take anonymous data and reveal the
identification of Dread Pirate Roberts,
with Ulbricht eventually receiving a life-
time jail sentence.
It just goes to show, if an individual,
making clever use of Google can do that,
consider how AI, with access to a gigantic
array of data, can de-anonymise data.
In the study, Unique in the Crowd: The
privacy bounds of human mobility by
Yves-Alexandre de Montjoye, César A.
Hidalgo Vincent D. Blondel, it was found
that “in a dataset where the location of an
individual is specified hourly, and with
a spatial resolution equal to that given
35.
by the carrier’s antennas, four spatio-
temporal points are enough to uniquely
identify 95 per cent of the individuals.”
Vijay Pandurangan, a former Google
engineer and founder and CEO of a
company called Mitro, analysed data on
20 gigabytes worth of trip and fare data
involving New York cabs, comprising
more than 173 million individual rides.
The data was anonymous. His conclusion:
“This anonymisation is so poor that
anyone could, with less than two hours
work, figure which driver drove every
single trip in this entire data set. It would
even be easy to calculate drivers’ gross
income or infer where they live.”
Khaled El Emam is the Canada Research
Chair in Electronic Health Information
at the University of Ottawa, an Associate
Professor in the Department of Pediatrics.
In an article for BMJhe said: “Data with the
individual names and addresses removed
will have a high risk of re-identification.”
He said that pseudonymised data should
be treated as personal information.
As Dubiniecki says: “It is very hard to
be anonymous. You can anonymise the
data as effectively as you can through
techniques that are tried and true – as an
individual layperson, it is not for you to
think you have truly done it, you really do
the need help of a professional or a tool,
but in addition to technical measures you
need organisational methods.”
Foreveryanonymisationorpseudonymisation
thatyoudo,youhavetodoare-identificationrisk
assessment to make sure you have identified the
different ways the data can be re-identified, and
then you put in measures to mitigate those risks
andprotectthedata.
And then, before you share data, you
must put contractual and organisational
protections around that sharing.
“Never throw anonymised data into the
real world for anyone to use like Netflix
did in 2006” says Ms Dubiniecki, “because
then there is no control over the thousand
points of light that can be built in.”
One method being tried involves the
process of adding noise, making it harder
to de-anonymise data, using what is called
differential privacy. Abigail says: “The
process involves taking a data set, running
a parallel one, and adding some noise so
that you can run your analysis and get
something that is ‘pretty good’, but the more
noise you run the less accurate it is, so it is
questionable how effective this can be.”
Apple says it is using differential
privacy, but the community is sceptical.
Facebook has talked about using it too.
GDPRputsemphasisontheanonymisation
of data, and the pseudonymisation, but being
100 per cent confident it cannot be de-
anonymised may not be possible.
But GDPR is a risk-based regulation, the
key lies with assessing the risks, addressing
high risks and turning them into risidual
risks, documenting the measures you
have taken, applying re-identification
risk assessments, if you are anoymising
or pseudonymisation, employing outside
expertise and ensuring the data you do
process is relevant to the purpose and you
do not hold onto it longer than you need to.
By Michael Baxter
36.
“Never throw anonymised data
into the real world for
anyone to use like Netflix
did in 2006 sbecause then
there is no control over
the thousand points of light
that can be built in.
Data will be at the heart of everything,
and people will have to become more
comfortable with a greater level of
transparency within their own lives and
accept that companies are governments
in providing services will know quite a
lot about them. But it must always be
clear how this operates and people must
be given choice.
Subscribe today and take advantage of our great subscriber offers
“
SELLING YOUR DATA...
“I give you permission, in exchange for a sum of money, say $100 a month to
process my data.” Is that a possible future, what might GDPR have to say
about that?
In a feature by Stephan Talty in the Smithsonian magazine, a
future was envisaged, circa 2065, in which computers and robots
have taken our jobs, but most of enjoy universal basic income
- a set monthly amount, freeing people up to spend their time
making movies, volunteering and traveling the far corners of the
earth.
“There will be Christian, Muslim and Orthodox Jewish districts
in cities such as Lagos and Phoenix and Jerusalem, places where
people live in a time before AI, where they drive their cars and
allow for the occasional spurt of violence, things almost unknown
in the full AI zones. The residents of these districts retain their
faith and, they say, a richer sense of life’s meaning. Life is hard,
though. Since the residents don’t contribute their data to the AI
companies, their monthly UBI is a pittance.”
Sylvie Kingsmill, a Partner at KPMG Canada, and the Canadian
Digital Privacy & Compliance Leader, blanched at the idea when
we put it to her. “What kind of society do we want to have?
If we look at privacy as a right, a freedom, as Europeans do, I
would disagree. You only understand what the implications of
a data breach are if you have been affected and your reputation
has been affected by this.” She does concede, however, that it
“depends on your cultural background.”
Forget about the year 2065, who knows what the equivalent of
GDPR will say then. What about selling your data today?
Chris Combemale at the DMA said: “There are some models
today in which people are selling their data in exchange for
services. But there are many other models in which peoples’
data is used to pay for services. Data will be at the heart of
everything, and people will have to become more comfortable
with a greater level of transparency within their own lives and
accept that companies are governments in providing services will
know quite a lot about them. But it must always be clear how this
operates and people must be given choice.”
But the issue of paying for data matters now, and GDPR is most
certainly relevant. It all boils down the legal bases for processing
data. Abigail Dubiniecki’s view is that ‘under consent’ there is no
option to sell your own data. If you are making the lawful bases
consent, it has to be freely given and the moment you make it
conditional, it is not freely given.”
But other legal bases may make it possible. “Contractual
necessity is another legal basis. If the purpose of the contract
is data exchange, so you can do X with the data and I can make
money off that, then perhaps you could make an argument that
you can do that on a contractual necessity basis. If that’s the
essence of the contract.”
She pauses, Abigail clearly feels uneasy about the concept “I
am not selling this idea, I think privacy is more fundamental
than that. [Can I add: “The right to privacy and to protection
of Personal Data isn’t a mere property right to be bought and
sold. It’s a fundamental human right and it’s tied to so many
other rights, like the right to be free from discrimination.
Discrimination is one of the top harms listed in the GDPR
recitals and a consequence you must consider when applying
the GDPR’s risk-based approach. Under Quebec law (a Civil
law province in Canada), the right to privacy is an inalienable
right. You can never fully and permanently sign it away. That
said, there could be instances where you might transact with a
business using your personal data as currency.
“So the contract might state that this is the personal data that
is necessary for the purpose of the contract, and the contract
involves crunching up the data and putting it with other people’s
data, in exchange for the consideration (as we say under Common
Law) for say a sum of money or a token. And then when it is no
longer necessary for the performance of the contract, they stop
using it. People.io uses this model, and to a certain extent the
Hub of All Things does.
But it seems such an arrangement may only apply to a specific
exchange, not a relationship when you are constantly feeding
data. She draws an analogy with a contract to buy a coffee, “they
might charge me £2.20, if I accept that offer, but if they supply
a frog, that is a breach of the contract, if they say in addition to
the money I have to give them everything in my purse, that is not
contractually necessary to get this coffee.
“But in the digital Wild West, there is this surface relationship,
the view that ‘we are going to hoover up everything we can get,
maybe unbeknownst to you. We will cover it in all these terms and
conditions, which you can’t find because they are somewhere else
and the screen keeps stalling, and they put speed bumps in the way
and we don’t understand it, this is not an enlightened choice that
you are making’. That is just not going to happen. Remember:
“GDPR is about transparency, accountability and control.”
Maybe, in the not too distant future, we wIll sell our data.
1
by Michael Baxter
39.
Anatomy of a breach in trust,
the Facebook story
Expand your knowledge – pre-register to Data Protection World Forum today
Facebook’s CEO conceded that the
Cambridge Analytica saga that has
put data protection and privacy on
the front pages of newspapers for
weeks, was an example of a breach
of trust. But was it a data breach?
Regardless of the technicalities, the
whole episode cuts right to the core
of one of the most important issues
of our time. It’s important that we
take an objective perspective.
In April 2015, Mark Zuckerberg selected Orwell’s
Revenge, by Peter Huber as one of his books of the
year. “Many of us are familiar with George Orwell’s
book 1984. Its ideas of Big Brother, surveillance and
doublespeak have become pervasive fears in our
culture,” wrote Zuckerberg.
He continued: “Orwell’s Revenge is an alternate
version of 1984. After seeing how history has actually
played out, Huber’s fiction describes how tools like the
internet benefit people and change society for the better.”
As these words are being written, Zuckerberg is
being cast as a kind of privacy fiend. Not so long ago,
people in some quarters saw him as the most likely
next President of the United States, now, read some
media reports and you could be forgiven for thinking
he is like a Bond villain in the flesh.
Such descriptions are unfair. He has erred, of that
there is no doubt, maybe at 34 years of age, having
only ever known extraordinary success, he does not
have the wisdom that comes with the odd failure, but
before we go any further it is worth considering his
philosophy. And maybe we can tell much about this
by considering some of his other books of choice;
‘Why Nations Fail’ by Daren Acemoğlu , in which
‘extractive governments’ use the power of technology
and others powers to enforce the power of a select
few; ‘The Rational Optimist’ by Matt Ridley, a book
about the positive benefits of new technology; and
‘The End of Power’ by Moisés Naím, a book about a
shift in power from authoritative and military based
governments and corporations to individuals.
Of the last book on that list Zuckerberg said: “The
trend towards giving people more power is one I
believe in deeply.”
In fact, scan through all the books that Zuckerberg
rates and you get the impression of a young man who
sees technology as a way to end poverty, and give
more power to individuals, a force for freedom and
empowering ordinary people.
That is a quite different philosophy to the one many
associate with the corporate world - and at odds with
the critiques laid at Facebook - of a peddler of fake
news, echo chambers, and insidious manipulation.
At core, this may boil down to your view of humanity.
The ideology behind Facebook. Zuckerberg once said:
“We’ve gone from a world of isolated communities to
one global community, and we are all better off for it…
[Our mission] is to connect the world… [The danger
is in] people and nations turning inwards against this
idea of a connected world and community.”
41.
The advertising medium
But look beyond the ideology and you find a money-
making machine - money may not grow on trees, but
it does seem to emerge out of personal data.
In the final quarter of last year, Facebook made
$4.26 billion profit after tax, but even that understates
the underlying picture, the company incurred a one-
off $3 billion tax charge.
Maybe this statistic is the key one: Facebook
generated $6.18 per user during the quarter. From the
point of view of the user, that is not so great. Maybe
there would not be that much money in selling one’s
personal data even if one wanted to.
It also grabbed an 18.4 per cent share of the global
digital advertising market, behind Alphabet/Google
on 31.3 per cent share, according to eMarketer.
And it is this market share pertaining to Facebook
and Google that vexes many in the world of
publishing. Facebook doesn’t pay for the content that
appears on its platform, unlike most publishers who
have to fund content with pounds, dollars and euros
- bitcoins, or whatever currency they choose. They
say it isn’t fair. Without the quality journalism they
produce, Facebook and Google would not be nearly so
profitable, yet the lions’ share of the advertising bucks
does not go the companies that invest in journalists.
The reason is simple, it lies with data. An
advertisement on Facebook can be targeted with
extraordinary accuracy. And advertising budgets can
start at just a few dollars. In today’s digital business
world, a business model has evolved known as ‘lean-
start up’, it involves developing as simple a version of
a new product as possible rapidly and testing it. Back
in the media’s analogue days, such an approach would
have been much harder. Start-up production costs
were higher, so was minimal advertising expenditure.
Facebook type mediums represent a superb way to
test new ideas, for start-ups to test the market, and
that’s not a feature the traditional publishers are too
keen on.
Power and responsibility
But as French philosopher Voltaire once said: “With
great power comes great responsibility,” a saying recently
returned to the public consciousness by the Marvel
Comic’s character Spider-Man. Facebook has power, of
that there is no doubt.
GDPR is about transparency, accountability
and responsibility. Has Facebook been living up
to such responsibility?
One person who has doubts is Ardi Kolah, Director,
GDPR Transition Programme, Henley Business
School, and Editor in Chief, Journal of Data Protection
& Privacy. He said that Facebook’s product may be a
massive generator of data, creating huge profits and
“capitalism creates a right to make money from good
business,” he suggests. “Mark Zuckerberg is brilliant,
but with rights comes responsibility. He has used his
rights to accumulate a fabulous fortune, but where is
his responsibility?”
He continued: “The extent of the obfuscation by
the social media giants is breathtaking. Facebook’s
service agreement runs to some 3,700 words covering
10 sides of A4 and Twitter’s is 11,000 words, taking
up 32 pages.”
Ardi turns to Google, “where is its responsibility
for recent gang violence in Nottingham leading to a
tragic death?”
On this theme, The Times recently quoted a Home
Office spokesman saying: “Gangs often post videos
online that seek to incite violence or glamorise
criminality to influence young people. The instant
nature of social media also means that plans develop
rapidly and disputes can escalate very quickly.”
It’s a good point but recall too that social media
played a key role in supporting the Arab spring.
Google might say “don’t do evil”, but does it practice
what it preaches?
Facebook breach
It was, of course, the use of data by Cambridge
Analytica, via the academic Aleksandr Kogan, that
put Facebook on the front pages and saw around $80
billion knocked off its market cap.
The privacy question related to an app created by Dr.
Aleksandr Kogan called ‘thisisyourdigitallife’ which
used the Facebook login feature. 270,000 people used
the app, sharing personal information. So far, it seems
nothing untoward happened. A more controversial step
followed, Kogan was able to make use of a Facebook
feature that existed at that time which provided
access to data of friends of those initial users. Mark
Zuckerberg now concedes that this could have created
data on no less than 87 million individuals.
This data was then used by Cambridge Analytica,
passed to the company by Kogan. Facebook says that
this was against its terms and conditions.
This all leads us to the question, was there a data
breach? Zuckerberg is careful how he words his
response: “There was a breach of trust.”.
Abigail Dubiniecki, was not convinced: “Although
Facebook may want to split hairs and say this wasn’t
a breach, it absolutely was a breach, but it wasn’t
a breach when someone had to do some complex
hacking, it was Facebook not taking care, as they
should, with other people’s data.”
Ardi Kolah was equally unequivocal, if the use of
this data possibly influenced the US election and the
EU Referendum, if data on 87 million people was used
without their knowledge, how could it have not been
a breach?”
Next
This all begs the question, has Facebook applied
sufficient measures as required by GDPR? Elizabeth
42.
“Although Facebook may want to split
hairs and say this wasn’t a breach,
it absolutely was a breach.
Denham, the Information Commissioner, recently
said: “Facebook has been cooperating with us and,
while I am pleased with the changes they are making,
it is too early to say whether they are sufficient under
the law.”
In short, there is a big question over whether even
now, Facebook has taken sufficient steps to follow the
GDPR requirements.
Ardi Kolah wonders whether Facebook itself
has sufficient maturity - it may be a massive
company with huge influence, but has it enjoyed
sufficient time to gain the maturity required for
greater social responsibility?
Ardi questions whether data of the type controlled
by Facebook should be in the private sector. The
company enjoys a network effect, the more users it
has, the more attractive its product offering to users,
arguably Facebook is a natural monopoly - which,
given the social and moral questions associated with
so much data, may suggest it is a kind of public utility.
Yet the stated Zuckerberg philosophy, with its
emphasis on creating communities, seems to be
at odds with this perception. Zuckerberg clearly
has more money than he needs, even if he lived to
a thousand his riches would be at a level that even
highly successful business people could only dream
of. Perhaps the problems began when the company
was floated - the tyranny of quarterly accounting
created a pressure to grow and generate money.
It is worth recalling, however, that Facebook offers
consumers an extraordinary product ostensibly for
free. The data it generates can help grease the wheels
of the economy, and via the information it brings
support marketing of new products, it may be a great
supporter of wealth creation. And while the danger
of filter bubbles are clear, it is also a remarkable tool
for promoting debate - is Facebook helping to create
a new interest in politics, creating a more engaged
populace, supporting democracy?
Bear in mind too, that the greatest winners from its
demise are the very media who are so keen to write
its obituary.
Personal data is not evil, it’s a key driver of the
Fourth Industrial Revolution, but without trust from
the public, the Revolution may be more like a new
year’s resolution, forgotten soon after it was made.
GDPR is there to create public confidence. It is not
the enemy of Facebook, and never should it be. Instead
it hoists certain standards upon the company, greater
transparency, accountability and responsibility. The
company is perfectly entitled to process personal
data, providing it does so within the legal bases of
GDPR, meaning there are constrains, data can only
be used in specifically defined ways, meaning no
handing it out to any third parties that might want
for nefarious intent.
And given how the Facebook share price has fallen,
given the campaigns to get people to delete Facebook,
the company desperately needs to re-build trust
with the public, and GDPR could be its single most
important means to achieve this.
By Michael Baxter
43.
Blockchain and GDPR
by Ardi Kolah
Blockchain has the potential not only
to change how we transfer value, but
could shift our systems of trade, identity,
efficiency and governance across all
sectors, radically transforming traditional
approaches to management.
“
What struck me was that the digital landscape referred to in the
Recitals of GDPR isn’t the same as the technology landscape
these ‘digital disruptors’ are building today.
According to the Oxford English Dictionary, Blockchain is:
“A system in which a record of transactions made in bitcoin or
another cryptocurrency are maintained across several computers
that are linked in a peer-to-peer network.”
That sounds very geeky and techie and of little relevance to
the rest of us. And only those with a strong interest in advanced
mathematical algorithms? Wrong!
Blockchain isn’t just about cryptocurrencies. Clarity in its
potential use beyond cryptocurrencies is rapidly emerging
and promises to change the face of professional, business and
industry sectors such as healthcare, manufacturing, aerospace,
energy, financial services and even law.
In short, will Blockchain effectively kill the sacred cows of
our current understanding of data and consent and dare I say
it GDPR even before it’s fully enforceable across the European
Union from 25 May this year?
The ‘digital disruptors’ I met on a cold and rainy night at
Silicon Roundabout in London seem to think so.
They were enthusiastic about the prospect of massive industry
disruption and the opportunities for creating new business
models as a result of the power of Blockchain.
But this isn’t some distant dream. Currently, 80% of banks are
developing Blockchains, and industries from law to aerospace
are already exploring possibilities. So, the next 5 - 10 years will
see massive disruption from Blockchain adoption as jobs are
automated and new industry applications are created.
When the internet was born, people used it to email one
another. Businesses like Amazon and Uber were inconceivable.
According to the GBA, Blockchain promises to be a revolution of
similar proportions, with undetermined potential, ramifications
and opportunities.
“Blockchain has the potential not only to change how we transfer
value, but could shift our systems of trade, identity, efficiency and
governance across all sectors, radically transforming traditional
approaches to management,” explains Nir Vulkan, Associate
Professor of Business Economics and Co-convenor of the Oxford
Blockchain Strategy Programme at Oxford Saïd Business School.
Many are looking at what’s on the horizon rather than how
things are done today. And because of the pace of technological
change, the future is rapidly catching up with the present.
As a result, is ‘consent’ dead? Does ‘consent’ as we understand
that to mean, work anymore in this Blockchain landscape? How
can a user give consent when they no longer understand what they
are consenting to? Even if they read a privacy notice or terms of
use, do you really think a consumer will understand how you’re
processing that personal data?
Is personal data the ‘new oil’ as The Economist proudly
proclaimed on its front cover not that long ago? If oil is a finite
resource and personal data is infinite resource, do we have to
question whether this analogy – although helpful to some extent
– is now out-of-date? Should we now be focused on thinking that
intelligence – not data – is much more useful as AI and Blockchain
technologies become smarter at manipulating this stuff? And so
where does this leave GDPR?
According to Nick Oliver of People.io “GDPR is the future of the
past. Yes, it’s long overdue. Yes, it’s a good thing. Does it solve for the
future? No. Does it solve for the now? Probably not.”
ArdiKolahwasinvitedtospeaktoaround150peopleat
a gathering of the Government Blockchain Association
(GBA). Here, he considers GDPR and blockchain.
45.
SUBSCRIBE FOR FREE TODAY HERE
GDPR and the businesssmall
46.
Make sure you don’t miss issue 2 – subscribe here for free
All organisations have a responsibility to ensure data is as safe as it can be and that it is
being processed fairly. If you’re unsure whether something is okay to process, you should
ask yourself whether it feels right, because if it doesn’t, then it probably isn’t.
AsmanysmallbusinessesstarttolearnaboutGDPR,
they can initially feel daunted. This is likely to be
duetoalackthemoneyandresourcesmakingsmall
businessesfeelliketheywillstruggletocomply
However, what’s more concerning, is that
many small businesses still have not even
heard of the General Data Protection
Regulation (GDPR) with less than a month
to go until the deadline on May 25th.
Some rumours rattling around suggest
that small businesses don’t need to comply
with the new data protection regulations.
Yet, with the recent adverts launched in
early March from the ICO about bringing
GDPR awareness principally aimed at
micro businesses - those employing fewer
than 10 people, shows that they will not
have a ‘get of jail card free’ if something
were to go awry. And even a one-person
operation is required to oblige with new
data protection rules.
The ICO campaign launched with a
series of radio adverts, aimed at those
micro-businesses who haven’t heard
about the GDPR at all.
TheInformationCommissionerElizabeth
Denham said: “All organisations have to be
ready for the new data protection rules, but
we recognise that micro-businesses in the
UK face particular challenges.
“It’s also worth noting that many sector
and industry groups and associations are
offering help to micro businesses about
the GDPR and can be a good starting
point for industry-specific advice.”
Denham also added that steps towards
GDPR compliance can be achievable
without too much cost or expensive
external support.
Are there exemptions?
Although small businesses must comply
with the regulation, they have some partial
exemptions. For example, Article 30
requires organisations to keep records of
their processing activities and categories,
and to make those records available to the
supervisory authority on request. Within the
article, it explains that this does not apply to
organisations of fewer than 250 people.
However, this is only a partial exemption.
If you employ fewer than 250 people, you
will still need to document processing
activities that are conducted on a regular
basis or which involve the processing of
sensitive data such as criminal convictions,
sexual orientation, health etc. The ICO has
produced a template which you can use to
record this processing information.
Some small businesses may not need to
appoint a Data Protection Officer, but the
ICO should be contacted if there is any
doubt on this issue. If you are not required
to appoint a Data privacy officer you
should be aware that making a voluntary
appointment will bring additional statutory
responsibilities. The DPO role can be
fulfilled by an individual on a part-time basis
or outsourced, and will contribute to your
compliance with the GDPR.
DPOs may also be a data processor so they
will need to keep a record for their clients in
such cases. I would point them to the ICO
guidance which has templates for this.
If you are a small business and a data
processor, you will be required to keep a
record for your clients. Although you may
have less information to document as a
processor, it is still important to document
activities. The ICO has specific templates for
data processors.
How do you comply?
As stated previously, the same rules
pretty much apply for both large and
small businesses. The regulation is
aimed to give individuals more control
of their data. Therefore, organisations
must ensure personal information is:
fairly and lawfully processed and can be
processed for specified purposes. It should
be adequate, not excessive and kept up to
date. It will not be kept for longer than is
necessary and will be processed in line
with the rights of the individual. The
information will be kept secure; and will
not be transferred to countries outside
the European Economic Area unless the
information is adequately protected.
When handling the personal data
that a company holds for both clients
and its staff, fairness, transparency and
confidentiality should be upheld.
Data mapping
Small businesses should start by finding
out what, where, who, why personal data
is being stored. You might be surprised by
how much data you have unnecessarily
or in the wrong place. Once the data
mapping has been completed and you have
identified each contact and noted why you
have obtained their personal data, you need
to check it is compliant with the above and
other key principles of GDPR.
For example, if you have personal data
that you don’t really need, this should be
deleted. If data is not recognized or out of
date or the data being too incomplete to
be of any use.
A process should then be put in place to
reassess the data you retain to be completed
on an ongoing basis. This ensures only
relevant data is kept and processed.
Privacy Policy
Your website is often seen as the equivalent
to your shop window. Therefore it needs to
be perceived as trustworthy. By creating a
privacy policy explicitly explaining what
you want you do with data will certainly
help people trust what you are doing
with their data. This should be accessible
for all and should reference the GDPR
requirements in the policy.
Train Staff
All staff should be clued up on the new
legislation and they should learn about
employees basic data hygiene, for
example, workstation security.
47.
Staff should also feel comfortable
about reporting anything that they feel
compromises data protection, privacy and
security of customers, clients, supporters
and employees. They should be able
to report anything without fear of any
personal repercussions.
Denham has talked about creating a
culture of data protection which “pervades
the whole organisation.” This is expected
under GDPR.
Subject Access Requests
Many are concerned that there will be a
huge influx of data subject access requests
(DSARs) when GDPR comes into force.
Small businesses will not be exempt from
theserequests.Thismeansthatorganisations
must be able to provide a description of
the data, why they are holding it, who it is
disclosed to and have a copy of the personal
data in an intelligible form.
As stated previously, small businesses
are not required to keep a record of
everything, however, if you don’t keep a
record of customer data, then you might
not be able to send this information in the
correct timeframe.
This means that small businesses should
look at all their data to understand what
data is held, where it is coming from and
where is it going. The best option is to
keep a record of processing activities so
you don’t have to search through heaps of
material at a later date.
Staff are also fundamental in ensuring
data subject access request (DSAR) are
complete in time. They should be trained
on how to spot a DSAR as this could come
via an email, telephone conversation,
letter, the website etc. They should then
understand what to do with them, which
requires a process to be put in place.
As mentioned, staff training is essential.
Human error is often considered the
highest risk of a data breach within an
organisation. By making sure staff know
its importance and their responsibility, it
will lower the risk of a breach. Training
staff can also be used as evidence for
demonstrating compliance. If a breach
were ever to occur, you can show that you
have actively complied with the legislation.
Security
Security is another factor that small
businesses need contend with. Cyber-
attacks are not just a large business issue. In
fact, according to the government’s Cyber
Security Breaches Survey 2017, 46 per cent
of UK businesses reported having a data
breach in 2016. Cyber-attacks on small
businesses have significantly increased
which could be due to cyber criminals
taking advantage of SMEs lower resources,
allowing them to exploit weaknesses and
gain access to larger business partners.
The cost of securing your organisation
should not be too high. For example,
organisations can activate firewalls on
computers and access points to the internet.
Run a reputable anti-virus product and
ensure it automatically updates on a
daily basis. Make sure you maintain
good passwords by activating two-
factor authentication for hosted software
services. Finally, you can remove
unused user accounts and ensure only
administrators have full administrative
access to computers.
Data Breach Reporting
If a data breach occurs, small business will
have to report them, however, only if there
If you suffer a data breach as a data
controllerwillitbeyourlegalresponsibility
to report directly to the ICO.
Failure report, may lead to increased
fines – up to 2% of global turnover.
Although this is aimed at larger firms, yet
small business owners should be aware
that the ICO may issue fines for careless
or deliberate data breaches.
Conclusion
Organisations that process data have a
responsibility to process it securely, no
matter their size.
In today’s climate, data privacy has
become a basic human right. Customers
rightly believe their data should be treated
with care and they may be quick to dispel
a company seen to break that trust.
Therefore, we should think about how
we would like our own data to be handled,
before we process other’s.
As data protection comes to the
forefront of organisations, the growing
concern about what is being done with
personal data and how safe it is, means
that all organsations have a responsibility
to ensure data is as safe as it can be and it
is being processed fairly.
In other words, if you’re unsure whether
something is okay to process, should ask
yourself, whether it feels right, because if
it doesn’t, then it probably isn’t.
by Laura Edwards, editor of gdpr.report
48.
www.dataprotectionworldforum.com LEON@dataprotectionworldforum.com LEON BROWN : 0203 515 3015
FOR MORE INFORMATION:
Data Protection World Forum is not to be missed – pre-register now
In an increasingly digital economy the issue of data protection and privacy is
becomingevermoreimportant.
On 25 May of this year the EU General Data Protection Regulation
GDPR becomes enforceable and the accompanying ePrivacy
Regulation (ePR) is currently working its way through the EU
legislative process.
2018 will see the biggest change in data protection and privacy
for two decades and will have a global impact on companies of
all sizes; public sector and government organisations, charities,
professional bodies and associations.
The inaugural Data Protection World Forum (DPWF) will be
held on 20-21 November 2018 at ExCeL London. It will feature
carefully curated content streams to allow delegates to focus on the
issues that matter the most to them and allow them to deep-dive
into specialised topics.
While GDPR will be one of the central topics covered at this
landmark event, the international line-up of speakers will also cover
burning issues such as; The Internet of things (IoT), cyber threat
protection, ransomware, Blockchain, AI, disaster recovery, biometric
technologies, surveillance, ad-tech and much, much more.
51.
2018 will be a landmark year
for data protection and privacy
importantinformationaboutthedateprotectionworldforumconference
& expo 2018
•	 20-21 November 2018, ExCeL London
•	 Pre-register at www.dataprotectionworldforum.com
•	 While the overall impact of GDPR will take centre stage,
the Data Protection World Forum will also highlight
key issues affecting the following industries: GDPR &
marketing, GDPR & HR, GDPR & financial services,
GDPR & retail’
the 2018 speakers
•	 JamieBartlett: Author, journalist and researcher – Internet
cultures, cyber security and online privacy
•	 Rohit Talwar : CEO at Fast Future – The future of personal
privacy and data protection over the next ten years
•	 Pernille Tranberg : Founder of Data Ethics Consulting - A
look at the organistations working with data ethics
LEON@DATAPROTECTIONWORLDFORUM.COM
0203 515 3015
SPEAKERS INCLUDE:
WWW.DATAPROTECTIONWORLDFORUM.COM
20TH
& 21ST
November 2018, ExCeL London
M
ARLO
ES
PO
M
P
N
IRVAN
A
FARH
AD
I
G
ARY
H
IBBERD
STEVE
W
RIG
H
T
2018 WILL BE A LANDMARK YEAR FOR DATA PROTECTION AND PRIVACY
Pre-register for the data protection event of the year here
The EU General Data Protection Regulation (GDPR)
comes into force in May 2018, and will impact every
organisation that trades with any EU country.
The new regulations will disrupt how organisations:
store,manageandprocessdata.GDPRisthebiggest
shake-up in data protection for a generation.
Ourhalf-dayandfull-daycoursesprovideapractical
and efficient staff training programme, presenting
staff with the knowledge, understanding and
practical tools to implement and maintain GDPR
compliant processes.
Contact Leon for more details:
leon@dataprotection.media
0203 515 3015
Half Day : £2,000
Full Day : £3,500
Breaches, incidences
and security
Apple may have set a high bar for cybersecurity,
but not even this company is invulnerable.
Recently, it issued a second customer warning
for owners of iPhones, iPads and the Mac
that they were affected by a processor flaw
that could leave them vulnerable to hackers.
Apple said that security issues are known as
Meltdown and Spectre. It urged customers to
only download software from trusted sources.
The good news is that, at the time of writing,
there is no evidence that the security flaws
have been exploited by hackers. Apple has also
released software updates to iOS.
It just goes to show, even the world’s biggest
company and a firm that sets great store by the
way it treats customers’ privacy and security
is under constant threat. And indeed, its
share price took a hit. When it last revealed
a security flaw, in November 2017, its share
price fell by over two per cent.
Breaches, incidences
and security
Apple may have set a high bar, but not even
this company is invulnerable. Recently, it
issued a second customer warning for owners
of iPhones, iPads and the MAC that they are
affected by a processor flaw that could
leave them vulnerable to hackers. Apple says
that security issues are known as Meltdown
and Spectre. It is urging customers to only
download software from trusted sources.
The good news is that, at the time of writing,
there is no evidence that the security flaws
have been exploited by hackers. Apple has also
released software updates to iOS.
It just goes to show, even the world’s
biggest company and a firm that sets great
store by the way it treats customer’s privacy
and security is under constant threat. And
indeed, its share price took a hit. When it last
revealed a security flaw, in November 2017, its
share price fell by over two per cent.
Even before GDPR, security was a fundamental
issue, now it has taken on an even greater
imperative. In this section, we look at some
of the key issues.
Some key Articles in GDPR
Some of the key articles in GDPR relating to cyber security,
breaches and incidences are:
Article 33, which states: “In the case of a personal breach, the
controller shall without undue delay and, where feasible, not
later than 72 hours after having become aware of it, notify the
personal data breach to the supervisory authorities…”
It also states: “Where the notification is not make within 72
hours, it shall be accompanied with reasons for the delay.”
Other key Articles include 32, which relates to security of
processing, Article 28, which focuses on the processor of data,
and among other things stipulates clauses that must go in a
contract with third party data processors, Article 30 which
relates to keeping records, and Article 34, the communication
of a breach.
Here, we look at tips to avoid a breach, dealing with third
parties, especially using the cloud, plus a guide to cyber security
– Everything You Need to Know About Cybersecurity – and the
Cardless Society. But first we look at tips to avoid a Breach.
Guide to avoid a breach and what to do when
it happens
At the recent GDPR Summit London, a panel discussion focused
on tips to avoid a breach.
Present were Abigail Dubiniecki from Henley Business School,
Piers Wilson from Huntsman, Dave Horton from One Trust,
Amanda Finch from IISP, Anthony Lee from DMH Stallard and
chaired by Ardi Kolah from Henley Business School.
128 days versus 72 hours
There is a gulf between what GDPR expects and the current reality.
Article 32, stipulates that a breach usually has to be reported
within 72 hours, Ardi Kolah suggested that research has found the
average length of time it takes an organisation to discover they
have a data breach is 128 days. So that is quite a gap, and it needs
closing, fast. So what tips did the panellists have?
The clock is ticking
It is simply no good waiting for a breach, and then agreeing how
you will react. All panellists agreed: preparation is the key.
“Don’t write a plan in the middle of a crisis”, summed up
Anthony Lee.
They also agreed, it’s not a question of if, it’s a question of when
a breach occurs. GDPR says organisations must have technical
and organisational measures in place. So that means having the
right technology, but also having procedures and staff training.
Staff must understand the data, procedures must be defined,
practises agreed and everyone must be familiar with them. Abigail
Dubiniecki said “that staff must speak the same language” – not
literally, but that there must be is a common understanding of
the terms used.
There was also an emphasis on the need for good people
interpreting technology and doing risk assessments, ensuring
that you must have information to answer questions, such as what
data was lost, what data was exposed, passwords decrypted? This
means having tools, and technology, but also audit trails. You
need records, otherwise if you say: “I don’t know”, the regulator
is likely to look less kindly upon you.
Dave Horton suggested that you need to identify the roles
within a business that will be important, while Amanda Finch
put emphasis on testing and learning from previous experiences.
Difference between a breach and an incident
The panel turned to the question of incidence versus breach.
Article 29 of working guidance makes a distinction between
an incident and breach. There are three types we recognise,
shortened with the initials CIA:
•	 Confidentiality breach; so some data has got out into the world.
•	 Identity; for example, has the data been tampered with?
Under GDPR having accurate data is critical. Examples
given related to changing the grades of an exam or
modifying the signature on a major financial transaction,
third is.
•	 Availability; such as ransomware attack.
•	 An incident; is where a security event has happened, but is
not necessarily a personal data breach, a ransomware attack,
for example, but you have a robust backup so that it does
not cause any serious harm. But even if there is a breach it
may not be reportable. This depends on the risk assessment
and likelihood of risk to individuals. Then again, if a breach
occurs but is not reportable, it is still a learning opportunity.
A key point relates to having a defensible position. So if a breach
occurs and you are subject to an investigation, they are going to
look far more favourablly if you have good processes.
Processor and supervisor
A lot of smaller companies do not process their data themselves.
Indeed, while many outsource data to the cloud, smaller firms
often outsource the people who manage that, with an outsourced
IT manager.
But GDPR requires that the processor has to report a breach to
the data supervisor, and this should be built into the contract with
a company that has been outsourced too. Article 28 means that
you cannot agree a contract with a third party unless they provide
sufficient guarantees of compliance including security provisions,
to inform, without undue delay if there has been a breach.
It is also essential that all data that is processed is subject to
an appropriate lawful bases under GDPR, such as consent or
legitimate interests.
And panellists also said, choose partners wisely, “if they say
‘GDPR what?’ You probably probably look for another provider,”
suggested Abigail.
But this all leads us to the question of what to do when your
data is held by a third party, for example in the cloud. And for
this, we paid a visit to Anthony Lee, who told us more.
By Michael Baxter
PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
Let’sputourhead
intheclouds
Let’s say you are storing some of your data, or are using a cloud
based electronic point of sale system. “Take a retailer,” suggests
Anthony, “by definition, you will have a lot of personal data,
because most of the contacts you have will be people rather than
organisations, GDPR will apply to those data sets.
“If you are responsible, as an organisation, for technical and
organisational measures in place to keep that data safe and
to make sure it is being processed in a way that is complaint
with the regulations, then if you choose to put that data with a
third party, you, as the data controller, are still responsible for
that data, and you will be held accountable by the regulator if
anything untoward happens to it.”
He says there are three key things to think about in relation to
third party eco systems:
1: Does the company in question have the right to share the
data with the third party in question? Is there a lawful bases?
For example, are they doing so in pursuant to their legitimate
interests, or one of the other lawful bases, such as consent?
Processing personal data is defined very broadly, holding data,
storing data, retrieving data, accessing data, transmitting data,
sharing data; it is all a form of processing. To put your data with a
third party, you need to be able to establish that you have a lawful
bases for doing so. And if you don’t, then the sharing of that data
would be in breach of the regulation.
2: As an organisation, you need technical and organisational
measures in place to ensure that the data is being processed in a
way that is complaint with the regulation and the principles in it.
So you need the right technology to guard against external threats
such as hack attacks and the right people that can handle the
data in a way that is complaint with the regulation. That alludes
to awareness raising, training, privacy policies and procedures,
and just making sure that right from the top of the company to
operational people, you are handling data in an appropriate way.
Then, assuming you have satisfied yourself that you have a lawful
bases for sharing the data, you need to satisfy yourself that the
third party has sufficient technical and organisational measures in
place to keep that data safe.
So if you give the data to a third party that has very loose
security in place, you will be in breach of the regulation.
3: Then you need to extract a number of contractual promises
from that provider, which go way beyond the promises that you
need to extract under the current rules.
For example:
•	 You need the cloud service provider, which would be
considered, under the rules, to be a data processor, to agree
that they will only process the data you hand to them in
accordance with your written instructions.
•	 That they will do no more or no less.
•	 There is no discretion in relation to the data.
•	 Also: and this resonates with your obligation to satisfy
yourself that the service provider has got technical and
InterviewwithAnthonyLee
[prvacyspecialist]
onGDPRandthirdparties
Article 28 of GDPR states:
“Where processing is to be carried
out on behalf of a controller, the
controllershalluseonlyprocessors
providing sufficient guarantees to
implementappropriatetechnicaland
organisational measures in such a
manner that processing will meet
the requirements of this Regulation
and ensure the protection of the
rightsofthedatasubject.”
Under GDPR, you are responsible for the data you collect and
control even when it is being processed by third parties. But
supposethedatagoesinthecloud,whataretheconsiderations?
WespoketoAnthonyLee,aprivacyspecialistandpartneratthe
Law firm DMH Stallard.
BY MICHAEL BAXTER
organisational measures in place, to process the data in a
compliant way, you need to get a contractual promise from
them to that effect.
•	 Then it gets more detailed. For example, you, the controller,
would have to get a promise from the third party in question
that they will give you access to their facilities so that you
can audit them to satisfy yourself.
•	 You need to extract from them a promise that they will give
you the information you need to be able to demonstrate to
the regulator that not just you, but the third-party system
you are tapping into, is compliant. (This point relates to
the accountability principle in the regulation, not only
do you need to comply with the regulation, you need to
be able to demonstrate to the satisfaction of the regulator
that you comply, you need to have evidence that you can
show to the regulator that you keep appropriate records of
processing activities, and that you have robust training and
programmes and procedures to make sure your work force
are thinking privacy and acting in an appropriate way. But
in addition you have a requirement to ensure your service
provider provides you with information that you need to be
able satisfy the regulator that they too have those similar
measures in place.)
•	 The contract must say that on ‘expiry or termination of
contract, for whatever reason, whether it be convenience or
just expiry, the data processor – the cloud service provider,
will either delete or return the data.’
•	 If you are dealing with what is for example known as a
software as a service provider, for example a cloud based
electronic point of sale system, there is every likelihood
that that supplier will be tapping into Amazon, or Microsoft
Cloud – Azure – or Google Cloud. The regulation says that
you, as the data controller, are responsible for ensuring this
software as a service provider, also ensures their third parties
also apply the above requirements - so all those clauses
above, have to flow down the third party chain.
But saying and doing are not the same thing. How do you ensure
that, if the cloud provider is a giant tech, it abides by the above
requirement. There are additional issues if the data is processed
at a location outside of the European Economic Area, such as on
a server in the US.
“My view is that they are going to have to come up with some
kind of self-contained approach to be able to demonstrate to the
controller community and to the data processor community that
they do have clauses in their contracts that are aligned to Article
28.” Says Anthony.
The export of data
“If the data ends up outside of the European Economic Area,
then that accounts as a form of export,” Anthony explains. Under
GDPR, there is a general prohibition of exporting data, unless it
is going to a country that has been designated by the European
Commission as having suitably robust privacy laws in place.
Such a designation does not apply to the US.
The US consideration
In the US, they have the privacy shield, which is voluntary.
Companies can register themselves, and you can export data to
that company without falling foul of the adequacy requirements
or the export prohibition, providing of course, that you satisfy
all the other conditions. “If they are not signed up to the privacy
shield there are ways around it. For example, you can get them
signed up to the Model Clauses approved by the EU Commission,
but this may change post 25 May.” The issue of dealing with the
US is nuanced.
Importance
The regulator has already singled this area out as one of the areas
that may carry the highest fines. But fines are only part of the
story: “The box of tricks at the regulator’s disposal, includes the
right to stop data from being exported, to injunct. For example,
if you are a retailer, and your life-blood was your access to your
cloud based electronic sell system, and the regulator stops you
using it, fines may be the least of your problems.”
There is also the possibility of individual compensation.
Anthony warns, “No one in the UK has yet managed to extract
individual compensation, but the door is open slightly, as courts
have acknowledged that someone can be compensated, not just as
a pecuniary loss for use of data. For example someone syphoning
money from your bank account, but also for distress that is caused
as a result of being on the receiving end of inappropriate emails.”
Reporting a breach
“It is worth bearing in mind that under the current rules, if
there is a breach, such a hack attack, or accident loss, you have
a discretion as to whether you have to notify the regulator, but
under the new rules, you have to notify the regulator within 72
hours – although typically requirements are softer if the data
encrypted.” Furthermore, “if the person whose data has been
lost is at risk, you have to warn them. Within the third party
consideration, have to make sure the third party is aware of a
breach and they notify you, and you can meet that 72 hours
requirement. You can’t just bury your head in the sand. There
has to be a component of proactive monitoring, if you don’t find
out about a breach because your service provider did not tell you,
then your service provider should have known about it, but were
half asleep, you are going to be accountable to the regulator.”
The supply side
But what if you are a third party, what can you do? “Bigger
organisations are cooking up long and detailed data protection
addendums, which have all manner of detail, and from the
controller-side perspective. They may have a sting in the tail
saying ‘if you mess up on any of these things, you the processor
will indemnify us, and your liability is unlimited.’
“A supplier can, however, seize the agenda with your their
clauses dealing with Article 28 provision, demonstrating that
they take it seriously, putting a clause in saying they will not
export the data outside of Europe without your consent, and then
agree the bases for doing so.”
57.
Don’t miss our next issue – subscribe for free here
Everything you need to know
About cybersecurity
Cybercrime is unavoidable in this day and
age. There’s nearly a 100 per cent chance
you know someone who has been the target of
an attack. If you own a business, your odds
of being targeted are at least one in two.
Yes, fifty percent of all businesses — and
probably more — have been attacked. Here
Nathan Sykes, takes a deeper look.
Don’t write that off as conjecture just
because you don’t know you’ve been attacked,
either. As we’ve seen with numerous recent
cyber incidents, the consequences of these
attacks often don’t become apparent until
long after they’ve been carried out.
Enforcing data protection and privacy laws
is still not an exact science, and that means
to keep your business safe, you’ve got to
be educated. GDPR makes this a requirement,
of course.
Expand your knowledge – pre-register to Data Protection World Forum today
Money and Power
Make no mistake that the reason cybercrime is so popular is that
it’s lucrative. In the example of ransomware, which demands that
a user send payment — often in the form of cryptocurrency — the
model is straightforward. Scamming attacks like emails claiming
to be a long-lost relative in need of some cash work the same way.
Less obvious hacks are part of larger schemes that lead to
someone getting paid, but many of the attacks you hear about on
the news don’t have any direct financial impact on those affected.
For example, when Yahoo leaked three billion user accounts, the
media made a huge deal of it — but not one of those people lost
money as part of the hack.
Your company houses a wealth of sensitive information about
your employees at the very least. If you interact with customers,
their information is also housed on your servers or the service
you choose to use for storage. To a cybercriminal, your company
is a veritable money pit, so what can you do to protect it?
Corporate Cybersecurity
You wouldn’t leave the door to your warehouse unlocked at the
end of the day, would you? And yet many businesses leave the
door to their data wide open. Were it a brick-and-mortar building.
You would install cameras, motion sensors, and elaborate locks.
In the world of cyber security, the best strategy to adopt is a
similarly layered approach to protecting data.
Threat vectors are everywhere in a corporate setting. You have
an array of workstations continually communicating with outside
locations, email that you expect to receive from completely
unknown sources, and a workforce that cybercriminals will
leverage against you. That’s right. Your people are one of the
biggest threats to cybersecurity.
More than 60 per cent of companies allow employees to do
business on their personal devices. All that it takes for malicious
software to access your network is a single file shared from your
phone. Does your company allow outside flash drives or hard
disks to be brought in? Those are dangerous, too.
Layered Security
Replacing the cameras, sensors, and locks in the world of Cyber
means installing comprehensive threat detection and network
protection technologies. Not long ago, the term would have been
“antivirus,” but with threats evolving from viruses to network-
based attacks, “network security” is a more accurate phrase in
today’s world.
A well-outfitted network protection suite today will include the
ability to scan for “conventional” threats on your environment
as well intelligent detection technology at the application and
network level. In simple terms, it looks for suspicious activity on
your computer’s file system, and also in communications over
the internet.
In addition to this type of protection, you’ll want to stand up
a firewall, which is the component that blocks communications
detected as malicious as they come in from the internet. Emails
get a dedicated scanning component with technology that can
find threats in their dedicated format and protocols.
On top of these first and second-layer defence systems, state-of-
the-art solutions include a layer of forensics software that lets you
identify where the servers sending attacks are located. This can
be critical if your business is high-profile and you suspect that a
single attacker is launching multiple attempts to compromise your
system, this type of defense can help authorities resolve the threat.
Next, encrypt all your files. This will make it impossible for
bad guys to make sense of data if they do get access to it and is
an essential component of good security particularly if you’re
housing information from the public.
Finally, you need a disaster recovery solution. Even the best
defences can be breached with enough time and effort. Your
environment should be backed up regularly, at least once a day,
with the ability to restore your work from the backups in a matter
of hours. Without this in place, you’re begging for a crisis. And a
big fine under GDPR.
Social Engineering
Malicious actors have learned that, in many cases, people are
easier to hack than computers. This practice, called ‘social
engineering’ involves tricking one or more people on your team
into exposing access to sensitive data or introducing malicious
code. Once you’re compromised, it’s very difficult to wind back
the proverbial watch.
Education is the best way to combat this type of attack. Social
engineering relies on the ignorance of your employees, but you
can fight back by arming them with the knowledge to recognise
and avoid cyber-attacks. Companies like Symantec, Webroot and
ESET, offer cybersecurity awareness training programs that can
help reduce the risk of exposure for your office.
Fighting Back
To stop cybercrime from throwing our modern world into anarchy,
groups like the FBI, and CIA in America, and Interpol teaming up
with businesses from the private sector to track down and punish
the people who perpetrate these attacks. One such operation
started in 2009, recently saw the capture of Russia’s most famous
hacker, after bringing down a botnet named “Gameover Zeus.”
By reverse-engineering these attacks and the reconnaissance
that goes into them, investigators can begin to form patterns and
identify the people responsible. The internet is the real world, and
the punishment for these crimes is comparable to any other form
of theft or incursion.
However, cybercriminals know that if they plan and execute
an attack successfully, the authorities are at best in a reactive
position, and that means the bad guys still win much of the time.
The turning point in the war on cybercrime will come when we
have a deterrent. A way to raise the stakes for cybercriminals, so
that they think twice about what they’re doing. We might not see
it in our lifetime, but if we want to go on with our lives free from
constant cyber-threats, that is what it will take.
59.
“More than 60 per cent of companies allow
employees to do business on their personal
devices. All that it takes for malicious
software to access your network is a single
file shared from your phone.
Nathan Sykes is a business technology writer
from Pittsburgh, PA.
Once upon a time, a world where your eyeballs
are scanned to get into a building, an implant
inside your body opens doors and switches on
the lights, and your whole life is contained in
a mini-computer measuring 6 x 12cm, was the
realm of sci-fi movies. Jason Choy, of the security
specialists Welcome Gate, discusses how the
latest advances in technology are transforming
his industry.
It has become the norm for our faces to be
scanned at passport control, our iPhones
to be opened by a fingerprint, and for us
use a phone as our calendar, our fitness
tracker, our bank, our research resource
and our document store.
It is hardly surprising that security
systems are increasingly being managed
via a phone, biometrics is being used more
and more to control access to buildings.
And taking it to the next level, implants
are growing in popularity, thanks to the
body hacking movement.
There are advantages to each of
these which it is worth entrepreneurs
and business owners considering next
time they move offices, or upgrade their
security system, especially as GDPR will
place greater demands on recording and
tracking how this data is stored.
Smartphone security
Nowadays, we can’t be without our
phones - they have become an extension
of our physical, mental and social beings,
with some people using them for a third of
their waking hours.
It seems inevitable that the traditional
security fob to get into office buildings will
disappear over the next few years, and
access control will move onto our phones.
There are advantages:
•	 if someone picks up your phone,
they can’t get into your office, as it
needs to be activitated by your PIN, a
fingerprint or facial recognition
•	 if people lose their phones, they report
it quickly (often people are reluctant
to tell their employers they’ve lost
their security fob)
•	 out of hours contractors can download
an app to get into a building, meaning
the property team don’t need to be
on-site to programme a card and
physically hand it over
•	 those managing your office security can
operatethewholethingwithouttheneed
to be on site, adding new users, checking
access cards, sorting out security
issues and even checking individual
CCTV camera, particularly useful for
businesses with multiple offices
•	 it can create a seamless welcome
for your visitors, who can be sent a
personalised visitor pass in advance,
so their phone will activate
•	 entrance barriers or even lifts, and
their hosts will be notified they have
arrived, and they can even choose
their preferred drink in advance.
Biometrics
Biometrics is the use of technology to
scan a part of the body to determine
your identity. There are four key types:
fingerprint, face, eye, and hand geometry.
It’s now being used for access control
in many modern offices, particularly
industries such as tech, financial services
and retail. There are three main benefits of
using a biometric security system:
1. Security and accuracy
Your body parts are much more difficult to
fake, lose or share, so you are guaranteed
biometric access is 100 per cent accurate.
This gives you completely reliable
attendancerecordsforstaffandcontractors.
This accuracy is particularly attractive
for industries such as tech, defence or
pharmaceuticals. For these, stringent
security procedures – which can be
guided by standards such as ISO 27,001
Information Security Management - are
essential to restrict access to areas such
as server rooms, and to protect sensitive
information, particularly if there is a
potential threat of terrorism, espionage or
extreme activism.
2. Cost
As with every new technology, when
biometrics was first introduced 20 years ago
thecostmadeitprohibitiveformostbuildings.
Now it can be more cost effective than
using card security, as you no longer have
to pay to replace lost access cards. This
cost can run into thousands, by the time
temporary cards are reprogrammed and
issued, old cards taken off the system and
new ones bought.
3. Sophistication
The latest biometric systems are lightning
fast, with a whole range of different
technologies available and a bewildering
array of suppliers in the market.
They can be used for more than just
security. Facial recognition software
has great potential for marketing in the
leisure and retail industries, providing
valuable analysis of buyer behaviour and
demographic information. It can track the
demographics of visitors including length
of stay, age profile and gender.
Implants
Pet owners don’t think twice about micro-
chipping their animals and implants work
exactly the same way.
They take convenience to the next
level and are part of a new movement
called body hacking, where people are
using technology to enhance their lives,
implanting mini computer chips into their
bodies to eliminate the need for credit
cards, security fobs and phone apps.
This tech is being adopted by a few
companies for security, cashless vending,
photocopying and every other function
that you might historically need a card for.
As well as being convenient, it is also
completely secure and some might argue,
environmentallyfriendly,astheyminimise
the volume of plastic waste created by
having physical cards for everything.
The miniaturisation of technology has
made this very viable, and I predict it will
become increasingly mainstream.
Conclusion
The cardless society is fast approaching.
The advances in biometrics, in smartphone
apps and in implants are revolutionising
the way security is managed.
This is where organisations’ strict
adherence to GDPR becomes even more
important, as we consolidate technology,
making it less tangible and in cases like
implants, wholly intrusive. Organisations
have a higher responsibility to ensure that
the data they hold, handle and store is
carefully managed.
Jason Choy is the founder of the security
specialistsWelcomeGate,whichprovidestailored
security packages for clients including EDF, Ford,
Ikea, British Transport Police, H & M and Bentley.
The Cardless Society
60.
Make sure you don’t miss issue 2 – subscribe here for free
Under GDPR, the ‘right
of access’ means that
individualshavetheright
to access their personal
data and supplementary
information.
By now most people would have
heard about GDPR and the impending
fines organisations may face if
they have a data breach.
Aside from data leaks and hacks, another
way an organisation can breach the new
regulation is by not adhering to the new
rights of individuals. Under the GDPR,
individuals will benefit from increased
rights such as:
•	 The Right to be Informed: If requested,
organisations must state the purpose
for processing personal data, the
retention period for the personal data,
and who it will be shared with.
•	 Right to Rectification: A process will
need to be put in place such as if requests
come in to correct historical data.
•	 Right to Erasure: Or right to be
forgotten allows people remove
their data if they no longer want an
organisation to have it unless it is
mandatory to keep.
•	 Right to Portability: Under Article 30,
do we have a facility in place for data
portability?
•	 Automated decision making: Individuals
can request for a human to make their
decision rather than it being automated.
The Right of Access
Under GDPR, the ‘right of access’ means
that individuals have the right to access
their personal data and supplementary
information. This right also allows them
to be aware of and verify the lawfulness
of the processing of their data. This is also
known as Subject Access Requests (SARs).
SARs aren’t new, this right also existed
under the under the Data Protection Act,
however, under the new legislation, there
will be changes that organisations must
pay attention to. This article will take a
look at what the requirements are of SARs
and the pitfalls of organisations dealing
with them.
Changes to SARs
Currently, under the Data Protection Act,
organisations can charge £10 for a Subject
Access Request, but under GDPR, the £10
charge will be revoked. This may open
up the floodgates for people to request to
find out organisations are doing with their
data. Small businesses may suffer from
the new requirements, as they may have
never received a SAR before and will not
know the process or have the time and
resource to deal with them if they become
inundated after 25 May.
Another change to SARs is the amount
of time people must respond. Under GDPR,
organisations must respond within 30 days
as opposed to the 40 days under the DPA.
Under GDPR, if data subject rights are
violated, organisations will have to pay
up to 4% of their turnover or €20 million,
whichever is greater. Therefore, it is
essential that organisations know exactly
how to respond correctly and in time.
The ICO receives 1,5000 complaints
which are increasing year on year. Out of
those complaints, 40% are SAR related.
This is a significant number considering
there are eight separate data protection
principles.
The ICO is not clear on the exact reason
why, but one of them may be due to the
fact that people are becoming more aware
of their rights and are more willing to act
on them. Over the last few years with
media coverage of data breaches and
misuse of data, the public has lost trust
in organisations. In fact, an ICO survey
revealed that that only one-fifth of the UK
public (20%) have trust and confidence in
companies and organisations storing their
personal information.
In addition, under Article 57, the ICO
will have an obligation to promote GDPR
including the data subject rights. Therefore
there will be even more awareness. The
changes to the SARs means there is more
pressure on employers and they are looking
at implementing dealing with SARs.
With public awareness rightly
increasing regarding their data rights,
organisations must ensure they take it
seriously. Organisations may not know
how to deal with SAR correctly. The ICO
receives many complaints about regarding
several issues about SARs. Here are some
of the complaints the ICO receives and
what organisations can do to ensure they
adhere to the legislation.
The response has arrived late
This is self-explanatory, however, as
mentioned, organisations may face fines
Subject access requests
if they don’t respond in time. If you aren’t
able to respond within the 30 days, this may
expose other issues with the organisation.
For example, if an organisation doesn’t store
data correctly, they may need to spend time
on a data flow audit and data mapping.
A data controller didn’t realise
they had received a SAR
If an organisation is involved in a customer
or employee dispute, they may have long
letters or phone calls about wider issues.
However, a SAR request may be concealed
within the correspondence.
This kind of issue highlights the
importance of staff training. Organisations
will have a shortened time to respond to
one, and therefore, if staff receive one
via email, phone call, online, they should
know how to spot one to pass it onto the
relevant staff member or team.
If someone is unsure whether it is a SAR,
you can quite simply ask the individual to
confirm or check with the ICO.
Not extensive enough
If an individual received SAR but they
believe it is incomplete or the search
wasn’t extensive enough, the ICO would
require the individual to go back to the
organisation to try and raise their concern
and for the organisation to rectify the issue.
IftheconcernisthenraisedwiththeICO,
they will review all the correspondence
with the individual to get the full picture
and the organisation’s understanding of
the SAR.
SARs with third-party data
A response would include third-party data,
which is information that would relate
to other people. Documents containing
the personal information of more than
one person should not automatically
be disclosed on submission of a subject
access request.
This can be more complex and some
instances it is reasonable and appropriate
to provide third-party data, but sometimes
it is not. For example, CCTV footage
may disclose personal information about
another person, which may be a breach
of their own personal data. A balancing
exercise must be done to determine
whether disclosure is appropriate in the
63.
circumstances. Organisations can contact
the ICO anonymously if they are unsure.
The individual is ignored
Organisations may rightly contact an
individual informing them that they no
longer wish to correspond on a wider
issue, such as a customer complaint or
dispute as they may feel that it is no longer
constructive. However, if a customer has
requested a SAR within their wider issue,
the organisation is still obliged to respond
to the SAR. Organisations can make it
clear that they are only reopening further
communications to respond to the request.
Although organisations may rightly
decide they no longer want to correspond
if they don’t believe it to be constructive,
if an individual requests a SAR, they still
have a statutory right and the organisation
must respond.
What can organisations require
before responding?
Organisations may need some
information to establish the individual.
An organisation can ask for proof of
identity. Often individuals may not be
happy with this request, as they may have
spoken with them very recently and know
who they are. However, in most cases it
is required to ensure that the information
they hold is held securely. Any issue is
made, SARs establish their ID and make
sure the organisation knows who the
individual is before handing over their
information. Requiring ID will depend on
the information you expect to disclose.
For instance, when someone has
requested a SAR, if the organisation
doesn’t hold any sensitive data, an
organisation might want to ask for a
recent utility with name and address.
Yet, if sensitive data will be disclosed, a
higher level of ID such as a passport may
be requested.
Organisations can also ask the requester
to help locate data. For example, if you
have not heard from them before and
they are requesting a general search,
they might ask for all information held
on them. In this case, you may only be
required to search your customer or
complaints database.
Conclusion
Overall, the fundamental structure for
SARs will remain the same as the DPA
under the GDPR. Nonetheless, the
changes may have more of an impact
on organisations and their resources
By Laura Edwards, editor of gdpr.report
than expected. Therefore, it is essential
that organisations plan ahead and know
exactly where their data is stored. Have
their staff on board and a process is in
place to respond in time
64.
Watch out for: Delay in receiving
response. The clock starts ticking
the moment the SARs is sent, not the
moment the individuals responsible for
overseeing it receive it.
What to do: It is imperative that all
staff are trained and know who to
contact if they receive a SARs.
Watch out for: Delay in replying.
What to do: Ensure well-rehearsed
procedures are in place.
Watch out for: Not knowing where all
the data is, for example removable
storage devices and on the cloud.
What to do: Data audits.
Watch out for: Third parties.
What to do: Look at agreements, ensure
third parties will respond to a request
furnishing you with the appropriate
data in an appropriate time-frame.
Watch out for: Not realising a SARs has
been made.
What to do: Training of staff is
essential to avoid this.
Subject access requests checklist
left behind? I hear what you say, surely security has developed
technologically too. So what is the heart of the issue?
I am suspicious that the day-to-day use of data has changed
so dramatically that now many companies are finding that they
are not only having to bridge the hop, skip and jump to GDPR
compliance, but review and consider how the DPA regulations are
currently being adhered to. I don’t doubt that the annual training
is being executed year in, year out, however, the headless chicken
act seems to be dominating the internet and social media and this
leads me to believe that not much of the DPA is being upheld. So I
wonder if the laid back attitudes being adopted signify that, despite
best efforts, the DPA is the wallflower of compliance.
In reality, if a company is happy to take whatever fine will be
issued, should I care? Well ‘we’ certainly did as the breaches
ripped through the headlines, dominating the news in the early
days, shock and horror ripped through the newspapers and social
media as outrage boiled the blood. The more recent breaches
have received a much different response, the headlines are still
grabbing the stories but the response from the many is almost
nonchalant. Perhaps the wallflower does not stand on its own?
It is amazing that more emotion can often be drawn by a person
experiencing the theft of any physical object (despite its worth)
due to the feeling of violation that resonates with the victim. This
differs so dramatically from the attitudes today when the risk and
potential impact can be overwhelming to any individual, yet data
being illegally obtained doesn’t. It is, for this reason alone, that I am
convinced the changes GDPR bring are undoubtedly for the best.
As Data Protection Impact Assessments (DPIA) transition from
recommended to mandatory, the data subject (You and I) become
more empowered as our rights are extended and the responsibility
of maintaining a process inventory lies with the businesses that
are using your data. The inventory and DPIA together will bring
a greater understanding and awareness of the way data is used by
the company, promote the right questions to be asked, and ensure
security controls are rigorous. Delivering these changes will alter
the way data is seen throughout the day-to-day working of the
business for the better. And for those that don’t want to step up to
the plate? Well, we can all be comforted that the fine cap has truly
been blown, so where warranted a maximum 4 per cent of global
turnover could be issued. It seems the wallflower of compliance
has grown its thorns.
1998: an era in time when Snake was the only game on any ‘cool’ phone, the
internetcouldcauseexcitementbysnailmailwhenaCDfromoldfavourites
like Freeserve, or Computerserve dropped to the doormat, and paper
documents dominated the filing system of any business.
1998 was also the year the Data Protection Act was enforced and
an appreciation for data was born, over the next two decades
technology developed quicker than you can say ‘Silicon Valley’.
We saw ‘mobile’ phones big enough to warrant their own case
morph into a handheld device that can connect you to your
friends and family, capture those unmissable moments, update
all of your friends, instantly and if needed, to run your business,
connecting you globally.
So, this article is about GDPR, why are phones important? By
looking at the paper driven environments to a business in your
hand brings into perspective how differently data is processed in
today’s world. Enter GDPR.
“If you keep your cool while all of those around you are losing
their heads….” There is a school of thought that GDPR is but
a hop, skip and jump from the DPA so why has the practical
application of it caused such a fuss? My thought (and mine alone
I may add) is that the DPA is the wallflower of compliance. For
years I have seen adverts showing that crime and fraud are being
cracked down on, HMRC needs you to send in your tax return
and the FCA, the ombudsman, OFCOM, OFWAT are there to
support you… until recent weeks the ICO has not been so much
heard of; well not outside of the multiple investigations that hit
the news. So if the only news is bad news does that mean there
really is no bad publicity?
“Uber says 2.7 million in UK were affected by security breach”
- The Guardian,
“Carphone Warehouse fined £400,000 for putting millions of
customers’ data at risk” - Independent
“Email gaffe by Coventry University staff exposed 1,930
students’ details” - www.databreaches.net
With such headlines storming the news it is no wonder that
affairs and fines have been issued to business giants, left, right
and centre. YET, the lack of data protection continues to exist
and the breaches just keep coming. It is no wonder that data
security has become the focus of Europe.
I ask, has technology moved at a pace that security has been
the wallflower of
compliance grows thorns
Compliance Specialist and GDPR Lead Business Analyst at Morrisons,
Janna Banks, with her blue-sky thinking & years of data management,
has her own perspective to offer.
PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
www.dataprotectionworldforum.com
@DataProtectWF

Data Protection Magazine

  • 1.
  • 2.
    Published by DataProtection World Forum Editor: Michael Baxter, michael@dataprotectionworldforum.com Contributions from: Laura Edwards, Ardi Kolah, Adil Akkus, Deborah Dillon, Jenna Banks, Nathan Sykes, Jason Choy Design : Hannah Richards, Amplified Business Content DATA PROTECTION MAGAZINE Lawfulness of processing Before you go any further, take a note of this, it seems to be the most common point of discussion whenever the topic of GDPR comes up. Article 6 GDPR, Lawfulness of processing • Processing shall be lawful only if and to the extent that at least one of the following applies: • The data subject has given consent to the processing of his or her personal data for one or more specific purposes; • Processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract; • Processing is necessary for compliance with a legal obligation to which the controller is subject; • Processing is necessary in order to protect the vital interests of the data subject or of another natural person; • Processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; • Processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child. • Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of their tasks. a note from the editor Welcome to Data Protection Magazine. We are all data protection people now. Privacy and data protection has moved from an obscure corporate cubbyhole to the front pages of newspapers and it feels as if everyone has an opinion. The role of the data protection officer, or DPO, has been transformed. In the post GDPR era, DPOs don’t simply take instructions. They report to the highest level of management authority and are experienced in business continuity, risk and technology. Read the media, and you may come away with the impression that the DPO has gone from the person of last redress to rock star. In fact, the change is more nuanced than that. The role of the DPO under the GDPR is radically and legally different. But caring about data protection is not just a job for the DPO. It needs to run deep within a company and become a key consideration for senior management. The c-suite does not merely need to become data protection aware, the ladies and gentlemen who make up the higher echelons of corporate hierarchy need to breathe it. It needs to be embedded into their corporate mindset. As the editor of this publication, I too have been on a data protection and privacy journey. My background is as an economic and disruptive technology writer, but until a year or so ago, thinking deeply, or indeed narrowly, about privacy was barely on my radar. I had this notion that we may be entering an era which will see the end of secrets - that this is the inevitable consequence of creating transparency. But the truth is there is a dividing line. Transparency is good, but we do not want a transparent wall in our bathroom. Teenagers crave privacy from their parents. The mums and dads want transparency from their kids. Creating privacy in the age of transparency is one of the most important challenges of the digital economy. Data Protection Magazine is aimed at the business generalist rather than the data protection or privacy professional and will be released on a quarterly basis. Enjoy!
  • 3.
    You haven’t seenanything yet! GDPR is good for you Goodbye darkness hello friend GDPR and the United States GDPR and how the millennial generation manage privacy GDPR – a quick summary Privacy by design and default From Grim Reaper to hero The Canadian experience Interview with Abigail Dubiniecki GDPR and the Fourth Industrial Revolution AI, GDPR and the anonymisation of data Selling your data Anatomy of a breach in trust, the Facebook story Blockchain and GDPR The GDPR and small businesses The Data Protection World Forum Breaches, incidences and security Let’s put our head in the clouds Everything you need to know about cybersecurity The cardless society Subject access requests The wallflower of compliance grows thorns spring 2018 issue 1 contents 02. 05. 06. 10. 12. 14. 20. 22. 26. 28. 30. 34. 38. 40. 44. 46. 50. 52. 56. 58. 60. 62. 65.
  • 4.
    The General DataProtection Regulation is here. We sat down with Nicola McKilligan-Regan, one of the UK’s leading experts on data privacy, a woman who literally wrote the book about the Data Protection Act of 1998, to give us the benefit of her wisdom. Nicola herself has been working in privacy for 20 years, she admits reluctantly. She began working at the Information Commissioner’s Office (ICO) before it was called that. Instead, it went by the name of the Data Protection Registrar. She worked as a Strategic Policy and International Officer. In 2000, after the Data Protection Act 1998 came into force, she left to focus on helping organisations with the new law. In 2004, her book: A Pocket Guide to the Data Protection Act, was published. The ICO once said of the book that it was the sort of guidance it wished it had written. Today, she is the Senior Partner at the Privacy Partnership, as well as the founder and CEO of Smart Privacy, which provides privacy tools for businesses struggling to implement GDPR. And between now and when her book was published, she has been the Vice President Privacy at Thomson Reuters, and has enjoyed a glittering career in data privacy. The latest version of her book: The Pocket Guide to the Data Protection Act, GDPR and DPA 2018, is being published later this year. And we began by talking about whether GDPR is over-hyped, like the millennium bug 18 years earlier, soon to be forgotten, just as May 25th 2018 itself will be relegated to being just another ticked off date on the calendar. isGDPRislikethemillenniumbug,theso-calledY2K? Nicola: “No. We didn’t know what effect the millennium bug would have, but before GDPR is even implemented, regulators have begun flexing their muscles. She poses a question. “If GDPR is just like the millennium bug, why has Facebook moved the accounts of 1.5 billion users from Ireland to the US?” “The new regulations give regulators a lot more scope to take a hard line against businesses that don’t treat our personal data and privacy with the respect it deserves.” And then she sums it all up in one phrase: “You haven’t seen anything yet!” Why does GDPR matter? We can all get lost in detail: GDPR is such a vast topic that it can be devilishly difficult to sum it up in a way that captures the imagination. We asked Nicola to do just that.SUBSCRIBE FOR FREE TODAY HERE Thescaleofthepenaltiesindicatesthatgovernmentsnow seeprotectingprivacyasanimportantanissueasbeing abletostoppeoplefromlendingmoneytocriminals orsellingarmstoroguestates. • Whyit’snotlikethemillenniumbug • WhydoesGDPRmatter? • Isitgoodorbad? • MessagefortheCEO • MessagefortheDPO • Firststeps • GDPRandtheleanstart-up • “
  • 5.
    Nicola: “It’s thebiggest change in data legislation in 18 years, as it brings data protection legislation up to speed with the internet age. “The scale of the penalties indicate that governments now see protecting privacy as an important an issue as being able to stop people from lending money to criminals or selling arms to rogue states. “For the first time, privacy regulators have really strong powers. The law now applies more directly to the latest technology and is supported by a compliance regime which is the equivalent of major compliance regimes such as AML – anti-money laundering. “If privacy was like a poor second cousin to AML compliance, it’s not like that now. “It’s about public interest. What governments think is important has changed from 18 years ago.” And is GDPR good or bad? So, given this, we asked whether the regulation is good or bad. Nicola: “It’s both. It was needed and I am a supporter of the general provisions. I am not so keen on the bureaucracy. It means a lot of record keeping, a lot of work for lawyers and consultants. But the trouble is, there is a danger that you end up paying less attention to the real risks to individuals - bureaucracy may act as a distraction from real issues.” Nicola cites as an example, Facebook: “It has a compliant privacy policy, but whether it is fair in the way it processes its users’ data is another matter.” A message for CEOs What advice does Nicola have for the CEO? “It is no longer possible, as Mark Zuckerberg found out, to hide behind saying ‘I didn’t know.’ Make sure you understand how privacy affects the business. CEOs are always busy, but they must find the time. If their organisation is large, they may have a General Counsel and a data protection officer. Pay attention to them. CEOs who are not aware of the key issues will be hung out to dry.” A message for the Data Protection Officer Andwhataboutthedataprotectionofficer, or an external advisor, how do you convince the board to take GDPR seriously? “Make sure your advice is very specific to the business model. Make sure your leadership fully understand how GDPR affects the business. “Remember, if you can stand out as an organisation that can solve GDPR problems or provide data privacy safeguards for customers and clients, that can be a commercial advantage. CEOs like to hear about that. “If instead, you focus on the fines, you risk the CEO saying ‘but, that’s what we pay you to avoid.’” InshortNicolasays,“engagetheboardontheir levelbyfocusingonthebusinessadvantage”. The DPO As you read this magazine you will find our article ‘Grim Reaper to Hero’. The role of the data protection officer has changed. But Nicola sees a challenge ahead, there is a shortage of people with the necessary skills. “To be a DPO you need to have experience to deal with practical issues - just sending someone on a course won’t cut it.” She also foresees a conflict of interest problem. “You can’t be the problem solver and advisor like you used to be. You are more like a policeman now. That means you are going to have to say ‘no’ a lot more. The first steps It’s a bit late to be worrying about the first steps now. But Nicola does have some specific advice on your general approach to GDPR. “If you do nothing else, train your staff. If you have an informed work- force it will reduce your risk.” The lean-start-up Nicola also has some advice for the start- up, maybe a company applying the ideas of a lean start-up. “Elizabeth Denham, the (UK’s Information Commissioner) has said that one of her big focuses is supporting small businesses. “I would say to a small business, talk to the regulator, either on a no name basis or get written advice. “The ICO has specialist teams in different sectors. And going to the ICO is cheaper than going to a lawyer. “My advice to a lean-start up, talking to the ICO, is to provide it with as much information as possible. If you get written advice keep it, if you get advice from an ICO help-line, take accurate notes, the name of the person you spoke to and the date and time of the call. “But you need to know enough about GDPR to recognise you have a problem.” Three take-homes Finally, we asked her to sum it all up and tell us the three most important things about GDPR. First off, she talked about consent. “I have always had a problem with consent being used as grounds for processing information, because people were not really given a choice. Now companies can no longer say ‘yes we have consent’ if they make it a condition of providing a service. And that is a big change. Now consent has to be totally free and informed.” Second off, she turned to legitimate interests. Consent and legitimate interests are two of the legal bases for processing data under GDPR. “Organisations can no longer tick a box, they have to document their argument for why their legitimate interests are not a threat to individuals’ privacy. So, they have to conduct a balancing test. You can’t just process information because it is a good commercial idea, you have to make sure it is balanced against the rights of individuals. And you have to document it, prove it and have a response if you are challenged.” Thirdly, “what you now see is a confident regulator, who has the power to take on large companies and is not afraid to act. Regulators now feel that they have the tools to act against Big Tech. “GDPRislikeagunaimedatthetempleof the big technology companies. Other smaller businesses are caught in the crossfire.” The rest is history The build-up to GDPR is over. That is now history. Now the hard work is on-going. In producing this magazine, we have spoken to experts on GDPR and data privacy, many of whom have been speakers at our GDPR Summits. We have more in the diary. And don’t forget, the Data Protection World Forum this November, where many of the experts we have been talking to for this publication, including Nicola herself, will be speaking. By MICHAEL BAXTER 03.
  • 7.
    GDPR IS GOODFOR YOU The world’s largest company boasts a privacy policy that is a joy to read. What’s good for the customer is good for the business that serves that customer and GDPR is good for both. Let’s face it, privacy policies are not always fun to peruse. If someone wrote a book: “The world’s riveting privacy policies” it may be thick enough to act as a door stop, but only in a dolls’ house. Yet Apple, the company whose iTunes terms and conditions make GDPR read like romantic fiction, takes a pretty impressive stab at it. Maybe it is a coincidence that the company with such a privacy policy is also valued at $845 billion. But then there is no doubt that many millions of users choose its phones for the perception of privacy and data security. Maybe it is unfair to describe the company thus, Samsung will undoubtedly claim its own policies are just as robust, and maybe it would be right to do so. But perception is everything, and Apple is perceived in a certain way, this is at least one of the reasons why it is so successful and valuable. But to get to the nub of the matter, cast your mind back to 1949, when a book was published whose first sentence should be etched onto the minds of data professionals: “It was a bright cold day in April and the clocks were striking thirteen.” So began George Orwell’s 1984, a dystopian vision which gave the world such memes as Big Brother, Room 101 and Doublespeak. In the vision, privacy is no more. Not even our thoughts belong to us. Thirty nine years later, Apple announced its new computer to the world with a now famous ad featuring an Orwellian world: “On January 24th, Apple Computer will introduce Macintosh and you will see why 1984 won’t be like ‘1984.’” Fifteen years after that, Scott McNealy, the then chief executive of Sun Microsystems said: “Privacy is dead, get over it”, and famously in 2011, Mark Zuckerberg himself said: “People have gotten really comfortable, not only sharing more information and different kinds, but more openly and with more people.” He suggested that such lack of privacy has become a “social norm”. Zuckerberg is often quoted as saying privacy is dead. But there is no evidence he actually said that. Soon after he uttered those words about lack of privacy becoming a social norm, a spokesperson for Facebook said: “His remarks were mischaracterised and added: “A core part of Facebook’s mission has always been to deliver the tools that empower people with control over their information.” She continued: “If the assertion is that anything Mark chooses to make private is inconsistent with his remarks last week, here are a few other hypocritical elements of his life: he hides his credit card numbers in his wallet, he does not post the passwords to his online accounts, and he closes the door behind him when he goes to the toilet.” Yet,lookathowtheFacebooksharepriceslidaftertherevelations of its deal with Cambridge Analytica were outed by The Guardian and New York Times, look at the value of Apple. That is why there are big bucks in data protection, and it is why GDPR is not merely a good thing, it’s good news for the world. If, as The Economist suggested last year, big data is really the new oil, and not the new asbestos, as cynics might suggest, then that counts for nothing if people don’t trust companies with their data. A recent survey by ForgeRock found that 57 per cent of UK consumers are worried that they have shared too much personal dataonline.Fiftythree percentsaidtheywouldnotbecomfortable if their personal data was shared without their permission and a further 58 per cent said they would stop using a company’s services completely if it shared data without permission. If data is the means by which the digital economy can create wealth and give the so called Fourth Industrial Revolution the impetus it needs, then without trust, without the public’s confidence that their privacy is respected, the data revolution will stutter and then collapse. That is why we need GDPR - not as a piece of regulation to scare companies, not through some sense of anti-Orwellian sentiment, nor do we want regulators strutting the privacy stage like thought police, punishing the slightest transgression with fines to bankrupt well-meaning companies, but without trust, the data revolution may not happen - and that would be a disaster. PS: This article was written before Apple announced a new upgrade to its iOS operating system and OS for the Mac, designed to offer improved privacy. It seems that Apple sees data privacy as core strength, a feature that can give its products an edge. GDPR IS GOOD FOR YOU 05. PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
  • 8.
    GDPR is thegeneral regulation, but contrary to many media reports, other rules and directives such as MiFID II, Open Banking, ePrivacy Regulation and The Directive on Security of Network and Information Systems, do not muddy the water, in fact they fit together. People too easily forget the financial crisis in 2008 was so severe that at one point it seemed as if capitalism itself was tottering. The truth is, that in the build up to that crisis, if you listened to the noise coming out of regulators and august bodies such as the IMF, then to borrow words from Simon and Garfunkel, you heard the sound of silence. The famous singing duo also said: “Hello darkness my old friend”, but the new digital single market and other regulations slowly being unveiled by the EU do not create a new barrier to business, as some might say, rather they are designed to bring “down barriers to online opportunity,” or, so says the European Commission. New regulations are also about transparency, and especially in the case of GDPR, responsibility and accountability. Above all, they are also about creating trust. Take MiFID II - Markets in Financial Instruments Directive version 2 - the media, always keen to spot a scandal, say that the new financial instruments directive is not compatible with GDPR - that the transparency required by MiFID II is at odds with the privacy that GDPR is designed to support. But this is simply not true. The detail does not reveal the devil, it reveals light. MiFID II is designed to force financial organisations to act in the best interests of their clients, and do not, as happened prior Goodbye darkness, hello friend to 2008, occasionally bet against clients, or recommend products simply because they pay a higher commission. So, for example, out goes the practice of giving away free research as an inducement to trade via an organisation - instead research must either be charged for or given away to all who want it, regardless of whether they are customers. High frequency trades must be time stamped in an effort to put an end to the practice of ordering a trade in an attempt to manipulate the market, and then cancelling the order before it is executed. But perhaps most significantly of all, MiFID II requires the holding of data concerning staff’s trading activities. It does this by saying that for every trade that your bank or broker makes on your behalf, there must be a detailed record. It must be possible to pull up all the data and information that goes with the trade, regardless of the medium, for example, a recorded phone call with someone giving instructions. Or it could be a text, an email, anything that relates to that trade - creating a paper trail, so there is a clear record showing what went into that decision to make that trade and whether it was in the best of interests of the customer. So MiFID II is all about empowering the consumer to make the right choices. ButsomearguethattheyhavespottedacontradictionwithGDPR. MiFID II requires the processing of data related to individuals, GDPR imposes rules on the processing of personal data. But as Abigail Dubiniecki, Associate at Henley Business School’s GDPR Transition Programme and Specialist in data privacy at My Inhouse Lawyer, says: “There may seem to be a contradiction between MiFID II and GDPR but that’s not the case. Indeed, the FCA (Financial Conduct Authority) and ICO (Information Commission Office) have gone on record saying they are not incompatible, because under article six of GDPR, By Michael Baxter 06. Subscribe for free here and get the next issue to your inbox
  • 9.
    GDPR means youcan only use the data for the purpose for which it was given. So if they share it with someone else, or try and sell another service, that is a no no, unless they find a legitimate interest or they get my consent or make use of one of the other bases of lawful consent. one of the lawful bases for processing personal data is legal obligation – you have a legal obligation, under MiFID II, to suck up all that data related to that trade.” Of course, it is nuanced. As Ms Dubiniecki added: “Could you suck up all that data and keep it forever, or not use it for the purpose for which you gathered it? Absolutely not. Under GDPR you have limited retention periods, you are not allowed to hold onto it for longer than is necessary for the purpose for which you got it, under the legal bases you used.” Returning to the Digital Single Market another new area is Open Banking, or PSD2. The idea here is simple enough, and at least in part the regulation has been created from one of the lessons of the 2008 crash, the risk of ‘too big to fail’ banks. It’s about data portability for the financial services sector, it is saying: ‘I should be able to take my personal data related to the banking that I have been doing with a particular bank, in a machine readable format that is common to everyone else, and plug it in somewhere else to see if I can get better rates, or a better device.’ Some fintechs may provide more information to you. Under Open Banking, it will be possible to take your banking data and via an app, provide you with a detailed online tool on your spending, updated in real time, projecting how much money you have at the end of the month, and search for better deals for you on financial products. But will the consumer have sufficient trust in how her or his data is processed? If the consumer is reluctant to let a third party take control of their data, then PSD2 will not achieve its desired outcome. There are lots of safeguards built into PSD2, rules about how data is used, but perception matters - the barriers to online opportunities will not be overcome unless this trust can be earned. That is why GDPR is so vital - the General Data Protection Regulation is what its name suggests: general. And as Abigail Dubiniecki said: “PSD2 does provide protections, but GDPR means you can only use the data for the purpose for which it was given. So if they share it with someone else, or try and sell another service, that is a no no, unless they find a legitimate interest or they get my consent or make use of one of the other lawful bases.” Open banking is a fundamental part of creating a digital market that serves the best interests of customers, but without trust it won’t get the take-up. GDPR is vital for creating that trust. Another area to watch is the new ePrivacy Regulation. At the moment, GDPR and the Privacy and Electronic Communications Directive or PECR, work hand in hand. A marketer, for example, seeking a legal basis for emailing customers may feel that the requirements for consent, under GDPR, are too onerous, but instead choose legitimate interest as their legal bases for processing personal data. GDPR is clear, Recital 47 states it in black and white: “The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest.” It is just that PECR imposes additional considerations. For one thing, it requires that in most cases, people have to give consent to receive emails, but, as John Mitchison at the Direct Marketing Association pointed out in a discussion with us: “There is a line that refers to soft opt-in. So if you have collected someone’s email in the course of doing business, and there is an opt-out option, you can send them emails.” PECR is soon to be replaced by the new e-Privacy Regulation. We are not sure what the final regulation will say yet, but there is one thing we can say for sure: it will dovetail with GDPR, the regulations will support each other and provide more light. Finally, there is The Directive on Security of Network and 07. “ The media, always keen to spot a scandal, say that the new financial instruments directive is not compatible with GDPR – that the transparency required by MiFID II is at odds with the privacy that GDPR is designed to support. But, one of the lawful bases for processing personal data is legal obligation – you have a legal obligation, under MiFID II, to suck up all that data related to that trade. Open banking is a fundamental part of creating a digital market that serves the best interests of customers, but without trust it won’t get the take-up. PECR is soon to be replaced by the new e-Privacy Regulation. We are not sure what the final regulation will say yet, but there is one thing we can say for sure: it will dovetail with GDPR; the regulations will support each other and provide more light.
  • 10.
    Information Systems (NISDirective). EU countries have until 25 May 2018 to transpose this into national law. GDPR is unambiguous on the subject of security, and what to do in the event of a data breach: Article 33 states: “In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.” GDPR then goes into further detail on cybersecurity. The NIS directive objective is to “increase cooperation between member states and lay down security obligations for operators of essential services and digital service providers.” While GDPR focuses on cybersecurity in the context of data privacy, the NIS focuses on the sharing of information on cybersecurity threats, and improving security safeguards. The two regulations are once again designed to dovetail - and if there is any superficial indication that the sharing of information required by NIS is at odds with GDPR’s requirement for privacy, such a contradiction breaks down when you consider that one of the lawful bases for processing of personal data under GDPR is legal obligations. However, in NISD it is specifically stated that any information sharing that happens reincidence must be consistent with GDPR regulations when personal data is involved. The digital single market, big data, and the opportunity it all brings can create wealth and prosperity - but only if it can illuminate where currently there is darkness, bring transparency where where there is lack of visibility and trust where there is suspicion. GDPR, together with a raft of other measures, is designed to bring lightness and make digital technology and data the customers’ friend. The NIS directive objective is to “increase cooperation between member states and lay down security obligations for operators of essential services and digital service providers.” Goodbye darkness, hello friend 08.
  • 11.
    GDPR IS AJOURNEY, NOT A DESTINATION £50 OFF BOOK NOW PROMO CODE: DPMAG GDPRSUMMIT.LONDON THE U K’ S L E ADING GDPR EVENT SERIES UNDERSTAND WHAT GDPR MEANS FOR THEIR BUSINESS HEAR FROM THE LEADING GDPR EXPERTS BENEFIT FROM INTERACTIVE IN-DEPTH SESSIONS GET ANSWERS TO THEIR BURNING QUESTIONS GAIN AN ACTIONABLE ROADMAP TO COMPLIANCE AND BEYOND SUPPORTED BY ATTENDEES WILL:
  • 12.
    GDPR is anEU measure, but the principles that lie behind the regulation are gaining traction around the world. George Soros, the billionaire investor and speculator, who famously bet against the Bank of England in 1992 and won, is an arch critic of the way big tech companies such as Alphabet and Facebook use data. Maybe it has something to do with Soros’ roots - growing up in Hungary under Nazi occupation. “Social-media companies are inducingpeopletogiveuptheirautonomy,” he said, “the power to shape people’s attention is increasingly concentrated in the hands of a few companies threatening the concept of ‘the freedom of mind.’” For that reason, he is a big supporter of GDPR. “Europe has much stronger privacy and data-protection laws than America,” he said when speaking at the World Economic Forum in Davos, earlier this year. He continued, “Commissioner Vestager (European Commissioner for Competition) is the champion of the European approach. It took the EU seven years to build a case against Google, but as a result of her success, the process has been greatly accelerated. Due to her proselytizing, the European approach has begun to affect attitudes in the United States as well.” Yet there is a perception that the US is too far behind. Commenting on the ongoing issues with Facebook, Vera Jouriuva, EU Commissioner for Justice, Consumers and Gender Equality, recently told the BBC: “I can tell you that now we have much stronger arguments to say that we did the right thing to adopt the General Data GDPR and the United Statesby Michael Baxter Don’t miss the data protection event of the year – pre-register here
  • 13.
    11. Protection Regulation. .. So there will be a much stricter regulatiory framework in the EU. . . But I have just come back from Washington and when I look at the American legislation, I don’t see such a robust legislative framework so it will be interesting to look at the United States to see whether they will come up, in the future, with stricter rules.” Could this be changing? “The Americans have famously over time been the people who will give away data all day long,” says Eve Maler, the VP of Innovation & Emerging Technology at ForgeRock. But this could be changing: speaking to us from Seattle, she said “but Americans have become sensitised and more cynical.” A recent survey from The Economist Intelligence Unit (EIU) and sponsored by ForgeRock, which describes itself as a digital identity management company, found that Americans are the most concerned about misuse of data, with the survey including people from Europe and Asia Pacific as well as the US. Eve Maler speculates that, in what she calls ‘the post, post-Snowdon era’, people are beginning to have strategies around their privacy rights. She suggests that following various breaches, United Healthcare and Equifax, for example, American attitudes have shifted. Vera Jouriuva said: “I have a very strong feeling that the tiger got out of the cage.” Referring to the recent furore over Facebook, she continued: “Something that is serious happened. That will have opened all of our eyes. I see that American people are more relaxed about their privacy and they are not looking to impose such strong pressure as the Europeans. This strong pressure from Europeans resulted in the stricter rules, coming into force in May (GDPR). And I expect something like that in the United States.” What happens in the US regarding data privacy is relevant to companies trying to comply with GDPR, not least because under GDPR, a data controller is responsible for the personal data it has collected, even when it is processed by third parties including third parties not based in the EU. See this in the context of uploading data onto the cloud: as far as GDPR is concerned, when data gathered by a country that falls under the jurisdiction of GDPR uploads data into the cloud, and that data is held on servers in the US, maybe owned by Amazon, Microsoft or Alphabet, then that data is regarded as imported. Or so Anthony Lee, a privacy lawyer and Partner at DMH Stallard told us. Mr Lee explained: “There is a general prohibition of personal data exported out of the European Economic Area, unless it is going to a country which has been designated by the EU commission as having suitably robust privacy laws in place. And for example, the US is not one of those countries.” The data privacy story in the US weaves. There used to be Safe Harbour, until the Austrian lawyer and privacy activist, Max Schrems, with half a mind on the Edward Snowdon revelations, took on Facebook. Safe Harbour was a voluntary scheme that had been set up by the US government in conjunction with the EU Commission, it meant that if a company was a member you could tick the adequacy box, as required under GDPR for the export of data. Schrems objected to the way, US state agencies could, as Anthony Lee put it, “essentially help themselves to data under the Patriots Act - creating the impression of mass surveillance.” Mr Schrems took the matter up with the Irish Data Protection Agency, who dismissed his case, saying it was “frivolous and vexatious.” Schrems was not finished, however, takingthecasetotheIrishHighCourt,who referred the matter to the European Court of Justice to the European Commission, who sided with Schrems saying that Safe Harbour was unlawful and henceforth ‘you can’t use it’. The eventual result was The Privacy Shield, the voluntary system currently in place. According to Anthony Lee, under Obama there was an understanding that there would be restraints on the extent to which state agencies could help themselves to data. The Trump administration, however, has made it clear that US rules are there to protect US residents and not European citizens .” Sylvia Kingsmill is a Partner at KPMG, heading up digital privacy and compliance nationally for the firm. Canada, and in particular, Ontario, is the home of privacy by design and default, a fundamental part of GDPR, and Ms Kingsmill worked with the Ontario privacy commissioner in the early days of privacy by design. Shesays:“TheproblemwiththeUSisthat they do not have a national omnibus data protection law that regulates data privacy, but they do have a very heavy handed regulator, The Federal Trade Commission, which exercises its powers under section five of the Federal Trade Act, for any practices that are misleading and deceptive. Privacy falls under that umbrella. That’s where The FTC flexes its muscles, against the tech giants like Facebook and Google, but they do need a national standard, so there is a uniform approach.” However, Ms Kingsmill says that sectors like healthcare and banking have been regulated for some time, and “I know that the US gets a lot of flack because the tech giants aren’t regulated enough, but in certain sectors take privacy very seriously.” Another important element in US data privacy relates to class action law suits, which are far more prevalent in the US. “Facebook could be the game changer, though,” suggests Ms Kingsmill. And that takes us to how the US, with its own particular priorities and the EU, via GDPR, view data privacy. The differences are clear. Maybe in the EU, the experiences from the World War II period, strikes a stronger resonance. It may be no coincidence that within the EU, the most vociferous of defenders of data privacy, and how it relates to human rights, is Germany. But if attitudes in the US, as the ForgeRock sponsored EIU survey suggests, are changing, then the mood may be ripe for a GDPR type framework to be applied there too. Maybe the eruption of media attention relating to Facebook and data privacy will be the catalyst. There is a general prohibition of personal data exported out of the European Economic Area, unless it is going to a country which has been designated by the EU commission as having suitably robust privacy laws in place. And, for example, the US is not one of those countries. A recent survey from The Economist Intelligence Unit (EIU), sponsored by ForgeRock, found that Americans are the most concernedaboutmisuseofdata.
  • 14.
    Expand your knowledge– pre-register to Data Protection World Forum today
  • 15.
    Millennials are different,or so they say. The generation born in the decade or two before the end of the last century, or so we keep hearing, have a different way of thinking. Take their attitude towards personal data. A study by the USC Annenberg Center for Digital Future and Bovitz Inc found that 56 per cent of the millennial generation would be willing to share their location with a nearby company in return for a relevant coupon or promotional deal. By contrast, only 42 per cent of users of 35 years and older agreed they would share their location. Jeffrey I. Cole, the director of the USC Annenberg Center for the Digital Future said: “Millennials recognise that giving up some of their privacy online can provide benefits to them. This demonstrates a major shift in online behaviour.” But is it really like that? The millennial generation and their youngers, generation Z, are what they call digital natives - they are digitally savvy. But how many kids do you know who share only minimal information on Facebook with parents? Don’t they know precisely how to change privacy settings, determining who can see certain posts. Abigail Dubiniecki, data privacy specialist lawyer, recalls “being at a conference where people were talking about 14-year-olds using Instagram, setting up multiple user profiles, deleting stuff when it no longer supports the narrative they put forward to friends. Whether it is cheetah photos or skate boarding, they are actively developing their brands on social media.” She says: “I think this is far more natural to them than to us; I did not grow up being seen by the whole world. If something embarrassing at high school happened, maybe most of the people at the school would know about it, but not someone a couple of blocks away, or half an hour away. But they have found very creative ways to control their own narrative.” It’s what’s Dr Ann Cavoukian calls informational self-determination. It’s about ‘when I want to share, I do, when I don’t want to, I don’t. If I share it’s for this purpose not that’. Abigail says: “That’s what privacy is.” Valerie Steeves, Professor of Criminology at the University of Ottawa, conducted a study into this very area. It seems that privacy consideration come as almost second nature to this generation. In the study, she found that participants engaged in a number of different strategies to manage their privacy. For example, a small number of photos were kept entirely private, while priority was given to efforts aimed at controlling who sees particular photos and preventing them from being spread to unintended audiences.Some topics were seen as private and not appropriate to share. Snapchat and Instagram were used to ensure audiences saw particular photos. Snapchat was seen as platform for sharing with friends, Instagram was seen as a platform for building a persona and thus required more careful curation. Some created multiple accounts to limit which audiences saw which content, for example, ‘spamming friends using one Instagram with less formal images,’ but these accounts only have a very small number of followers, such as close friends. Snapchat has a feature that tells users if a screen shot is taken of one of their photos, and this was cited by some as an important feature. More than half expected friends to ask permission before posting an image with them in it. There was a lot of emphasis on retrospective consent, asking peers to delete photos they didn’t want shared. Data privacy seems to be something that is implicitly understood by this generation. GDPR and how the millennial generation manage privacy By Michael Baxter 13. Dr Ann Cavoukian refers to informational self- determination. It’s about ‘when I want to share, I do, when I don’t want to, I don’t. If I share it’s for this purpose not that’.
  • 16.
    GDPR: A QUICK SUMMARY BYARDI KOLAH, Executive Fellow and Director of the GDPR Transition Programme at Henley Business School
  • 17.
    Introduction GDPR requires are-boot in our thinking about data protection, privacy and security for the digital age. It represents the biggest shake up in European data protection and privacy laws for over two decades. The genesis of GDPR is the 2012 proposal by the European Commission for a modern legal framework for the world’s largest digital single market of 500 million consumers. This was more about lowering the barriers to market entry for new entrants, increasing competition and choice of products and services for consumers and stimulating economic activity and employment across the European Economic Area (EEA). Since that time, the pendulum has swung in the other direction and there’s now an obsessive focus on data protection, privacy and security in the wake of an exponential increase in cyber- crime and the misuse of personal data on a global scale. Although this is extremely important it has clouded judgment on seeing GDPR as a significant opportunity, not a regulatory threat for companies and organisations. The open competition aspects of GDPR remain in place, and the European Commission is actively leading the effort to encourage companies and organisations to create a deeper level of digital trust in order to do more, not less, with personal data. Fully enforceable across all 28 EU Member States from 25 May 2018, GDPR aims to deliver a high degree of consistency, certainty and harmonisation in the application of data protection, privacy and security laws across the EU, replacing the Data Protection Directive 95/46/EC and other Member State legislation like the Data Protection Act 1998 in the UK, in its wake. Within this new landscape, there’s limited ‘wriggle room’ for Member States to pass laws that impact the processing of personal data seen only through the lens of national self-interest. Many commentators have pointed to this as evidence of a lack of harmonisation of data protection, privacy and security laws applying across the EU, given the differences in the way some aspects of GDPR will work on a country-by-country basis. However, the reality is that such differences are largely confined to a relatively small number of operational areas for companies and organisations within the EU. Many countries, including the UK, have passed their own data protection laws in alignment with GDPR and globally, this trend is gaining momentum. In the case of the UK, certain standards under GDPR are being raised under Member State laws. From a commercial perspective, companies and organisations that conduct cross-border personal data processing will be primarily regulated by the local supervisory authority in the jurisdiction in which it has its main establishment. Data protection principles GDPR retains the core principles of the Data Protection Directive 95/46/EC,buthasbeefedthemup.Thecorerulesmaylookfamiliar to experienced privacy practitioners and senior managers, but this is a trap for the unwary as there are many important new obligations as well as a tougher regime of sanctions and fines for getting this wrong. There are seven data protection principles and the data controller and data processor must ensure that it complies with all of them: 1. Lawfulness, fairness and transparency Personal data must be processed lawfully, fairly and in a transparent manner. 2. Purpose limitation Personal data must be collected for specified, explicit and legitimate purposes and not further processed in a manner that’s incompatible with those purposes (with exceptions for public interest, scientific, historical or statistical purposes). 15. GDPR is complex, right? Even if you feel you know the basics the Regulation can be confusing. Here, GDPR is summarisedbyArdiKolah,DirectoroftheGDPRTransition Programme at Henley Business School, and Editor-in- Chief at the Journal of Data Protection and Privacy. This summary comes from Ardi’s book, The GDPR Handbook, whichisavailableonAmazonfrom3June2018. Make sure you don’t miss issue 2 – subscribe here for free
  • 18.
    3. Data minimisation Personaldata must be adequate, relevant and limited to what’s necessary in relation to purposes for which they are processed. 4.Accuracy Personal data must be accurate and where necessary, kept up- to-date. Inaccurate personal data should be corrected or deleted. 5. Retention Personal data should be kept in an identifiable format for no longer than is necessary (with exceptions for public interest, scientific, historical or statistical purposes). 6. Integrityandconfidentiality Personal data should be kept secure. 7. Accountability The data controller should be able to demonstrate and in some cases, verify compliance, with GDPR. What to do now? Double check that all policies, processes and procedures are in place and that this delivers the seven data protection principles. Ensure that the board support company and organisation-wide awareness and training programmes that should be short, informative (not boring!) and that all of this is recordable and logged. Security of processing GDPR requires the data controller and the data processor to keep personal data secure. This obligation is expressed in general terms but does indicate that some enhanced measures, such as encryption and pseudonymizing may be required. The data controller must report data breaches to their supervisory authority within 72 hours of discovering this has happened, unless the personal data breach doesn’t cause high or very risk to the rights and freedoms of individuals. What to do now? Undertake a full review of technical security measures that are appropriate for the type of personal data processing being carried out at the data controller and data processor. Seek expert guidance and support. Data Protection Officer (DPO) A data controller, joint data controller and a data processor may be required to appoint a data protection officer (DPO). This depends on what processing of personal data is being carried out. Certain private and most public-sector organisations will be required to appoint a DPO to oversee their data processing operations. A DPO will be required where: the processing is carried out by a public authority or body; the core activities of the data controller or data processor consist of processing which requires regular and systematic monitoring of data subjects on a large scale; the core activities consist of processing special categories of personal data on a large scale and required by Member State law. The DPO must be involved in all data protection issues and can’t be dismissed or penalised for performing their duties and responsibilities. The DPO must also report to the highest level of management authority within the company and organisation. The DPO must be involved in all data protection and security issues and can’t be dismissed or penalised for performing their role. The DPO must report directly to the highest level of management within the company or organisation but doesn’t have to physically report to the CEO. The report they write must be considered by the board. What to do now? Ensure that a suitable senior manager within the company and organisation has been identified and can be trained independently to fulfil the duties and responsibilities of the DPO. They need to be adequately resourced otherwise this in itself is a breach of GDPR. Other alternatives including using a consultant as a DPO or an outsourced DPO service. Data Controller and pan-European data breach notification obligations The data controller is responsible for deciding the purposes and means for the processing of personal data and can be a private company and organisation, a charitable or voluntary body, operate in the public sector or be government department. The responsibility and liability of the data controller for any processing of personal data or that done on its behalf by the data processor needs to be established. The data controller is obliged to implement appropriate and effective organisational and technical measures to mitigate very high risks of processing personal data that could cause harm or damage to data subjects. It’s important that the data controller is able to demonstrate compliance of personal data processing activities with GDPR. Those organisational and technical measures should take account of the nature, scope, context and purposes of the personal data processing and the risks to the rights and freedoms of natural persons. GDPR imposes stricter obligations with respect to data security and a specific data breach notification process. In the event of a personal data breach that’s likely to result in a high risk to the rights and freedoms of data subjects, the data controller must notify the supervisory authority and where appropriate affected data subjects within 72 hours of becoming aware of the breach. Notice needs to be given “without undue delay” and must contain proscribed information. What to do now? Prepare for personal data breaches and practice the way in which the company and organisation will respond to the real thing. 16.
  • 19.
    Double check policies,processes and procedures are fit for purpose and ensure how the company and organisation will react and whom to notify including the DPO. Implement staff training as to what to do in such instances, as well as how to reduce the risk of incidents and personal data breaches occurring in the first place. Extra-territorial reach GDPR primarily applies to companies and organisations established in the EU. It will also apply to businesses based outside the EU that offer goods and services to, or monitor the behaviour of individuals in the EU. What to do now? These businesses based outside of the EU will need to appoint a representative in the EU, subject to certain limited exemptions. The representative may have to accept liability for breaches of GDPR and could only act under specific instructions from the data controller. The representative effectively has all of the downsides but no corresponding upsides so it’s a difficult relationship from the start. Cross-border data transfer rules GDPR prohibits the transfer of personal data outside the EU unless certain conditions are met. Those conditions are broadly the same as those under the Data Protection Directive 95/46/EC. but adds new ones such as certification mechanisms and codes of conduct, as well as a new very limited derogation for occasional transfers based on legitimate interest. Country-specific authorization processes will no longer be needed (with some exceptions) and Binding Corporate Rules (BCRs) are formally recognised in GDPR. What to do now? Double check that there’s an up-to-date inventory of cross-border data flows; check the strategy in the light of any decisions from the European Commission, the Court of Justice of the European Union and EU-US Privacy Shield (if appropriate). Data mapping The data controller and data processor must maintain an up-to-date record of processing activities. Under GDPR, detailed information must be kept and could be inspected. What to do now? The DPO must ensure and document what actually the data controller holds now in the way of personal data, the intentions of transferring this personal data, and how such data flows internally through the company and organisation. Data Processor obligations GDPR expands the list of provisions that data controllers must include in their contracts with data processors. Some aspects of GDPR are directly applicable to processors. This will be a major change for some suppliers who have avoided direct regulation under the Data Protection Directive 95/46/EC by setting themselves up as data processors. Processors will be jointly and severally liable with the relevant controller for compensation claims by individuals. What to do now? Ensure that new obligations under GDPR are understood and observed. GDPR imposes compliance obligations directly on the dataprocessor,suchasimplementingsecuritymeasures,notifying the data controller of personal data breaches, appointing a DPO (if applicable), maintaining records of processing activities, etc. the data processor will be directly liable in case of non-compliance and may be subject to direct enforcement action. The data controller and data processor will be required to enter into detailed data processing agreements or renegotiate existing ones. Data Subject’s rights GDPR largely preserves the existing rights of individuals to access their own personal data, rectify inaccurate data and challenge automated decisions about them. The Regulation also retains the right to object to direct marketing. There are also potentially significant new rights for individuals, includingthe“righttobeforgotten”andtherighttodataportability. The new rights are complex and it isn’t clear how they will operate in practice. In addition, the data controller will also be required to provide significantly more information to Data subjects about their personal data processing activitiesd What to do now? The data controller must implement appropriate processes and infrastructure to be able to address data subjects’ rights and requests. It must also ensure that an adequate Data Privacy Notice is provided prior to commencement or within one month where the personal information comes from a third party. Quality of consent Obtaining consent from an individual is just one way to justify processing their personal data. There are other justifications. It will be much harder for the data controller to obtain a valid consent under GDPR. Individuals can also withdraw their consent at any time. As under the Data Protection Directive 95/46/EC, consent to process special (sensitive) personal data must be explicitly given. Consent to transfer personal data outside the Union must now also be explicit. What to do now? Consent is retained as a processing condition but GDPR is more prescriptive than the Directive 95/46/EC when it comes to the conditions for obtaining valid consent, so this needs to be very carefully understood. The key change is that consent will require a statement or clear affirmative action of the data subject. Silence, pre-ticked boxes 17.
  • 20.
    and inactivity willnot be sufficient. GDPR clarifies cases where consent won’t be freely given (e.g. no genuine choice to refuse, clear imbalance between the data subject and the data controller such as in the workplace). Data subjects must be informed of their right to withdraw consent. One-Stop Shop GDPR applies to the processing of personal data by data controllers and data processors established in the EU, as well as by data controllers and data processors outside the EU where their processing activities relate to the offering of goods or services (even for free) to data subjects within the EU, or to the monitoring of their behaviour. The supervisory authority in the jurisdiction of the main or single establishment of the data controller/data processor will be the lead authority for cross-border processing (subject to derogations). What to do now? Assess whether – as a non-EU data controller and data processor – data processing activities of data subjects is within scope of GDPR. This must determine where the ‘main establishment’ might be located based on data processing activities. Profiling and Profiling-based decision making GDPR includes a wide range of existing and new rights for data subjects. Amongst these are the right to data portability (right to obtain a copy of one’s personal data from the controller and have them transferred to another controller), right to erasure (or ‘right to be forgotten’), right to restriction of processing, right to object to certain processing activities (profiling) and to automated processing decisions. The data controller will also be required to provide significantly more information to data subjects about their processing activities. What to do now? Data subjects have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or similarly significantly affects them. Individuals will also have an express right to ‘opt out’ of profiling and automated processing in a wide range of situations. If the data controller is engaging in profiling activities, then it must consider the best way to implement total security measures. Data Protection by Design and by Default These concepts are codified in GDPR and require the data controller to ensure that individuals’ privacy is considered from the outset of each new processing, product, service or application, and that, by default, only minimum amounts of personal data as necessary for specific purposes are collected and processed. What to do now? Implement measures, such as pseudonymisation or data minimisation designed to implement data protection principles from the outset of any project. Data Protection Impact Assessment ‘Lite’ and DPIA The data controller will be required to perform a data protection Impact Assessment (DPIA), where the processing of personal data (particularly when using new technologies) is likely to result in a high risk to the rights and freedoms of the individuals. At Henley Business School we conceived of DPIA ‘Lite’ which is a helicopter view of what is being done that’s compliant, what’s currently being done that isn’t compliant and what the data controller needs to consider starting to do in order to be complaint with GDPR. What to do now? DPIAs will particularly be required in cases of: an evaluation of personal aspects based on automated data processing including profiling processing on a large scale of special categories of personal data; systematic monitoring of a publicly accessible area; make DPIAs part of standard procedures for all personal data processing operations so that they are easier to implement as an everyday task; DPO should ensure that staff have been trained in order to carry out a DPIA and these need to be documented very carefully. Sanctions and fines There is a step change in sanctions and fines where supervisory authorities can impose fines up to 4% of annual worldwide turnover or €20 million, whichever is greater. There are also other powers at their disposal including audit rights, issue warnings and issue a temporary or permanent ban on the processing of personal data (known as a ‘STOP order’). What to do now? Understand and keep under review the type of personal data being processed, whether it is very high or high risk and what mitigation measures are necessary to reduce this to a residual risk that doesn’t cause harm or damage to data subjects and record the organisational and technical measures deployed within the company and organisation. 18.
  • 21.
    The EU GeneralData Protection Regulation (GDPR) comes into force in May 2018, and will impact every organisation that trades with any EU country. The new regulations will disrupt how organisations: store,manageandprocessdata.GDPRisthebiggest shake-up in data protection for a generation. Ourhalf-dayandfull-daycoursesprovideapractical and efficient staff training programme, presenting staff with the knowledge, understanding and practical tools to implement and maintain GDPR compliant processes. Contact Leon for more details: leon@dataprotection.media 0203 515 3015 Half Day : £2,000 Full Day : £3,500
  • 22.
    Privacy by designand default Where it began: Dr Ann Cavoukian is passionate about data privacy. Appointed Information and Privacy Commissioner of Ontario, Canada in 1997, today she is the Distinguished Expert-in-Residence, leading the Privacy by Design Centre of Excellence at Ryerson University, Canada. And when it comes to privacy by design and default - she is the boss, the governor - the person who brought the concept to the world. Privacy by design and default is a key, maybe the single most vital, part of GDPR - it means privacy is part of the foundation of a new product or service, an integral part - not a bolt-on extra. AndwhenDrCavoukiantalksonthesubject, her passion is palpable - and contagious. Back in 2016, she expanded on the idea at the SAI Computing Conference, London. You can see a video of her talk on the Ryerson website. It’s worth watching. But when you hear her story, you begin to get it. Privacy, and by extension, because it is so essential to it, privacy by design and default, is about freedom - a human right. The good doctor herself has a personal tale which may point to why this concept is so important to her. She is from an Armenian family born in Egypt - her parents fled the country in the 1950s in the dead of night. They had lived charmed lives, but still they left. Why? She asked her parents that very question. “We moved for you, for your freedom” they said. But then maybe a similar story can be applied to the German people as a whole. Today they have a phrase to describe it. Germans call it informationelle Selbstbestimmung - Information self- determination. It was enshrined into the German constitution in 1983 and Dr Cavoukian says “Germany is now the leading privacy and data protection country in the world.” But there is a myth about privacy - it does not mean secrecy. Not zero sum Some people fear that, in a healthcare setting, data privacy could hold back progress, cost safety and lives. Data is helping the fight against disease - it was once estimated that if the data gathered by the British NHS was captured and monetised it would create an organisation worth over a trillion dollars. But privacy does not have to come at the expense of the benefits of data. It’s not a zero-sum game - privacy does not means less of another benefit, rather privacy can enhance, create a positive sum. Dr Cavoukian says “privacy can work with safety, with data analytics, marketing - it’s not an either, you can have both. Privacy does not equal secrecy it’s about personal control, which belongs to you. Privacy does not equal secrecy. By Michael Baxter 20. Privacybydesign It’s about freedom - a human right. PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
  • 23.
    Adil Akkus has24 years’ experience in IT and Business change delivery. He holds a B.Sc. degree in Computer Science as well as PRINCE2, MSP and MoP Practitioner certifications. He has successfully implemented a wide varity of multi-site and multi-million-pound programmes for industry leaders such as Mercer, Sky, IBM, Virgin Media, BT, AT&T Wireless and T-Mobile. With his CIPP/E privacy and data protection certification, he has set up and implemented 3 multi-national GDPR programmes. Here he introduces privacy by design. Equifax In September 2017, Equifax hit the news headlines with reports suggesting that over 140 million people’s personal data was breached. It transpired that Equifax failed to install a security patch tool for building web applications in a timely manner. Some proactive actions and securing data even at the design stage would have saved a lot of money and bad press for Equifax. Facebook, WhatsApp data sharing for ad targeting After Facebook purchased the WhatsApp messaging platform, they set out to change WhatsApp’s terms and conditions to allow the data sharing with Facebook for ad targeting purposes. The news raised a lot of public uproar. Even though most of us use both apps separately, a significant percentage of the users changed their settings to prevent this. In March 2018, the ICO convinced WhatsApp to sign a public commitment not to share personal data with Facebook until data protection concerns are addressed. Setting the users’ privacy to the strictest mode (i.e. privacy by default) would surely have prevented all this hassle. Conclusion With the undeniable influence of GDPR, public and the regulators expect “privacy by design” to prevent such high profile cases and to increase public confidence in the organisations with their data. Until recently, data protection teams had to fight hard to convince the product, service and technology teams to pay attention to privacy concerns, GDPR’s emphasis on “privacy by design” will transform out ways of working and thinking. 1.ProactivenotReactive;Preventative not Remedial The principle encourages the proactive rather than reactive measures. It requires the organisations to anticipate the risks and prevent privacy incidents before they occur. Privacy as the Default Setting (referred to as Privacy by Default) This principle promotes the maximum privacy protection as a baseline. It requires any new service or product to use strictest privacy settings by default, without any manual input from the end user. Some privacy experts argue that GDPR’s push for “explicit” consent can be traced back to this principle. Privacy Embedded into Design This principle demands the privacy measures not to be bolted on as add-ons, after the fact. It drives privacy to become an essential component of the core service, product or functionality that is being delivered. As a result, privacy becomes an integral part of the product or service, instead of a functionality trade-off. Full Functionality – Positive-Sum, Not Zero-Sum This principle encourages creating a privacy-by-design culture where privacy policies drive rather than hinder growth and revenue. End-to-End Security with a full Lifecycle Protection Having embedded privacy by design into the systems in the previous principles, this one requires the data to be protected cradle to grave; wherever it goes, from the time it is created, shared, and finally archived. Visibility and Transparency – Keep it Open This principle is all about building consumer trust. Information about your privacy practices should be out in the open and written in plain language. Respect for User Privacy – Keep it User-Centric This principle puts consumers in charge of their data. It requires users to be given the ability to update and correct any data held about them. It also requires the users to be the only ones who can grant and revoke access to their data. 2. 3. 4. 5. 6. 7. 21. Over the recent few years, we have witnessed some of the highest profile data breaches and incidents from Yahoo to Uber, from Equifax to Talktalk. Some of these breaches remained undetected, unreported or unknown for various reasons. A good proportion of these incidents could have been prevented or handled in a much less costly and much less distressing way by using the privacy by design principles. Background Even though privacy by design has become very popular over the last few years, thanks to GDPR, it is has been around since the 90s. Privacy by design argues the view that the future of privacy cannot be assured solely by compliance with legislation and regulatory frameworks. It argues that privacy assurance must become an organisation’s default mode of operation. Dr. Ann Cavoukian, who is hailed as the ‘Godmother of Privacy by Design,’ set out the foundations across seven principles: The 7 Foundational Principles of Privacy by Design By Adil Akkus
  • 24.
    From Don’t miss ournext issue – subscribe for free here Youcandomorewiththedatabecause people already know; they are not goingtobeshockedorannoyed,they are going to share more, they are goingtosharegoodqualityaccurate databecausetheywillseethatthere isabenefittoit.Andtheytrustyou.” Andthatistheessenceofprivacyby designanddefault.
  • 25.
    Grim Reaper to hero Howthe data protection officer has become an enabler - and is no longer seen as a block to new ideas. Once, the data protection officer was the last port of call. So you had an idea for a product, you researched it, tested it, tweaked it, honed it some more, the PR people were ready, the advertising agency was briefed, the design was just so - and then it was compliance - the only barrier. To some it felt like the kiss of death to their brilliance. Deborah Dillon, a privacy protection officer of considerable experience says that is how it used to feel like to her. Today, she is the Data Privacy Lead at Atos. Before that, she worked for a ‘large bank.’ “Data privacy” she recalls “was like an after- thought. There had been no consideration of data development at all, until they came to me needing the information governance. And I often had to put the dampeners on projects.” It is not like that now, of course, and privacy by design is one of the main reasons, at least for Deborah. There’s a growing realisation that privacy by design is a key part of a product or service. And the data protection officer, by being the person who is integral to that, has been transformed. “I used to feel like I was seen as a grim reaper, now I am seen as an enabler,” says Deborah. Maybe she is being too modest. In the era of digital transformation and big data, the data protection officer is a hero, the one whose insights can help create trust between data subject, and controller or processor. Abigail Dubiniecki, Associate at Henley Business School’s GDPR Transition Programme and Specialist in data privacy at My Inhouse Lawyer, says that GDPR, and privacy by design in particular, also present an opportunity to do some house keeping. It’s a kind of nudge. It is in our own interests to wear seat belts when driving, but until it became compulsory - the law - many of us ignored it. How many lives were saved when the law was changed? The numbers are probably countless? In fact, the Royal Society for the Prevention of Accidents has estimated that 50,000 lives have been saved in the UK alone since the law making it compulsory to wear seat belts in the front of a car was introduced in the country some 30 years ago. Indeed, behavioral economists have a word for it: ‘nudge’, advanced by Cass Sunstein and the Nobel Prize winning Richard Thaler - that we can change our behavior to act in our own best interests through quite subtle influences. Maybe GDPR and privacy by design are like that. Companies are run by people - and some tasks are seen as more fun. Efficiently managing data, ensuring it’s not kept unnecessarily eating up valuable storage space, may not feel sexy, but it’s still important. Or for that matter, since data is valuable, knowing what you have is vital. As Ms Dubiniecki continued: “People are not leveraging their data because they don’t know what they have. So they are not unlocking the value from 23. So for me, GDPR is the missing link to get to companies before they do digital transformation. “
  • 26.
    it, they arejust hoarding it, and putting it into lakes because they are hoping that some day they will find something in that closet or lake - so GDPR is forcing you to know exactly what you have, which gives you an opportunity to really harvest it.” But privacy by design and default is about creating trust. Ms Dubiniecki explained: “You can do more with the data because people already know, they are not going to be shocked or annoyed, they are going to share more, they are going to share good quality accurate data because they will see that there is a benefit to it. And they trust you.” And that is the essence of privacy by design and default. When she was working at the ‘large bank’, Deborah Dillon says “I brought in privacy by design, in anticipation of GDPR, working with system developers, products teams, and building it right at the start of a product. And it worked so well, and data was captured so well, that systems developers were talking to each other about the datasets they needed to collect to apply privacy principles.” She says it was so refreshing to hear, “really turning data privacy on its head. I am bringing it in with clients now, it’s great for compliance and transparency.” If you can get privacy right, rather than becoming an after-thought, a necessary hurdle to overcome, it can enhance a product, make it more appealing to the customer, a reason to buy your product or service. She recalls an occasion, at the bank, how she managed to build data protection into a Christmas competition in a way that made it more interesting. It involved a Christmas tree in every branch - the idea wasforcustomerstonominateaneighbourforanaward, as someone who had done a lot for the local community and put their nomination on the tree. But the privacy implications were enormous - and potentially damaging to the bank’s reputation with customers. Deborah explained: “Privacy by design is not just the big systems. In this particular case, by using personal privacy as an advantage, we made the identity of the person nominated anonymous, instead they were described without revealing who they were and the person who nominated them put their own email address into the competition, and in the event their nominee win, the neighbour was to be told and in turn inform the winner.” Digital transformation; the missing link Digital transformation has become a big buzz word of the moment - a lot of companies are looking to transform their businesses - they may have a good budget, but, as Deborah says: “They have to get their house in order first, for me GDPR comes before digital transformation.” She says that if they know data is accurate, and relevant, in the right place at the right time, then when they are looking at digital transformation, the process can become a whole lot more effective. She gives as an example, uploading data onto the cloud: if they have classification of documents, then if there is highly sensitive information and data, it stays in the private cloud, but internal information might go into the public cloud. “So for me, GDPR is the missing link to get to companies before they do digital transformation.” Atos is the European leader in the payment industry. It focuses on big data services, cybersecurity, and digital workplace, cloud services, infrastructure & data management, business & platform solutions, as well as transactional services through Worldline. Given what it does, it may come as no surprise to learn that Deborah says: “At Atos, privacy by design is a must for our customers.” The benefit The headlines focus on the fines associated with GDPR, the potentially heavy cost of not complying. But for many large companies a fine, even one that could be four per cent of turnover, is just another problem - and not necessarily even one of the more serious challenges. Maybe that will change when the first fines are rushed out, but at the moment there is a perception that regulators across the world are thinly stretched, have multiple responsibilities, and that in practice, fines will be rare - a risk to a company for sure, but maybe not a great one. The latest revelations concerning Facebook add another dimension - the damage to the company’s reputation - the falls in the share price that saw over $70 billion knocked off the company’s market capitalisation - shows how data privacy and respecting customer’s data is not just about trying to avoid fines, but protecting reputation too - something that can be worth far more than a few per cent of turnover. So, while the threat of a fine, may be seen as a stick to ensure data protection is taken more seriously, the reputational damage that can arise from a data breach, or allegations that governance is not what it should be, is potentially a much more fearsome stick. But privacy by design and default can provide a carrot too, it can help a company manage costs by making more efficient use of data, but above all can be a superb communication message to the customer. Privacy policies say: “we take your privacy seriously” but do people believe it? Do they just assume the policy is there to provide legal protection? By contrast, if privacy can become a core part of a product, if it screams out at the customer ‘you can trust us, we are on your side, protecting your privacy is fundamental to who we are, a human right’ then all of a sudden your image is transformed - the customer will be more willing to agree to their data being processed, and your reputation is given the kind of boost that can make the difference between success and failure, that can give you a decisive edge over rivals. That is why privacy by design has transformed the data protection and privacy officer - from grim reaper, who was seen as a hurdle that could prove impossible to pass, to hero - someone whose insights can transform the reputation of a company, or public service, creating a bold, empowering and successful relationship with the customer. by michael baxter 24.
  • 27.
    SPEAKERS INCLUDE: WWW.DATAPROTECTIONWORLDFORUM.COM 20TH & 21ST November2018, ExCeL London JAM IE BARTLETT PERN ILLE TRAN BERG RO H IT TALW AR ARD IKO LAH 2018 WILL BE A LANDMARK YEAR FOR DATA PROTECTION AND PRIVACY LEON@DATAPROTECTIONWORLDFORUM.COM 0203 515 3015 Pre-register for the data protection event of the year here
  • 28.
    The Canadian experience... ByMichael Baxter Don’t miss the data protection event of the year – pre-register here
  • 29.
    “We exported privacyby design to the rest of the world” says Sylvia Kingsmill, a Partner at KPMG Canada, and the Canadian Digital Privacy & Compliance Leader, “now we are importing it back.” But the Canadian experience is illuminating - unlike in the EU, where under GDPR, privacy by design and default is a requirement, in Canada it has been voluntary. But companies and other organisations still apply it. The Canadian view and the Dr Ann Cavoukian view is that it is just common sense. “We in Canada,” she says “took Ann’s seven foundational principles which are the lynchpin. It is one thing to have it as a philosophy, another to actually do it and operationally track it.” Ms Kingsmill worked with Dr Ann Cavoukian and her team at Ryerson University in developing privacy by design methodology and framework to assess new technology and services against the principles of privacy by design. What does this mean? “What do controls look like from the perspective of a company or entity? How do we bake in the right controls, so they can demonstrate, whether it is in the EU or Canada, that they are taking the right steps to directionally protect our consumer rights and freedoms?” In Canada, you “can get certified to say look we are doing the right things to your data, we are listening, we are following best practice, we are following guidelines, we are listening to regulators, we don’t want to just appease them.” There is another way of putting it that really does sum it all up: “We satisfied a market need, and that need is to re-assure the public.” She cites as an example CANTATICS, a Canadian organisation that produce an anti-insurance fraud solution. Privacy by design is a core part of the product. The data generated by insurance claims across companies can be used to spot fraud, regular insurance claimants, who, in order to avoid detection, spread their fraudulent claims across multiple companies. But the result of this is higher insurance premiums, so it is very much in the interests of customers to come up with a fix. So CANTATICS applied privacy by design. They showed how processing customer’s data worked to their benefit, promoted how the data would only be used for the specific purpose stated and kept no longer than necessary. In short, privacy by design became a feature, a key part of what CANTATICS is about. There was no need for an enforcement body - it was there because it was important, a key part of creating trust, building and maintaining a reputation. But rules are being introduced. “We had a Parliamentaryreportrecommending19 areas toreformtomaintainouradequacystandards and the main topic is privacy by design.” ButthenSylviaKingsmillreckonsthisishow things are going worldwide. “International regulators are collaborating, there is a global privacy enforcement network,” she says. GDPR then, or at least Privacy by design anddefault,mayhavehaditsrootsinCanada, but now it is being applied worldwide. GDPR is about “transparency, accountability and control.” “It is actually pretty brilliant the way they have put together GDPR, as awful as it looks, I just wish they had used clear concise intelligible language like they say we should. The architecture of that law is really quite impressive, and I say this as someone who has read a lot of laws.” Abigail Dubiniecki, is an Associate at Henley Business School’s GDPR Transition Programme and Specialist in data privacy at My Inhouse Lawyer. To put it mildly, she knows her GDPR. And, like so many experts on GDPR and advocates of privacy by design and default, she is Canadian. When she talks about GDPR her passion shines through, she tells it like privacy is one of the most important issues of the day - of the digital world, but then given how data and personal data is shaping the economy, business and impacting upon us all, it probably is. “Before I came here, I was already doing privacy by design without calling it that,” she says, referring back to when she was the privacy lead on an in-house counsel team for a Canadian crown corporation. We in Canada took Ann’s seven foundational principles which are the lynchpin. It is one thing to have it as a philosophy, another to actually do it and operationally track it. 27. “
  • 30.
    The Canadian takeCardinal rule “Under GDPR, one of the cardinal rules is ‘know the data”. She explains: • what data you have • why you have it • what you are doing with it • do you have a lawful basis for having it • how long you need to keep it • who you are sharing it with • and for what purpose But then, look at the detail, the devil may not emerge, but you must be devilishly careful: • Are there secondary purposes for which you are using it? • if so, are they sufficiently linked? • is this something that requires you to go back to the drawing board? “It’s not a legal obligation in Canada to apply privacy by design, but those basic principles are there.” Says Dubiniecki. “She adds: “what GDPR does and what privacy by design asks you to do is take data cycle life management practices, cyber security best practices and fair information principles and make that a legal obligation.” She turns to the public sector. We had this constant interaction with our regulators, so when something comes up, we are supposed to report that, not just to the regulators and privacy commissioner’s office, but there is also the Treasury Board Secretariat, so accountability has been built in; it’s just a reflex. “And although no one would say it’s a legal obligation to apply privacy by design, much of that is already there, if not articulated.” Shakespeare said it well, “that which we call ‘privacy by design’ by any other phrase smells so sweet.” Not to mention helps confer a human right. In Canada there is no formal right to be forgotten law, but there is the Right to Rectify. “You can correct something that is wrong, and if something is unlawfully processed that’s a problem, but where we differ is that we don’t have the same teeth for enforcement.” Privacy by design may have been invented in Canada, but she says “we only started to take it seriously once the EU adopted it”, but that seems to be the Canadian way. If you are a Canadian rock star, or Canadian hip hop artist or painter, no one is going to care about you in Canada until you go somewhere else and become really popular, then they call you Canadian. “It is being looked at in many other countries, such as in Latin America and in countries that are eager to trade with the EU, such as Japan where there is a lot of interest and now finally Canadians are paying attention to it too. As they seek to reap the benefits of greater trade with Europe under the Canada- EU comprehensive and economic trade agreement (CETA).” Examples of privacy by design and default Dubiniecki cites as an example: Securekey and its product, SecureKey Concierge, technology that enables you to ‘bring your own credentials’ to prove your identity. Abigail explained: “You can access 80 government services using banking credentials” and it’s a service in which privacy by design is an integral part. “The idea is that you don’t have to have loads and loads of different data, in different treasure troves, my company and your company, and government so that you can be attacked from any angle. Now we have this system to verify when you need to be verified This has been a big trend, putting control in the hands of the individual, giving them greater security because there are fewer places where your identity is spread about, creating a richer, more reliable way to confirm someone’s identity, because you have much richer dataset than you would have if you just have passwords, and date of birth and other things that someone could steal.” 1. 2. 3. 28. GDPR is about transparency, accountability and control: Interview with Abigail Dubiniecki, Associate at Henley Business School’s GDPR Transition Programme and Specialist in data privacy at My Inhouse Lawyer
  • 31.
    End-to-end and theseven foundation principles With privacy by design, we talk about end to end protection, rights, and putting the user first, all those things are baked into GDPR. So as a starting point, you have to meet the seven personal data processing principles, in every activity that you do. You have to make sure that your default choice is the most privacy preserving choice. So if it is social media, the default is to share information with the people ‘I want to share with, not share with the public,’ you can’t ‘default to share with the public’ and roll it back, you have to default to something more defensible that makes sense in the light of our exchange, and the reason why ‘I am coming to your platform.’ You have to meet those principles. It must be lawful. You have to choose which lawful basis you are relying on. If you are relying on consent, you have to be able to let someone withdraw consent as easily as they give it, and ensure it is freely given. In an employment situation you cannot rely on consent, because it can’t be freely given. Therefore, you have to look at other lawful bases. Then you have to go through all the principles which must be baked into everything you are doing. People have these seven rights under GDPR and you have to make good on them, which means however you design those systems you have to find that needle in a haystack, so if, say Rob Lowe makes a subject access request, when you are designing your system, you have to make it so that you can find everything that is said about Rob Lowe in everything you have done or shared with his data. Be warned “Lots of companies are doing back flips, ‘oh this does not apply to me because of Brexit’, ‘it does not apply to me because I amjustaprocessorforatelecomscompany and I don’t actually see anything’, ‘I am not even interested in what’s coming in there’, but it absolutely does, if there is personal data in there, if you don’t need it, then don’t have it. Or anonymise it so you reduce your compliance footprint. “If they say, ‘I am not going to do anything with it, so I should be okay,’ They will be in trouble. If not under GDPR, then under ePrivacy regulation and the Cyber Security Act that is being proposed. Or under the ‘Directive on the Security of Networks and Information Systems’. There is something in that digital single market strategy that will hit you. So, pay attention.” 4. 5.
  • 32.
    GDPRandthefourthindustrialrevolution by Michael Baxter Subscribefor free here and get the next issue to your inbox
  • 33.
    There is anice analogy with a chess board that illustrates the point. Put a grain of rice on the top left-hand square. Moving to the right, put two on the one next to it, then four, doubling, with each square working your way across and down the square until you reach square 64, at the bottom right hand corner. At the end of the first row, you would have 128 grains of rice. At the end of row two, 32,000 grains of rice, eight million by the end of row three, and by the last square 128 trillion grains of rice. This, goes the idea, serves as a metaphor for the way technology is accelerating. Whether this is accurate or not, one can say, without doubt, that technology is developing at an extraordinary pace. Moore’s Law Moore’s Law, relates to words spoken by Gordon Moore, co- founder of Intel, in 1965, that the number of transistors on an integrated circuit double every two years. Today, Moore’s Law is said to refer to computers doubling in speed every 18 months to two years. It’s the classic example of accelerating technology, the chess board analogy seems to describe this perfectly. The economist Brad de Long estimated that if someone had built a computer with the kind of processing power of the iPhone X in 1957, it would have cost 150 trillion of today’s dollars, the device would have taken up a hundred-storey square building, 300 meters high, and 3 kilometres long and wide and would have drawn 150 terawatts of power—30 times the world’s generating capacity at that time. There is a view that Moore’s Law is now slowing down, because of the physical limits in cramming more transistors on a chip. But newer technologies, such as quantum computers, applying the super material graphene instead of silicon, or photonic chips could give Moore’s Law a new lease of life. Metaphorical Moore’s Law But the idea of Moore’s Law can be applied to other applications - not literally Moore’s Law, but following a similar trajectory. Examples include Butter’s Law, which predicts a doubling of data transmission over a fibre optic cable every nine months, or genome sequencing, which has seen the cost of sequencing the human genome fall from $2.7 billion when it was first sequenced as part of the human genome project, at the start of the century, to less than $1,000 today. Renewable energy also seems to be falling in cost at an exponential pace - although the factor here seems to be the learning rate - a doubling in the user base leads to a proportional fall in cost, or lithium ion batteries which have fallen in cost from $1,000 a kilowatt hour in 2008 to around $200 in 2016. Convergence But it’s when we take into account convergence that things get exciting. Technologies to watch include: • The Internet of Things • Robotics • Artificial Intelligence (AI) • Augmented reality • Virtual reality • Super materials such as graphene • Quantum computing • 3D printing/additive manufacturing • 5G and faster internet speeds • Genome sequencing • CRISPR/cas-9 - DNA editing • Mobile computing/the cloud • Blockchain • Drones • Energy storage • Renewable energies Convergence, combined with Moore’s Law and metaphorical Moore’s Law, are causing things to develop at an extraordinary pace. Faster computers are helping to support advances in AI, developments in screen technology, advances in processing power, and in the components that make up smart phones such as accelerometers - which tell a phone which way it is being held - are supporting advances in virtual reality. The combination of AI, with sensors that make up the Internet of Things inside wearable devices, and genome sequencing is set to create a revolution in health tech. Data But data is a key part of the revolution. Aggregation of data from genomesequencingandsensorsmonitoringthebody,whenanalysed by AI, may create new insights on the treatment of diseases. According to a study by IBM, 90 per cent of the information on the internet has been created since 2016. Eric Schmidt, chair at Alphabet, once said that every two days we create as much information as we did up to 2003. But it seems to be getting quicker. Another way to put it is to say technology has never changed so fast, but it will never again change as slowly as it is now. Fourth Industrial Revolution The ‘Fourth Industrial Revolution’ was so named by Klaus Schwab, founderoftheWorldEconomicForumwhichfamouslymeetsinDavos every January. In 2016, it was the theme of the Davos conference. It follows the first industrial revolution of the 18th century, dominated by steam power and textiles, the period from the mid-19th century to 1914, led by the use of electricity, the rise of the motor car, flight, mass production, and then the more recent IT revolution. Personal data It is clear that personal data will be part of this revolution and the processing of it may create extraordinary insight and make incredible efficiencies. But there is risk. Will it lead to a surveillance state? The end of secrets does not sound so bad, but the end of privacy does. Maybe we are heading for a time when we sell our data? Can AI take our data, even anonymous data, and by comparison with our data streams de-anonymise it? These are among the biggest issues of the day. GDPR takes us the heart of this matter. It has its critics, but the regulation may represent a key moment in the battle to ensure the results of the Fourth Industrial Revolution are benign. Technology,astheysay,isaccelerating. Moretothepoint,itisalsoconverging. 31.
  • 34.
    “Our research showsthat around two thirds of people are conformable with the data exchange part of a modern economy” says Chris Combemale. The Direct Marketing Association (DMA) recently published research into what the consumer really thinks about data privacy. Data isn’t the only thing that is making the Fourth Industrial Revolution possible, but without it, the Revolution could flitter out and die. And the DMA research finds that more needs to be done to create the trust that is required so that consumers are comfortable with their data being processed. The DMA survey also found that 54 per cent say “trust in an organisation is the single most important reason why they will share data,” but 78 per cent say that “the company gets the best value of the exchange, 86 per cent say they want more control over what companies do with their data, and 88 per cent of companies say they would like more transparency. “Unless there is trust; that data is held securely and trust that companies will only do with it what they say they are going to do, tell you openly and honesty, then we will have a real issue going forward,” suggests Chris. Chris began his working life in TV production before entering the world of advertising and marketing, joining Young and Rubicam in New York. He worked for the famous advertising and marketing agency for 12 years, with a stint in Asia, working in Hong Kong and Singapore. Today, he is CEO of the DMA Group, comprising the DMA, IDM (Institute of Direct and Digital Marketing) and TPS (Telephone Preference Service). He sees a parallel between GDPR and the Advertising Standards Authority. “The ASA, ensures that, on a voluntary self-regulated basis, ads are truthful and meet societal standards. If ads are found to be untruthful they have to be removed. This is really important as it means that the consumer can trust that the ads they see are truthful, and don’t have to worry about deciding which ones are telling lies and which ones are not. The same applies to data, companies have to have open honest and transparent in their communications about what they are doing with their data. GDPR will mandate this.” Referring to the DMA survey, he said “we have some way to go if we are going to create the eco system and environment that is right for the modern future. “GDPR can play an important role, as it mandates that companies, and government for that matter, give people more control and greater transparency. “If companies do a good job, they will articulate the benefits in a better way than they have done, so that people start to see the value exchange as a balance between the great benefit they get, and the benefit the company gets.” Yet, in some respects, he says that this perception that the company achieves the greater benefit from the value exchange is almost inexplicable. “When you think of some of the value we get every day in our lives, so if you think of the number of people who use Google maps for free, several times a day, it is not only free it contains minimal advertising.” “Do we just take it for granted, is that why we won’t see value, why 78 per cent say the company benefits from the value exchange?” “GDPR is going to be helpful, not perfect, but was drafted with the idea of increasing consumer confidence in the future technological and data driven economy. In a similar way to how self-regulated watch dogs such as the ASA have created confidence in the traditional advertising market.” Advertising and the Fourth Industrial Revolution The media is not like it used to be. There was a time when we instinctively knew the difference between the media and, say, a ChrisCombemale,CEOattheDMA:FourthIndustrialRevolution 32. trust in an organisation is the single most important reason why they will sharedata. Anyone who is wholly worried or wholly optimistic, is missing the point. optimismandworrymustbeinbalance. And the benefits cannot accrue if you don’tmitigateagainsttherisks.
  • 35.
    retailer or taxioperator. “But,” suggests Chris, “technology has evolved such that new products and services that have become available, that make is easier and more efficient to live their lives, are all essentially media channels that rely on the use of data to deliver them. So that is true; whether it be an Uber, who use data to work out where to collect you from and drop you off somewhere, or websites that sell you products or access to services providing content. “And increasingly, as we move into a world of AI and big data analytics, more and more we will combine technology and data to create new services of the future. But it is really important, that people trust the technologies and how they use the data. “ 33. “GDPRcanplayanimportantrole,asitmandatesthatcompanies, and government for that matter, give people more control and greater transparency. The opportunity and threat from data So, the point is hammered home. Trust is vital, and GDPR can be a key part of creating that trust. “Big data could be the new oil or the new asbestos. There is a huge amount of opportunity for businesses, people and society in the era of big data, but there is an equal amount of risk. That is why it is important we have a set of rules on our playing field which help us mitigate the risk and empower the opportunity. “And that is what GDPR is trying to do. It is important we think about the risk as well as the opportunities. “At the DMA, we say we need to move from responsible marketing and codes of practice, to ethical marketing, where people think about the impact on society of these technologies that are coming forward and understand how your business can use them for positive purposes and mitigate the risk. “Anyone who is wholly worried or wholly optimistic, is missing the point, optimism and worry must be in balance. And the benefits cannot accrue if you don’t mitigate against the risks.” And that is why GDPR may prove to be a vital tool or even weapon in the battle to ensure that the final outcome of the Fourth Industrial Revolution is benign. SUBSCRIBE FOR FREE TODAY HERE
  • 36.
    Alford was ableto take anonymous data and reveal the identification of Dread Pirate Roberts, with Ulbricht eventually receiving a life- time jail sentence. This anonymisation is so poor that anyone could, with less than two hours work, figure which driver drove every single trip in this entire data set. It would even be easy to calculate drivers’ gross incomes. “Data with the individual names and addresses removed will have a high risk of re-identification.” He said that pseudonymised data should be treated aspersonalinformation. Data Protection World Forum is not to be missed – pre-register now
  • 37.
    Elon Musk reckonsAI poses an existential threat to society, the late Stephen Hawkins was similarly pessimistic. This magazine is about privacy and data protection. If you fret that AI may create super intelligent artificial life-forms that represent a threat to our existence, like a Terminator, then you may or may not be right, but that is not an issue for here. How AI may use data, perhaps to manipulate us, is an issue for here. Artificial intelligence is a vague concept. Many of us think we know what it means, but if you immediately get thoughts of The Terminator or HAL from Stanley Kubrick’s 2001 A Space Odyssey, then your view is quite different from that held by many who work in AI itself. Some experts define it as machine learning - others say machine learning is a subset of AI. But for all intents and purposes, when we talk about AI in the contextoftheFourthIndustrialRevolution, we are talking about machine learning and its close cousin, deep learning. Machine learning relates to the analysis of a data stream, often a stream containing huge volumes of data, to extrapolate insights. It will often do this via neural networks, a computer configuration which is meant to mirror how the brain works, with neurons that form links or synapses with each other. Deep learning takes this concept a little further, maybe analysing data from multiple data streams. The applications could involve creating medical insights, spotting links with certain behaviour, symptoms and data from genome sequencing with certain conditions. The result could be the more accurate and speedy diagnosis of disease and treatment. One of the paradoxes of big data is the personalisation of solutions - for example analysis of data to pinpoint the ideal treatment for individuals based on their circumstances and genetic data. But data can also be used to support marketing, targeting advertising, for example, with extraordinary accuracy, or it could be used by retail to help in merchandising, stock control and matching stock with customer demand. AI, GDPR and the anonymisation of data Or AI could, for example, create data to help with traffic monitoring, control and flow. GDPR and AI GDPR is quite specific. Whatever legal bases you use for processing personal data, whether for example it is consent, legitimate interest or contractual, the data used must be specific to its purpose and it cannot be held longer than is absolutely necessary. Anonymisation But while data can provide important insights,invariablyitdoesnothavetoname an individual. Big data can be anonymous and only applied to an individual when the insights from that data can benefit that individual specifically. De-identification of data In fact, under GDPR, there is a spectrum of de-identification. There is pseudonymised personal data, and anonymised personal data. The challenge here relates to the potential for AI to take multiple data streams, and from that identify who individuals are, even though someone has gone to the trouble of de-linking data to individuals. So, say a clinician, doing research gets non-personal data, they shouldn’t then be able to identify the individual. This is an example of pseudonymised data as, with information from other sources, there is a way to link back. With anonymised data it is not possible to link again. In practice, Abigail Dubiniecki tells us, it is extremely difficult, if not impossible to stop the link back again, “because there are a thousand points of light, so much personal data and so many different data elements pushing out constantly about us.” Dread Pirate Roberts and the tax investigator who brought him down by de-anonymising data Gary Alford was a tax inspector. He didn’t have access to any unique information, he just had the channels available to us all, even so, in June 2013 be brought down Ross William Ulbricht, the so called kingpin, Dread Pirate Roberts, the man who operated the Silk Road market place on the dark web. Selling drugs was just part of the story, Silk Road was an online black market. And most of the sales were indeed for drugs, including stimulants, psychedelics, prescription, precursors, opioids, ecstasy, and dissociatives, but there were other products too, fake driving licences, and supposedly, even the option to take out a contract on someone - hire a hit man. Libertarians point to the philosophy behind. Silk Road, the ideology of its founder, a fan of the free market economist Ludwig von Mises, terms of service prohibited the sale of certain items whose purpose was to ‘harm or defraud’. Child pornography, stolen credit cards, assassinations, and weapons of any type were banned, thus making a hero of Ulbricht in the eyes of many. The name Dread Pirate Roberts came from theUlbrichtlogindetails.TheFBIinvestigated, but his identity remained a secret. But Gary Alford was able to track him down, ascertain his identity, by spending time on his own computer, reading though chat rooms, and old blog posts. His main tool was Google, using advanced search to focus on references to Silk Road during a certain time period: the early days. He was able to track down early messages on forums that referred to Silk Road and found one particular communication on a forum asking for help in programming with the email address; rossulbricht@gmail.com. The original message had in fact been deleted, but Alford was able to find a record of the communication in someone else’s response. By doing this, Alford was able to take anonymous data and reveal the identification of Dread Pirate Roberts, with Ulbricht eventually receiving a life- time jail sentence. It just goes to show, if an individual, making clever use of Google can do that, consider how AI, with access to a gigantic array of data, can de-anonymise data. In the study, Unique in the Crowd: The privacy bounds of human mobility by Yves-Alexandre de Montjoye, César A. Hidalgo Vincent D. Blondel, it was found that “in a dataset where the location of an individual is specified hourly, and with a spatial resolution equal to that given 35.
  • 38.
    by the carrier’santennas, four spatio- temporal points are enough to uniquely identify 95 per cent of the individuals.” Vijay Pandurangan, a former Google engineer and founder and CEO of a company called Mitro, analysed data on 20 gigabytes worth of trip and fare data involving New York cabs, comprising more than 173 million individual rides. The data was anonymous. His conclusion: “This anonymisation is so poor that anyone could, with less than two hours work, figure which driver drove every single trip in this entire data set. It would even be easy to calculate drivers’ gross income or infer where they live.” Khaled El Emam is the Canada Research Chair in Electronic Health Information at the University of Ottawa, an Associate Professor in the Department of Pediatrics. In an article for BMJhe said: “Data with the individual names and addresses removed will have a high risk of re-identification.” He said that pseudonymised data should be treated as personal information. As Dubiniecki says: “It is very hard to be anonymous. You can anonymise the data as effectively as you can through techniques that are tried and true – as an individual layperson, it is not for you to think you have truly done it, you really do the need help of a professional or a tool, but in addition to technical measures you need organisational methods.” Foreveryanonymisationorpseudonymisation thatyoudo,youhavetodoare-identificationrisk assessment to make sure you have identified the different ways the data can be re-identified, and then you put in measures to mitigate those risks andprotectthedata. And then, before you share data, you must put contractual and organisational protections around that sharing. “Never throw anonymised data into the real world for anyone to use like Netflix did in 2006” says Ms Dubiniecki, “because then there is no control over the thousand points of light that can be built in.” One method being tried involves the process of adding noise, making it harder to de-anonymise data, using what is called differential privacy. Abigail says: “The process involves taking a data set, running a parallel one, and adding some noise so that you can run your analysis and get something that is ‘pretty good’, but the more noise you run the less accurate it is, so it is questionable how effective this can be.” Apple says it is using differential privacy, but the community is sceptical. Facebook has talked about using it too. GDPRputsemphasisontheanonymisation of data, and the pseudonymisation, but being 100 per cent confident it cannot be de- anonymised may not be possible. But GDPR is a risk-based regulation, the key lies with assessing the risks, addressing high risks and turning them into risidual risks, documenting the measures you have taken, applying re-identification risk assessments, if you are anoymising or pseudonymisation, employing outside expertise and ensuring the data you do process is relevant to the purpose and you do not hold onto it longer than you need to. By Michael Baxter 36. “Never throw anonymised data into the real world for anyone to use like Netflix did in 2006 sbecause then there is no control over the thousand points of light that can be built in.
  • 40.
    Data will beat the heart of everything, and people will have to become more comfortable with a greater level of transparency within their own lives and accept that companies are governments in providing services will know quite a lot about them. But it must always be clear how this operates and people must be given choice. Subscribe today and take advantage of our great subscriber offers “
  • 41.
    SELLING YOUR DATA... “Igive you permission, in exchange for a sum of money, say $100 a month to process my data.” Is that a possible future, what might GDPR have to say about that? In a feature by Stephan Talty in the Smithsonian magazine, a future was envisaged, circa 2065, in which computers and robots have taken our jobs, but most of enjoy universal basic income - a set monthly amount, freeing people up to spend their time making movies, volunteering and traveling the far corners of the earth. “There will be Christian, Muslim and Orthodox Jewish districts in cities such as Lagos and Phoenix and Jerusalem, places where people live in a time before AI, where they drive their cars and allow for the occasional spurt of violence, things almost unknown in the full AI zones. The residents of these districts retain their faith and, they say, a richer sense of life’s meaning. Life is hard, though. Since the residents don’t contribute their data to the AI companies, their monthly UBI is a pittance.” Sylvie Kingsmill, a Partner at KPMG Canada, and the Canadian Digital Privacy & Compliance Leader, blanched at the idea when we put it to her. “What kind of society do we want to have? If we look at privacy as a right, a freedom, as Europeans do, I would disagree. You only understand what the implications of a data breach are if you have been affected and your reputation has been affected by this.” She does concede, however, that it “depends on your cultural background.” Forget about the year 2065, who knows what the equivalent of GDPR will say then. What about selling your data today? Chris Combemale at the DMA said: “There are some models today in which people are selling their data in exchange for services. But there are many other models in which peoples’ data is used to pay for services. Data will be at the heart of everything, and people will have to become more comfortable with a greater level of transparency within their own lives and accept that companies are governments in providing services will know quite a lot about them. But it must always be clear how this operates and people must be given choice.” But the issue of paying for data matters now, and GDPR is most certainly relevant. It all boils down the legal bases for processing data. Abigail Dubiniecki’s view is that ‘under consent’ there is no option to sell your own data. If you are making the lawful bases consent, it has to be freely given and the moment you make it conditional, it is not freely given.” But other legal bases may make it possible. “Contractual necessity is another legal basis. If the purpose of the contract is data exchange, so you can do X with the data and I can make money off that, then perhaps you could make an argument that you can do that on a contractual necessity basis. If that’s the essence of the contract.” She pauses, Abigail clearly feels uneasy about the concept “I am not selling this idea, I think privacy is more fundamental than that. [Can I add: “The right to privacy and to protection of Personal Data isn’t a mere property right to be bought and sold. It’s a fundamental human right and it’s tied to so many other rights, like the right to be free from discrimination. Discrimination is one of the top harms listed in the GDPR recitals and a consequence you must consider when applying the GDPR’s risk-based approach. Under Quebec law (a Civil law province in Canada), the right to privacy is an inalienable right. You can never fully and permanently sign it away. That said, there could be instances where you might transact with a business using your personal data as currency. “So the contract might state that this is the personal data that is necessary for the purpose of the contract, and the contract involves crunching up the data and putting it with other people’s data, in exchange for the consideration (as we say under Common Law) for say a sum of money or a token. And then when it is no longer necessary for the performance of the contract, they stop using it. People.io uses this model, and to a certain extent the Hub of All Things does. But it seems such an arrangement may only apply to a specific exchange, not a relationship when you are constantly feeding data. She draws an analogy with a contract to buy a coffee, “they might charge me £2.20, if I accept that offer, but if they supply a frog, that is a breach of the contract, if they say in addition to the money I have to give them everything in my purse, that is not contractually necessary to get this coffee. “But in the digital Wild West, there is this surface relationship, the view that ‘we are going to hoover up everything we can get, maybe unbeknownst to you. We will cover it in all these terms and conditions, which you can’t find because they are somewhere else and the screen keeps stalling, and they put speed bumps in the way and we don’t understand it, this is not an enlightened choice that you are making’. That is just not going to happen. Remember: “GDPR is about transparency, accountability and control.” Maybe, in the not too distant future, we wIll sell our data. 1 by Michael Baxter 39.
  • 42.
    Anatomy of abreach in trust, the Facebook story Expand your knowledge – pre-register to Data Protection World Forum today
  • 43.
    Facebook’s CEO concededthat the Cambridge Analytica saga that has put data protection and privacy on the front pages of newspapers for weeks, was an example of a breach of trust. But was it a data breach? Regardless of the technicalities, the whole episode cuts right to the core of one of the most important issues of our time. It’s important that we take an objective perspective. In April 2015, Mark Zuckerberg selected Orwell’s Revenge, by Peter Huber as one of his books of the year. “Many of us are familiar with George Orwell’s book 1984. Its ideas of Big Brother, surveillance and doublespeak have become pervasive fears in our culture,” wrote Zuckerberg. He continued: “Orwell’s Revenge is an alternate version of 1984. After seeing how history has actually played out, Huber’s fiction describes how tools like the internet benefit people and change society for the better.” As these words are being written, Zuckerberg is being cast as a kind of privacy fiend. Not so long ago, people in some quarters saw him as the most likely next President of the United States, now, read some media reports and you could be forgiven for thinking he is like a Bond villain in the flesh. Such descriptions are unfair. He has erred, of that there is no doubt, maybe at 34 years of age, having only ever known extraordinary success, he does not have the wisdom that comes with the odd failure, but before we go any further it is worth considering his philosophy. And maybe we can tell much about this by considering some of his other books of choice; ‘Why Nations Fail’ by Daren Acemoğlu , in which ‘extractive governments’ use the power of technology and others powers to enforce the power of a select few; ‘The Rational Optimist’ by Matt Ridley, a book about the positive benefits of new technology; and ‘The End of Power’ by Moisés Naím, a book about a shift in power from authoritative and military based governments and corporations to individuals. Of the last book on that list Zuckerberg said: “The trend towards giving people more power is one I believe in deeply.” In fact, scan through all the books that Zuckerberg rates and you get the impression of a young man who sees technology as a way to end poverty, and give more power to individuals, a force for freedom and empowering ordinary people. That is a quite different philosophy to the one many associate with the corporate world - and at odds with the critiques laid at Facebook - of a peddler of fake news, echo chambers, and insidious manipulation. At core, this may boil down to your view of humanity. The ideology behind Facebook. Zuckerberg once said: “We’ve gone from a world of isolated communities to one global community, and we are all better off for it… [Our mission] is to connect the world… [The danger is in] people and nations turning inwards against this idea of a connected world and community.” 41.
  • 44.
    The advertising medium Butlook beyond the ideology and you find a money- making machine - money may not grow on trees, but it does seem to emerge out of personal data. In the final quarter of last year, Facebook made $4.26 billion profit after tax, but even that understates the underlying picture, the company incurred a one- off $3 billion tax charge. Maybe this statistic is the key one: Facebook generated $6.18 per user during the quarter. From the point of view of the user, that is not so great. Maybe there would not be that much money in selling one’s personal data even if one wanted to. It also grabbed an 18.4 per cent share of the global digital advertising market, behind Alphabet/Google on 31.3 per cent share, according to eMarketer. And it is this market share pertaining to Facebook and Google that vexes many in the world of publishing. Facebook doesn’t pay for the content that appears on its platform, unlike most publishers who have to fund content with pounds, dollars and euros - bitcoins, or whatever currency they choose. They say it isn’t fair. Without the quality journalism they produce, Facebook and Google would not be nearly so profitable, yet the lions’ share of the advertising bucks does not go the companies that invest in journalists. The reason is simple, it lies with data. An advertisement on Facebook can be targeted with extraordinary accuracy. And advertising budgets can start at just a few dollars. In today’s digital business world, a business model has evolved known as ‘lean- start up’, it involves developing as simple a version of a new product as possible rapidly and testing it. Back in the media’s analogue days, such an approach would have been much harder. Start-up production costs were higher, so was minimal advertising expenditure. Facebook type mediums represent a superb way to test new ideas, for start-ups to test the market, and that’s not a feature the traditional publishers are too keen on. Power and responsibility But as French philosopher Voltaire once said: “With great power comes great responsibility,” a saying recently returned to the public consciousness by the Marvel Comic’s character Spider-Man. Facebook has power, of that there is no doubt. GDPR is about transparency, accountability and responsibility. Has Facebook been living up to such responsibility? One person who has doubts is Ardi Kolah, Director, GDPR Transition Programme, Henley Business School, and Editor in Chief, Journal of Data Protection & Privacy. He said that Facebook’s product may be a massive generator of data, creating huge profits and “capitalism creates a right to make money from good business,” he suggests. “Mark Zuckerberg is brilliant, but with rights comes responsibility. He has used his rights to accumulate a fabulous fortune, but where is his responsibility?” He continued: “The extent of the obfuscation by the social media giants is breathtaking. Facebook’s service agreement runs to some 3,700 words covering 10 sides of A4 and Twitter’s is 11,000 words, taking up 32 pages.” Ardi turns to Google, “where is its responsibility for recent gang violence in Nottingham leading to a tragic death?” On this theme, The Times recently quoted a Home Office spokesman saying: “Gangs often post videos online that seek to incite violence or glamorise criminality to influence young people. The instant nature of social media also means that plans develop rapidly and disputes can escalate very quickly.” It’s a good point but recall too that social media played a key role in supporting the Arab spring. Google might say “don’t do evil”, but does it practice what it preaches? Facebook breach It was, of course, the use of data by Cambridge Analytica, via the academic Aleksandr Kogan, that put Facebook on the front pages and saw around $80 billion knocked off its market cap. The privacy question related to an app created by Dr. Aleksandr Kogan called ‘thisisyourdigitallife’ which used the Facebook login feature. 270,000 people used the app, sharing personal information. So far, it seems nothing untoward happened. A more controversial step followed, Kogan was able to make use of a Facebook feature that existed at that time which provided access to data of friends of those initial users. Mark Zuckerberg now concedes that this could have created data on no less than 87 million individuals. This data was then used by Cambridge Analytica, passed to the company by Kogan. Facebook says that this was against its terms and conditions. This all leads us to the question, was there a data breach? Zuckerberg is careful how he words his response: “There was a breach of trust.”. Abigail Dubiniecki, was not convinced: “Although Facebook may want to split hairs and say this wasn’t a breach, it absolutely was a breach, but it wasn’t a breach when someone had to do some complex hacking, it was Facebook not taking care, as they should, with other people’s data.” Ardi Kolah was equally unequivocal, if the use of this data possibly influenced the US election and the EU Referendum, if data on 87 million people was used without their knowledge, how could it have not been a breach?” Next This all begs the question, has Facebook applied sufficient measures as required by GDPR? Elizabeth 42. “Although Facebook may want to split hairs and say this wasn’t a breach, it absolutely was a breach.
  • 45.
    Denham, the InformationCommissioner, recently said: “Facebook has been cooperating with us and, while I am pleased with the changes they are making, it is too early to say whether they are sufficient under the law.” In short, there is a big question over whether even now, Facebook has taken sufficient steps to follow the GDPR requirements. Ardi Kolah wonders whether Facebook itself has sufficient maturity - it may be a massive company with huge influence, but has it enjoyed sufficient time to gain the maturity required for greater social responsibility? Ardi questions whether data of the type controlled by Facebook should be in the private sector. The company enjoys a network effect, the more users it has, the more attractive its product offering to users, arguably Facebook is a natural monopoly - which, given the social and moral questions associated with so much data, may suggest it is a kind of public utility. Yet the stated Zuckerberg philosophy, with its emphasis on creating communities, seems to be at odds with this perception. Zuckerberg clearly has more money than he needs, even if he lived to a thousand his riches would be at a level that even highly successful business people could only dream of. Perhaps the problems began when the company was floated - the tyranny of quarterly accounting created a pressure to grow and generate money. It is worth recalling, however, that Facebook offers consumers an extraordinary product ostensibly for free. The data it generates can help grease the wheels of the economy, and via the information it brings support marketing of new products, it may be a great supporter of wealth creation. And while the danger of filter bubbles are clear, it is also a remarkable tool for promoting debate - is Facebook helping to create a new interest in politics, creating a more engaged populace, supporting democracy? Bear in mind too, that the greatest winners from its demise are the very media who are so keen to write its obituary. Personal data is not evil, it’s a key driver of the Fourth Industrial Revolution, but without trust from the public, the Revolution may be more like a new year’s resolution, forgotten soon after it was made. GDPR is there to create public confidence. It is not the enemy of Facebook, and never should it be. Instead it hoists certain standards upon the company, greater transparency, accountability and responsibility. The company is perfectly entitled to process personal data, providing it does so within the legal bases of GDPR, meaning there are constrains, data can only be used in specifically defined ways, meaning no handing it out to any third parties that might want for nefarious intent. And given how the Facebook share price has fallen, given the campaigns to get people to delete Facebook, the company desperately needs to re-build trust with the public, and GDPR could be its single most important means to achieve this. By Michael Baxter 43.
  • 46.
    Blockchain and GDPR byArdi Kolah Blockchain has the potential not only to change how we transfer value, but could shift our systems of trade, identity, efficiency and governance across all sectors, radically transforming traditional approaches to management. “
  • 47.
    What struck mewas that the digital landscape referred to in the Recitals of GDPR isn’t the same as the technology landscape these ‘digital disruptors’ are building today. According to the Oxford English Dictionary, Blockchain is: “A system in which a record of transactions made in bitcoin or another cryptocurrency are maintained across several computers that are linked in a peer-to-peer network.” That sounds very geeky and techie and of little relevance to the rest of us. And only those with a strong interest in advanced mathematical algorithms? Wrong! Blockchain isn’t just about cryptocurrencies. Clarity in its potential use beyond cryptocurrencies is rapidly emerging and promises to change the face of professional, business and industry sectors such as healthcare, manufacturing, aerospace, energy, financial services and even law. In short, will Blockchain effectively kill the sacred cows of our current understanding of data and consent and dare I say it GDPR even before it’s fully enforceable across the European Union from 25 May this year? The ‘digital disruptors’ I met on a cold and rainy night at Silicon Roundabout in London seem to think so. They were enthusiastic about the prospect of massive industry disruption and the opportunities for creating new business models as a result of the power of Blockchain. But this isn’t some distant dream. Currently, 80% of banks are developing Blockchains, and industries from law to aerospace are already exploring possibilities. So, the next 5 - 10 years will see massive disruption from Blockchain adoption as jobs are automated and new industry applications are created. When the internet was born, people used it to email one another. Businesses like Amazon and Uber were inconceivable. According to the GBA, Blockchain promises to be a revolution of similar proportions, with undetermined potential, ramifications and opportunities. “Blockchain has the potential not only to change how we transfer value, but could shift our systems of trade, identity, efficiency and governance across all sectors, radically transforming traditional approaches to management,” explains Nir Vulkan, Associate Professor of Business Economics and Co-convenor of the Oxford Blockchain Strategy Programme at Oxford Saïd Business School. Many are looking at what’s on the horizon rather than how things are done today. And because of the pace of technological change, the future is rapidly catching up with the present. As a result, is ‘consent’ dead? Does ‘consent’ as we understand that to mean, work anymore in this Blockchain landscape? How can a user give consent when they no longer understand what they are consenting to? Even if they read a privacy notice or terms of use, do you really think a consumer will understand how you’re processing that personal data? Is personal data the ‘new oil’ as The Economist proudly proclaimed on its front cover not that long ago? If oil is a finite resource and personal data is infinite resource, do we have to question whether this analogy – although helpful to some extent – is now out-of-date? Should we now be focused on thinking that intelligence – not data – is much more useful as AI and Blockchain technologies become smarter at manipulating this stuff? And so where does this leave GDPR? According to Nick Oliver of People.io “GDPR is the future of the past. Yes, it’s long overdue. Yes, it’s a good thing. Does it solve for the future? No. Does it solve for the now? Probably not.” ArdiKolahwasinvitedtospeaktoaround150peopleat a gathering of the Government Blockchain Association (GBA). Here, he considers GDPR and blockchain. 45. SUBSCRIBE FOR FREE TODAY HERE
  • 48.
    GDPR and thebusinesssmall 46. Make sure you don’t miss issue 2 – subscribe here for free All organisations have a responsibility to ensure data is as safe as it can be and that it is being processed fairly. If you’re unsure whether something is okay to process, you should ask yourself whether it feels right, because if it doesn’t, then it probably isn’t.
  • 49.
    AsmanysmallbusinessesstarttolearnaboutGDPR, they can initiallyfeel daunted. This is likely to be duetoalackthemoneyandresourcesmakingsmall businessesfeelliketheywillstruggletocomply However, what’s more concerning, is that many small businesses still have not even heard of the General Data Protection Regulation (GDPR) with less than a month to go until the deadline on May 25th. Some rumours rattling around suggest that small businesses don’t need to comply with the new data protection regulations. Yet, with the recent adverts launched in early March from the ICO about bringing GDPR awareness principally aimed at micro businesses - those employing fewer than 10 people, shows that they will not have a ‘get of jail card free’ if something were to go awry. And even a one-person operation is required to oblige with new data protection rules. The ICO campaign launched with a series of radio adverts, aimed at those micro-businesses who haven’t heard about the GDPR at all. TheInformationCommissionerElizabeth Denham said: “All organisations have to be ready for the new data protection rules, but we recognise that micro-businesses in the UK face particular challenges. “It’s also worth noting that many sector and industry groups and associations are offering help to micro businesses about the GDPR and can be a good starting point for industry-specific advice.” Denham also added that steps towards GDPR compliance can be achievable without too much cost or expensive external support. Are there exemptions? Although small businesses must comply with the regulation, they have some partial exemptions. For example, Article 30 requires organisations to keep records of their processing activities and categories, and to make those records available to the supervisory authority on request. Within the article, it explains that this does not apply to organisations of fewer than 250 people. However, this is only a partial exemption. If you employ fewer than 250 people, you will still need to document processing activities that are conducted on a regular basis or which involve the processing of sensitive data such as criminal convictions, sexual orientation, health etc. The ICO has produced a template which you can use to record this processing information. Some small businesses may not need to appoint a Data Protection Officer, but the ICO should be contacted if there is any doubt on this issue. If you are not required to appoint a Data privacy officer you should be aware that making a voluntary appointment will bring additional statutory responsibilities. The DPO role can be fulfilled by an individual on a part-time basis or outsourced, and will contribute to your compliance with the GDPR. DPOs may also be a data processor so they will need to keep a record for their clients in such cases. I would point them to the ICO guidance which has templates for this. If you are a small business and a data processor, you will be required to keep a record for your clients. Although you may have less information to document as a processor, it is still important to document activities. The ICO has specific templates for data processors. How do you comply? As stated previously, the same rules pretty much apply for both large and small businesses. The regulation is aimed to give individuals more control of their data. Therefore, organisations must ensure personal information is: fairly and lawfully processed and can be processed for specified purposes. It should be adequate, not excessive and kept up to date. It will not be kept for longer than is necessary and will be processed in line with the rights of the individual. The information will be kept secure; and will not be transferred to countries outside the European Economic Area unless the information is adequately protected. When handling the personal data that a company holds for both clients and its staff, fairness, transparency and confidentiality should be upheld. Data mapping Small businesses should start by finding out what, where, who, why personal data is being stored. You might be surprised by how much data you have unnecessarily or in the wrong place. Once the data mapping has been completed and you have identified each contact and noted why you have obtained their personal data, you need to check it is compliant with the above and other key principles of GDPR. For example, if you have personal data that you don’t really need, this should be deleted. If data is not recognized or out of date or the data being too incomplete to be of any use. A process should then be put in place to reassess the data you retain to be completed on an ongoing basis. This ensures only relevant data is kept and processed. Privacy Policy Your website is often seen as the equivalent to your shop window. Therefore it needs to be perceived as trustworthy. By creating a privacy policy explicitly explaining what you want you do with data will certainly help people trust what you are doing with their data. This should be accessible for all and should reference the GDPR requirements in the policy. Train Staff All staff should be clued up on the new legislation and they should learn about employees basic data hygiene, for example, workstation security. 47.
  • 50.
    Staff should alsofeel comfortable about reporting anything that they feel compromises data protection, privacy and security of customers, clients, supporters and employees. They should be able to report anything without fear of any personal repercussions. Denham has talked about creating a culture of data protection which “pervades the whole organisation.” This is expected under GDPR. Subject Access Requests Many are concerned that there will be a huge influx of data subject access requests (DSARs) when GDPR comes into force. Small businesses will not be exempt from theserequests.Thismeansthatorganisations must be able to provide a description of the data, why they are holding it, who it is disclosed to and have a copy of the personal data in an intelligible form. As stated previously, small businesses are not required to keep a record of everything, however, if you don’t keep a record of customer data, then you might not be able to send this information in the correct timeframe. This means that small businesses should look at all their data to understand what data is held, where it is coming from and where is it going. The best option is to keep a record of processing activities so you don’t have to search through heaps of material at a later date. Staff are also fundamental in ensuring data subject access request (DSAR) are complete in time. They should be trained on how to spot a DSAR as this could come via an email, telephone conversation, letter, the website etc. They should then understand what to do with them, which requires a process to be put in place. As mentioned, staff training is essential. Human error is often considered the highest risk of a data breach within an organisation. By making sure staff know its importance and their responsibility, it will lower the risk of a breach. Training staff can also be used as evidence for demonstrating compliance. If a breach were ever to occur, you can show that you have actively complied with the legislation. Security Security is another factor that small businesses need contend with. Cyber- attacks are not just a large business issue. In fact, according to the government’s Cyber Security Breaches Survey 2017, 46 per cent of UK businesses reported having a data breach in 2016. Cyber-attacks on small businesses have significantly increased which could be due to cyber criminals taking advantage of SMEs lower resources, allowing them to exploit weaknesses and gain access to larger business partners. The cost of securing your organisation should not be too high. For example, organisations can activate firewalls on computers and access points to the internet. Run a reputable anti-virus product and ensure it automatically updates on a daily basis. Make sure you maintain good passwords by activating two- factor authentication for hosted software services. Finally, you can remove unused user accounts and ensure only administrators have full administrative access to computers. Data Breach Reporting If a data breach occurs, small business will have to report them, however, only if there If you suffer a data breach as a data controllerwillitbeyourlegalresponsibility to report directly to the ICO. Failure report, may lead to increased fines – up to 2% of global turnover. Although this is aimed at larger firms, yet small business owners should be aware that the ICO may issue fines for careless or deliberate data breaches. Conclusion Organisations that process data have a responsibility to process it securely, no matter their size. In today’s climate, data privacy has become a basic human right. Customers rightly believe their data should be treated with care and they may be quick to dispel a company seen to break that trust. Therefore, we should think about how we would like our own data to be handled, before we process other’s. As data protection comes to the forefront of organisations, the growing concern about what is being done with personal data and how safe it is, means that all organsations have a responsibility to ensure data is as safe as it can be and it is being processed fairly. In other words, if you’re unsure whether something is okay to process, should ask yourself, whether it feels right, because if it doesn’t, then it probably isn’t. by Laura Edwards, editor of gdpr.report 48.
  • 52.
    www.dataprotectionworldforum.com LEON@dataprotectionworldforum.com LEONBROWN : 0203 515 3015 FOR MORE INFORMATION: Data Protection World Forum is not to be missed – pre-register now
  • 53.
    In an increasinglydigital economy the issue of data protection and privacy is becomingevermoreimportant. On 25 May of this year the EU General Data Protection Regulation GDPR becomes enforceable and the accompanying ePrivacy Regulation (ePR) is currently working its way through the EU legislative process. 2018 will see the biggest change in data protection and privacy for two decades and will have a global impact on companies of all sizes; public sector and government organisations, charities, professional bodies and associations. The inaugural Data Protection World Forum (DPWF) will be held on 20-21 November 2018 at ExCeL London. It will feature carefully curated content streams to allow delegates to focus on the issues that matter the most to them and allow them to deep-dive into specialised topics. While GDPR will be one of the central topics covered at this landmark event, the international line-up of speakers will also cover burning issues such as; The Internet of things (IoT), cyber threat protection, ransomware, Blockchain, AI, disaster recovery, biometric technologies, surveillance, ad-tech and much, much more. 51. 2018 will be a landmark year for data protection and privacy importantinformationaboutthedateprotectionworldforumconference & expo 2018 • 20-21 November 2018, ExCeL London • Pre-register at www.dataprotectionworldforum.com • While the overall impact of GDPR will take centre stage, the Data Protection World Forum will also highlight key issues affecting the following industries: GDPR & marketing, GDPR & HR, GDPR & financial services, GDPR & retail’ the 2018 speakers • JamieBartlett: Author, journalist and researcher – Internet cultures, cyber security and online privacy • Rohit Talwar : CEO at Fast Future – The future of personal privacy and data protection over the next ten years • Pernille Tranberg : Founder of Data Ethics Consulting - A look at the organistations working with data ethics
  • 54.
    LEON@DATAPROTECTIONWORLDFORUM.COM 0203 515 3015 SPEAKERSINCLUDE: WWW.DATAPROTECTIONWORLDFORUM.COM 20TH & 21ST November 2018, ExCeL London M ARLO ES PO M P N IRVAN A FARH AD I G ARY H IBBERD STEVE W RIG H T 2018 WILL BE A LANDMARK YEAR FOR DATA PROTECTION AND PRIVACY Pre-register for the data protection event of the year here
  • 55.
    The EU GeneralData Protection Regulation (GDPR) comes into force in May 2018, and will impact every organisation that trades with any EU country. The new regulations will disrupt how organisations: store,manageandprocessdata.GDPRisthebiggest shake-up in data protection for a generation. Ourhalf-dayandfull-daycoursesprovideapractical and efficient staff training programme, presenting staff with the knowledge, understanding and practical tools to implement and maintain GDPR compliant processes. Contact Leon for more details: leon@dataprotection.media 0203 515 3015 Half Day : £2,000 Full Day : £3,500
  • 56.
    Breaches, incidences and security Applemay have set a high bar for cybersecurity, but not even this company is invulnerable. Recently, it issued a second customer warning for owners of iPhones, iPads and the Mac that they were affected by a processor flaw that could leave them vulnerable to hackers. Apple said that security issues are known as Meltdown and Spectre. It urged customers to only download software from trusted sources. The good news is that, at the time of writing, there is no evidence that the security flaws have been exploited by hackers. Apple has also released software updates to iOS. It just goes to show, even the world’s biggest company and a firm that sets great store by the way it treats customers’ privacy and security is under constant threat. And indeed, its share price took a hit. When it last revealed a security flaw, in November 2017, its share price fell by over two per cent. Breaches, incidences and security Apple may have set a high bar, but not even this company is invulnerable. Recently, it issued a second customer warning for owners of iPhones, iPads and the MAC that they are affected by a processor flaw that could leave them vulnerable to hackers. Apple says that security issues are known as Meltdown and Spectre. It is urging customers to only download software from trusted sources. The good news is that, at the time of writing, there is no evidence that the security flaws have been exploited by hackers. Apple has also released software updates to iOS. It just goes to show, even the world’s biggest company and a firm that sets great store by the way it treats customer’s privacy and security is under constant threat. And indeed, its share price took a hit. When it last revealed a security flaw, in November 2017, its share price fell by over two per cent. Even before GDPR, security was a fundamental issue, now it has taken on an even greater imperative. In this section, we look at some of the key issues.
  • 57.
    Some key Articlesin GDPR Some of the key articles in GDPR relating to cyber security, breaches and incidences are: Article 33, which states: “In the case of a personal breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authorities…” It also states: “Where the notification is not make within 72 hours, it shall be accompanied with reasons for the delay.” Other key Articles include 32, which relates to security of processing, Article 28, which focuses on the processor of data, and among other things stipulates clauses that must go in a contract with third party data processors, Article 30 which relates to keeping records, and Article 34, the communication of a breach. Here, we look at tips to avoid a breach, dealing with third parties, especially using the cloud, plus a guide to cyber security – Everything You Need to Know About Cybersecurity – and the Cardless Society. But first we look at tips to avoid a Breach. Guide to avoid a breach and what to do when it happens At the recent GDPR Summit London, a panel discussion focused on tips to avoid a breach. Present were Abigail Dubiniecki from Henley Business School, Piers Wilson from Huntsman, Dave Horton from One Trust, Amanda Finch from IISP, Anthony Lee from DMH Stallard and chaired by Ardi Kolah from Henley Business School. 128 days versus 72 hours There is a gulf between what GDPR expects and the current reality. Article 32, stipulates that a breach usually has to be reported within 72 hours, Ardi Kolah suggested that research has found the average length of time it takes an organisation to discover they have a data breach is 128 days. So that is quite a gap, and it needs closing, fast. So what tips did the panellists have? The clock is ticking It is simply no good waiting for a breach, and then agreeing how you will react. All panellists agreed: preparation is the key. “Don’t write a plan in the middle of a crisis”, summed up Anthony Lee. They also agreed, it’s not a question of if, it’s a question of when a breach occurs. GDPR says organisations must have technical and organisational measures in place. So that means having the right technology, but also having procedures and staff training. Staff must understand the data, procedures must be defined, practises agreed and everyone must be familiar with them. Abigail Dubiniecki said “that staff must speak the same language” – not literally, but that there must be is a common understanding of the terms used. There was also an emphasis on the need for good people interpreting technology and doing risk assessments, ensuring that you must have information to answer questions, such as what data was lost, what data was exposed, passwords decrypted? This means having tools, and technology, but also audit trails. You need records, otherwise if you say: “I don’t know”, the regulator is likely to look less kindly upon you. Dave Horton suggested that you need to identify the roles within a business that will be important, while Amanda Finch put emphasis on testing and learning from previous experiences. Difference between a breach and an incident The panel turned to the question of incidence versus breach. Article 29 of working guidance makes a distinction between an incident and breach. There are three types we recognise, shortened with the initials CIA: • Confidentiality breach; so some data has got out into the world. • Identity; for example, has the data been tampered with? Under GDPR having accurate data is critical. Examples given related to changing the grades of an exam or modifying the signature on a major financial transaction, third is. • Availability; such as ransomware attack. • An incident; is where a security event has happened, but is not necessarily a personal data breach, a ransomware attack, for example, but you have a robust backup so that it does not cause any serious harm. But even if there is a breach it may not be reportable. This depends on the risk assessment and likelihood of risk to individuals. Then again, if a breach occurs but is not reportable, it is still a learning opportunity. A key point relates to having a defensible position. So if a breach occurs and you are subject to an investigation, they are going to look far more favourablly if you have good processes. Processor and supervisor A lot of smaller companies do not process their data themselves. Indeed, while many outsource data to the cloud, smaller firms often outsource the people who manage that, with an outsourced IT manager. But GDPR requires that the processor has to report a breach to the data supervisor, and this should be built into the contract with a company that has been outsourced too. Article 28 means that you cannot agree a contract with a third party unless they provide sufficient guarantees of compliance including security provisions, to inform, without undue delay if there has been a breach. It is also essential that all data that is processed is subject to an appropriate lawful bases under GDPR, such as consent or legitimate interests. And panellists also said, choose partners wisely, “if they say ‘GDPR what?’ You probably probably look for another provider,” suggested Abigail. But this all leads us to the question of what to do when your data is held by a third party, for example in the cloud. And for this, we paid a visit to Anthony Lee, who told us more. By Michael Baxter PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
  • 58.
    Let’sputourhead intheclouds Let’s say youare storing some of your data, or are using a cloud based electronic point of sale system. “Take a retailer,” suggests Anthony, “by definition, you will have a lot of personal data, because most of the contacts you have will be people rather than organisations, GDPR will apply to those data sets. “If you are responsible, as an organisation, for technical and organisational measures in place to keep that data safe and to make sure it is being processed in a way that is complaint with the regulations, then if you choose to put that data with a third party, you, as the data controller, are still responsible for that data, and you will be held accountable by the regulator if anything untoward happens to it.” He says there are three key things to think about in relation to third party eco systems: 1: Does the company in question have the right to share the data with the third party in question? Is there a lawful bases? For example, are they doing so in pursuant to their legitimate interests, or one of the other lawful bases, such as consent? Processing personal data is defined very broadly, holding data, storing data, retrieving data, accessing data, transmitting data, sharing data; it is all a form of processing. To put your data with a third party, you need to be able to establish that you have a lawful bases for doing so. And if you don’t, then the sharing of that data would be in breach of the regulation. 2: As an organisation, you need technical and organisational measures in place to ensure that the data is being processed in a way that is complaint with the regulation and the principles in it. So you need the right technology to guard against external threats such as hack attacks and the right people that can handle the data in a way that is complaint with the regulation. That alludes to awareness raising, training, privacy policies and procedures, and just making sure that right from the top of the company to operational people, you are handling data in an appropriate way. Then, assuming you have satisfied yourself that you have a lawful bases for sharing the data, you need to satisfy yourself that the third party has sufficient technical and organisational measures in place to keep that data safe. So if you give the data to a third party that has very loose security in place, you will be in breach of the regulation. 3: Then you need to extract a number of contractual promises from that provider, which go way beyond the promises that you need to extract under the current rules. For example: • You need the cloud service provider, which would be considered, under the rules, to be a data processor, to agree that they will only process the data you hand to them in accordance with your written instructions. • That they will do no more or no less. • There is no discretion in relation to the data. • Also: and this resonates with your obligation to satisfy yourself that the service provider has got technical and InterviewwithAnthonyLee [prvacyspecialist] onGDPRandthirdparties Article 28 of GDPR states: “Where processing is to be carried out on behalf of a controller, the controllershalluseonlyprocessors providing sufficient guarantees to implementappropriatetechnicaland organisational measures in such a manner that processing will meet the requirements of this Regulation and ensure the protection of the rightsofthedatasubject.” Under GDPR, you are responsible for the data you collect and control even when it is being processed by third parties. But supposethedatagoesinthecloud,whataretheconsiderations? WespoketoAnthonyLee,aprivacyspecialistandpartneratthe Law firm DMH Stallard. BY MICHAEL BAXTER
  • 59.
    organisational measures inplace, to process the data in a compliant way, you need to get a contractual promise from them to that effect. • Then it gets more detailed. For example, you, the controller, would have to get a promise from the third party in question that they will give you access to their facilities so that you can audit them to satisfy yourself. • You need to extract from them a promise that they will give you the information you need to be able to demonstrate to the regulator that not just you, but the third-party system you are tapping into, is compliant. (This point relates to the accountability principle in the regulation, not only do you need to comply with the regulation, you need to be able to demonstrate to the satisfaction of the regulator that you comply, you need to have evidence that you can show to the regulator that you keep appropriate records of processing activities, and that you have robust training and programmes and procedures to make sure your work force are thinking privacy and acting in an appropriate way. But in addition you have a requirement to ensure your service provider provides you with information that you need to be able satisfy the regulator that they too have those similar measures in place.) • The contract must say that on ‘expiry or termination of contract, for whatever reason, whether it be convenience or just expiry, the data processor – the cloud service provider, will either delete or return the data.’ • If you are dealing with what is for example known as a software as a service provider, for example a cloud based electronic point of sale system, there is every likelihood that that supplier will be tapping into Amazon, or Microsoft Cloud – Azure – or Google Cloud. The regulation says that you, as the data controller, are responsible for ensuring this software as a service provider, also ensures their third parties also apply the above requirements - so all those clauses above, have to flow down the third party chain. But saying and doing are not the same thing. How do you ensure that, if the cloud provider is a giant tech, it abides by the above requirement. There are additional issues if the data is processed at a location outside of the European Economic Area, such as on a server in the US. “My view is that they are going to have to come up with some kind of self-contained approach to be able to demonstrate to the controller community and to the data processor community that they do have clauses in their contracts that are aligned to Article 28.” Says Anthony. The export of data “If the data ends up outside of the European Economic Area, then that accounts as a form of export,” Anthony explains. Under GDPR, there is a general prohibition of exporting data, unless it is going to a country that has been designated by the European Commission as having suitably robust privacy laws in place. Such a designation does not apply to the US. The US consideration In the US, they have the privacy shield, which is voluntary. Companies can register themselves, and you can export data to that company without falling foul of the adequacy requirements or the export prohibition, providing of course, that you satisfy all the other conditions. “If they are not signed up to the privacy shield there are ways around it. For example, you can get them signed up to the Model Clauses approved by the EU Commission, but this may change post 25 May.” The issue of dealing with the US is nuanced. Importance The regulator has already singled this area out as one of the areas that may carry the highest fines. But fines are only part of the story: “The box of tricks at the regulator’s disposal, includes the right to stop data from being exported, to injunct. For example, if you are a retailer, and your life-blood was your access to your cloud based electronic sell system, and the regulator stops you using it, fines may be the least of your problems.” There is also the possibility of individual compensation. Anthony warns, “No one in the UK has yet managed to extract individual compensation, but the door is open slightly, as courts have acknowledged that someone can be compensated, not just as a pecuniary loss for use of data. For example someone syphoning money from your bank account, but also for distress that is caused as a result of being on the receiving end of inappropriate emails.” Reporting a breach “It is worth bearing in mind that under the current rules, if there is a breach, such a hack attack, or accident loss, you have a discretion as to whether you have to notify the regulator, but under the new rules, you have to notify the regulator within 72 hours – although typically requirements are softer if the data encrypted.” Furthermore, “if the person whose data has been lost is at risk, you have to warn them. Within the third party consideration, have to make sure the third party is aware of a breach and they notify you, and you can meet that 72 hours requirement. You can’t just bury your head in the sand. There has to be a component of proactive monitoring, if you don’t find out about a breach because your service provider did not tell you, then your service provider should have known about it, but were half asleep, you are going to be accountable to the regulator.” The supply side But what if you are a third party, what can you do? “Bigger organisations are cooking up long and detailed data protection addendums, which have all manner of detail, and from the controller-side perspective. They may have a sting in the tail saying ‘if you mess up on any of these things, you the processor will indemnify us, and your liability is unlimited.’ “A supplier can, however, seize the agenda with your their clauses dealing with Article 28 provision, demonstrating that they take it seriously, putting a clause in saying they will not export the data outside of Europe without your consent, and then agree the bases for doing so.” 57. Don’t miss our next issue – subscribe for free here
  • 60.
    Everything you needto know About cybersecurity Cybercrime is unavoidable in this day and age. There’s nearly a 100 per cent chance you know someone who has been the target of an attack. If you own a business, your odds of being targeted are at least one in two. Yes, fifty percent of all businesses — and probably more — have been attacked. Here Nathan Sykes, takes a deeper look. Don’t write that off as conjecture just because you don’t know you’ve been attacked, either. As we’ve seen with numerous recent cyber incidents, the consequences of these attacks often don’t become apparent until long after they’ve been carried out. Enforcing data protection and privacy laws is still not an exact science, and that means to keep your business safe, you’ve got to be educated. GDPR makes this a requirement, of course. Expand your knowledge – pre-register to Data Protection World Forum today
  • 61.
    Money and Power Makeno mistake that the reason cybercrime is so popular is that it’s lucrative. In the example of ransomware, which demands that a user send payment — often in the form of cryptocurrency — the model is straightforward. Scamming attacks like emails claiming to be a long-lost relative in need of some cash work the same way. Less obvious hacks are part of larger schemes that lead to someone getting paid, but many of the attacks you hear about on the news don’t have any direct financial impact on those affected. For example, when Yahoo leaked three billion user accounts, the media made a huge deal of it — but not one of those people lost money as part of the hack. Your company houses a wealth of sensitive information about your employees at the very least. If you interact with customers, their information is also housed on your servers or the service you choose to use for storage. To a cybercriminal, your company is a veritable money pit, so what can you do to protect it? Corporate Cybersecurity You wouldn’t leave the door to your warehouse unlocked at the end of the day, would you? And yet many businesses leave the door to their data wide open. Were it a brick-and-mortar building. You would install cameras, motion sensors, and elaborate locks. In the world of cyber security, the best strategy to adopt is a similarly layered approach to protecting data. Threat vectors are everywhere in a corporate setting. You have an array of workstations continually communicating with outside locations, email that you expect to receive from completely unknown sources, and a workforce that cybercriminals will leverage against you. That’s right. Your people are one of the biggest threats to cybersecurity. More than 60 per cent of companies allow employees to do business on their personal devices. All that it takes for malicious software to access your network is a single file shared from your phone. Does your company allow outside flash drives or hard disks to be brought in? Those are dangerous, too. Layered Security Replacing the cameras, sensors, and locks in the world of Cyber means installing comprehensive threat detection and network protection technologies. Not long ago, the term would have been “antivirus,” but with threats evolving from viruses to network- based attacks, “network security” is a more accurate phrase in today’s world. A well-outfitted network protection suite today will include the ability to scan for “conventional” threats on your environment as well intelligent detection technology at the application and network level. In simple terms, it looks for suspicious activity on your computer’s file system, and also in communications over the internet. In addition to this type of protection, you’ll want to stand up a firewall, which is the component that blocks communications detected as malicious as they come in from the internet. Emails get a dedicated scanning component with technology that can find threats in their dedicated format and protocols. On top of these first and second-layer defence systems, state-of- the-art solutions include a layer of forensics software that lets you identify where the servers sending attacks are located. This can be critical if your business is high-profile and you suspect that a single attacker is launching multiple attempts to compromise your system, this type of defense can help authorities resolve the threat. Next, encrypt all your files. This will make it impossible for bad guys to make sense of data if they do get access to it and is an essential component of good security particularly if you’re housing information from the public. Finally, you need a disaster recovery solution. Even the best defences can be breached with enough time and effort. Your environment should be backed up regularly, at least once a day, with the ability to restore your work from the backups in a matter of hours. Without this in place, you’re begging for a crisis. And a big fine under GDPR. Social Engineering Malicious actors have learned that, in many cases, people are easier to hack than computers. This practice, called ‘social engineering’ involves tricking one or more people on your team into exposing access to sensitive data or introducing malicious code. Once you’re compromised, it’s very difficult to wind back the proverbial watch. Education is the best way to combat this type of attack. Social engineering relies on the ignorance of your employees, but you can fight back by arming them with the knowledge to recognise and avoid cyber-attacks. Companies like Symantec, Webroot and ESET, offer cybersecurity awareness training programs that can help reduce the risk of exposure for your office. Fighting Back To stop cybercrime from throwing our modern world into anarchy, groups like the FBI, and CIA in America, and Interpol teaming up with businesses from the private sector to track down and punish the people who perpetrate these attacks. One such operation started in 2009, recently saw the capture of Russia’s most famous hacker, after bringing down a botnet named “Gameover Zeus.” By reverse-engineering these attacks and the reconnaissance that goes into them, investigators can begin to form patterns and identify the people responsible. The internet is the real world, and the punishment for these crimes is comparable to any other form of theft or incursion. However, cybercriminals know that if they plan and execute an attack successfully, the authorities are at best in a reactive position, and that means the bad guys still win much of the time. The turning point in the war on cybercrime will come when we have a deterrent. A way to raise the stakes for cybercriminals, so that they think twice about what they’re doing. We might not see it in our lifetime, but if we want to go on with our lives free from constant cyber-threats, that is what it will take. 59. “More than 60 per cent of companies allow employees to do business on their personal devices. All that it takes for malicious software to access your network is a single file shared from your phone. Nathan Sykes is a business technology writer from Pittsburgh, PA.
  • 62.
    Once upon atime, a world where your eyeballs are scanned to get into a building, an implant inside your body opens doors and switches on the lights, and your whole life is contained in a mini-computer measuring 6 x 12cm, was the realm of sci-fi movies. Jason Choy, of the security specialists Welcome Gate, discusses how the latest advances in technology are transforming his industry. It has become the norm for our faces to be scanned at passport control, our iPhones to be opened by a fingerprint, and for us use a phone as our calendar, our fitness tracker, our bank, our research resource and our document store. It is hardly surprising that security systems are increasingly being managed via a phone, biometrics is being used more and more to control access to buildings. And taking it to the next level, implants are growing in popularity, thanks to the body hacking movement. There are advantages to each of these which it is worth entrepreneurs and business owners considering next time they move offices, or upgrade their security system, especially as GDPR will place greater demands on recording and tracking how this data is stored. Smartphone security Nowadays, we can’t be without our phones - they have become an extension of our physical, mental and social beings, with some people using them for a third of their waking hours. It seems inevitable that the traditional security fob to get into office buildings will disappear over the next few years, and access control will move onto our phones. There are advantages: • if someone picks up your phone, they can’t get into your office, as it needs to be activitated by your PIN, a fingerprint or facial recognition • if people lose their phones, they report it quickly (often people are reluctant to tell their employers they’ve lost their security fob) • out of hours contractors can download an app to get into a building, meaning the property team don’t need to be on-site to programme a card and physically hand it over • those managing your office security can operatethewholethingwithouttheneed to be on site, adding new users, checking access cards, sorting out security issues and even checking individual CCTV camera, particularly useful for businesses with multiple offices • it can create a seamless welcome for your visitors, who can be sent a personalised visitor pass in advance, so their phone will activate • entrance barriers or even lifts, and their hosts will be notified they have arrived, and they can even choose their preferred drink in advance. Biometrics Biometrics is the use of technology to scan a part of the body to determine your identity. There are four key types: fingerprint, face, eye, and hand geometry. It’s now being used for access control in many modern offices, particularly industries such as tech, financial services and retail. There are three main benefits of using a biometric security system: 1. Security and accuracy Your body parts are much more difficult to fake, lose or share, so you are guaranteed biometric access is 100 per cent accurate. This gives you completely reliable attendancerecordsforstaffandcontractors. This accuracy is particularly attractive for industries such as tech, defence or pharmaceuticals. For these, stringent security procedures – which can be guided by standards such as ISO 27,001 Information Security Management - are essential to restrict access to areas such as server rooms, and to protect sensitive information, particularly if there is a potential threat of terrorism, espionage or extreme activism. 2. Cost As with every new technology, when biometrics was first introduced 20 years ago thecostmadeitprohibitiveformostbuildings. Now it can be more cost effective than using card security, as you no longer have to pay to replace lost access cards. This cost can run into thousands, by the time temporary cards are reprogrammed and issued, old cards taken off the system and new ones bought. 3. Sophistication The latest biometric systems are lightning fast, with a whole range of different technologies available and a bewildering array of suppliers in the market. They can be used for more than just security. Facial recognition software has great potential for marketing in the leisure and retail industries, providing valuable analysis of buyer behaviour and demographic information. It can track the demographics of visitors including length of stay, age profile and gender. Implants Pet owners don’t think twice about micro- chipping their animals and implants work exactly the same way. They take convenience to the next level and are part of a new movement called body hacking, where people are using technology to enhance their lives, implanting mini computer chips into their bodies to eliminate the need for credit cards, security fobs and phone apps. This tech is being adopted by a few companies for security, cashless vending, photocopying and every other function that you might historically need a card for. As well as being convenient, it is also completely secure and some might argue, environmentallyfriendly,astheyminimise the volume of plastic waste created by having physical cards for everything. The miniaturisation of technology has made this very viable, and I predict it will become increasingly mainstream. Conclusion The cardless society is fast approaching. The advances in biometrics, in smartphone apps and in implants are revolutionising the way security is managed. This is where organisations’ strict adherence to GDPR becomes even more important, as we consolidate technology, making it less tangible and in cases like implants, wholly intrusive. Organisations have a higher responsibility to ensure that the data they hold, handle and store is carefully managed. Jason Choy is the founder of the security specialistsWelcomeGate,whichprovidestailored security packages for clients including EDF, Ford, Ikea, British Transport Police, H & M and Bentley. The Cardless Society 60.
  • 64.
    Make sure youdon’t miss issue 2 – subscribe here for free Under GDPR, the ‘right of access’ means that individualshavetheright to access their personal data and supplementary information.
  • 65.
    By now mostpeople would have heard about GDPR and the impending fines organisations may face if they have a data breach. Aside from data leaks and hacks, another way an organisation can breach the new regulation is by not adhering to the new rights of individuals. Under the GDPR, individuals will benefit from increased rights such as: • The Right to be Informed: If requested, organisations must state the purpose for processing personal data, the retention period for the personal data, and who it will be shared with. • Right to Rectification: A process will need to be put in place such as if requests come in to correct historical data. • Right to Erasure: Or right to be forgotten allows people remove their data if they no longer want an organisation to have it unless it is mandatory to keep. • Right to Portability: Under Article 30, do we have a facility in place for data portability? • Automated decision making: Individuals can request for a human to make their decision rather than it being automated. The Right of Access Under GDPR, the ‘right of access’ means that individuals have the right to access their personal data and supplementary information. This right also allows them to be aware of and verify the lawfulness of the processing of their data. This is also known as Subject Access Requests (SARs). SARs aren’t new, this right also existed under the under the Data Protection Act, however, under the new legislation, there will be changes that organisations must pay attention to. This article will take a look at what the requirements are of SARs and the pitfalls of organisations dealing with them. Changes to SARs Currently, under the Data Protection Act, organisations can charge £10 for a Subject Access Request, but under GDPR, the £10 charge will be revoked. This may open up the floodgates for people to request to find out organisations are doing with their data. Small businesses may suffer from the new requirements, as they may have never received a SAR before and will not know the process or have the time and resource to deal with them if they become inundated after 25 May. Another change to SARs is the amount of time people must respond. Under GDPR, organisations must respond within 30 days as opposed to the 40 days under the DPA. Under GDPR, if data subject rights are violated, organisations will have to pay up to 4% of their turnover or €20 million, whichever is greater. Therefore, it is essential that organisations know exactly how to respond correctly and in time. The ICO receives 1,5000 complaints which are increasing year on year. Out of those complaints, 40% are SAR related. This is a significant number considering there are eight separate data protection principles. The ICO is not clear on the exact reason why, but one of them may be due to the fact that people are becoming more aware of their rights and are more willing to act on them. Over the last few years with media coverage of data breaches and misuse of data, the public has lost trust in organisations. In fact, an ICO survey revealed that that only one-fifth of the UK public (20%) have trust and confidence in companies and organisations storing their personal information. In addition, under Article 57, the ICO will have an obligation to promote GDPR including the data subject rights. Therefore there will be even more awareness. The changes to the SARs means there is more pressure on employers and they are looking at implementing dealing with SARs. With public awareness rightly increasing regarding their data rights, organisations must ensure they take it seriously. Organisations may not know how to deal with SAR correctly. The ICO receives many complaints about regarding several issues about SARs. Here are some of the complaints the ICO receives and what organisations can do to ensure they adhere to the legislation. The response has arrived late This is self-explanatory, however, as mentioned, organisations may face fines Subject access requests if they don’t respond in time. If you aren’t able to respond within the 30 days, this may expose other issues with the organisation. For example, if an organisation doesn’t store data correctly, they may need to spend time on a data flow audit and data mapping. A data controller didn’t realise they had received a SAR If an organisation is involved in a customer or employee dispute, they may have long letters or phone calls about wider issues. However, a SAR request may be concealed within the correspondence. This kind of issue highlights the importance of staff training. Organisations will have a shortened time to respond to one, and therefore, if staff receive one via email, phone call, online, they should know how to spot one to pass it onto the relevant staff member or team. If someone is unsure whether it is a SAR, you can quite simply ask the individual to confirm or check with the ICO. Not extensive enough If an individual received SAR but they believe it is incomplete or the search wasn’t extensive enough, the ICO would require the individual to go back to the organisation to try and raise their concern and for the organisation to rectify the issue. IftheconcernisthenraisedwiththeICO, they will review all the correspondence with the individual to get the full picture and the organisation’s understanding of the SAR. SARs with third-party data A response would include third-party data, which is information that would relate to other people. Documents containing the personal information of more than one person should not automatically be disclosed on submission of a subject access request. This can be more complex and some instances it is reasonable and appropriate to provide third-party data, but sometimes it is not. For example, CCTV footage may disclose personal information about another person, which may be a breach of their own personal data. A balancing exercise must be done to determine whether disclosure is appropriate in the 63.
  • 66.
    circumstances. Organisations cancontact the ICO anonymously if they are unsure. The individual is ignored Organisations may rightly contact an individual informing them that they no longer wish to correspond on a wider issue, such as a customer complaint or dispute as they may feel that it is no longer constructive. However, if a customer has requested a SAR within their wider issue, the organisation is still obliged to respond to the SAR. Organisations can make it clear that they are only reopening further communications to respond to the request. Although organisations may rightly decide they no longer want to correspond if they don’t believe it to be constructive, if an individual requests a SAR, they still have a statutory right and the organisation must respond. What can organisations require before responding? Organisations may need some information to establish the individual. An organisation can ask for proof of identity. Often individuals may not be happy with this request, as they may have spoken with them very recently and know who they are. However, in most cases it is required to ensure that the information they hold is held securely. Any issue is made, SARs establish their ID and make sure the organisation knows who the individual is before handing over their information. Requiring ID will depend on the information you expect to disclose. For instance, when someone has requested a SAR, if the organisation doesn’t hold any sensitive data, an organisation might want to ask for a recent utility with name and address. Yet, if sensitive data will be disclosed, a higher level of ID such as a passport may be requested. Organisations can also ask the requester to help locate data. For example, if you have not heard from them before and they are requesting a general search, they might ask for all information held on them. In this case, you may only be required to search your customer or complaints database. Conclusion Overall, the fundamental structure for SARs will remain the same as the DPA under the GDPR. Nonetheless, the changes may have more of an impact on organisations and their resources By Laura Edwards, editor of gdpr.report than expected. Therefore, it is essential that organisations plan ahead and know exactly where their data is stored. Have their staff on board and a process is in place to respond in time 64. Watch out for: Delay in receiving response. The clock starts ticking the moment the SARs is sent, not the moment the individuals responsible for overseeing it receive it. What to do: It is imperative that all staff are trained and know who to contact if they receive a SARs. Watch out for: Delay in replying. What to do: Ensure well-rehearsed procedures are in place. Watch out for: Not knowing where all the data is, for example removable storage devices and on the cloud. What to do: Data audits. Watch out for: Third parties. What to do: Look at agreements, ensure third parties will respond to a request furnishing you with the appropriate data in an appropriate time-frame. Watch out for: Not realising a SARs has been made. What to do: Training of staff is essential to avoid this. Subject access requests checklist
  • 67.
    left behind? Ihear what you say, surely security has developed technologically too. So what is the heart of the issue? I am suspicious that the day-to-day use of data has changed so dramatically that now many companies are finding that they are not only having to bridge the hop, skip and jump to GDPR compliance, but review and consider how the DPA regulations are currently being adhered to. I don’t doubt that the annual training is being executed year in, year out, however, the headless chicken act seems to be dominating the internet and social media and this leads me to believe that not much of the DPA is being upheld. So I wonder if the laid back attitudes being adopted signify that, despite best efforts, the DPA is the wallflower of compliance. In reality, if a company is happy to take whatever fine will be issued, should I care? Well ‘we’ certainly did as the breaches ripped through the headlines, dominating the news in the early days, shock and horror ripped through the newspapers and social media as outrage boiled the blood. The more recent breaches have received a much different response, the headlines are still grabbing the stories but the response from the many is almost nonchalant. Perhaps the wallflower does not stand on its own? It is amazing that more emotion can often be drawn by a person experiencing the theft of any physical object (despite its worth) due to the feeling of violation that resonates with the victim. This differs so dramatically from the attitudes today when the risk and potential impact can be overwhelming to any individual, yet data being illegally obtained doesn’t. It is, for this reason alone, that I am convinced the changes GDPR bring are undoubtedly for the best. As Data Protection Impact Assessments (DPIA) transition from recommended to mandatory, the data subject (You and I) become more empowered as our rights are extended and the responsibility of maintaining a process inventory lies with the businesses that are using your data. The inventory and DPIA together will bring a greater understanding and awareness of the way data is used by the company, promote the right questions to be asked, and ensure security controls are rigorous. Delivering these changes will alter the way data is seen throughout the day-to-day working of the business for the better. And for those that don’t want to step up to the plate? Well, we can all be comforted that the fine cap has truly been blown, so where warranted a maximum 4 per cent of global turnover could be issued. It seems the wallflower of compliance has grown its thorns. 1998: an era in time when Snake was the only game on any ‘cool’ phone, the internetcouldcauseexcitementbysnailmailwhenaCDfromoldfavourites like Freeserve, or Computerserve dropped to the doormat, and paper documents dominated the filing system of any business. 1998 was also the year the Data Protection Act was enforced and an appreciation for data was born, over the next two decades technology developed quicker than you can say ‘Silicon Valley’. We saw ‘mobile’ phones big enough to warrant their own case morph into a handheld device that can connect you to your friends and family, capture those unmissable moments, update all of your friends, instantly and if needed, to run your business, connecting you globally. So, this article is about GDPR, why are phones important? By looking at the paper driven environments to a business in your hand brings into perspective how differently data is processed in today’s world. Enter GDPR. “If you keep your cool while all of those around you are losing their heads….” There is a school of thought that GDPR is but a hop, skip and jump from the DPA so why has the practical application of it caused such a fuss? My thought (and mine alone I may add) is that the DPA is the wallflower of compliance. For years I have seen adverts showing that crime and fraud are being cracked down on, HMRC needs you to send in your tax return and the FCA, the ombudsman, OFCOM, OFWAT are there to support you… until recent weeks the ICO has not been so much heard of; well not outside of the multiple investigations that hit the news. So if the only news is bad news does that mean there really is no bad publicity? “Uber says 2.7 million in UK were affected by security breach” - The Guardian, “Carphone Warehouse fined £400,000 for putting millions of customers’ data at risk” - Independent “Email gaffe by Coventry University staff exposed 1,930 students’ details” - www.databreaches.net With such headlines storming the news it is no wonder that affairs and fines have been issued to business giants, left, right and centre. YET, the lack of data protection continues to exist and the breaches just keep coming. It is no wonder that data security has become the focus of Europe. I ask, has technology moved at a pace that security has been the wallflower of compliance grows thorns Compliance Specialist and GDPR Lead Business Analyst at Morrisons, Janna Banks, with her blue-sky thinking & years of data management, has her own perspective to offer. PRE-REGISTER FOR DATA PROTECTION WORLD FORUM HERE
  • 68.