16 C O M M U N I C AT I O N S O F T H E A C M | N O V E M B E R 2 0 1 8 | V O L . 6 1 | N O . 1 1
news
I
M
A
G
E
B
Y
M
I
X
M
A
G
I
C
with the advertisements that you ac-
cept. And I think people are becoming
more and more aware of the fact that
their personal data do have a value.”
Says Alison Cool, assistant profes-
sor of anthropology at the University
of Colorado, Boulder, “There are a
lot of questions and ambiguities that
must be addressed, but it’s clear that
GDPR will significantly change the
data landscape.”
While the U.S. and a number of
other countries have adopted an opt-
out approach to data collection—es-
sentially, a consumer must instruct
a company if he or she doesn’t want
his or her data used or shared in cer-
tain ways—Europe has implemented
a more restrictive opt-in approach.
However, GDPR takes this concept to
a new and previously untested level.
Besides giving consumers near-total
control of their data, they can have
W
H E N T H E E U R O P E A N
U N I O N (EU) General
Data Protection Regu-
lation (GDPR) went
into effect on May 25,
2018, it represented the most sweeping
effort yet to oversee the way businesses
collect and manage consumer data.
The law, established to create consis-
tent data standards and protect EU
citizens from potential privacy abus-
es, sent ripples—if not tidal waves—
across the world.
GDPR gives European citizens great-
er control of their data while establish-
ing strong penalties for businesses
that do not comply. What is more, any
data that involves EU citizens or touch-
es EU companies is covered by GDPR.
The initiative replaces an older data
privacy initiative called the Data Pro-
tection Directive 95/46/EC, which was
introduced in 1995.
The implications and ramifications
are enormous—and the initiative’s
reach is global. GDPR will change ev-
erything from the way data collection
takes place to the way corporate data-
bases are designed and used. It also will
potentially change the way research
and development takes place, and
will impact cybersecurity practices, as
well as introducing a practical array of
challenges revolving around sites and
repositories where groups share com-
ments, information, and other data.
“It’s a groundbreaking initiative,”
says Brett M. Frischmann, Charles
Widger Endowed University Professor
in Law, Business, and Economics at Vil-
lanova University, and Affiliate Scholar
of the Center for Internet and Society
at Stanford Law School. “Europe has
flipped a switch and prompted recon-
sideration of how data can be collected,
managed, and used.” The EU takes the
position that a person owns his or her
data, and their privacy is a fundamen-
tal right that is “basic to the integrity of
a human being,” Frischmann adds.
Data Wars
Digital technology has inexorably
changed the face of privacy. Today,
there is a perception—and plenty of
evidence to support it—that personally
ident ...
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
16 C O M M U N I C AT I O N S O F T H E A C M N O.docx
1. 16 C O M M U N I C AT I O N S O F T H E A C M | N
O V E M B E R 2 0 1 8 | V O L . 6 1 | N O . 1 1
news
I
M
A
G
E
B
Y
M
I
X
M
A
G
I
C
with the advertisements that you ac-
cept. And I think people are becoming
more and more aware of the fact that
their personal data do have a value.”
2. Says Alison Cool, assistant profes-
sor of anthropology at the University
of Colorado, Boulder, “There are a
lot of questions and ambiguities that
must be addressed, but it’s clear that
GDPR will significantly change the
data landscape.”
While the U.S. and a number of
other countries have adopted an opt-
out approach to data collection—es-
sentially, a consumer must instruct
a company if he or she doesn’t want
his or her data used or shared in cer-
tain ways—Europe has implemented
a more restrictive opt-in approach.
However, GDPR takes this concept to
a new and previously untested level.
Besides giving consumers near-total
control of their data, they can have
W
H E N T H E E U R O P E A N
U N I O N (EU) General
Data Protection Regu-
lation (GDPR) went
into effect on May 25,
2018, it represented the most sweeping
effort yet to oversee the way businesses
collect and manage consumer data.
The law, established to create consis-
tent data standards and protect EU
citizens from potential privacy abus-
3. es, sent ripples—if not tidal waves—
across the world.
GDPR gives European citizens great-
er control of their data while establish-
ing strong penalties for businesses
that do not comply. What is more, any
data that involves EU citizens or touch-
es EU companies is covered by GDPR.
The initiative replaces an older data
privacy initiative called the Data Pro-
tection Directive 95/46/EC, which was
introduced in 1995.
The implications and ramifications
are enormous—and the initiative’s
reach is global. GDPR will change ev-
erything from the way data collection
takes place to the way corporate data-
bases are designed and used. It also will
potentially change the way research
and development takes place, and
will impact cybersecurity practices, as
well as introducing a practical array of
challenges revolving around sites and
repositories where groups share com-
ments, information, and other data.
“It’s a groundbreaking initiative,”
says Brett M. Frischmann, Charles
Widger Endowed University Professor
in Law, Business, and Economics at Vil-
lanova University, and Affiliate Scholar
of the Center for Internet and Society
at Stanford Law School. “Europe has
flipped a switch and prompted recon-
4. sideration of how data can be collected,
managed, and used.” The EU takes the
position that a person owns his or her
data, and their privacy is a fundamen-
tal right that is “basic to the integrity of
a human being,” Frischmann adds.
Data Wars
Digital technology has inexorably
changed the face of privacy. Today,
there is a perception—and plenty of
evidence to support it—that personally
identifiable information (PII) is under
assault as never before. A Pew Research
Center survey found that in the U.S.,
93% of adults say being in control of
who can get information about them
is important; 90% say controlling what
information is collected is important.
The figures in Europe and other parts
of the world are the same.
In a 2016 interview in Recode, Eu-
rope’s Competition Commissioner
Margrethe Vestager said, “There is no
such thing as a free lunch. You pay with
one currency or another—either cents,
or you pay with your data, or you pay
Weighing the Impact
of GDPR
The EU data regulation will affect computer, Internet,
and technology usage within and outside the EU;
how it will play out remains to be seen.
5. Society | DOI:10.1145/3276744 Samuel Greengard
http://dx.doi.org/10.1145/3276744
N O V E M B E R 2 0 1 8 | V O L . 6 1 | N O . 1 1 | C
O M M U N I C AT I O N S O F T H E A C M 17
news
their data removed from a database
or online source at any time and, for
those who believe they have been
wronged, seek an investigation and
join a class-action lawsuit.
Strict rules about how organizations
collect, manage, and process data any-
where in the world are only the starting
point for GDPR. It allows consumers to
file complaints with each nation’s na-
tional data protection authority, which
will investigate the claim. A company
that violates GDPR could face a fine
of up to 4% of its worldwide annual
revenue from the previous fiscal year.
The regulation also mandates consum-
ers can remove themselves from a da-
tabase at any time and take their data
elsewhere—to a new bank, a new mo-
bile provider, or a new content service.
Not surprisingly, data scientists, le-
gal experts, and others have radically
different perspectives of GDPR. Says
Daniel Le Métayer, Senior Research
6. Scientist at Inria (the French Institute
for Research in Computer Science and
Automation) and a leading authority
on data protection and privacy, “GDPR
could be a great achievement if prop-
erly implemented. It could establish a
more concrete framework for data use
and protection and help reduce the
misuse of personal information.”
Adds Cool: “It potentially changes
the balance of power. GDPR takes aim
at the widely used model of forced con-
sent, which is built on the idea that in
exchange for various services, there is
an implicit agreement to give up your
personal data.”
However, there are also plenty of
potential pitfalls likely to result from
GDPR. Le Métayer says the complexity
of GDPR, and the way regulators and
courts interpret some of the intention-
ally vague wording, could create such
rigid restrictions that the initiative be-
comes ineffective over time.
There is also strong opposition in
the corporate arena, where the focus
is on profiting from data rather than
stemming the wave of abuses and
breaches. Attorneys such as Tanya
Forsheit, partner and chair of the
Privacy & Data Practice Group at New
York City-based law firm Frankfurt
Kurnit Klein & Selz, demonstrate the
7. level of frustration about changes as
a result of GDPR. Forsheit describes
many GDPR provisions as onerous,
and suggests they could be more ef-
fectively addressed through self-
regulation. “It is simply not possible
to be 100% compliant. GDPR forces or-
ganizations to devote significant time
and expense to comply with standards
that are not consistent with the way
business is done online,” she argues.
Data Gets Personal
To be sure, the practical challenges of
complying with GDPR are significant,
especially as digital technology and ar-
tificial intelligence (AI) advance.
Personal assistants such as Siri,
Alexa, and Cortana add layers of com-
plexity to the issue of PII. Robo-advi-
sors, chatbots, recommendation ser-
vices, and other automated systems
introduce additional compliance chal-
lenges. All these systems collect and
store data about individuals. In the
past, there was no need to determine
where a person lived; under GDPR,
that could amount to crucial informa-
tion that would need to be added to
each individual data point related to
an individual. Even human resources
systems, payroll systems, and similar
repositories of personal data could be
significantly impacted by the regula-
8. tion; all may require algorithmic audit-
ing processes that revolve around “data
protection by design.”
Companies already are voicing con-
cerns that GDPR could inhibit innova-
tion by limiting how data is handled
in apps, databases, and online ser-
vices—and how data is used for adver-
tising and other purposes. The issue
could impact autonomous vehicles,
GDPR allows
consumers to remove
themselves from a
database or online
source at any time;
companies violating
GDPR face fines of up
to 4% of their global
annual revenues.
Milestones
Håstad
Receives
Knuth
Prize
The 2018 Donald E. Knuth
Prize has been awarded
to Johan Torkel Håstad
of Sweden’s KTH Royal
Institute of Technology
for his sustained record of
milestone breakthroughs at
the foundations of computer
9. science, with major impact on
areas including optimization,
cryptography, parallel computing,
and complexity theory.
The Knuth Prize is
jointly bestowed by the ACM
Special Interest Group on
Algorithms and Computation
Theory (SIGACT) and the
IEEE Computer Society
Technical Committee on the
Mathematical Foundations of
Computing (TCMF). The Prize
is named for Donald Knuth of
Stanford University, the “father
of the analysis of algorithms,”
and is bestowed in recognition
of outstanding contributions
to the foundations of computer
science by individuals for their
overall impact in the field over
an extended period.
A professor of computer
science at the KTH Royal
Institute of Technology in
Stockholm, Håstad received
his bachelor’s degree in
mathematics from Stockholm
University, his master’s degree
in mathematics from Sweden’s
Uppsala University, and his
doctorate in mathematics from
the Massachusetts Institute of
Technology.
10. Håstad’s works resolved
long-standing problems
central to circuit lower bounds,
pseudorandom generation,
and approximability. He also
introduced transformative
techniques that have
fundamentally influenced
much of the subsequent work
in these areas.
Previous honors bestowed
on Håstad include the ACM
Doctoral Dissertation Award
(1986), the Gödel Prize for
outstanding papers on
theoretical computer science
(1994 and 2011), and the
Göran Gustafsson Prize for
outstanding achievement in
mathematics.
18 C O M M U N I C AT I O N S O F T H E A C M | N
O V E M B E R 2 0 1 8 | V O L . 6 1 | N O . 1 1
news
robotics, and a variety of systems that
rely on AI. Organizations may ulti-
mately need to keep two separate da-
tabases—one for the EU and one for
elsewhere—or find ways to differenti-
ate records in databases.
11. In addition, GDPR might add a layer
of complexity atop an already complex
European privacy framework. For ex-
ample, more than 2.4 million individu-
als have already submitted “right to
be forgotten” requests so they can be
expunged from Google searches. Cool
says some people believe the law will
“hinder innovation by making organi-
zations more risk averse.”
Depending on who opts in, who
opts out, and what data appears or
disappears from a database or oth-
er source, the situation could be-
come even more problematic. As
Frischmann puts it, “What happens
when one person at a group meeting
or part of a community invokes a pri-
vacy clause but it affects everyone?”
The greatest challenge may be en-
suring companies in the EU and be-
yond adhere to the spirit of GDPR.
Many companies lack expertise in how
they will need to implement and man-
age data under GDPR; they also do not
know the levels of expertise or staffing
required to conduct crucial data protec-
tion impact assessments.
“If businesses view GDPR as a
checklist activity rather than an issue
that requires ethical reflection—and if
they look to exploit loopholes and skirt
12. the intent of the law—the long-term
outcome could be negative,” Cool says.
“When you look at groups like bio-
ethicists and physicians, the starting
point for discussion is how to do the
right thing for society; it’s not about
avoiding getting sued or how to side-
step legal and ethical provisions.”
Cracking the Code on Privacy
How GDPR will play out is anyone’s
guess. The initiative could revolution-
ize the data landscape—or it may fiz-
zle into a footnote in digital history. It
could also change the way the Internet
works and how data and information
flow across sites, clouds, and more.
One wild card is how consumers re-
act to GDPR. If large numbers of peo-
ple revoke access to PII or challenge
the way companies use their data, busi-
nesses may reach an inflection point
where they will have to rethink the
fundamental way they approach and
navigate data management, or reevalu-
ate the fundamental value of data and
how it is monetized. GDPR also might
mandate new tracking and data man-
agement tools, such as blockchain.
Le Métayer argues that businesses
need to address complex issues such
as conducting data protection impact
assessments and implementing data
13. portability, which requires agree-
ing on standard data formats. Other
sources of uncertainty include the
compatibility of GDPR with big data,
and the rules concerning automated
decision-making. Article 22 of GDPR
states individuals have the right to
“not be subject to a decision based
solely on automated processing, in-
cluding profiling.” GDPR also allows
consumers to contest a decision, but it
is not clear what type of explanations
should be provided to make this right
effective. “The issue is also technical,
since providing useful explanations
about certain types of algorithms is a
challenge in itself,” he says.
GDPR could also prompt companies
to directly pay for PII data, Frischmann
says. “If the power balance shifts and
consumers gain leverage over their
personal data, companies may look to
provide incentives, discounts, and di-
rect compensation for the use of data.
It could flip the current model and
even lead to entirely different ways to
approach data,” he explains.
In fact, a recent study conducted
by digital marketing agency Syzgy in
Germany, which polled 1,000 respon-
dents each from the U.S., U.K., and
Germany, found citizens in all three
countries would sell their data for be-
tween €130 (about US$150) and €140
14. (US$165) per month.
One thing is certain: amid a litany
of security breaches and breakdowns,
from Equifax to Cambridge Analytica,
there is a growing focus on data pri-
vacy. What is more, other government
entities are exploring ways to control
how data is collected, managed, and
used. In the U.S., the State of Vermont
enacted a law in May 2018 that estab-
lished standards for data. California is
now eyeing an initiative—the Califor-
nia Consumer Privacy Act—that could
extend many of the same GDPR pro-
tections to the state. Other countries,
from Australia to Japan, have also re-
vised data standards and privacy con-
trols in recent years.
Frischmann says GDPR, above all
else, represents the ongoing battle be-
tween unfettered capitalism and hu-
man dignity. The whole point of it is
that it is not designed to be an efficient
regulation for businesses. “To some ex-
tent, it’s about a person’s ability to exer-
cise their own free will about their life.”
Cool says that, in the end, it is vital
to strike a balance between privacy
and laws. “We need more research that
looks carefully at how personal data is
collected and by whom, and how those
people make decisions about data pro-
tection. Policymakers should use such
16. the ongoing battle
between unfettered
capitalism and
human dignity.
An Analysis of Economic Impact on IoT under
GDPR
Junwoo Seo∗ , Kyoungmin Kim∗ , Mookyu Park†, Moosung
Park‡ and Kyungho Lee†
∗ Department of Cyber Defense (CYDF)
Korea University
Seoul, Republic of Korea
Email: {junuseo, richard2104}@korea.ac.kr
†Center for Information Security Technologies (CIST)
Korea University
Seoul, Republic of Korea
Email: {ctupmk, kevinlee}@korea.ac.kr
‡Agency for Defense Development
Seoul, Republic of Korea
Email: [email protected]
Abstract—The EU’s GDPR is expected to come into force on
May 25, 2018. By this regulation, it will be possible to enforce
even stronger legislation than the existing Directive. In
17. particular,
GDPR is expected to have a major impact on IoT industry,
which uses diverse and vast amounts of personal information.
This paper first examines why the IoT industry is affected by
GDPR. Then, the paper describes how the GDPR will affect the
IoT firm’s cost qualitatively by using the cost definition of the
Gordon and Loeb model and quantitatively by estimating cost
using statistics and legal bases. As a qualitative view, the
GDPR
affected the preventative cost and legal cost of the Gordon and
Loeb model, and the quantitative view showed that after the
GDPR, the firm’s cost could increase by 3 to 4 times on
average,
and by 18 times if it was more.
Index Terms—IoT, GDPR, Economic Impact, Gordon & Loeb
Model
I. INTRODUCTION
On 14 April 2016, the European Parliament passed the
General Data Protection Regulation (GDPR). This regulation
ensures the free movement of personal information between
EU member states while strengthening the right of privacy of
information subjects. After 20 days, it will take effect, and will
be applied directly to the member countries after 2 years, May
2018. This is expected to give European citizens control over
their personal information and create a high level of privacy
18. protection in the European Union. With the introduction of
GDPR, companies that deal with personal information are
expected to be greatly affected. Among the various industries,
it is clear that IoT will be within the orbit of GDPR because it
collects and analyzes a vast amount of information from users.
According to Gartner, there were statistical studies that there
will be 8.4 billion IoT ’things’ until 2018 [1]. This means
each things stores more than 50 billion personal information
and reprocesses it and distributes it. However, Statista pub-
lished the statistical data that 39% of European consumers
have completely denied that IoT manufacturers give enough
information about the information they collect [2]. Under this
status quo, the IoT industry is expected to have an impact on
the GDPR, which significantly transfers control of information
usage to individuals.
This paper introduces the characteristics of GDPR, which is
different from the Directive, in Section II. In Section III, the
paper examines why the GDPR will affect the IoT industry
19. based on the characteristics of IoT. In Section IV, we analyze
the firm’s cost under GDPR through the Gordon & Loeb model
to determine the economic impact of the IoT industry. Finally,
the conclusions are written in Section V.
II. BACKGROUND
Basically, the EU legislation is divided into Directive and
Regulation. The Directive lay down certain results that must
be achieved but each Member State is free to decide how
to transpose directives into national laws. Regulations, on the
other hand, have binding legal force throughout every Member
State and enter into force on a set date in all the Member
States [3]. As a result of the GDPR, Data Protection Directive
1995 (95/46/EC), which was responsible for the protection of
personal information in 1995, will be replaced. The following
are key elements that distinguish GDPR from the Directive.
First, the definition of personal information has been further
expanded. According to GDPR Article 4, personal informa-
tion is a meaning of all information related to identified or
21. provided, and simple and easy language should be used. Also,
the information subject has the right to withdraw one’s consent
at any time.
III. HOW IOT HANDLES INFORMATION
This session analyzes the way the IoT industry handles
personal information and shows that the industry is under
GDPR. The following subsection describes the characteristics
of personal information usage of IoT devices that could be in
conflict with the GDPR.
A. Information Usage and Exchange between IoT Devices
Each endpoint of the IoT environment, things, sends data au-
tomatically and communicates with other endpoints and works
in conjunction. In IoT, there are cases where things traded and
acted on behalf of users. For example, if a smart fridge thinks
that food is scarce, it can connect to the Internet and buy things
on behalf of the user. In this case, the information utilization is
automated and the user’s information is exchanged to various
subjects. The control of GDPR’s personal information may
22. have restrictions on utilizing these advantages of IoT.
B. Analysis of Information Collected from IoT
Currently, IoT manufacturers collect huge amounts of in-
formation (big data) generated from the IoT environment and
research how they analyze this huge amount of data to better
understand the behavior of systems and users. They analyze
data that seem to have nothing to do with it, and if they find
out the relationship between behavior and usage pattern of one
consumer, they can give more value to customers and make
more profit. Conversely, data delivered from one endpoint
does not cause privacy issues, but data collected and analyzed
at various endpoints can be sensitive information. Therefore,
this collected information can be included in the extended
definition of the Personal Information covered by GDPR, and
it will fall into the domain of the individual’s information
control.
For example, electronic product development company
Vizio was recently charged a $2.2 million fine after using
23. content-aware software to track users without permission [4].
The company installed software on 11 million IoT TVs that
it sold to track customers’ detailed viewing habits. They
linked the data to specific household statistics and then sold
that information to third-party marketers. Vizio argued that
they never paired viewing data with personally identifiable
information such as name or contact information. However,
the data was collected as an analysis of personal TV habits
information, so it was considered as sensitive information and
were fined. If the GDPR had been applied, it would have been
$ 292 million, more than 100 times larger penalty than the
ruling.
IV. THE ECONOMIC IMPACT ON IOT FIRMS UNDER GDPR
In this section, this paper analyzes the economic impact
of the firms qualitatively by using the cost definition of the
Gordon & Loeb model and quantitatively by estimating cost
using statistics and legal bases.
According to the Gordon & Loeb model, the amount of
24. damage can be calculated as follows: Direct Costs, Indirect
Costs, Explicit Costs, Implicit Costs [5]. First, the direct
cost refers to the amount of damage directly caused by a
specific infringement incident. That is the amount of hardware
or software lost due to an accident. On the other hand,
indirect cost is the cost incurred to prevent information security
breaches in advance. Next, explicit costs mean all costs that
are explicitly visible due to a particular breach of infringement.
This includes investment in advance to prevent infringement,
an amount of damages caused by infringement, and all costs
to recover damages caused by the infringement. The implicit
cost is not the damage caused by the infringement accident,
but the damage cost for the circumstances that may arise
thereafter. This includes, for example, the cost of legal liability
for an infringement incident, including a decline in stock value
or sales due to a reputation decline in an affected company.
Using the model, this paper examines which parts of Gordon
& Loeb’s defined costs are expected to change due to GDPR
25. regulation.
Fig. 1. Gordon & Loeb Model’s Cost that GDPR affects
According to Article 82 (Right to Compensation and Lia-
bility), any person who has suffered material or non-material
damage due to a violation of GDPR rules has the right to
demand compensation for damages. In particular, the GDPR
differs from the Directive, which mentions only damages, in
that it can be compensated for pecuniary and non-pecuniary
losses.
In this regard, Article 83 explains about general conditions
for imposing administrative fines. According to Article 83,
administrative fines are not automatically applied and will be
charged in each individual case. Therefore, it is not possible to
exactly measure the fines due to characteristics of individual
imposition and absence of verdict, but we can check the
increment of certain cost factors through the definition of
Gordon & Loeb model. At this time, the Article 82 and 83
cause the increment of legal cost described in the Gordon &
26. Loeb model.
Furthermore, Articles 37, 38 and 39 describe the designa-
tion, status, and duties of the DPO (Data Protection Officer),
880
respectively. The controller and the processor should designate
a data protection officer when it is in the case of public authori-
ties, the cases of large-scale regular and systematic monitoring
of information entities and large-scale treatment of sensitive
information or criminal history. They should also have an in-
depth understanding of GDPR, expertise in national privacy
laws, and understanding of personal information processing
tasks. Therefore, these Articles will affect the Preventative
Cost, because the designation of the DPOs is mandatory and
their qualities must be proven.
In conclusion, it affects the two costs of Gordon & Loeb
model as shown above. In order to examine the economic
impact of GDPR, the estimated cost of damage before GDPR
27. and the estimated cost of damage after GDPR must be
compared. According to the Ponemon Institute, in 2016, the
world’s average number of breached records reached 24,089
[6]. Based on this research, we select four personal data
breach cases on average to analyze how GDPR affects the IoT
industry. First, the paper estimates how much each of the four
cases, regardless of GDPR, caused a certain loss of firm’s cost.
According to the Ponemon Institute, the average per capita cost
of a data breach over the last four years is 150$ [6]. Based
on this research, an estimated value of the loss of four cases
can be derived. To know how GDPR affects, the paper then
analyzes the loss assuming that the cases are under GDPR. As
shown in Figure 1, two component costs that GDPR affects
were derived, the Legal Cost and the Preventative Cost. Due
to a cases’ averageness, the paper assumed that each violation
of GDPR was considered to be lesser incidents and was fined
either 10 million or 2% of a firm’s global turnover (which
ever is greater). Considering each of these costs, the Figure 2
28. shows how disastrous the estimated firm’s cost is.
Fig. 2. Comparison of firm’s cost before and after GDPR
Unlike the rest of the cases where a firm’s cost increase of
3 to 4 times, a firm’s cost of case B is expected to rise about
18 times. This is due to the characteristics of GDPR, which
determines fine based on the company’s annual turnover for the
previous year. Thus, a company with a large annual turnover
could be fined well in excess of 10 million.
V. CONCLUSION
The regulatory scope of GDPR and its impact are increasing
the tension of the firms. 52% of firms are concerned that the
GDPR will cause fines for their firms, 65% think they will
change their business strategy, and 30% think the GDPR will
increase their annual budgets by more than 10% until it is
implemented [7] [8]. In such tension and concern. this paper
first shows that the IoT industry is under the influence of
GDPR in section 3. In section 4, the paper describes how
the GDPR will affect the IoT firm’s cost qualitatively by
29. using the cost definition of the Gordon & Loeb model and
quantitatively by estimating cost using statistics and legal
bases. Although there is a limit to proceed with limited
data, it is meaningful that the industry’s economic impact
due to changes in the legislation can be analyzed from two
perspectives (qualitatively and quantitatively), so that it is
possible to identify which industries are vulnerable to changes
in the legislation.
ACKNOWLEDGMENT
This work was supported by Defense Acquisition Program
Administration and Agency for Defense Development under
the contract. (UD060048AD)
REFERENCES
[1] R. van der Meulen. Gartner says 8.4 billion connected things
will
be in use in 2017, up 31 percent from 2016. [Online]. Available:
http://www.gartner.com/newsroom/id/3598917
[2] Statista. Level of agreement regarding internet of things
(iot)
manufacturers sufficiently informing consumers about
information
the devices can collect in europe in 2016. [Online].
30. Available: https://www.statista.com/statistics/609021/trust-in-
iot-device-
manufacturers-eu/
[3] T. E. PARLIAMENT, “Regulation (eu) 2016/679 of the
european parlia-
ment and of the council,” Official Journal of the European
Union, 2016.
[4] J.-M. Franco. The internet of things and the
threat it poses to gdpr compliance. [Online]. Avail-
able: https://www.talend.com/blog/2017/04/03/internet-things-
threat-
poses-gdpr-compliance/
[5] L. A. Gordon and M. P. Loeb, “The economics of
information security
investment,” ACM Transactions on Information and System
Security
(TISSEC), vol. 5, no. 4, pp. 438–457, 2002.
[6] P. Institute, “2017 cost of data breach study: United states,”
2017.
[7] A. Rodger. Analyst opinion: Gdpr will force changes in
strategy.
[Online]. Available: https://www.ovum.com/analyst-opinion-
gdpr-will-
force-changes-in-strategy/
[8] D. T. Ford and S. Qamar, “Seeking opportunities in the
internet of
things (iot):: A study of it values co-creation in the iot
ecosystem
while considering the potential impacts of the eu general data
protection
44. GDPR Impact on Computational Intelligence
Research
Keeley Crockett1, Sean Goltz2, Matt Garratt3
1School of Computing, Mathematics and Digital Technology,
Manchester Metropolitan University,
Manchester, M1 5GD, UK, [email protected]
2Business & Law School, Edith Cowan University, Perth,
Australia, [email protected]
3School of Engineering and IT. University of New South Wales,
PO Box 7916, Canberra BC 2610, ACT 2902,
Australia, [email protected]
Abstract— The General Data Protection Regulation (GDPR)
will become a legal requirement for all organizations in Europe
from 25th May 2018 which collect and process data. One of the
major changes detailed in Article 22 of the GDPR includes the
rights of an individual not to be subject to automated decision-
making, which includes profiling, unless explicit consent is
given. Individuals who are subject to such decision-making
have
the right to ask for an explanation on how the decision is
reached
and organizations must utilize appropriate mathematics and
statistical procedures. All data collected, including research
projects require a privacy by design approach as well as the data
controller to complete a Data Protection Impact Assessment in
addition to gaining ethical approval. This paper discusses the
impact of the GDPR on research projects which contain
elements of computational intelligence undertaken within a
University or with an Academic Partner.
Keywords- GDPR, Profiling, automated-decision making,
computational Intelligence
45. I. INTRODUCTION
In a society governed by rules attempting to protect
person’s privacy, people are generally unaware as to how
much of their personal information is collected and the ways
in which this data will be manipulated and used. Historically,
when consent was requested in order to collect a subject’s
data, this request was often lengthy and shrouded in legal
terminology. This usually results in the consent form not
being read in detail and questioned by the data subject, or not
being understood. For example, a person may give consent
whilst not really knowing to what use their data will be put or
how their data will be manipulated.
Concerns about privacy are of particular importance to
those in the computational intelligence (CI) field. CI
encapsulates “the theory, design, application, and
development of biologically and linguistically motivated
computational paradigms emphasizing neural networks,
connectionist systems, genetic algorithms, evolutionary
programming, fuzzy systems, and hybrid intelligent systems
in which these paradigms are contained.”[1]. In conducting
research within the field, Dunis et. al [2] highlights some of
the known scientific difficulties such as overfitting, feature
selection, interoperability and parameter tuning. More
recently, further data related challenges (velocity, variety,
volume, veracity, and value) associated with applying CI
algorithms to Big Data have added further complexities to the
development of new and existing applications of CI
algorithms. Such challenges can cause CI systems either to
be difficult to set up, run the risk of skewing the results or can
mean that the results that emerge are difficult to interpret.
Sandvig et al. [3] asks, ‘Can an Algorithm be Unethical?’.
Models generated from CI algorithms are derived from an
underlying set of data upon which decisions are formulated.
47. has stated that individuals are “very likely” to ask for
explanations of decisions when applying for credit, insurance
and possibly in recruitment decisions [7]. Given that the
definition of explainable decisions and where they might
apply is still vague, Dinsmore [8] argues that the requirement
may force data scientists to stop using techniques such as
deep learning where decisions are more difficult to explain
and interpret. Organizations that violate the GDPR can be
fined up to 4% of their annual global turnover or €20 Million
[6]. In light of this new legislation, current and future CI
focused research conducted in Universities will need close
examination and scrutiny of the appropriate legislation.
The concept of Responsible Research and Innovation
(RRI) [9], a growing area, particularly within the EU, offers
potential solutions to workplace bias and is being adopted by
several research funders such as the EPSRC (The UKs
Engineering and Physical Sciences Research Council) [10],
who include RRI core principles in their mission statement.
RRI is an umbrella concept that draws on classical ethics
theory to provide tools to address ethical concerns from the
outset of a project (design stage and onwards). Quoting Von
Schomberg, “Responsible Research and Innovation is a
transparent, interactive process by which societal actors and
innovators become mutually responsive to each other with a
view to the (ethical) acceptability, sustainability and societal
desirability of the innovation process and its marketable
products (in order to allow a proper embedding of scientific
and technological advances in our society).” [11]. Through
recent consultations, individual countries are putting the
ethics of artificial intelligence at the core of technological
developments, for example at the World Economic Forum in
Davos (January 2018), the current British Prime Minister
(PM), outlined plans for a national center for ethics in
artificial intelligence [12].
48. The IEEE is leading the global initiative in establishing a
set of societal, and policy driven guidelines for the use and
impact of autonomous and intelligent systems. Version 2 of
the IEEE Ethically Aligned Design document [13] was
released in December 2017 and contains a chapter on
personal data and individual access control. Working groups
for four IEEE standards on data privacy have been setup. The
initiative suggests using a “personalized privacy AI” agent to
act as a broker between each individual and all other entities
wishing to access their private data. The idea of a broker is to
help deal with the fragmentation of data across numerous
organizations that might use that data and to help individuals
make informed decisions on how to share their data whilst
navigating the complexities of the consequence of such
sharing with so many potential organizations wanting to use
that data. The agent could provide advice on options
regarding which type of data can be shared, track what
permissions have been granted and check compliance of the
receiving organizations. Development of such an agent and
policies on how it might be used could be a very useful area
of research. Principle 4 of the IEEE Ethically aligned design
document [13] also highlights the fact that transparency in
decision making is important because it builds human trust
into the system. In the case where the system makes the
wrong decision, the internal processes that the autonomous
system took will need to be explainable. It is designed to
protect vital interests of the data subject. As the IEEE global
initiative gains providence, the Ethically Aligned Design
Document will become an essential resource for CI
practitioners. Privacy by design is a legal requirement under
the GDPR.
For researchers conducting projects with a CI focus
undertaken within or with a University or Academic Partner,
the challenge is in understanding the impact of the GDPR on
their research. Clearly in an international context, it is not
49. only those in the European Union and/or countries, which
adopt regulation that will need to be GDPR ready. This paper
discusses the impact of the GDPR on CI as elements of
research projects undertaken within or with a University or
Academic Partner. Whereas academics may be pioneers in
their specific fields, they may have less knowledge of privacy
by design approaches, data protection impact assessments
and what the GDPR actually means for their research. The
paper ends with a set of brief recommendations that CI
researchers should consider at the start of every research
project.
This paper is organized as follows; Section II provides an
overview of the legislation regarding the GDPR and
Profiling. Section III reviews some example profiling
systems based on computational intelligence approaches.
Section IV examines the role of privacy by design from the
perspective of undertaking research projects within a
university, and finally Section V makes recommendations for
researchers working in the field of computational
intelligence.
II. GDPR: PROFILING AND AUTOMATED DECISION
MAKING
Article 4(4) of the GDPR defines what forms of data
processing could be considered as “profiling”. This includes
any form of automated processing of personal data; and
utilizing this personal data to evaluate certain personal
aspects relating to a natural person. For example, analyzing
or predicting “aspects concerning that natural person's
performance at work, economic situation, health, personal
preferences, interests, reliability, behaviour, location or
movements.” [6]. Recital 71 provides a lengthy definition of
what is meant by the term profiling [6] especially in relation
to any personal aspect “concerning the data subject’s
performance at work, economic situation, health, personal
50. preferences or interests, reliability or behaviour, location or
movements, where it produces legal effects concerning him
or her or similarly significantly affects him or her.”. Article
22 of the GDPR concerns the rights of an individual when
interacting with systems that may automatically make a
decision or profile them in any way that they have not given
consent to [6]. The main principle of article 22 is “The data
subject shall have the right not to be subject to a decision
based solely on automated processing, including profiling,
which produces legal effects concerning him or her or
2018 International Joint Conference on Neural Networks
(IJCNN)
similarly significantly affects him or her” [6]. In any aspect
of automated decision making, the individual has the right to
ask for human intervention and provide an explanation of
how the machine based decision has been reached through
disclosure of “the logic involved” (article 13 [6]). Recital 71
states that the data controller should use appropriate
mathematical and statistical procedures for profiling and that
data should be accurate in order to minimize the risk of errors.
On the subject of automated profiling, UK Lawyers, Wright
Hassall [14] recommend that companies should avoid
automated processing on any sensitive personal data unless
explicit informed consent of an individual is first obtained.
The GDPR also now requires organizations to conduct a
Data Protection Impact Assessment (DPIA) specifically in
the case where profiling and/or automated decision making
is utilized - “A data protection impact assessment should also
be made where personal data are processed for taking
decisions regarding specific natural persons following any
systematic and extensive evaluation of personal aspects
relating to natural persons based on profiling those data or
51. following the processing of special categories of personal
data, biometric data, or data on criminal convictions and
offences or related security measures”([6] Recital 91]. More
discussion regarding conducting a DPIA in terms of CI
research is discussed in section IV.
CI based systems, have historically been used within
profiling and automated decision-making. Some examples
are the profiling of users within intelligent tutoring systems
to provide recommended learning activities using fuzzy trees
[15] or the personalization of learning from utilizing neural
networks to automatically detect and monitor learner’s
comprehension from the non-verbal behaviour [16]. Zheng et
al [17] utilized a fuzzy deep learning approach to profiling
airline passengers for “classifying normal passengers and
potential attackers…” [17]. Angelos et al. [18] used a fuzzy
c-means clustering algorithm to detect abnormalities within
customer energy consumption profiles within Power
Distribution Systems [18]. The new legislation within the
GDPR suggests that CI researchers need to re-examine the
legalities of profiling systems especially in a) the ability to
communicate effectively that a person is to be profiled and
what this profile will be used for and b) the ability for the CI
system to provide an explain the logic behind a person be
assigned a particular profile.
III. COMPUTATIONAL INTELLIGENCE DECISION
BASED SYSTEMS
A. Overview
It has been argued that as CI gets more complex our ability
to understand and therefore guide how it makes decisions
decreases. Moreover, it may come to pass that these machines
will need to be maintained by other specialist machines,
further removing humanity from the equation [19]. A list of
52. existing and potential CI application is limitless, yet with
each application that involves decision making, researchers
must consider the ethical, moral, social and legal implications
of the system making a decision and ask: Does a human have
the final say? Can the system give a reason why the decision
was made? Is the decision itself explainable to a member of
the public who will use the system?
Let us consider self-driving cars. On the positive side,
while self-driving cars will lack the innate morality of
humans, they might also lack the distractions (unless these
are programmed into the cars themselves). That is to say an
autonomous vehicle won’t be thinking about what to cook for
dinner or how their upcoming interview will go which could
lead to less accidents overall. Conversely, self-driving cars
come with the risks inherent in a non-sentient being in control
of a dangerous object. On the one hand, no one wants to be
in a situation where the car will decide to avoid danger to
others at all costs but neither will society allow the
manufacture of vehicles that seek only their own protection
[20]. The classic tension in this space is exemplified by the
oft cited “trolley problem” whereby a decision needs to be
made whether to take a positive action and kill something or
do nothing and risk killing several people [21]. Some
countries, such as the UK’s Department of Transport have
already issued a set of “Key principles of vehicle cyber
security for connected and automated vehicles” which
manufacturers must adhere to for such cars to drive in UK
roads [22].
Another CI application area which has huge ethical
dilemmas is that of autonomous weapons systems (AWS). If
(and when) drones are used in combat situations, how are we
able to be sure that they correctly recognize the target from
innocent parties. This is particularly relevant in the case of
using a learning or evolutionary intelligence, which may
change the way things are viewed over time. The IEEE
53. Global Initiative has identified eleven issues concerning
AWS in a document currently open for public discussion
[13], each having a set of detailed candidate
recommendations covering areas such as predictability of a
weapon systems, the use of learning and adaptive learning
algorithms, and the ability for humans to have meaningful
control. All require the AWS to be able to clearly explain
their reasoning and decisions. Can AWS ever be made safe?
The Future of Life Institute created a short film [23] which
was shown at the United Nations Convention on
Conventional Weapons in November 2017, hosted by the
Campaign to Stop Killer Robots. This movie clearly raises
the issue of whether these systems cross the moral line of who
dies and who lives. Another risk isn’t so much with the ethics
of the machines but rather of nefarious individuals who seek
to exploit them. Just as current machines can be hacked, one
cannot rule out the possibility that future ones might be
hacked as well, with potentially calamitous results.
B. Case Study: Automated Deception Detection
Automated deception detection systems (ADDS) detect
whether an individual is lying or not by conducting some
measurement on that individual’s behavior. Systems such as
the polygraph which intrusively detect lies by measuring
2018 International Joint Conference on Neural Networks
(IJCNN)
physiological changes related to stress through body during
an interview have been around since the 1920s [24].
Ultimately, a trained polygraph examiner makes the decision
on whether physiological changes indicate truthful or
deceptive behavior and are widely accepted by the public and
54. used in courts of law (although usually requiring the parties
consent to be used). In contrast, Silent Talker [25, 26] is an
ADDS system which uses computational intelligence,
specifically artificial neural networks (ANN), to make
judgements of participants’ deception based on
microgestures (small facial and other movements). Silent
Talker utilizes up to 40 channels of facial nonverbal behavior
which it extracts from static or live feed video, processes,
detects the behaviour of channel objects i.e. the left eye, and
feeds all information into a final classifier to give a
probability of deception/ truthfulness. This system has been
shown in laboratory conditions to achieve an accuracy of up
to 87% [25]. So how can the Silent Tracker’s decision be
explained? Depending on whether the final classifier is ANN
based or decision forest based determines whether some
explanation is possible – but would it actually be meaningful
to a human being? Especially if providing a numerical
representation of the behaviour of a facial feature such as an
eye. It would also be common that the number of rules
generated by decision trees in such a domain can be in excess
of 1000’s making human comprehension difficult. This raises
the question on how do such complex CI systems provide an
adequate explanation that satisfy the GDPR requirement?
However, in CI systems which operate within the security
field, releasing how a decision is made could lead to a higher
risk of spoofing the system.
iBorderCtrl [27, 28] is a system currently under
development whose prime aim is to enable faster and
thorough border control for third country nationals crossing
the land borders of EU Member States in line with the current
status quo of Schengen Border Management. The system
will feature an ADDS system based on a near real-time Silent
Talker for traveler pre-arrival border-style crossing
interviews. The system, funded as a H2020 grant (2016-
2019) [28] by the EU involves a partner institution expert in
legal informatics, data protection, data security and ethics
55. who will lead the EU-wide legal review and the tasks
associated with legal and ethical compliance. However,
institutions such as Universities will also have their own
standard operating procedures and processes for conducting
ethical research and for data governance, which may have
some different legalities dependent on the country’s own
legislation. In addition, some institutions may put initial
effort into ensuring compliance with GDPR principles that
align more directly with the Data Protection Act 1998, rather
than examine the newer laws regarding profiling, the right to
refuse an automated decision and the right to an explanation
which may involve R&D protocol and physical systems
changes in the ways that they conduct business.
IV. DATA PRIVACY IMPACT ASSESSMENTS AND
ETICAL CONSIDERATIONS
A) Data Privacy Impact Assessments
Kamarinou et al [29] argues that if personalized data was
anonymized and used to target individuals or infer some
behavior about them for automated decision-making, then
this would be considered as “incompatible with the purpose
for which the data were originally collected”. Data Privacy
Impact Assessments (DPIA) are fundamental to undertaking
a privacy by design approach which is mandatory under the
GDPR. The DPIA is designed to be a tool that allows
organizations to “comply with their data protection
obligations and meet individuals’ expectations of privacy.”
[30, 31]. The DPIA contains a description of data processing
operations and its purpose, an internal assessment of the
necessity and proportionality of the processing in relation to
56. the purpose, a risk assessment to individuals along with the
associated measures to address the risk such as consideration
of all issues in relation to data security.
Consider a research project, which spans academics and
industry over several countries. The project lead, will adopt
one DPIA format, whilst organizations will have their one in
house format. In the UK, the ICO [32], provides suggestion
on the content of a DPIA but states that organizations can
adopt their own content as long as it is compliant with the
requirements. The first stage comprises of a set of screening
questions (Figure 1) which determine whether a DPIA is
necessary. [32]. If it is deemed that a DPIA is required, then
a series of six steps is required to complete the DPIA (Figure
2), followed by a further section where the DPIA is linked to
the data protection principles.
The challenge to a CI researcher is in having the depth of
knowledge of the GDPR articles to be able to complete an
effective DPIA. For example, consider the case study of the
ADDS system described in section III. Under Step three:
Identify the privacy and related risks - the compliance risk
associated with each privacy issue should be recorded in
terms of non-compliance with the Data Protection Act and
also specific articles of the GDPR. One might consider that
as ADDS provides a deception risk score determined from a
hierarchy of neural networks, individuals cannot get an
explained decision on how the ADDS artificial neural
network classifiers obtained this score from their nonverbal
behaviour during an automated pre-travel interview. This
initially suggests non-compliance with GDPR Article 22 –
“the right not to be subject to a decision based solely on
automated processing” [33] but as ADDS is part of a larger
Border Control system, where ultimately a decision on
whether to “Go” straight through border control, or “proceed
to a second line check” is made by combining of the results
of multiple sub-systems [28] – does ADDS alone need to
57. meet this GDPR requirement as ADDS does not make the
final decision of the overall system “which is based solely on
automated processing”?. However, as ADDS does make a
decision on the deception risk element through effectively
profiling an individual then this individual will have the right,
2018 International Joint Conference on Neural Networks
(IJCNN)
at the informed consent stage, to choose whether or not to
undertake a pre-arrival border-style crossing interview.
Fig.1. DPIA Screening Questions [23]
Fig.2. Steps for completion of a DPIA [23]
B) Research Ethics
Organizations have long established principles, policies and
procedures which guide the researcher on how to conduct
ethically aligned research in accordance with the law.
Universities will have a set of clear set of guidelines that
addresses how researchers and research organizations should
conduct themselves when working with participants, their
personal data (including personal artifacts) [34]. Recently,
there has been an emergence in companies creating specific
ethics research units as part of their R&D in the field of
computational intelligence i.e. DeepMind, owned by Google,
has launched a research unit called DeepMind Ethics &
Society in late 2017 [35]. Mustafa Suleyman, co-founder of
58. DeepMind reported “A tech company that applies its
technology without due consideration for ethical and social
implications is destined to be a bad tech company” [35].
Whilst other companies follow suit, this is a clear indicator
that ethically aligned design will be considered more as the
norm rather than the exception. Indeed, “Ethics is an essential
element of good research governance” [34].
Traditionally, the process of making an ethical application
for a research project, has involved the research lead
completing a number of checklists and forms i.e. an ethics
checklist, an ethics application, research insurance form, a
health and safety risk assessment form and in some projects
a security sensitive information form. The application as a
whole is then reviewed by an internal ethics panel where it
may be approved or further meetings with the research team
are required for clarification. Some universities have begun
adopting an electronic ethical process using systems such as
the Ethics Online System (EthOS) to establish clear
workflows and responsibilities. However, the questions
asked on an ethics application can overlap with those on the
DPIA. For example, in identifying any ethical issues, the
researcher would have to detail how the data would be
secured to ensure protection of a participant’s confidentiality.
In addition, a clear description and justification must be given
if any sensitive data was to be collected, such as age, colour,
race/ethnicity, disablement, religion etc.
The challenge to the CI researcher is the knowledge and the
time to carry out the processes effectively within the
organization. For example, challenges can occur when a
research project conducted within a University is part of a
larger international consortium based project where
effectively the DPIA and the Ethics application must be
undertaken twice to meet the legal requirements of a number
of different countries. Secondly, there is the terminology
barrier between those who review the ethics application
59. (from a multidisciplinary area) who may be unfamiliar with
the terminology and methodologies used in CI research. In
situations such as this communication of the ethical
considerations required can only be understood by all parties
through additional meetings. Hence, it is crucial that specific
time be built into all projects to go through the ethical
approval process.
V. RECOMMENDATIONS AND CONCLUSIONS
This paper has attempted to discuss impact of the GDPR on
research projects, which contain elements of computational
intelligence. It has attempted to raise awareness of what this
legislation will mean for CI systems that profile and/or
produce automated decisions and how it will be expected that
decisions made by such systems should be explainable.
Although the GDPR is a European Regulation, research and
collaboration is often undertaken within an international
community which implies that for data sharing, there may
will be a need for international compliance. In our limited
experience so far, we present the following
recommendations:
• CI researchers should gain familiarity in the principles of
the IEEE Ethically aligned design document V2 [13].
1. Will the project involve the collection of new information
about individuals?
2. Will the project compel individuals to provide information
about themselves?
3. Will information about individuals be disclosed to
60. organisations or people who have not previously had
routine access to the information?
4. Are you using information about individuals for a purpose
it is not currently used for, or in a way it is not currently
used?
5. Does the project involve you using new technology that
might be perceived as being privacy intrusive? For
example, the use of biometrics or facial recognition.
6. Will the project result in you making decisions or taking
action against individuals in ways that can have a
significant impact on them?
7. Is the information about individuals of a kind particularly
likely to raise privacy concerns or expectations? For
example, health records, criminal records or other
information that people would consider to be private.
8. Will the project require you to contact individuals in ways
that they may find intrusive?
Step one: Identify the need for a PIA
Step two: Describe the information flows
Step three: Identify the privacy and related risks
61. Step four: Identify privacy solutions
Step five: Sign off and record the PIA outcomes
Step six: Integrate the PIA outcomes back into the project
plan
2018 International Joint Conference on Neural Networks
(IJCNN)
• Ensure the local research team has had appropriate
training in the GDPR [6, 7, 31, 33, 36, 37]
• Adopt a privacy by design approach at the start of the
research project and build in privacy and data protection
to the research proposal or any knowledge transfer
partnerships with industry [32]
• Ensure the legal responsibility between project partners
is clear and well defined.
• Conduct an initial ethical review at the project proposal
stage. This will naturally lead to the identification of data
privacy issues, risks, security including cybersecurity of
data).
• Conduct a Data Protection Impact Assessment in
phases, if appropriate at the same time as the initial
ethical review [30].
• Build in specific time (as a specific task) into any grant
application for conducting a DPIA and an ethical review.
Whilst all current and new CI research projects must seek
to confirm to the GDPR, draft guidance from European
62. regulators suggests that this “data protection by design”
approach should be extended to existing systems within three
years [31]. The impact of this in terms of time will be
significant and it will need the establishment of specialized
teams with extensive knowledge of both the GDPR and the
field of computational intelligence.
The GDPR is a warranted step towards the much-needed
protection humans require in the midst of the artificial
intelligence fourth industrial revolution. In the context of
computational intelligence and academic research it is yet to
be seen whether the implementation and compliance of the
relevant GDPR principles will provide a desirable outcome.
The interface between regulation and complex systems,
especially autonomous systems, is highly challenging and
requires new and innovative approach. Developing new
research projects with the GDPR main goal in mind
(protecting human subjects) should be the overarching
principle to be considered in any related future academic
research.
REFERENCES
[1] Scope of Computational Intelligence, IEEE Computational
Intelligence society, [online], Available:
http://cis.ieee.org/field-of-
interest.html. [Accessed 23/12/2017].
[2] Dunis, C, Likothanassis, S, Karathanasopoulos, A,
Sermpinis, G, &
Theofilatos, K (eds) (2014), Computational Intelligence
Techniques
for Trading and Investment, Taylor and Francis, Florence.
Available
from: ProQuest Ebook Central. [Accessed 26/0/2017].
[3] Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C.
63. (2016).
When the Algorithm Itself Is a Racist: Diagnosing ethical harm
in the
basic components of software. International Journal of
Communication, 10, pp. 4972–4990.
[4] Larson, J. Angwin, J., & Parris Jr., T. (2016), [online], How
Machines
Learn to be Racist. ProPublica. Available at:
https://www.propublica.org/article/breaking-the-black-box-how-
machines-learn-to-be-racist?word=Trump [Accessed
19/10/2017]
[5] Lee, P. Learning from Tay’s Introduction (2016), [online],
available:
https://blogs.microsoft.com/blog/2016/03/25/learning-tays-
Introduction/, Date Accessed [2/1/2017]
[6] The GDPR Portal (2017), [online]. Available at:
https://www.eugdpr.org/ Accessed [21/12/2017].
[7] Information commissioner’s Office (2017), Guide to the
General Data
Protection Regulation (GDPR) [online]. Available at:
https://ico.org.uk/media/for-organisations/guide-to-the-general-
data-
protection-regulation-gdpr-1-0.pdf Accessed [21/12/2017]
[8] Dinsmore, T. How GDPR Affects Data Science (2017),
[online], ,
Available at: https://thomaswdinsmore.com/2017/07/17/how-
gdpr-
affects-data-science/ Accessed [20/12/2017]
[9] Burget, M., E. Bardone, and M. Pedaste. “Definitions and
Conceptual
64. Dimensions of Responsible Research and Innovation: A
Literature
Review.” Science and Engineering Ethics 23, no. 1 (2016):
pp.1–9.
[10] EPSRC, Engineering and Physical Sciences Research
Council.
[online], Available at https://www.epsrc.ac.uk/ [Accessed
22/01/2018].
[11] Von Schomberg, R. (2011), Prospects for Technology
Assessment in
a Framework of Responsible Research and Innovation, in
Technikfolgen Abschätzen Lehren: Bildungspotenziale
Transdisziplinärer Methode, Wiesbaden, Germany: Springer VS,
pp.
39–61.
[12] PM's speech at Davos 2018: 25 January, [online].
Available:
https://www.researchprofessional.com/0/rr/news/uk/politics/whi
tehall
/2018/1/May-seeks-consensus-on-AI-s-benefits-and-limitations-
.html?utm_medium=email&utm_source=rpMailing&utm_campai
gn=
personalNewsDailyUpdate_2018-01-25#sthash.mvPeuPmd.dpuf
[Accessed: 28/1/2018].
[13] Ethically Aligned Design: A Vision for Prioritizing Human
Well-being
with Autonomous and Intelligent Systems, Version 2, The IEEE
Global Initiative on Ethics of Autonomous and Intelligent
Systems
(2017) [online], Available: https://ethicsinaction.ieee.org/,
[Accessed
23/12/2017].
65. [14] GDPR – Individuals’ Rights, (2017) Wright Hassall,
[online],
Available at: https://www.wrighthassall.co.uk/knowledge/legal-
articles/2017/11/21/gdpr-individuals-rights/ [Accessed:
28/1/2018].
[15] Wu, D. Lu, J. Zhang, G. (2015) A Fuzzy Tree Matching-
Based
Personalized E-Learning Recommender System, IEEE
Transactions on
Fuzzy Systems, Vol:23:6, pp. 2412 – 2426.
[16] Holmes, M. Latham, A. Crockett, K, O’Shea, J. (2017)
Near real-time
comprehension classification with artificial neural networks:
decoding
e-Learner non-verbal behaviour, IEEE Transactions on Learning
Technologies, 2017, Vol:PP: 99, DOI:
10.1109/TLT.2017.2754497.
[17] Zheng, Y. Sheng, W. Sun, X. Chen, S. (2017) Airline
Passenger
Profiling Based on Fuzzy Deep Machine Learning, IEEE
Transactions
on Neural Networks and Learning Systems, Vol28:12, pp 2911 –
2923.
[18] Angelos, E. Saavedra, O. Cortés, O. Nunes de Souza, A.
(2011),
Detection and Identification of Abnormalities in Customer
Consumptions in Power Distribution Systems, IEEE
Transactions on
Power Delivery, Vol:26:4, pp. 2436 – 2442.
[19] Heron, M. & Belford, P., (2015), Fuzzy ethics: or how I
66. learned to stop
worrying and love the bot. ACM SIGCAS Computers and
Society,
45(4), pp.4–6.
[20] Baum, S. Social Choice Ethics in Artificial Intelligence
[online],
available at SSRN.com
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3046725,
[Accessed 1/10/17].
[21] Santoni de Sio, F. Ethic Theory Moral Prac (2017) 20: 411.
https://doi-
org.ezproxy.waikato.ac.nz/10.1007/s10677-017-9780-7.
[Accessed
1/10/17]
[22] The key principles of vehicle cyber security for connected
and
automated vehicles (2017) [online], Available at:
https://www.gov.uk/government/publications/principles-of-
cyber-
security-for-connected-and-automated-vehicles/the-key-
principles-of-
vehicle-cyber-security-for-connected-and-automated-vehicles
[Accessed 22/1/2018].
[23] Ban on killer robots urgently needed, say scientists (2017),
[online]
Available at:
https://www.theguardian.com/science/2017/nov/13/ban-
on-killer-robots-urgently-needed-say-scientists, [Accessed
22/1/2018].
[24] International League of Polygraph Examiners (2018),
Polygraph/Lie
67. Detector FAQs. [online]. Available at:
http://www.theilpe.com/faq_eng.html. [Accessed 5/1/18].
2018 International Joint Conference on Neural Networks
(IJCNN)
[25] Rothwell, J., Bandar, Z., O'Shea, J. and McLean, D.,
(2006). Silent
talker: a new computer�based system for the analysis of facial
cues to
deception. Applied cognitive psychology, 20(6), pp.757-777.
[26] Silent Talker Ltd [online], Available at:
https://www.silent-
talker.com/ [Accessed 5/1/18].
[27] Crockett, KA and O'shea, J, Szekely, Z, Malamou, A,
Boultadakis, G,
Zoltan, S (2017), Do Europe's borders need multi-faceted
biometric
protection. Biometric Technology Today, 2017 (7). pp. 5-8.
ISSN
0969-4765.
[28] iBorderCtrl Intelligent Portable Control System [online],
Available at
http://www.iborderctrl.eu/ [Accessed 12/1/2018]
[29] Kamarinou, D. Millard, C. Singh, J. (2017) Machine
Learning with
Personal Data: Profiling, Decisions and the EU General Data
Protection Regulation. To appear, Journal of Machine Learning
Research, 2017. [online] Available at
http://www.mlandthelaw.org/papers/kamarinou.pdf, [Accessed
68. 12/1/2018].
[30] Information commissioner’s Office (2014),Conducting
privacy impact
assessments code of practice. [online], Available:
https://ico.org.uk/media/for-organisations/documents/1595/pia-
code-
of-practice.pdf Accessed [21/12/2017].
[31] Cormack, A. (2017), Preparing for the GDPR - a guide for
universities,
[online] Available:
http://universitybusiness.co.uk/Article/preparing-
for-the-gdpr Accessed [21/12/2017].
[32] ICO - Privacy by design (2017) [online], Available at:
https://ico.org.uk/for-organisations/guide-to-data-
protection/privacy-
by-design/ [Accessed 22/01/2018].
[33] Art. 22 GDPR Automated individual decision-making,
including
profiling (2017), [online]. Available: https://gdpr-info.eu/art-
22-gdpr/
Accessed [21/12/2017].
[34] Ethics and Governance, Manchester Metropolitan
University, (2018),
[online], Available at:
https://www2.mmu.ac.uk/research/staff/ethics-
and-governance/ethics/, [Accessed: 28/1/2018].
[35] DeepMind, [online], Available at: https://deepmind.com/
[Accessed:
28/1/2018].
69. [36] ICO highlights that GDPR requires businesses to
understand and
explain the rationale of decisions taken by machines, (2017),
[online],
Available: https://www.out-law.com/en/articles/2017/march/ico-
highlights-that-gdpr-requires-businesses-to-understand-and-
explain-
the-rationale-of-decisions-taken-by-machines/
[37] Article 29 Working Party. (2013) Advice paper on essential
elements
of a definition and a provision on profiling within the EU
General Data
Protection Regulation. [online] Available at
http://bit.ly/2fIei8K,
[Accessed: 28/1/2018].
2018 International Joint Conference on Neural Networks
(IJCNN)
<<
/ASCII85EncodePages false
/AllowTransparency false
/AutoPositionEPSFiles true
/AutoRotatePages /None
/Binding /Left
/CalGrayProfile (Gray Gamma 2.2)
/CalRGBProfile (sRGB IEC61966-2.1)
/CalCMYKProfile (U.S. Web Coated 050SWOP051 v2)
/sRGBProfile (sRGB IEC61966-2.1)
/CannotEmbedFontPolicy /Error
/CompatibilityLevel 1.7
/CompressObjects /Off
/CompressPages true
/ConvertImagesToIndexed true
/PassThroughJPEGImages true