This is the slide-deck of the workshop held on April 2, 2019 in Brussels, titled "Towards Value-Centric Big Data". It includes the presentations given by the speakers.
2. Workshop Agenda – Morning session
10.00 Opening and introduction
10.10 Privacy-enhancing technologies, data sharing and ethics - Saila Rinne, Programme Officer Data Policy and
Innovation Unit, European Commission DG Connect
10.20 “You’re monitoring my what…?!” - Balancing privacy against enhanced security outcomes - Duncan Brown,
EMEA Chief Security Strategist - Forcepoint
10.35 e-SIDES: The community paper on RRI guidelines for big data: why we need it and how we can work together
10.45 A “win-win” initiative for value-centric big data and safeguarding “privacy and ethical values” in the PSPS
domain: AEGIS project - Marina Da Bormida, R&I Lawyer and Ethics Expert
11.00 Safe and secure data marketplaces for innovation: SAFE-DEED - Alessandro Bruni, Legal Researcher KU Leuven
11.15 The dangers of tech-determinism: Demystifying AI and reclaiming the future – Barbara Giovanelli, Policy
Officer, Digital Ethics, EPDS European Data Protection Supervisor
11.30 Responsible Research: Analytics when dealing with personal and personalised mobility data: Track&Know -
Prof. Ansar Yasar, University of Hasselt
11.45 Privacy Preserving Technologies - An outlook– Tjerk Timan, Policy Ananlyst, TNO / BDVA
12.00 Networking Lunch
3. 13.00 Knowledge café: towards responsible big data
15.00 Knowledge café reporting to the plenary and open discussion
16.00 End of workshop
Workshop Agenda – Afternoon session
11. Inadvertent
Behaviors
Poorly communicated
policies and user
awareness
Broken Business
Process
Data where it shouldn’t
be, not where it should
be
Rogue
Employee
Leaving the
company, poor
performance review
Criminal Actor
Employees
Corporate espionage,
national espionage,
organized crime
Malware
Infections
Phishing targets,
breaches, BYOD
contamination
Stolen
Credentials
Credential exfiltration,
social engineering,
device control hygiene
MALICIOUS INSIDER COMPROMIZED INSIDERACCIDENTAL INSIDER
TRUSTED USERS PUTTING ORGANISATIONS AT RISK
19. e-Sides Ethical and Societal Implications of Data Sciences
https://e-sides.eu/collaborative-platform/
RRI principles for privacy-
enhancing technologies
advise and
recommendations on the
best principles of
development of privacy-
preserving
technologies.
Online collaborative space
December 2019
20. Research agenda
20ICT 2018 – Vienna, 5 December 2018
Relevance of the community position paper for the future research
agenda
Building the next Research Agenda
Connect to our
e-SIDES platform
Engage with the
Big Data and
Data Science
Community
Contribute to our
RRI
recommendations
Shape the
priorities for
research
Contribute the
development of
the next
Framework
Programme
Know the topics
of the Work
Programme in
advance
41. This project has received funding from the European Union’s Horizon 2020
research and innovation programme under grant agreement No 825225
Safe and secure data marketplaces for innovation
02.04.2019
54. The dangers of tech-determinism:
Demystifying AI and reclaiming the future
Barbara Giovanelli
Policy Officer, Digital Ethics
European Data Protection Supervisor
56. AI & GDPR
Lawfulness, fairness and transparency of processing
Purpose limitation
Data minimisation
Storage limitation
Controllership and accountability
Right to be informed
Right not to be subject to a decision based solely on automated processing
...
personal data: “any information relating to an identified or identifiable natural person”
57. “Not everything that is legally compliant and technically feasible is
morally sustainable.”
Giovanni Buttarelli, European Data Protection Supervisor,
at ICDPPC 2018
59. The myth of an alleged “artificial intelligence”
leads us to ignore the limitations of big data analytics.
60. “The result produced by the machine, using more and more
sophisticated software ... has an apparently objective and
incontrovertible character to which a human decision-maker may attach
too much weight, thus abdicating his own responsibilities.”
European Commission, Amended proposal for Data Protection Directive 1992
64. “COMPAS is an algorithm widely used in the US to guide sentencing by predicting the
likelihood of a criminal reoffending. In perhaps the most notorious case of AI prejudice,
in May 2016 the US news organisation ProPublica reported that COMPAS is racially
biased. According to the analysis, the system predicts that black defendants pose a
higher risk of recidivism than they do, and the reverse for white defendants. Equivant,
the company that developed the software, disputes that.
It is hard to discern the truth, or where any bias might come from, because the algorithm
is proprietary and so not open to scrutiny. But in any case, if a study published in
January this year is anything to go by, when it comes to accurately predicting who is
likely to reoffend, it is no better than random, untrained people on the internet.”
New Scientist, April 2018
65. “COMPAS is an algorithm widely used in the US to guide sentencing by predicting the
likelihood of a criminal reoffending. In perhaps the most notorious case of AI prejudice,
in May 2016 the US news organisation ProPublica reported that COMPAS is racially
biased. According to the analysis, the system predicts that black defendants pose a
higher risk of recidivism than they do, and the reverse for white defendants. Equivant,
the company that developed the software, disputes that.
It is hard to discern the truth, or where any bias might come from, because the algorithm
is proprietary and so not open to scrutiny. But in any case, if a study published in
January this year is anything to go by, when it comes to accurately predicting who is
likely to reoffend, it is no better than random, untrained people on the internet.”
New Scientist, April 2018
67. “Let's face it, technological determinism is terrible politics.”
Matthew Taylor, December 2018, Wired
68. “When individuals are treated not as persons but as
mere temporary aggregates of data processed at an
industrial scale, they are arguably, not fully respected,
neither in their dignity nor in their humanity.“
EDPS Ethics Advisory Group, 2018
69. “Ethics is not supposed to be easy. Ethics is not there to be convenient.
Ethics is there to challenge your views and notions and convictions on a
daily basis. Ethics is nothing static, it needs to be agile because as a
society we are constantly changing.”
Sandra Wachter, EDPS #DebatingEthics Podcast
73. Good AI to be competitive in the global marketplace,
or good AI for the sake of good AI?
74. “Privacy is the legal recognition of the resistance or reticence to
behaviour steered or induced by power. From this point of view, privacy
in a constitutional democratic state represents a legal weapon against the
development of absolute balances of power, again proving privacy's
essential role in such a state.”
Serge Gutwirth & Paul De Hert, 2006
75. Big Data Analytics & GDPR
Lawfulness, fairness and transparency of processing
Purpose limitation
Data minimisation
Storage limitation
Controllership and accountability
Right to be informed
Right not to be subject to a decision based solely on automated processing
...
personal data: “any information relating to an identified or identifiable natural person”
76. What is our vision and how can we use tech to get there, and not let tech drive it?
104. e-Sides Ethical and Societal Implications of Data Sciences 104
Towards Value-Centric Big Data
Connect People, Processes and Technology
Knowledge Café Session
105. 105
Station 1: Develop a common ethical and legal framework for responsible innovation by privacy-preserving technologies
Station 2: Define design requirements for big data solutions that lead to a more responsible use of big data
Station 3 How to embed accountability, transparency and responsibility in company processes?
Station 4 What about the business? How to balance business and ethical objectives?
FOUR
4
e-SIDES team
Stations