On Thursday 10th November 2016, in The Hague, the Netherlands, Trilateral Research carried out an Ethics and Privacy Impact Assessment (EPIA) workshop as part of the iTRACK project. iTRACK will create an open-source real-time tracking and threat detection system providing intelligent decision support to civilian humanitarian missions for the purpose of better protection, and more efficient and effective operations.
3. Click to edit Master title style
iTRACK Project
Agenda for the day
3
Time Activity
9.30 – 9.45 Registration and Coffee
9.45 -11.15 Introduction & Information Flows
11.15 -11.30 Coffee
11.30 – 12.30 What are Privacy and Ethical Risks
12.30 – 13.30 Lunch
13.30 – 14.45 Identifying Risks: Group Work
14.45 – 15.00 Coffee
15.00 – 17.00 Risk Likelihood, Solutions, Conclusion
4. Click to edit Master title style
iTRACK Project
WHAT are we doing today?
Ethics and Privacy Impact Assessment Workshop
4
5. Click to edit Master title styleWHERE ARE WE NOW?
What? We are developing a human-centred technology that takes into account actual
real-world practices of humanitarian aid workers, and provide policies for better
protection and a more effective and efficient response.
How? By building an integrated intelligent real-time tracking and threat identification
system.
Is there a potential problem? Could potentially be intrusive and people could be
suspicious of it.
What could be? A system that improves protection and improves efficiency.
6. Click to edit Master title styleWHY?
iTRACK’s nature
H2020
User expectations
Marketable
7. Click to edit Master title styleWHO for?
End user
Technology
developers
Research
8. Click to edit Master title styleWHAT is the output of today?
Deliverable
3.1
Future
Technology
9. Click to edit Master title styleMETHODOLOGY
1
Preparation
2
Interviews and mapping of information flows
3
Stakeholder workshop
4
Mapping out the risks and solutions
5
Review of ethical and privacy assessments by an
independent third-party
6
Publication
11. Click to edit Master title styleEXAMPLE QUESTIONS…
Please describe the system/technology that you are developing.
Are there alternatives to the technology that are less
intrusive?
Could the technology affect vulnerable groups?
What personal information is collected?
Where will the information collected be stored?
Can users decline to use the technology?
12. Click to edit Master title style
iTRACK Project
TECHNOLOGY AND INFORMATION FLOWS
12
Understanding the
information flows involved
in a project is essential to a
proper assessment
System of systems
13. Click to edit Master title style
iTRACK Project
Information Flows – SOCIAL SENSE (K-NOW)
13
REMOVED: CONFIDENTIAL
14. Click to edit Master title style
iTRACK Project
Information Flows – STAFF SENSE (K-NOW)
14
REMOVED: CONFIDENTIAL
15. Click to edit Master title style
iTRACK Project
Information Flows – ON BOARD LOCALISATION AND
TRACKING IN VEHICLES (TREELOGIC)
15
REMOVED: CONFIDENTIAL
16. Click to edit Master title styleInformation Flows – RFID & GPS SENSORS (TREELOGIC)
REMOVED: CONFIDENTIAL
17. Click to edit Master title styleInformation Flows – 360-DEGREE PANORAMA CAMERA
(TEKNOVA)
REMOVED: CONFIDENTIAL
18. Click to edit Master title styleInformation Flows – SECURE COMMUNICATION
(Teleplan)
REMOVED: CONFIDENTIAL
19. Click to edit Master title styleInformation Flows – REAL TIME THREAT
DETECTION AND SUPPORT (UiA)
REMOVED: CONFIDENTIAL
20. Click to edit Master title styleInformation Flows – BACKEND SERVER (TREELOGIC)
REMOVED: CONFIDENTIAL
21. Click to edit Master title styleInformation Flows – iTRACK INTEGRATED SYSTEM
(INTRASOFT)
REMOVED: CONFIDENTIAL
23. Click to edit Master title styleRISKS
We need to identify privacy risks
We need to identify ethical risks
24. Click to edit Master title stylePrinciples that will guide this analysis: ETHICS
Ethics, based on ECHR, UNDHR and
other pieces of legislation
Look out for: harm, invasion of
boundaries, invalidity, lack of trust,
undermining of human dignity,
breaching autonomy…
25. Click to edit Master title stylePrinciples that will guide this analysis: PRIVACY
There is no overarching
definition of the term however
in iTRACK we take a broad view
of privacy that goes beyond
traditional notions of personal
information
26. Click to edit Master title stylePrinciples that will guide this analysis: PRIVACY
Privacy of the person
Privacy of behaviour and action
Privacy of communication
Privacy of data and image
Privacy of thoughts and feelings
Privacy of location and space
Privacy of association
27. Click to edit Master title styleEXAMPLE OF A RISK
There is a risk that
iTRACK Reputational
damage due to loss of
personal and
potentially sensitive
data
Because the
server is
insecure
A hacker breaks
into the iTRACK
system
28. Click to edit Master title styleEXAMPLE OF A RISK
There is a risk that a
humanitarian worker
may be discriminated
on the grounds of his
religion
A humanitarian
worker forgets
to turn off his
tracking device
and he is traced
to a place of
worship
A nosy
manager looks
where workers
go after work
29. Click to edit Master title styleEXAMPLE OF A RISK
There is a loss of
personal data and
thus a regulatory
fine & loss of
reputation
They leave
phones with
iTRACK apps lying
around with no
passowrd
Employees do not
know about
privacy and data
protection
31. Click to edit Master title style
• Group 1: Social Sense (Social media collection) & Staff Sense
(Tracking & communication & Health sensors)
• Group 2: On board localisation and tracking in vehicles
• Group 3: Tracking of assets: RFID
• Group 4: Real time threat detection and support
• Group 5: iTRACK Backend Server
• Group 6: 360-degree panorama camera (position and risk
detection)
GROUP EXERCISE
32. Click to edit Master title styleGroup - Names
Group 1
Alexander
Verbraeck
Hayley Watson
Paul
Crompton
Katrina
Petersen
Vita
Lanfranchi
Group 2
Yan Wang
Moran Naor
Gerardo
Glorios
Anne-Laure
DUVAL
Group 3
Julia
Muraszkiewicz
Christian
Baumhauer
Yngvar
Sørensen
Tunca Tabaklar
Mehdi Ben
Lazreg
Group 4
Jacinto
Esteban
Gyöngyi
Kovacs
Tina Comes
Anders Rye
Group 5
Heide Lukosch
Christos
Pateritsas
Lars Hamre
Hlekiwe
Kachali
Group 6
Philipp
Schwarz
Sofia
Tsekeridou
Lisa Maria
Svendsen
Morten
Goodwin
33. Click to edit Master title styleDISCUSSION - WHAT RISKS DID YOU IDENTIFY?
Group 1
Group 2
Group 3
Group 4
Group 5
Group 6
35. Click to edit Master title style
• The aim of this tasks is to assess the value of the risk and
whether it is acceptable. The decisions are based on the
framework of:
– severity of the feared event: what would be the
consequence of the feared event happening?
– likelihood of the feared event?
RISK MAP
36. Click to edit Master title styleRISK MAP
Severity Rating Likelyhood
Minor severity: Risk poses little
potential for loss.
1 Minor Likelihood: Given the current
state of technology the risk is very
unlikely to occur
Moderate severity: Risk requires
significant resources to take place,
with significant potential for loss. Or,
risk requires little resources to take
place, with moderate potential for
loss.
2 Moderate Likelihood: Given the
current state of technology the risk
may occur
High severity: Risk requires few
resources take palce, with significant
potential for loss.
3 High Likelihood: Given the current
state of technology the risk is very
likely to occur
37. Click to edit Master title styleEXAMPLE OF A RISK
There is a risk that
iTRACK Reputational
damage due to loss of
personal and
potentially sensitive
data
Because the
server is
insecure
A hacker breaks
into the iTRACK
system
38. Click to edit Master title styleEXAMPLE OF A RISK: LOSS OF PERSONAL DATA
Security
safeguards in
place so likelihood
is low
The severity is
high (personal
data loss = data
breach = fines &
reputation loss =
significant impact
for individual and
the organisation)
Need to think
about solutions
40. Click to edit Master title styleDISCUSSION
Please present 2x risks and their severity and likelihood.
Group 1
Group 2
Group 3
Group 4
Group 5
Group 6
41. Click to edit Master title style
• Risks with a high severity and likelihood absolutely must be avoided or
reduced by implementing measures that reduce both their severity and
their likelihood.
• Risks with a high severity but a low likelihood must be avoided or reduced
• Risks with a low severity but a high likelihood must be reduced by
implementing security measures that reduce their likelihood.
• Risks with a low severity and likelihood may be accepted, especially since
the treatment of other risks should also lead to their treatment.
SOLUTIONS
42. Click to edit Master title style
Risk Solution(s) Result (avoided, reduced,
minimised or transferred)
Risk: Reputational damage due to
loss of personal and potentially
sensitive data
Add extra protection
layer to server e.g., SSL
Minimised
Risk: Workers tracked and hurt
because tracking app was left
unattended
Ensure devices are
password protected
Minimised
SOLUTIONS
43. Click to edit Master title styleGROUP EXERCISE
Risk Solution(s) Result (avoided, reduced,
minimised or transferred)
44. Click to edit Master title styleDISCUSSION
What Solutions did you identify?
Group 1
Group 2
Group 3
Group 4
Group 5
Group 6
45. Click to edit Master title styleNEXT STEPS
EPIA
TRI’s next
steps
For tech
partners...
For end
user…
Other
tasks...
46. Click to edit
Master text
styles
CLICK TO EDIT MASTER
TITLE STYLE
46
THANK YOU
JULIA MURASZKIEWICZ & INGA KROENER
JULIA.MURASZKIEWICZ@TRILATERALRESEARCH.COM