SlideShare a Scribd company logo
1 of 62
Download to read offline
TILBURG UNIVERSITY
TILBURG LAW SCHOOL
TILBURG INSTITUTE FOR LAW, TECHNOLOGY AND SOCIETY (TILT)
STEFANO FANTIN
(ANR. 428819)
ENCRYPTION WITHIN LAW ENFORCEMENT INVESTIGATIONS
STATE OF PLAY OF THE CURRENT DEBATE
Master Thesis LL.M. Law & Technology
Supervisor: Lorenzo Dalla Corte
2
Table of Contents
Acknowledgments...................................................................................................................................3
1. INTRODUCTION ..............................................................................................................................4
1.1. Framing the research question .....................................................................................................4
1.2. Methodology and purpose............................................................................................................5
2. ENCRYPTION TODAY, THE GOLDEN AGE OF SURVEILLANCE...........................................8
2.1. Introduction......................................................................................................................................8
2.2 Importance of encryption for human rights and civil liberties......................................................8
2.3. State of play of the Going Dark debate......................................................................................13
3. TECHNICAL APPROACHES.........................................................................................................19
3.1 Introduction.................................................................................................................................19
3.2. Backdoors ..................................................................................................................................19
3.2.1. A semantic issue..................................................................................................................21
3.2.2. Backdoors vs front doors....................................................................................................23
3.3 Key-escrow solutions..................................................................................................................24
4. JUDICIAL AND INVESTIGATIVE APPROACHES.....................................................................32
4.1 Introduction.................................................................................................................................32
4.2 Mandatory key disclosure warrants ............................................................................................33
4.3 The UK 2016 Investigatory Powers Act and the decryption order.............................................39
4.4 Investigative approaches.............................................................................................................42
4.4.1 Social engineering................................................................................................................43
4.4.2 Live forensics.......................................................................................................................46
5. CONCLUSIONS...............................................................................................................................50
BIBLIOGRAPHY.................................................................................................................................54
3
Acknowledgments
First and foremost, I would like to express my sincere gratitude to my supervisor Lorenzo Dalla
Corte. The completion of this Thesis would not have been possible without his extensive
support and excellent guidance. I also sincerely thank the Data Protection Office, the European
Counter-Terrorism Centre, the European Cybercrime Centre of Europol, as well as the IT
Policy Sector of the European Data Protection Supervisor. My period of internship in such
organizations has highly enriched this Thesis with a profound insight of the analysed topic. I
am extremely grateful to Jan Ellermann, who has closely followed my path and helped me
understand the value of this research. A thank you to Lodewijk van Zwieten, Camille Antunes,
Desislava Borisova and Sabine Berghs, for having supported this work with interest, priceless
expertise and friendly advice. Last but not least, a big thank you to Emeline and to my friends
for supporting me, always and indiscriminately, guai se non ci foste stati.
This Thesis is dedicated to my parents, Eleonora and Paolo.
4
1. INTRODUCTION
1.1. Framing the research question
Following the Snowden revelations, a renewed fear of being observed indiscriminately
by Big Brother triggered the public debate, consequently lowering the level of trust between
citizens and law enforcement. The watchdogs, accountable for ensuring our safety and security,
are at the heart of a storm surrounding mass collection of personal data in the years following
9/11. This fear played a role in the decision of many communication providers to step aside
from cooperating actively and closely with the intelligence community1
. It would be
redundant to go through all the recent episodes of a saga that repeats itself like an historical
cycle: the so-called ‘crypto war’, name given to the attempt by the U.S. Government (and others
after it) to limit the access to cryptographic systems2
for the facilitation of law enforcement or
intelligence operations, is now taking place again, after more than twenty years. In addition to
the growing discussion on the collection of metadata that we are currently facing, the second
crypto-war created a situation in which public opinion places itself between the perpetrator and
the investigator, in a debate that often raises privacy as the ultimate safeguard against non-
authorized intrusions. The social legitimation behind conducting effective intelligence and law
enforcement operations is being increasingly challenged, and the number of implications from
a legal perspective is enormous. This master’s Thesis builds upon the above-mentioned
premise, and tries to discuss the most feasible and lawful policies to be adopted in order to
overcome encryption when intercepting a communication, or when retrieving data post-
mortem. By doing so, the research question explores the most recent arguments put forward in
order to support each of the analysed solutions. The purpose of encryption can be seen as
twofold: its protective use could turn into a crime facilitator, but it can undoubtedly also be an
instrument necessary to protect an idea or an opinion.
Approaching this from another perspective, encryption’s double connotation raises the
following inherent contradiction: despite being a useful tool to secure private data transfers, it
1
For instance, by implementing robust end-to-end encryption in their services or arguing privacy concerns
when asked for providing technical assistance and backdoors in various court-cases, often involving terrorism-
related crimes.
2
By means of key escrow systems or backdoors, extensively analysed further on.
5
creates a challenging obstacle for the unauthorized access to plaintext. This is extremely
frustrating for the law enforcement and intelligence community.
This paradox is highlighted by the growth of some forms of crimes that use the online
world either as an instrument to commit an offence or for a networked crime. Perpetrators are
very much aware of what encryption allows, as confirmed by the successes of their illegitimate
activities.
The following analysis explores the state of play of various legal hypotheses set forth
to overcome encrypted data in the course of an investigation. One of the purposes behind the
research question is to assess the real need for law enforcement to solve this issue, and the
levels of admissibility and accountability of evidences taken within the overall forensic
panorama, in light of a policing compliant with human rights.
1.2. Methodology and purpose
The encryption debate is currently examined from various perspectives. This Thesis
focuses on the latest developments of some of the most discussed policies supported by law
enforcement and intelligence communities when overcoming encrypted data.
After a horizontal overview of the topic, three different paths to evaluate the research
question become apparent. The first deals with the concept of encryption, its functioning and
the solutions therein. Analysing this tool from a legal point of view can help us understand its
significance from a human rights perspective, as well as clarifying what the concept of ‘going
dark’ entails. A second way of tackling this topic would be with a strictly regulatory approach,
trying to define a comparative landscape, taking into consideration the common and
harmonized standards involving several legal systems. Lastly, an alternative approach could be
pursued. It would pay attention to the best ways to collect evidences by implementing
substitutive methodologies aimed at circumventing encryption, rather than tackling it upfront
(offering both technical and non-technical solutions).
This Thesis will primarily adopt the first approach. By initially evaluating encryption
conceptually, we will be able to acknowledge the two sides of the coin. Once these are assessed,
the reader will understand the main principles to take into account throughout the dissertation.
We will begin by appraising the importance of encryption for the protection of human rights.
Following this, we will explore the recent trends of the so-called going-dark debate, analysing
new developments of the widely-criticized method of weakening encryption.
6
The following chapters will consider some of the main solutions discussed to overcome
encryption, namely backdoors, key escrow, mandatory key disclosure order, social engineering
and live forensic tools. The purpose is to demonstrate, by highlighting each of the latest
developments, whether the law enforcement community is currently discussing a balanced
approach to tackle this issue. The conclusion will give a consistent answer to this question,
taking into account the various hypotheses that have been examined. A look towards the future
will be fundamental, both from a legal and a practical point of view. For the former, it will be
interesting to evaluate the dichotomy ‘privacy vs. security’: with this, we will be able to
understand how this formula is out-of-date and misleading, theoretically and concretely3
. As
for the technical developments of encryption, some questions will have to be answered: how
do we expect the panorama to be? Will there be other mechanisms in place?
The added value that this research will offer builds upon the paradigm used for the
analysis. Drawing from Lessig’s prominent theories4
on how the regulation of the internet is
moving, this thesis will aim to assess how the different constraints of the cyberspace are
interacting with each other. Generally speaking, this starting point will be a valuable tool in
providing a broader evaluation of how different actors are developing their own position on the
encryption debate. Unsurprisingly, the attempt is to demonstrate how the level of the law in the
regulatory process has not yet reached a stable approach. The balance between the law and
other constraints is nevertheless fundamental in order to engage in a proper debate to overcome
this moment of regulatory impasse. The importance of this assessment is confirmed by the fact
that the statement ‘data equals power’ has never been so true.
In the course of this analysis, there will be recurring elements to take into account,
starting with the role of private parties. The relationship between the law enforcement
community and the hi-tech industry is key. Considerations around the reasons why this
relationship is currently being tested will give us a better idea of how these players are expected
to act and react in the future.
Secondly, the research intends to explore the core of the dispute as it stands. The
purpose is to determine whether the debate is keeping us away from what the discussion should
be focusing on, namely the real need for law enforcement to find a prompt solution for the
issues deriving from widespread encryption. Indeed, although there undoubtedly are
3
H. Nissenbaum, ‘Where Computer Security Meets National Security’ Ethics and Information Technology (2005)
7:61–73, <https://www.nyu.edu/projects/nissenbaum/papers/ETINsecurity.pdf> accessed 15.10.2016.
4
L. Lessig, ‘The Laws of Cyberspace’, Taiwan Net ’98 Conference,
<https://cyber.harvard.edu/works/lessig/laws_cyberspace.pdf> accessed 22.09.2016.
7
difficulties, a global and distanced approach should help us realize the factual value of
intercepting a plaintext, and the real risk of jeopardizing a related investigation, if such
interception is made impossible by encryption technologies. As such, it will be relevant to
highlight on which grounds the problem is currently trying to be solved, in order to evaluate
common grounds (if any) and accepted parameters that should be considered when addressing
it.
The research question will give us the possibility to conduct an evaluative exercise,
starting with a descriptive analysis of the latest developments on the subject matter. These
reflections will have to be understood as a common ground for broader assessments, which will
take into consideration a number of primary sources, alongside official reports by international
and European organisations or private stakeholders. Moreover, attention will be paid to the
evolution of the jurisprudence, in order to provide the research with a practical narrative on
how encryption has recently been dealt with by national and trans-national jurisdictions. Lastly,
this work intends to consider two other fundamental sources: legal doctrine and other academic
sources, which will help us support a conceptual analysis carried out by extrapolating some of
the main principles, and face-to-face interviews with experts, prosecutors and practitioners,
necessary to understand how the same concepts are interpreted by actors who are confronted
with them on a daily basis.
Whilst not forgetting the legalistic features that this work intends to be confronted with,
a multi-disciplinary approach is essential, in order to reach a critical perspective of the issue at
stake. Since this research takes into consideration some technical elements of encryption
technologies, it will be fundamental to consider some of the para-legal aspects that are
inherently related. A legal scholar is now increasingly asked to focus on a variety of parallel
disciplines, to reach a fuller understanding of the argument examined. This analysis is
nevertheless aimed towards reflecting upon the impact of encryption on modern society,
fleshing out the various views of the same issue by the different actors involved surrounding
the issue.
8
2. ENCRYPTION TODAY, THE GOLDEN AGE OF SURVEILLANCE
2.1. Introduction
This chapter aims to dive into the twofold feature5
of encryption. The starting point of
this research is intended to analyse further in depth the state of play across current
interpretations of this concept. On the one hand, it is a technology that is of utmost importance
for businesses and citizens, for their interests but most of all, for the protection of their rights.
This feature is now recognized as being a vehicle for securing our thoughts online, as confirmed
by David Kaye6
, and the upcoming obligations under EU law demonstrate the attention of
policy makers in safeguarding such a practice. Moreover, the following section will underline
that in certain circumstances encryption protects individuals from states’ circumventions,
where laws and their enforcements create a ‘chilling effect’ on subjects and their freedom of
expression, stressing constitutional concepts such as rule of law and democratic state.
However, on the other hand, this chapter wants to explore how and to what extent law
enforcement authorities are struggling against the wall that such a tool creates on people’s
electronic communications. The mainstream terminology used (‘going dark’) will have to be
analysed in light of an impartial assessment of the problem at stake. It is indeed acknowledged
that end-to-end or built-in encryption represents an obstacle for the access to such data, but the
dimension of this issue may have been overstated, and its perception by normal citizens may
have been misconducted.
2.2 Importance of encryption for human rights and civil liberties
Today, encryption7
represents an advantageous tool for our globalized and digitalized
third-millennium society. Implications from economic and trade perspectives are largely
recognized8
. From a fundamental rights perspective, encryption is an essential way to protect
5
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to
Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 12.
6
UN Special Rapporteur for the right to freedom of expression and opinion.
7
For the purpose of this Thesis, encryption will be mainly considered as follows:
End-to-end encryption (referring to an encrypted information in transit), and built-in encryption (related to stored
data on a certain device or in the cloud).
8
See also R.Hagemann and J.Hampson, ‘Encryption, Trust, and the Online Economy An Assessment of the
Economic Benefits Associated With Encryption’, Niskanen Centre, 1-25, 9.11.2015
<https://niskanencenter.org/wp-content/uploads/2015/11/RESEARCH-
PAPER_EncryptionEconomicBenefits.pdf>, accessed 30.11.2016, or also D. Castro and A. McQuinn, ‘Unlocking
Encryption: Information Security and the Rule of Law’, ITIF- Information Technology and Innovation
Foundation, March 2016, pp. 1-50 < http://www2.itif.org/2016-unlocking-encryption.pdf >, accessed 15.09.2016.
9
information when sent through the networked environment. Furthermore, the protection of
human rights fully applies to the digital world9
.
Our assessment should start by referring to the previous topical considerations: there is
much more of a connection between these statements than we might expect. This paragraph
tries to link them together with the aim of clarifying the question on the actual impact of
encryption on the protection of human rights and civil liberties.
The use of encryption is now widespread, and can be identified within the six clusters
or areas that this taxonomy is divided into, and where such technologies are deployed: e-mail,
e-commerce, internet services, darknet, new open source (see Open Whysper systems) and next
generation telephone networks (NGN)10
.It is of primary importance to start from the view of
many policy makers, legislators and IT experts11
, integrating encryption as a privacy by design
concept12
; according to Bart Preenel and Seda Gurses, this instrument can be interpreted as
aimed at ‘the protection of information and computation using mathematical techniques’13
.
Understanding technical security as an instrument to protect fundamental rights is
becoming crucial, and information security is undoubtedly an essential enabler to online
privacy. Without it, a number of civil liberties would not be protected and trust could vanish.
This argument is confirmed by the UN Special Rapporteur on the right to privacy, Joseph
Cannataci, in the last section of his first report, titled Ten Points Action Plan:
The safeguards and remedies available to citizens cannot ever be purely
legal or operational. Law alone is not enough. The SRP will continue to
engage with the technical community in an effort to promote the development
of effective technical safeguards including encryption, overlay software and
9
Dutch Ministry of Security and Justice (Criminal Policy Dept.), Cabinet’s View on Encryption. Joined Opinion
of the Dutch Ministry of Security and Justice (G.A. Van der Stur) and the Dutch Minister of Economic Affair
(H.G.J. Kamp) to the President of the House of Representatives of the States General. The Hague, 4.1.2016, ref.
708641.
10
ENISA, Opinion Paper on Encryption, December 2016, p.10 <https://www.enisa.europa.eu/publications/enisa-
position-papers-and-opinions/enisa2019s-opinion-paper-on-encryption>, accessed 13.12.2016.
11
J. Lang, A. Czeskis, D. Balfanz, M.Schilder, S.Srinivas, ‘Security Keys: Practical Cryptographic Second
Factors for the Modern Web’ , Google, Inc., Mountain View, CA, USA, p. 1-18, 25.2.2016,
http://fc16.ifca.ai/preproceedings/25_Lang.pdf, accessed 22.12.2016
12
This term was firstly introduced by the data protection Commissioner of Ontario (Canada), referring to the
practice of implementing Privacy Enhancing Technologies (PETs) and similar mechanisms within the design
phase of information technologies systems. Currently, the concept has been broadened to other technologies and
consolidated by several legal texts, remarkably the new General Data Protection Regulation (Article 23: Data
protection by design and by default).
13
J. Van Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016, p.9,
<http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.>
accessed 8.12.2016.
10
various other technical solutions where privacy-by-design is genuinely put
into practice14
.
A 2016 dossier of the European Union Agency for Network and Information Security
(ENISA) on the use of privacy enhancing technologies15
(PETs) in the context of big data
reports that encrypted tools not only play a vital role from a technical standpoint, but are also
used for the purpose of raising awareness and building trust amongst users16
. Bearing that in
mind, it will not be difficult to support the view that consolidated the value of such tools for
the protection of human rights, with regard to the confidentiality17
and the integrity of the
information therein18
.
It is remarkable to note the increase of interest of international institutions, due to this
argument being broadly included in the tendency of the recent decades to translate a full set of
fundamental rights into the online world19
. Focusing on the rights to freedom of opinion and
expression, the internet should be considered as an extremely effective means to multiply
information in such a central public world forum, as sustained by David Kaye20
. In his 2015
official report on the promotion and protection of the right to freedom of opinion and
expression, the UN Special Rapporteur shows that encryption provides a fertile area to secure
ideas of any kind. What is interesting, from a sociological point of view, is that encryption is
not only considered as a form of protection against random unauthorized access. Indeed, over
the last few years, users have started to qualify governments and law enforcement agencies as
main potential intruders: ‘where States impose unlawful censorship through filtering and other
14
UN – Human Rights Council, Report of the Special Rapporteur on the Right to Privacy, Joseph A. Cannataci,
Joseph Cannataci, A/HRC/31/64, 8.3.2016, p.19.
15
‘Privacy Enhancing Technologies are designed to provide functionality while minimizing the user data that
becomes accessible to the service provider. The most popular examples can now be found in the market of private
messaging’ J. Van Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016,
<http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.>
accessed 8.12.2016.
16
ENISA, Privacy by Design in Big Data: an Overview on Privacy Enhancing Technologies in the Era of Big
Data Analytics, Final Version, 1.0/Public, Dec. 2015, p.6.
17
Working Party Article 29, Opinion 3/2016, On the Evaluation and Review of the ePrivacy Directive
(2002/58/EC), WP240 adopted on 19 July 2016.
18
EDPS, Preliminary EDPS Opinion On the Review of the ePrivacy Directive (2002/58/EC), Opinion 5/2016,
22.07.2016, p.19.
19
As an example, regarding the right to access to public information and the protection of whistle-blowers, see
for instance UN – Human Rights Council, Report of the Special Rapporteur to the General Assembly on the
Protection of Sources and Whistleblowers - David Kaye, 8.9.2015 (para. 62), where is recommended to the states
to implement encryption and promote it in order to ensure the protection of sources.
20
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right
to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 11.
11
technologies, the use of encryption (…) may empower individuals to circumvent barriers and
access information and ideas without the intrusion of authorities’21
.
The value of this statement is supported by a second argument which confirms the
urgency for these principles to be laid down in binding and enforceable legal texts. Along the
line of David Kaye’s report22
, a parallel may be drawn, for instance, with art. 17(2) of the
International Covenant on Civil and Political Rights23
, stating the positive obligation for states
to implement legislative measures to protect privacy against unlawful infringements, regardless
of the fact that the wrongdoer can be either a private entity or a governmental institution, and
the current situation in which these fundamental rights frequently need a stricter protection in
the networked environment.
Touching upon these examples should help us understand how the international
community is trying to promote encryption and other Privacy Enhancing Technologies (PETs)
tools as standardized and practiced ways to protect and promote civil liberties. This idea is
supported by the latest reports released on the exercise of civil rights within the digital world,
mostly performed by assessing the impact of protective instruments like encryption and by
elaborating on the impact using specific case-studies, as this section will further specify.
As acknowledged by the European Data Protection Supervisor24
(EDPS), intrusive
surveillance software would enable us to bypass encryption when using such technologies for
investigative purposes25
, therefore creating an interference with the right to privacy and the
right to freedom of expression (often observed as complementary rights).
In March 2016, Amnesty International released a dossier on encryption, showing how
the digital revolution generated ‘new opportunities for state surveillance’26
: ‘were it not for
encryption, states’ reach into the internet could be total. Even with encryption, states are still
in a position to intercept communication en masse’27
. This statement proves the focus, as well
21
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right
to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 12.
22
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right
to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 18
23
International Covenant on Civil and Political Rights, Article 17:
1. No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or
correspondence, nor to unlawful attacks on his honour and reputation.
2. Everyone has the right to the protection of the law against such interference or attacks.
24
EDPS, Opinion 8/2015 On the Dissemination and Use of Intrusive Surveillance Technologies, 15/12/2015.
25
V.Blue, ‘How Spyware Peddler Hacking Team Was Publicly Dismantled’, (Engagdet, 07.09.15),
<https://www.engadget.com/2015/07/09/how-spyware-peddler-hacking-team-was-publicly-dismantled>
accessed 29/08/16.
26
Amnesty International, Encryption: a Matter of Human Rights, POL 40/3862/2016, March 2016, p.10.
27
Amnesty International, Encryption: a Matter of Human Rights, POL 40/3862/2016, March 2016, p.11.
12
as the immense strength of the online world, leading actors to acknowledge the importance of
having some levels of control over the flow of digital information.
Remarkably, in July 2016, the same Amnesty International (AI) published another
report, this time on the use of surveillance techniques against activists and media in Belarus28
.
It puts forward evidence highlighting a serious level of intrusiveness of national authorities,
exploiting such methods for threatening political adversaries, members of the press community
and opponents. AI’s analysis on the legal framework of Belarus serves as an example of what
David Kaye voiced in his report29
.
UN Rapporteur tackles the issue of national laws on encryption from a dual approach,
stating that, more often than we think, domestic restrictions are not necessary to meet a
legitimate interest, being sometimes disproportionate when assessed in light of the right to
freedom of expression30
: ‘a proportionality analysis must take into account the strong
possibility that encroachments on encryption and anonymity will be exploited by the same
criminal and terrorist networks that the limitations aim to deter’31
.
The two missing characteristics that are identified by the Rapporteur are the failure of
the law in providing instrument of protection (a) and accountability systems (b). Thus said,
evidence of such insufficiency are put forward by Kaye by examining cases in Latin America
and Asia: in these circumstances, the easiest targets of this legal discrepancy are media
communities, informants and whistle-blowers, often seen by ‘non-democratic’ governments as
serious threats against their supremacy. This test perfectly applies to the case of Belarus
highlighted by Amnesty: first and foremost, from a general point of view, national law and its
application throughout surveillance techniques creates a chilling effect on individuals, leading
entire groups of citizens to perceive a permanent sense of control of government and,
conversely, lowering their trust in the public institutions. Moreover, focusing on the role of
28
Amnesty International, It’s enough for People to Feel It Exists. Civil Society, Secrecy and Surveillance in
Belarus, EUR 49/4306/2016, July 2016.
29
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right
to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 38
30
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right
to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 39
31
‘A proportionality analysis must take into account the strong possibility that encroachments on encryption and
anonymity will be exploited by the same criminal and terrorist networks that the limitations aim to deter’. UN –
Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to
Freedom of Opinion and Expression, David Kaye, A/HRC/29/32,, 22.05.2015, para. 12
13
encryption in this scenario, Amnesty noted it as ‘absolutely necessary’32
, despite sometimes
being burdensome to media’s work33
.
A first conclusion can be made, drawing from the previous assessment, on the balance
between the law and other constraints. The Belarus case is a clear example of how regulating
encryption (directly or indirectly) does not necessarily ensure effective safeguards, often
lacking the essential standards foreseen by human rights principles. Indeed, by a careful
understanding of the scenario, the two legal failures (namely, lack of protection and
accountability) can be easily identified within this example, therefore confirming the extreme
value of an instrument like encryption, vehicle for democracy and freedom of expression, first
and foremost in those places and countries where certain civil liberties are not guaranteed and
some minorities are deliberately targeted by governmental institutions.
However, it must be understood that a certain level of restriction of this right is
collectively and legally granted, under certain conditions and with certain safeguards. Checks
and balances should be present in the legislation to ensure that no purpose other than
(inter)national security is pursued. On this basis, law enforcement and intelligence services are
therefore legally entitled to invade a citizen’s right to privacy for national security reasons.
Recently, the whole debate focused on the use of such measures for the fight against terrorism.
In this context, as we will be able to assess in the next section, encryption plays a fundamental
role.
2.3. State of play of the Going Dark debate
In July 2016, the American-based threat intelligence company Flashpoint released a
report called ‘Tech for Jihad’34
, which explored the most recent state-of-the-art uses of
communication technologies by terrorist groups, with a particular focus on the Islamic State35
.
It has emerged that these collectives have overcome the initial familiarization stage,
and the use of certain tools is now widespread36
. Tor and Opera browsers now extensively
32
Amnesty International, It’s enough for People to Feel It Exists. Civil Society, Secrecy and Surveillance in
Belarus, EUR 49/4306/2016, July 2016, p.19.
33
Amnesty International, It’s enough for People to Feel It Exists. Civil Society, Secrecy and Surveillance in
Belarus, EUR 49/4306/2016, July 2016, p.20.
34
The Flashpoint Group, Tech For Jihad: Dissecting Jihadists’ Digital Toolbox, 22.7. 2016, <
https://www.flashpoint-intel.com/library/>, accessed 29.11.2016.
35
For a deeper view of the current terrorist threat, see Europol, European Union Terrorism Situation and Trend
Report ‘TE-SAT’ 2016, <https://www.europol.europa.eu/sites/default/files/documents/europol_tesat_2016.pdf>,
accessed 30.11.2016, or Europol, The Internet Organized Crime Threat Assessment ‘IOCTA’ 2016 <
https://www.europol.europa.eu/sites/default/files/.../europol_iocta_web_2016.pdf >, accessed 28.11.2016.
36
See also Europol, Changes in Modus Operandi of Islamic State (IS) - revisited, 02.12.2016.
14
accompany online interactions amongst these actors, as well as end-to-end encrypted
applications like Threema or Telegram. With regard to the latter, Flashpoint explains how the
possibility for users to communicate within ‘secret chats’ generated a fertile way of conveying
secure communications, as it uses client-to-client encryption (i.e., readable only by the end-
devices) and allows for their complete deletion after being exploited. Furthermore, encrypted
e-mail services like Hushmail or Tutanota are also widely implemented.
As we can see from the outcomes of this report, terrorist groups understood the
importance of cyber security for protecting their data, developing it as an asset in their most
recent modus operandi, and including it as a pivotal tool within the recruitment cycle. On the
other hand, ‘for intelligence agencies and eavesdroppers alike, (messaging and) e-mail
surveillance is a primary way to monitor an actor’37
.
The frustration of cyber investigators on this matter highly influenced a broader public
debate on the lawful access by intelligence and law enforcement on citizens’ devices and digital
storage media. In spite of the increasingly superficial approach that this discussion is taking on
a public level, leading citizens to assess this issue not from a fully-impartial point of view, it is
nevertheless relevant to understand that law enforcement agencies are still objectively
experiencing a challenging period when it comes to encryption within investigations related to
terrorism and other forms of serious crime.
Fig. 1: Castro and McQuinn’s table on how encryption affects government access
to data38
Data at rest Data in motion
Law enforcement
authorities (LEAs)
LEAs cannot gain
access to encrypted data
stored on a user’s device or in
the cloud, even with a search
warrant.
LEAs cannot use
wiretaps to intercept
communications.
Intelligence agencies Intelligence
community cannot gain
access to encrypted data
Intelligence
community cannot analyze
37
The Flashpoint Group, Tech For Jihad: Dissecting Jihadists’ Digital Toolbox, 22.7. 2016,
<https://www.flashpoint-intel.com/library/>, accessed 29.11.2016, p.10.
38
D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF-
Information Technology and Innovation Foundation, March 2016, pp. 1-50 <http://www2.itif.org/2016-
unlocking-encryption.pdf>, accessed 15.09.2016.
15
stored on a user’s device or in
the cloud, including bulk
access to user data.
communications for trigger
terms.
Specifically, the term “going dark” was used during a press conference in 2014 by the
FBI’s director James Comey39
, arguing that within the recruitment phase of some terrorist
groups, a consistent part of the process is undertaken through encrypted communications, thus
leading police to ‘go dark’ by losing their traces in the absence of a legal basis for intervening.
Comey repeated it to the Senate Judiciary Committee in August 2015 (“we see them giving
directions’ to move to a mobile messaging app that is encrypted. And they disappear”40
), for
then testifying it even later: “there is no doubt that the use of encryption is part of terrorist
tradecraft now because they understand the problems we have getting court orders to be
effective when they’re using these mobile messaging apps, especially that are end-to-end
encrypted”41
. The San Bernardino case42
is one of the most well-known examples of a case
which triggered the attention of all the actors, from policy makers to other groups of interests.
It is relevant to note how the interaction between the industry representative and the law
enforcement body has proved to be imbalanced towards the former. Apple showed an obstacle
on allowing the requests of the FBI, raising (costly) architectural problems, as well as ethical
and privacy implications.
It needs to be acknowledged that this is not the first time we try to regulate similar
matters, as confirmed by the report ‘Keys under Doormats’43
. The article begins by saying that
law enforcement already tried (unsuccessfully) to regulate lawful access, but after long debates
the projects were abandoned, and today we are experiencing a so-called repeating of history,
39
James B. Comey, ‘Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?’ Speaking
Note at Brookings Institution , Washington D.C, 16.10.2014, <https://www.fbi.gov/news/speeches/going-dark-
are-technology-privacy-and-public-safety-on-a-collision-course>accessed 10.10.2016.
40
U.S. Committee on Homeland Security, Going Dark, Going Forward, A Primer on the Encryption Debate,
House Committee on Homeland Security Majority Staff Report, June 2016, <https://homeland.house.gov/wp-
content/uploads/2016/07/Staff-Report-Going-Dark-Going-Forward.pdf>, accessed 28.11.2016.
41
CQ press, Senate Judiciary Committee Holds Hearing on FBI Oversight, CQ Congressional Transcripts, 9.12
2015 (Available at < http://www.cq.com/doc/congressionaltranscripts-4803506?2 >, accessed 26.11.2016.
42
E. Perez ‘Apple opposes judge's order to hack San Bernardino shooter's iPhone’ CNN, 17.02.2016
<www.cnn.com/2016/02/16/us/san-bernardino-shooter-phone-apple>, accessed 11.10.2016.
43
H. Abelson, R. Anderson, S. M.Bellovin, J. Benaloh, M.Blaze, W. Diffie, J. Gilmore, M. Green, S. Landau,
P.G. Neumann, R. L. Rivest, J. I. Schiller, B. Schneier, M.l Specter, D. J. Weitzner, ‘Keys Under Doormats:
Mandating insecurity by requiring government access to all data and communications’, MIT Computer Science
and Artificial Intelligence Laboratory Technical Report, 6/7/2016,
p.2<https://dspace.mit.edu/handle/1721.1/97690>, accessed 12.11.2016.
16
which will be extensively discussed further on. The crypto debate is not something new, and it
would be a mistake to think that we are still exploring an initial phase of its regulatory approach.
Such a regulatory approach was initiated a long time ago. What has developed reflects
the globalized fear of mass surveillance after the Snowden revelations, the increase of new
ready-to-use cutting edge technologies, as well as the evolution of the hi-tech market. On this
last point, an extremely interesting and exemplary dichotomy may be drawn again from
Lawrence Lessig: the ‘West Coast Code’, the framework of rules arising from the architecture
originating from hi-tech companies in Palo Alto, stands on the same regulatory level of the
‘East Coast Code’ (laws that are issued in D.C.).
A positive point is that the dynamism of the going dark debate can be evidenced by the
interactions between actors representing either different interests, or constraints, if we bear in
mind the regulatory taxonomy of this research.
In early 2016, a group of experts from the Berkman Klein Institute of Harvard published
a paper with the provocative title ‘Don’t Panic. Making Progress on the ‘Going Dark’ Debate’.
The findings of this report were foreseeable when taking into consideration the actors they were
representing: the message conveyed therein tries to answer “the question whether the going
dark metaphor accurately describes the state of the affairs. (We think not)”44
.
A remarkable follow-up to this research is the written exchange between Mr. Cyrus
Vance Jr., District Attorney of the County of New York City, and the Berkman’s group to some
of the points highlighted by the experts in their paper. Specifically, two points are of certain
relevance to this study, both focusing on the roles of the various stakeholders within different
features of the debate.
First, contrary to what ‘Don’t Panic’ suggests, Cyrus Vance Jr. is of the view that within
the market, hi-tech companies do not frequently have ongoing relationships with their
customers, since these multi-nationals have told society that customers’ data would be
unavailable to law enforcement, even if asked for by ‘lawful means’45
. What Mr. Vance argues
instead is that ‘either the companies are misleading the public, because the material is
44
J. DeLong, U. Gasser, N. Gertner ,J. Goldsmith, S. Landau, A. Neuberger, Nye, David R. O’Brien, M. G.
Olsen, D. Renan ,J. Sanchez, B. Schneier, L. Schwartzt, J. Zittrain, ‘Don’t Panic. Making Progress on the ‘Going
Dark’Debate’,Berklett Project – Berkman Centre Report, 01.02.2016, p.
2<https://cyber.harvard.edu/pubrelease/dontpanic/Dont_Panic_Making_Progress_on_Going_Dark_Debate.pdf>,
accessed 5.9.2016.
45
Cyrus Vance Jr. (District Attorney of N.Y.C.) Letter to the Berkman Centre for Internet & Society,
‘RE: Don’t Panic. Making Progress on the ‘Going Dark’ Debate’, p.2.
17
accessible, or they are entirely correct and the material is inaccessible, notwithstanding the
cloud, back-ups, and data centers’46
.
Having elaborated on these positions, it is relevant to note the inherent behavior of the
market and the players therein. What should be raised here is that hi-tech companies and their
customers align in the demand and supply cycle. For a citizen who buys a hi-tech product, a
privacy-friendly approach is now almost invaluable47
: it satisfies the need of being protected
and secure, pushing away the fear of being under surveillance. On the other hand, hi-tech
companies are highly committed to satisfying this need, as long as this approach is not
economically overwhelming, for the reasonable purpose of designing a tailor-made and
attractive item. But the question is: how much do hi-tech companies actually believe in privacy-
friendly tools, and how much does this policy refer to a business strategy to align themselves
to the needs of their customers?
However, against this background and as a result of this, while industries and citizens
may have reached a compromise, namely around end-to-end encryption, the rationale behind
may be opposite. Notably, the consequence in cyberspace is that the law is hampered by this
perverse market cycle.
Another remarkable aspect to this jostle between the law (enforcement) and the
scientific community is represented by the future scenario of the Internet of Things (IoT)48
. As
Cyrus Vance argues, the IoT will not be ‘as all-encompassing as Don’t Panic suggests’, thus
for law enforcement, it will still be a challenging future for investigations, and this so-called
state of near-constant surveillance is not likely to happen in a short-term period. Again, it
should be noted here that the two positions are possibly too distant from one another. Whilst
acknowledging the increase of IoT products, we should nevertheless keep in mind that there is
no intention from a law enforcement perspective to intercept, for example every oven
connected to the internet within the next five years. Rather, the need for eavesdropping is
assessed on a case-by-case basis. To put it simply, using the previous example, one particular
oven might be more relevant than a hundred. This should trigger that the regulatory intentions
46
Cyrus Vance Jr. (District Attorney of N.Y.C.) Letter to the Berkman Centre for Internet & Society,
‘RE: Don’t Panic. Making Progress on the ‘Going Dark’ Debate’, p.2.
47
see Proposal for a Regulation of the European Parliament and of the Council concerning the respect for private
life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC
(Regulation on Privacy and Electronic Communications)
48
‘The Internet of Things refers to the development that not only computers, but more and more objects (and the
sensors installed in them, including microphones and cameras) become connected to the Internet’, [see J. Van
Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016,
<http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.>
accessed 8.12.2016]
18
from a law enforcement point of view are to allow, rather than generally, collectively and
proactively legislate.
After having briefly commented on the exchange between these stake-holders, a last
consideration is that an active debate is truly taking place. This may be not enough. As this
thesis will try to argue, the current state of play may actually be lacking a truthful
comprehensive discussion that engages a multi-stakeholder commitment on the topic. What
has been assessed so far, suggests that the terminology used (going dark) may have an
emotional impact for the society and lead to misconceptions on this debate49
. Moreover, it
should be noted that such a diversity of technologies, actors and side-factors may lead to
different challenges related with encryption within the going dark trend. For these reasons, it
is relevant to underline that a univocal solution may not be enough, either50
.
With this in mind, this argument will be further developed using the same approach that
has been practiced so far, taking into consideration a multi-stakeholder view. We will be able
to verify how the law is currently not at the level of the other constraints, thus leading to an
impartial debate that needs to be re-organized along the lines of a fair and equal line of
discussion, in view of a regulatory (or legal) solution to this issue.
49
B. Scheneier, ‘The value of encryption’, The Ripon Forum, April 2016,
<https://www.schneier.com/essays/archives/2016/04/the_value_of_encrypt.html>, accessed 11.12.2016.
50
House Judiciary Committee & House Energy and Commerce Committee, Encryption Working Group Year-
End Report, 20.12.2016
19
3. TECHNICAL APPROACHES
3.1 Introduction
The following chapter is intended to give a deep overview of the two most discussed
solutions to bypass encryption. The first one relates to the creation of hardware or software
backdoors, which would enable law enforcement agencies to access encrypted data, often
without informing the user. However, its impact on security standards is massive, as it may be
exploited by malicious actors after its potential disclosure.
In the second section, key escrow solutions will be assessed. An outline of this item
will be beneficial for looking to the past, when the previous ‘crypto war’ was taking place, to
see whether the progress of the modern technologies have in some way altered the previous
dynamics. We will be able to see that no substantial changes can be observed in this parallel,
but rather that there was an evolution of pre-existing constraints that were born during the
discussions around the so-called ‘clipper chip’.
Both solutions are largely controversial from the perspective of many of the actors
involved. In order to dive into these alternatives, it is appropriate to have in mind what Castro
and McQuinn pointed out in March 2016: “adding extraordinary access complicates
encryption. Every line of new code is a place where a mistake can be made and produced51
”.
3.2. Backdoors
Probably the most renowned solution to overcoming encryption relates to the word
backdoor. Generally speaking, a backdoor consists in the deployment of methods or techniques
inside computer programs in order to circumvent, break or bypass security mechanisms in a
given software or any digital storage media. This practice is often not communicated to the
user and should allow the maintenance or the update of such a system. By default, it creates a
weakness within the software or the device in which such a backdoor is implemented which,
once disclosed, enables potential malicious actors to deliberately access it. Narrowing our focus
on encryption52
, the term typically refers to the act of intentionally weakening all of encryption
algorithms used by citizens and businesses beforehand. It means that such an intervention
51
D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF-
Information Technology and Innovation Foundation, March 2016, 1-50, p.19<http://www2.itif.org/2016-
unlocking-encryption.pdf>, accessed 15.09.2016.
52
For a detailed understanding, see also T. Wu, J. Chung, J. Yamat, J. Richman, ‘Encryption Backdoors, The
Ethics (or not) of Massive Government Surveillance’, CS201 Final Project, Stanford Edu,
<https://cs.stanford.edu/people/eroberts/cs201/projects/ethics-of-surveillance/index3.html> accessed 30.11.2016.
20
would be done by default in the design process, thus allowing the third-party to have a sort of
‘window’ to users’ devices, which may be broken each time there is a need or a purpose. Of
course, this solution would be beneficial for combating serious crimes and terrorism, as it
would allow any person (in this case, law enforcement authorities) to break inside a subject’s
or suspect’s data stored on their respective medium.
As explained, a by-product of this solution is theoretically also allowing any potential
wrongdoer to access people’s data, with massive consequences for everyone’s privacy and
security. This solution was discarded a number of times in the past for various reasons, firstly
for the impact that this measure would have on all individuals. Indeed, it is technically
overwhelming to create a backdoor in people’s storage media or in an encryption structure that
would solely enable intelligence services or law enforcement agencies to make use of such a
measure53
.
Terrorist groups, ultimately the target of this technical solution against encryption,
would consequently misuse this for their illicit purposes, posing a serious threat to personal,
business or government data, and decreasing the security on our devices by means of backdoors
would endanger both our personal data and our fundamental rights54
.
Amongst legislators and policy makers, one of the biggest opponents of backdoors in
the European arena is Andrus Ansip, Vice President for the Digital Single Market in the
European Commission and former Prime Minister of Estonia (currently one of the most
digitally-oriented countries in the world). Inter alia, his position on backdoors is relevant to
better understand the fundamental role of encryption as a vehicle of trust for businesses and
society. It is topical what Mr. Ansip said to Politico in March ’16:
The creation of backdoors to access encrypted services and information will
be a boon for terrorism and threatens democracy. (...)We will be much more
affected by terrorism if we create those backdoors if good guys will get
possibilities to test your pin code many times as it will be needed, then bad
guys will do it definitely55
.
53
J.L.Hall, ‘Issue Brief: A ‘Backdoor’ to Encryption for Government Surveillance’, Centre for Democracy &
Technology (CDT)- Security &Surveillance, 3.3.2016, <https://cdt.org/insight/issue-brief-a-backdoor-to-
encryption-for-government-surveillance/>, accessed 1.12.2016.
54
D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF-
Information Technology and Innovation Foundation, March 2016, 1-50, p.18/19< http://www2.itif.org/2016-
unlocking-encryption.pdf>, accessed 15.09.2016.
55
C.Spillane, ‘Ansip: Encryption Backdoors Will Aid Terrorism, Harm Democracy’, Politico, 15.3.2016,
<http://www.politico.eu/pro/ansip-encryption-backdoors-will-aid-terrorism-harm-democracy/>, accessed
16.11.2016.
21
These words confirm the considerations made on backdoors in this Thesis, and outline
the non-feasibility of the solution, not only technically speaking -but also (and mostly, for the
purpose of this research) - in a democratic context.
On the view of the European Union, and in order to understand the two dimensions of
the same solution, the joint statement Europol/ENISA of May 2016 is paradigmatic in terms of
differentiation of levels of approach to the policy in question: ‘intercepting an encrypted
communication or breaking into a digital service might be considered as proportional with
respect to an individual suspect, but breaking the cryptographic mechanisms might cause
collateral damage. The focus should be on getting access to the communication or information;
not on breaking the protection mechanism’56
.
However, there are two new arguments worth mentioning in this Thesis, either because
of the notable actors that proposed them or due to the peculiarity of the theory itself.
3.2.1. A semantic issue
Both following sections tackle the backdoor solution from the point of view of its
definition. Indeed, policy discrepancies sometimes arise from an unclear and non-harmonized
set of definitions, thus leading to divergent understandings of the same subject matter. The first
semantic consideration brings some technical and legal consequences. The second is a recent
landmark position in terms of academic opinions on backdoors. Both arguments are brought to
our attention with the aim of demonstrating how actors are currently building new ‘walls’ to
protect their views on this issue, and this section will conclude with some interesting statements
made by prominent governmental institutions in the recent past. The common ground between
these two arguments is that both refer to the problem of providing clear and specific definitions
to the word ‘backdoor’.
One of the main misconceptions around the use of backdoors to access stored data
actually arises from the (mis)use of such terminology. Recent studies57
revealed how this
wording may refer to a number of technical implementations, often different from one another,
56
Europol/ENISA joint statement, On Lawful Criminal Investigation That Respects 21st Century Data Protection,
statement at margin of the International Conference ‘Privacy in the Digital Age of Encryption & Anonymity
Online’, The Hague, 20.05.2016, <https://www.enisa.europa.eu/publications/enisa-position-papers-and-
opinions/on-lawful-criminal-investigation-that-respects-21st-century-data-protection>, accessed 30.11.2016
57
‘While backdoors have become a significant concern in today’s computing infrastructure, the lack of a clear
definition of a backdoor has led the media (...) to misuse or abuse the word’. J. Zdziarski, ‘Backdoors: A Technical
Definition’, Zdziarski's Blog of Things, 2.5.2016, <https://www.Zdziarski.com/blog/wp-
content/uploads/2016/05/Backdoors-A-Technical-Definition.pdf> accessed 16.11.2016.
22
which in turn leads to misinterpretation58
by non-IT experts. From a policy point of view, this
process is highly dangerous, as it may lead to wrong conclusions and erroneous directions. The
following paragraph will underline this issue on the definition of the concept, which claims,
once again, a common understanding between the actors.
This argument was brought forward by Jonathan Zdziarski, leading expert in iOS digital
forensics, a pioneer in modern digital investigative techniques. He argues59
that a clear
definition of the term backdoor has never reached wide consensus in the IT forensic
community; for instance, in one of his most recent papers Zdziarski assesses backdoors on the
basis of the so-called ‘three pong test’, in which such a method is evaluated throughout three
different phases: intent, consent and access.
Leaving aside more technical details of his view, it is relevant to underline that this
semantic issue may provoke consequences, from both a societal point of view and from a
regulatory perspective. Furthermore, what Zdiarsky fairly argues is that the misinterpretation
of this concept may arise at stages where important and concrete decisions on individuals are
made60
:
Pending legal cases already exist within the court system in which a
technical definition of backdoor would be beneficial. Without a clear
definition, these proceedings pose the risk of misinformation in criminal
prosecution, search warrants, warrants of assistance, secret court
proceedings, and proposed legislation - all by preventing a duly appointed
legal body from adequately understanding the concept61
.
As we can see here, the impact of this small detail may be enormous, and it is exemplary
in understanding how regulators and architects must agree on common, standard and accepted
grounds and terminologies. Summing up Zdziarski’s reasoning, coming to an accepted and
fair definition of backdoors would actually help in removing them from other taxonomies of
malwares or legitimate security mechanisms.
58
An example of this potential misconception argued by Zdziarski can be found in this article: L. Ryge, ‘Most
Software Already Has a ‘Golden Key’ Backdoor: the System Update’, ArsTechnica,
27.2.2016<http://arstechnica.com/security/2016/02/most-software-already-has-a-golden-key-backdoor-its-
called-auto-update/>, accessed 30.11.2016.
59
J. Zdziarski, ‘Backdoors: A Technical Definition’, Zdziarski's Blog of Things, 2.5.2016,
<https://www.Zdziarski.com/blog/wp-content/uploads/2016/05/Backdoors-A-Technical-Definition.pdf>
accessed 16.11.2016.
60
J.Zdziarski,, ‘Identifying Backdoors, Attack Points, and Surveillance Mechanisms in iOS Devices’ International
Journal of Digital Forensics and Incident Response, Elsevier, 26.01.2014, < https://www.Zdziarski.com/blog/wp-
content/uploads/2014/08/Zdziarski-iOS-DI-2014.pdf> accessed 16.11.2016.
61
J. Zdziarski, ‘Backdoors: A Technical Definition’, Zdziarski's Blog of Things, 2.5.2016,
<https://www.Zdziarski.com/blog/wp-content/uploads/2016/05/Backdoors-A-Technical-Definition.pdf>
accessed 16.11.2016.
23
3.2.2. Backdoors vs front doors
While the debate is now leading to the questions above, it is nonetheless worth
discussing in slightly more detail, another contribution that has been already mentioned in this
Thesis. ‘Keys Under Doormats’ reports substantial and clear arguments against backdoors,
bringing our attention to recent statements made by the FBI director Mr. James Comey: ‘it
makes more sense to address any security risks by developing intercept solutions during the
design phase, rather than resorting to a patchwork solution when law enforcement comes
knocking after the fact’62
. What can be fairly supported here relates to the proactive approach
that Mr. Comey would pursue to weaken encryption in the design process by placing such
measures.
From a law enforcement perspective, tackling a crime, especially if serious and
organized, can lead to better results when done proactively. However, it should be argued that
this thought would result in an unfair imbalance for law enforcement, and the cost-benefit
analysis would not take into account the impact of such vulnerabilities on the IT ecosystems.
For these reasons, Mr. Comey added a point that specifies the intended approach within
his previous statement: ‘we aren't seeking a backdoor approach. We want to use the front door,
with clarity and transparency, and with clear guidance provided by law. We are completely
comfortable with court orders and legal process - front doors that provide the evidence and
information we need to investigate crime and prevent terrorist attacks’63
.
This solution sounds fascinating and perfectly fits within the limitations posed against
the use of backdoors. A ‘front door’ which only gives access to law enforcement to access
people’s data would be the perfect compromise, as it would also be lawfully granted by a court
order. Notwithstanding the lack of trust by citizens which may be not solved with this solution,
the other issue here relates to those technical constraints that were briefly exposed in the
previous pages. What exactly did the FBI director mean by the word ‘front door’? To solve this
dilemma, it is worth quoting Vagle and Blaze: “Director Comey was trying to describe (t)he
62
James B. Comey, ‘Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?’ Speaking
Note at Brookings Institution, Washington D.C, 16.10.2014, <https://www.fbi.gov/news/speeches/going-dark-
are-technology-privacy-and-public-safety-on-a-collision-course>, accessed 10.10.2016.
63
James B. Comey, ‘Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?’ Speaking
Note at Brookings Institution, Washington D.C, 16.10.2014, <https://www.fbi.gov/news/speeches/going-dark-
are-technology-privacy-and-public-safety-on-a-collision-course>, accessed 10.10.2016.
24
concept of “security through obscurity”, a widely disavowed practice where the security of the
system relies on the secrecy of its design”64
.
The two experts add some more background at this overview: “once that secret is
discovered—and it’s always discovered, sooner or later—the system’s security is lost forever.
One needs look no further than the many failed attempts at secure digital rights management
(DRM) systems to see this in action65
”. Indeed, for many IT legal experts66
, the term DRM has
unpleasant connotations, reminding them of the repeated efforts made in the past to implement
this technology.
Concluding again with Vagle and Blaze, the wording ‘front door’ brings about a false
distinction of these terms, with highly confusing results67
. Arguments against backdoors were
raised back in the 90s and prevented governments from using such methods. An example of
this is the attempt of implementing the ‘Clipper Chip’ during the so-called ‘second crypto-
war’, our starting point for the following section.
3.3 Key-escrow solutions
The concept behind this solution foresees individuals and private parties being forced
to hand over their secret encryption keys to a pre-defined public institution. It is a system in
which the assigned agency (Trusted Third Party, TTP) plays a crucial role, as well as being put
in a delicate position, a reason why one of the main principles that was claimed within the
theorization of this solution is the independence of the governmental authority in charge of the
retention of such keys. Similar to discussions on features that data protection authorities should
have by definition, one of the primary concerns fairly raised here relates to the correct
implementation of full, impartial and complete statutory independence and autonomy.
64
J.Vagle and M. Blaze, ‘Security ‘Front Doors’ vs. ‘Backdoors’: A Distinction Without a Difference’, Just
Security, 17.10.2014, <https://www.justsecurity.org/16503/security-front-doors-vs-back-doors-distinction-
difference>, accessed. 21.10.2016.
65
J.Vagle and M. Blaze, ‘Security ‘Front Doors’ vs. ‘Backdoors’: A Distinction Without a Difference’, Just
Security, 17.10.2014, <https://www.justsecurity.org/16503/security-front-doors-vs-back-doors-distinction-
difference>, accessed. 21.10.2016.
66
Walker J., ‘The Digital Imprimatur: How Big Brother and Big Media Can Put The Internet Genie Back in the
Bottle’, 13.09.2003, http://www.fourmilab.ch/documents/digital-imprimatur/, accessed 10.01.2017
67
J.Vagle and M. Blaze, ‘Security ‘Front Doors’ vs. ‘Backdoors’: A Distinction Without a Difference’, Just
Security, 17.10.2014, <https://www.justsecurity.org/16503/security-front-doors-vs-back-doors-distinction-
difference>, accessed. 21.10.2016.
25
Furthermore, from an information security perspective, additional accesses imply a
default complication of encryption schemes: “every line of new code is a place where a mistake
can be made and a flaw produced”68
.
However, a better understanding of the current state-of-play around key escrows
inevitably means looking back to the past, to what many experts renamed ‘crypto war’. A first
‘war’ took place, but we will obtain a clearer overview of the evolution of this debate by
focusing on what happened in the United States within the last twenty-five years, as well as
drawing from some notable documents that testify the parallels between that time and now.
Not surprisingly, we will be able to assess how, even in this case, history often has the
magic power to repeat itself over the decades.
Through the analysis of key escrow solutions, this section tries to understand which
dynamics during the crypto war have changed and which have not, with a particular focus on
the (non)collaboration between parties involved, the legal constraints of this methods, the
technical limitations around it and the consequences on a social scale.
The Clipper Chip69
was introduced with the Computer Securities Act in 198770
, but in
1991 it was disclosed that the Digital Signature Standard lobbied for its governmental
implementation. Finally, the White House announced it in 199371
. The microchip was intended
to be inserted in users’ hardware devices, allowing law enforcement to have access to
unencrypted communications through that same chip. The key escrow system would have
therefore allowed the U.S. Government to have all the decryption keys stored under its control.
Multiple concerns, critics and debates were raised over the initiative. First and foremost,
privacy and civil liberties advocates fairly argued the highly intrusive nature of this technology.
Other groups of interests also reacted, supporting arguments like the particular vulnerability
(from a security perspective) of the system or the huge economic impact that this would have
had if fully implemented in the American environment.
68
D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF-
Information Technology and Innovation Foundation, March 2016, p. 13 < http://www2.itif.org/2016-unlocking-
encryption.pdf>, accessed 15.09.2016.
69
As described by M.Blaze, ‘The Escrowed Encryption Standard (EES) defines a US Government family of
cryptographic processors, popularly known as ‘Clipper’ chips, intended to protect unclassified government and
private-sector communications and data. A basic feature of key setup between pairs of EES processors involves
the exchange of a ‘Law Enforcement Access Field’ (LEAF) that contains an encrypted copy of the current session
key. The LEAF is intended to facilitate government access to the clear text of data encrypted under the system’.
M. Blaze, ‘Protocol Failure in the Escrowed Encryption Standard’ Proceedings of the 2nd ACM Conference on
Computer and Communications Security: 59–67, 20.08.1994, <http://www.crypto.com/papers/eesproto.pdf>,
accessed 29.11.2016.
70
Computer Security Law of 1987, Public Law No. 100-235 (H.R. 145), (Jan. 8, 1988)
71
The White House, Office of the Press Secretary, Statement by the Press Secretary- Factsheet on the Clipper
Chip, 16.4 1993
26
These information security weaknesses were proven by several notable players, firstly
by Matt Blaze in 1994. His research72
showed how an installed protective 16-bit hash was too
short to provide full security within such an articulated Escrowed Encryption Standard (EES)
system.
Furthermore, a paper written in 1997 and titled “The risks of key recovery, key escrow,
and trusted third-party encryption’73
gave a final negative opinion on the Clipper Chip, de facto
closing the healed debate around it, at least temporarily. The findings of this study revealed
that key escrow solutions were inherently costly for the end users as well as particularly
dangerous for law enforcement agencies.
The first critical point is the high risk of exposure of an organization retaining escrow
keys. The need for this feature within the authority’s DNA brings by default the growth of
security risks. Internally, authorized personnel may turn into “malicious insiders”74
who misuse
his or her functions for illicit benefits; externally, a cyber-attack could at any time hamper the
correct functioning of the agency’s core business. This point led to a reflection on the
difficulties in placing an oversight mechanism, bearing in mind the absolute independence that
this organization should have. From a legal perspective, it was indeed difficult to imagine a
certain level of supervision over an entity that by definition should stay out from the
governmental dynamics. On the contrary, an internal solution would have undermined the trust
of the citizenship on the democratic state, and this is exactly what happened during the mid-
90s. As a matter of fact, the crypto war testified this principle to be unachievable in concrete
terms.
Secondly, it was a matter of overall complexity. The scale of such a global key escrow
system was impossible to realize due to its inherent overwhelming implementation, and this is
another reason why this project was abandoned. Indeed, ‘key recovery as envisioned by law
enforcement will require the deployment of secure infrastructures involving thousands of
companies, recovery agents, regulatory bodies, and law enforcement agencies worldwide
72
M. Blaze, ‘Protocol Failure in the Escrowed Encryption Standard’ Proceedings of the 2nd ACM Conference
on Computer and Communications Security: 59–67, 20.08.1994, <http://www.crypto.com/papers/eesproto.pdf>,
accessed 29.11.2016.
73
H. Abelson, R. N. Anderson, S. M. Bellovin, J. Benaloh, M. Blaze, W. Die,J. Gilmore, P. G. Neumann, R. L.
Rivest, J. I. Schiller, and others, ‘The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,’
1997. <http://academiccommons.columbia.edu/catalog/ac:127127>, accessed 26.9.2016.
74
H. Abelson, R. N. Anderson, S. M. Bellovin, J. Benaloh, M. Blaze, W. Die,J. Gilmore, P. G. Neumann, R. L.
Rivest, J. I. Schiller, and others, ‘The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,’
1997. <http://academiccommons.columbia.edu/catalog/ac:127127>, accessed 26.9.2016.
27
interacting and cooperating on an unprecedented scale’75
. The paper advanced some hypotheses
and made some calculations, counting, inter alia, thousands of products and agents, millions of
users and public-private keys, tens of thousands of law enforcement agencies around the world,
working with the same levels and standards of security. These numbers would have grown with
the development of new communication technologies. Reasons around cost-benefit presented
in the paper and largely analysed by many actors led policy makers to provisionally discard the
project.
It is now time to come back to the present, and more or less surprisingly, the same
authors of the article gathered again almost twenty years later to see what had changed in light
of the new cases that recently initiated a new crypto war. The result of this exercise is the paper
previously mentioned: ‘Keys Under Doormats’. By starting with the comparison between the
two ‘wars’, the latter paper states that ‘today, the fundamental technical importance of strong
cryptography and the difficulties inherent in limiting its use to meet law enforcement purposes
remain the same’76
.
This wording confirms the thesis that not so many things have changed in favour of a
lawful access by the law enforcement community. Indeed, ‘what has changed is that the scale
and scope of systems dependent on strong encryption are far greater, and our society is far more
reliant on far-flung digital networks that are under daily attack’77
. Along the same line, another
group of researchers theorized similar developments of the crypto war related to the clipper
chip debate in the paper ‘Doomed to repeat history’ (June 2015)78
. Notably, the two articles
give similar findings. What these groups observed is that we are not assisting to a change, rather
to an increase or the growth of certain crucial features, specifically:
75
H. Abelson, R. N. Anderson, S. M. Bellovin, J. Benaloh, M. Blaze, W. Die,J. Gilmore, P. G. Neumann, R. L.
Rivest, J. I. Schiller, and others, ‘The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,’
1997. <http://academiccommons.columbia.edu/catalog/ac:127127>, accessed 26.9.2016.
76
H. Abelson, R. Anderson, S. M.Bellovin, J. Benaloh, M.Blaze, W. Diffie, J. Gilmore, M. Green, S. Landau,
P.G. Neumann, R. L. Rivest, J. I. Schiller, B. Schneier, M.l Specter, D. J. Weitzner, ‘Keys Under Doormats:
Mandating Insecurity by Requiring Government Access to All Data and Communications’, MIT Computer
Science and Artificial Intelligence Laboratory Technical Report, 6/7/2016, p.2
<https://dspace.mit.edu/handle/1721.1/97690>, accessed 12.11.2016
77
H. Abelson, R. Anderson, S. M.Bellovin, J. Benaloh, M.Blaze, W. Diffie, J. Gilmore, M. Green, S. Landau,
P.G. Neumann, R. L. Rivest, J. I. Schiller, B. Schneier, M.l Specter, D. J. Weitzner, ‘Keys Under Doormats:
Mandating Insecurity by Requiring Government Access to All Data and Communications’, MIT Computer
Science and Artificial Intelligence Laboratory Technical Report, 6/7/2016, p.13
<https://dspace.mit.edu/handle/1721.1/97690>, accessed 12.11.2016
78
D.Kehl, A. Wilson, K. Bankston, ‘Doomed to Repeat History? Lessons from the Crypto Wars of the 1990s’,
New America, Open Technology Institute- Cybersecurity Initiative, p.18,
<https://static.newamerica.org/attachments/3407-doomed-to-repeat-history-lessons-from-the-crypto-wars-of-
the-1990s/OTI_Crypto_Wars_History.abe6caa19cbc40de842e01c28a028418.pdf>, accessed 11.12.2016.
28
 Standardization: U.S. Government cryptographic standards are now widespread both
within the Country and abroad;
 Security: the challenge around security has shifted from the pure mathematics of
cryptosystems to a higher level, namely the design and the implementation processes
of complex software systems;
 Cyber defence: attacks against governmental cyber infrastructures are now massively
increasing. There is a long history of exponential growth of cyber threats under public
institutions or similar bodies are exposed to;
 Social and economic reliance to the internet: it is clear and undoubtable that current
times are led by internet-related mechanisms and digital infrastructures. For this
reason, our necessity to make the online environment secure at any cost has become of
the utmost importance, much more than during the crypto war. Indeed, since that time,
‘strong encryption has become a bedrock technology that protects the security of the
internet, as well as the promotion of free expression online. Moreover, (...) information
economy has grown exponentially’79
.
In one of the closed sessions of the 38th Annual Conference of Data Protection and
Privacy Commissioners, held in Marrakesh in October 2016, Christopher Kuner, a prominent
practicing lawyer and academic, gave a speech on encryption from an historical perspective. It
is remarkable to mention some excerpts, due to both the notoriety of the author and for the
interesting analysis behind. Two points of his summary in fact require a closer look, as they
highlight the action-reaction of two different groups of actors that were involved in the previous
(and in the current) crypto war.
Firstly, he argues that one of the biggest constraints was that many non-U.S.
governments did not support the project80
. It may sound obvious, but if we approach this
consideration from a broader perspective, taking into account the development of international
relations, what we should acknowledge is that no changes to this can now be observed. Rather,
and similarly to the four points mentioned above, the assumption data equals power led to the
79
D.Kehl, A. Wilson, K. Bankston, ‘Doomed to Repeat History? Lessons from the Crypto Wars of the 1990s’,
New America, Open Technology Institute- Cybersecurity Initiative, p.18,
<https://static.newamerica.org/attachments/3407-doomed-to-repeat-history-lessons-from-the-crypto-wars-of-
the-1990s/OTI_Crypto_Wars_History.abe6caa19cbc40de842e01c28a028418.pdf>, accessed 11.12.2016
80
C.Kuner, Summary remarks/ closed session ‘Encryption and the Rule of Law: an International Conversation’,
38th Annual Conference of Data Protection and Privacy Commissioner, Marrakech (Morocco) - 17 October 2016,
<https://icdppc.org/wp-content/uploads/2015/03/Dr-Christopher-Kuner.pdf>p.1.
29
distancing of governments, when it came to the cyber world81
: “cryptography rearranges power:
it configures who can do what, from what”82
, quoting Rogaway. To this end, the cyberspace is
nothing else than an unexplored battlefield, where governments jealously claim their
sovereignty, and examples of unsuccessful legislative attempts83
support the theory that
national entities are not yet ready to sacrifice a piece of their competence over a common intent.
To make it more concrete, and coming back to the legal ground, the problem of cross-border
jurisdiction84
in the context of investigations in cyberspace is able to jeopardise a unique
common intent, at least with regard to the achievement of fast, harmonized and readily-
enforceable set of trans-national rules, as confirm Svantesson and Van Zwieten85
.
Another relevant point of discussion by Kuner is about the presence and interaction of
the private industries86
. Remarkably, another constraint that resulted in the rejection of the
project was the claim of crypto industries and major hi-tech firms that they would delocalize
their production (and their headquarters) overseas or outside the United States. Bearing that in
mind, it is not hard to believe that every government would carefully reflect before introducing
rules that lead industries to leave their country, especially if related to the Silicon Valley, at the
beginning of their boosting phase. Undoubtedly, such a policy would hamper innovation and
keep investments out of the borders.
Additionally, it must be recognised that certain countries, mostly economic or political
competitors of Europe and the U.S., are either drafting or implementing legislations that would
force hi-tech industries to actively collaborate with governmental intelligence agencies for the
purpose of granting them access to customers’ data87
. From the European point of view, the
position of ENISA confirms this and widens the scope of the risks that such practices carry:
81
On shifting powers in the Internet society, see for instance B.-J. Koops, ‘Law, Technology, and Shifting Power
Relations’, Berkeley Technology Law Journal, Art.5 [vol. 25:973], March 2010, p.973-1034,
<http://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?article=1848&context=btlj>, accessed 30.11.2016.
82
P. Rogaway, ‘The Moral Character of Cryptographic Work’. University of California, December
2015<http://web.cs.ucdavis.edu/~rogaway/papers/moral-fn.pdf>, accessed 8.12.2016.
83
ECJ, Case C-293/12 Digital Rights Ireland v Data Protection Commissioner
84
see also J. Ellermann, ‘Terror Won’t Kill the Privacy Star – Tackling Terrorism Propaganda Online in a Data
Protection Compliant Manner’, ERA Forum DOI 10.1007/s12027-016-0446-z, 17.11.2016,
<http://www.readcube.com/articles/10.1007/>, accessed 10.12.2016.
85
J. B. Svantesson and L. Van Zwieten, ‘Law Enforcement Access to Evidence Via Direct Contact with Cloud
Providers - Identifying the Contours of a Solution’, Computer Law & Security review 32, Science Direct, Elsevier
Ltd, p.671-682.
86
C.Kuner, Summary remarks/ closed session ‘Encryption and the Rule of Law: an International Conversation’,
38th Annual Conference of Data Protection and Privacy Commissioner, Marrakech (Morocco) - 17 October 2016
, https://icdppc.org/wp-content/uploads/2015/03/Dr-Christopher-Kuner.pdf p.2/3.
87
See for instance J.A. Lewis, ‘Posturing and Politics for Encryption’, Centre for Strategic and International
Studies, 17.2.2016, <https://www.csis.org/analysis/posturing-and-politics-encryption >, accessed 20.7.2016.
30
‘the perception that backdoors and key escrow exist can potentially affect and undermine the
aspirations for a full embraced Digital Society in Europe’88
.
As the state of play saw tech industries leveraging this argument in support of a strong
crypto advocacy, it now seems at least improbable that big firms would re-locate their
structures out of the U.S.: on one hand, many countries where innovation seems fertile and
funded have intrusive surveillance programs in place, and on the other hand, the European
option may not be so feasible due to the privacy implications of the most recent case-laws89
involving big hi-tech firms, hence leading to a less flexible scenario than the American one.
The U.S.-based solution still seems to be the best one in terms of compromise between trust
and surveillance, from the perspective of hi-tech industries.
Interestingly (and different from the ones cited in this chapter), this argument does not
support the idea enshrined in the previous pages (and in the quoted documents) that no changes
have occurred in the past twenty years, rather there has been an increase of some kind of
leitmotivs; what manufacturers threatened to do during the 90s seems quite impractical to
achieve in modern times (at least within the same conditions), although we should acknowledge
that the battles are played off the same fields as twenty plus years ago, and European players
are recognising that ‘the experience in the US that limiting the strength of encryption tools
inhibited innovation and left the competitive advantage in this area with other jurisdictions’90
.
To conclude, it is acknowledged that key escrow would actually enable governments to
unlock encrypted data by forcing enterprises, and entrusted third party or public institutions
themselves, to store an extra key for all this data. However, requiring such methods would
mean exposing businesses and citizens to overwhelming security risks and excludes the
deployment of certain security features such as perfect forward secrecy91
, considered by many
as best practice92
.
88
ENISA, Opinion Paper on Encryption, December 2016, p.5 <https://www.enisa.europa.eu/publications/enisa-
position-papers-and-opinions/enisa2019s-opinion-paper-on-encryption>, accessed 13.12.2016.
89
De Atley, Richard K.; Wesson, Gail, 21.12.2015 , "SAN BERNARDINO SHOOTING: Bail denied; Marquez
to remain in custody on terror-related charges (UPDATE 3)".<http://www.pe.com/articles/court-789907-
marquez-hearing.html>, accessed 28.12.2016.
90
ENISA, Opinion Paper on Encryption, December 2016, p.5 <https://www.enisa.europa.eu/publications/enisa-
position-papers-and-opinions/enisa2019s-opinion-paper-on-encryption>, accessed 13.12.2016.
91
D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF-
Information Technology and Innovation Foundation, March 2016, p.13 <http://www2.itif.org/2016-unlocking-
encryption.pdf >, accessed 15.09.2016.
According to TechoPedia, Perfect Forward Secrecy (PFS) is ‘a data encoding property that ensures the integrity
of a session key in the event that a long-term key is compromised. PFS accomplishes this by enforcing the
derivation of a new key for each and every session’.
92
J. Van Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016, p.16
<http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.>
accessed 8.12.2016.
31
As we have seen, this section has shown how, despite the battles of more than two
decades ago, actors are still not able to find a common solution for implementing key escrow
systems to access encrypted data. Moreover, what we have interestingly observed through the
assessments of certain documents is that some stakeholders are on the same page as the one
they were during the clipper chip phase. What we demonstrated is that almost nothing has
changed. Rather, there has been a growth of certain constraints for the implementation of key
escrows, and that these constraints constitute even bigger limitations than in the past, keeping
actors distanced from each other with regards to this specific solution.
32
4. JUDICIAL AND INVESTIGATIVE APPROACHES
4.1 Introduction
The following pages will explore the state of play of alternative solutions to overcome
encryption, namely, mandatory key disclosure warrant, social engineering and governmental
hacking by means of remote live forensics.
The first approach mostly refers to a legal one, i.e. deploying a solution that solely relies
on legal provisions and their enforcement to achieve the result of overcoming encryption.
The intended methodology will mainly focus on the European scenario, assessing the
state of the art of such legislations, as well as correlated court cases. However, it should be kept
in mind that, due to its relevance in the debate, a particular attention will be also paid to the
United States, as its legal regime offers interesting points of reflection, bearing in mind the
cross-border nature of this issue. As many have argued93
over the last years, an approach that
also takes into due consideration overseas dynamics is of utmost relevance since one of the
newest and sometimes most problematic weaknesses of a cyber-investigation relates to the
need for European law enforcement authorities to liaise with actors that are often based out of
the European borders.
The discussion around disclosure orders offers us a paradigmatic example of how the
regulator is in this way capable of enabling law enforcement authorities a series of powers and
rights to force individuals to compel with such an order. As we will be able to see, however,
this approach can be difficult to achieve in certain circumstances (due to professional secrecy
obligations or to central principles such as the privilege against self-incrimination).
Attention will be then paid to the newly released 2016 RIPA, a legislation that regulates
surveillance in the United Kingdom. Such a focus intends to give an example of how provisions
regarding cyber investigations and in particular encryption still cannot meet balanced
approaches or full consensus within their addressees. The facts and the documents that will be
analysed bring forward the conclusions that, due to its intrusiveness and the potential
infringement of human rights, mandatory warrants should be considered as the last resort from
the range of options that law enforcement authorities have in order to achieve an effective
result.
93
Svantesson J. B. and Van Zwieten L., ‘Law Enforcement Access to Evidence Via Direct Contact With Cloud
Providers - Identifying the Contours of a Solution’, Computer Law & Security review 32, Science Direct,
Elsevier Ltd, 2015, p.671-682.
33
The second possibility refers to social engineering. Despite being a purely alternative
approach, examples of effective social engineering solutions will show how this policy in
practice still plays a crucial role for overcoming encrypted communications, especially in cases
of serious crimes and terrorism.
The chapter concludes with the assessment of so-called live forensics94
techniques.
Despite being a policy that has been under criticisms several times due to a number of
constraints, it is an approach increasingly used in those countries where such a practice is
allowed or in those where no strict regulations on this item are present.
4.2 Mandatory key disclosure warrants
Globally, including in Europe, some countries have in their criminal or procedural
legislation a provision enabling law enforcement agencies to force individuals to reveal their
private key for the purpose of investigating and prosecuting a crime95
(Belgium96
or France97
,
to name a few98
).
Under those provisions and by means of a court order, judges can ask for the disclosure
of the encrypted content99
of subject’s devices when deemed relevant for a specific case or
investigation. In case of refusal of such a request, detention or fines are foreseen for the subject
if the court deems the undisclosed data as potentially relevant (compared with the overall
forensic collection of evidence).
It is a solution that entails some points of reflection, mostly from a procedural point of
view, since much criticism was raised over the collision of such norms with essential principles
of criminal and procedural law. Additionally, this practice can be relatively easy to technically
circumvent100
.
94
‘Live forensics considers the value of the data that may be lost by powering down a system and collect it while
the system is still running. The other objective of live forensics is to minimize impacts to the integrity of data while
collecting evidence from the suspect system’, MacForensicsLab, Definition of Live Forensics,
<http://www.macforensicslab.com/index.php?main_page=document_general_info&products_id=212>,
accessed 22.12.2016
95
S. Ranger,’The Undercover War on Your Internet Secrets: How Online Surveillance Cracked Our Trust in the
Web’. 24.3.2015, TechRepublic, <http://www.techrepublic.com/article/the-undercover-war-on-your-internet-
secrets-how-online-surveillance-cracked-our-trust-in-the-web/>, accessed 22.12.2016.
96
Loi du 28 novembre 2000 [2000-11-28/34] Relative à la Criminalité Informatique (Article 9).
97
Loi no 2001-1062 du 15 novembre 2001 Relative à la Sécurité Quotidienne (Article 30).
98
See next section for a focus on the United Kingdom.
99
For the purpose of this Thesis, the following analysis will mainly focus on full-disk-encryption.
100
As will be further explained later, by means of forward secrecy or deniability.
34
The preliminary consideration refers to a series of requirements that an order should
follow in terms of checks and balances, specifically with regard to its compliance with the rule
of law.
Firstly, such a warrant should be based on a publicly accessible law101
which clearly
limits its scope. In Rotaru v Romania102
, the European Court of Human Rights decided in favour
of the applicant since domestic laws did not specify which information, and pertaining to
whom, could be processed for surveillance purposes under the scope of national security. On
the contrary, a key disclosure order should effectively define under which terms such a warrant
can be issued, avoiding a blanket approach to the item103
.
Moreover, a legislation that provides investigative bodies with this tool should be “(...)
implemented under independent and impartial judicial authority, in particular to preserve the
due process rights of targets, and only adopted when necessary and when less intrusive means
of investigation are not available”104
.
Indeed, what Kaye recommends by summarizing the findings of his report relates to the
principles of proportionality and necessity. In fact, a disclosure warrant based on unlimited
(and untargeted) powers given to law enforcement authorities should not be lawfully pursuable,
and this test can be assessed through an evaluation of two familiar concepts: independence and
impartiality. These measures actually find a lawful ground, when in place for a limited and
definite aim.
On the other hand, mandatory key disclosure warrants present some clear difficulties
within their implementation which will be hereby briefly explained. Before this, it should be
noted that such criticalities mainly relate to both legal and technical constraints.
As for the first, one of the most debated arguments that poses a serious limitation to
such a solution relates to the right against self-incrimination, broadly called right to be silent.
101
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right
to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 45.
102
ECtHR, Rotaru v. Romania [GC], No. 28341/95, 4 May 2000, para. 57.
103
See also ECtHR, Taylor-Sabori v the United Kingdom, No 47114/99, 22 October 2002.
104
UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right
to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 45.
35
Recalling some old-fashioned TV shows, ‘Pleading the Fifth’105
(Amendment106
), i.e.
exercising the right to be silent, represents a crucial legal counter-argument when brought
against mandatory key disclosure orders. The right against self-incrimination, the principle
under which everyone has the right to not incriminate himself for the commission of an offence
by means of testimonial evidence, is one of the most debated topics when discussing solutions
to overcome encryption. Focusing on the European scenario, the principle is indirectly107
protected by Article 6 of the European Convention of Human Rights108
, and serves as a vital
defence for a suspect or a witness109
. Specifically, it ‘protects a defendant from the ‘cruel
trilemma’ of offering incriminating evidence against oneself and risking a criminal conviction;
lying to government officials and risking perjury; or keeping silent and risking contempt of
court’110
. Expectedly, a number of national jurisdictions within the European territories follow
these traditional criminal law principles, applying it by means of different procedural ways:
leading countries like France111
or Germany112
for example, are all systems that deploy these
approaches within their legislations.
Broadly speaking, this concept is not so distant from the principles accompanying the
right to privacy: in fact, it brings to an enlargement of legislators’ view on what should be
105
Constitution of the United States of America, Amendment V: ‘No person shall be held to answer for a
capital, or otherwise infamous crime, unless on a presentment or indictment of a Grand Jury, except in cases
arising in the land or naval forces, or in the Militia, when in actual service in time of War or public danger; nor
shall any person be subject for the same offence to be twice put in jeopardy of life or limb; nor shall be
compelled in any criminal case to be a witness against himself, nor be deprived of life, liberty, or property,
without due process of law; nor shall private property be taken for public use, without just compensation’
106
One of the first cases in the United States that addressed a suspect giving up an encryption key is U. S. District
Court for the District of Vermont, United States v. Boucher, 2009
107
ECtHR, Murray v UK, [1996]
108
European Convention on Human Rights (ECHR), Article 6(1):
In the determination of his civil rights and obligations or of any criminal charge against him, everyone is entitled
to a fair and public hearing within a reasonable time by an independent and impartial tribunal established by
law. Judgment shall be pronounced publicly but the press and public may be excluded from all or part of the trial
in the interest of morals, public order or national security in a democratic society, where the interests of juveniles
or the protection of the private life of the parties so require, or the extent strictly necessary in the opinion of the
court in special circumstances where publicity would prejudice the interests of justice. Everyone charged with a
criminal offence shall be presumed innocent until proved guilty according to law. Everyone charged with a
criminal offence has the following minimum rights: (a) to be informed promptly, in a language which he
understands and in detail, of the nature and cause of the accusation against him;(b) to have adequate time and
the facilities for the preparation of his defence, (c) to defend himself in person or through legal assistance of his
own choosing or, if he has not sufficient means to pay for legal assistance, to be given it free when the interests
of justice so require; (d) to examine or have examined witnesses against him and to obtain the attendance and
examination of witnesses on his behalf under the same conditions as witnesses against him; (e) to have the free
assistance of an interpreter if he cannot understand or speak the language used in court.
109
See for instance, Blau v. United States, 340 U.S. 159, 71 S. Ct. 223, 95 L. Ed. 170 (1950)
110
R.M. Thompson II and C. Jaikaran,’Encryption: Selected Legal Issues’, Congressional Research Service 1914
3.3.2016, <https://fas.org/sgp/crs/misc/R44407.pdf>, accessed 5.12.2016.
111
Code de Procédure Pénale, Art. L116 [FR]
112
Strafprozessordnung, § 136 [DE]
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations
Overcoming Encryption in Investigations

More Related Content

Viewers also liked

An Overview of Identity Based Encryption
An Overview of Identity Based EncryptionAn Overview of Identity Based Encryption
An Overview of Identity Based EncryptionVertoda System
 
CPK in Eurocrypt 2007 Rump Session
CPK in Eurocrypt 2007 Rump SessionCPK in Eurocrypt 2007 Rump Session
CPK in Eurocrypt 2007 Rump SessionZhi Guan
 
Image encryption using aes key expansion
Image encryption using aes key expansionImage encryption using aes key expansion
Image encryption using aes key expansionSreeda Perikamana
 
Reversible data hiding in encrypted images by reserving room before encryption
Reversible data hiding in encrypted images by reserving room before encryptionReversible data hiding in encrypted images by reserving room before encryption
Reversible data hiding in encrypted images by reserving room before encryptionPapitha Velumani
 
Steganography and Steganalysis
Steganography and Steganalysis Steganography and Steganalysis
Steganography and Steganalysis zaidsalfawzan
 
Face recognition ppt
Face recognition pptFace recognition ppt
Face recognition pptSantosh Kumar
 

Viewers also liked (6)

An Overview of Identity Based Encryption
An Overview of Identity Based EncryptionAn Overview of Identity Based Encryption
An Overview of Identity Based Encryption
 
CPK in Eurocrypt 2007 Rump Session
CPK in Eurocrypt 2007 Rump SessionCPK in Eurocrypt 2007 Rump Session
CPK in Eurocrypt 2007 Rump Session
 
Image encryption using aes key expansion
Image encryption using aes key expansionImage encryption using aes key expansion
Image encryption using aes key expansion
 
Reversible data hiding in encrypted images by reserving room before encryption
Reversible data hiding in encrypted images by reserving room before encryptionReversible data hiding in encrypted images by reserving room before encryption
Reversible data hiding in encrypted images by reserving room before encryption
 
Steganography and Steganalysis
Steganography and Steganalysis Steganography and Steganalysis
Steganography and Steganalysis
 
Face recognition ppt
Face recognition pptFace recognition ppt
Face recognition ppt
 

Similar to Overcoming Encryption in Investigations

chap1 crypto.pdf
chap1 crypto.pdfchap1 crypto.pdf
chap1 crypto.pdfeseinsei
 
1 Evaluate the role of managers in criminal justice planning and in.pdf
1 Evaluate the role of managers in criminal justice planning and in.pdf1 Evaluate the role of managers in criminal justice planning and in.pdf
1 Evaluate the role of managers in criminal justice planning and in.pdffms12345
 
Hhs en12 legalities_and_ethics
Hhs en12 legalities_and_ethicsHhs en12 legalities_and_ethics
Hhs en12 legalities_and_ethicsShoaib Sheikh
 
Outline D
Outline DOutline D
Outline Dbutest
 
#Folksonomies: the next step forward to transparency?
#Folksonomies:  the next step forward to transparency?#Folksonomies:  the next step forward to transparency?
#Folksonomies: the next step forward to transparency?Federico Costantini
 
Computer Forensic: A Reactive Strategy for Fighting Computer Crime
Computer Forensic: A Reactive Strategy for Fighting Computer CrimeComputer Forensic: A Reactive Strategy for Fighting Computer Crime
Computer Forensic: A Reactive Strategy for Fighting Computer CrimeCSCJournals
 
From the Cuckoo’s Egg to Global Surveillance Cyber Espion
From the Cuckoo’s Egg to Global Surveillance Cyber EspionFrom the Cuckoo’s Egg to Global Surveillance Cyber Espion
From the Cuckoo’s Egg to Global Surveillance Cyber EspionJeanmarieColbert3
 
Electronic Surveillance Of Communications 100225
Electronic Surveillance Of Communications 100225Electronic Surveillance Of Communications 100225
Electronic Surveillance Of Communications 100225Klamberg
 
Electronic Surveillance of Communications 100225
Electronic Surveillance of Communications 100225Electronic Surveillance of Communications 100225
Electronic Surveillance of Communications 100225Klamberg
 
Don't Panic. Making Progress on the 'Going Dark' Debate
Don't Panic. Making Progress on the 'Going Dark' DebateDon't Panic. Making Progress on the 'Going Dark' Debate
Don't Panic. Making Progress on the 'Going Dark' DebateFabio Chiusi
 
ESSENTIALS OF Management Information Systems 12eKENNETH C..docx
ESSENTIALS OF Management Information Systems 12eKENNETH C..docxESSENTIALS OF Management Information Systems 12eKENNETH C..docx
ESSENTIALS OF Management Information Systems 12eKENNETH C..docxdebishakespeare
 
ESSENTIALS OF Management Information Systems 12eKENNETH C.
ESSENTIALS OF Management Information Systems 12eKENNETH C.ESSENTIALS OF Management Information Systems 12eKENNETH C.
ESSENTIALS OF Management Information Systems 12eKENNETH C.ronnasleightholm
 
The Investigative Lab - Nuix
The Investigative Lab - NuixThe Investigative Lab - Nuix
The Investigative Lab - NuixNuix
 
The Investigative Lab - White Paper
The Investigative Lab - White PaperThe Investigative Lab - White Paper
The Investigative Lab - White PaperNuix
 
Final Paper Draft Outline – Week 7 For the second to last.docx
Final Paper Draft Outline – Week 7  For the second to last.docxFinal Paper Draft Outline – Week 7  For the second to last.docx
Final Paper Draft Outline – Week 7 For the second to last.docxcharlottej5
 
Are computer hacker break ins ethical
Are computer hacker break ins ethicalAre computer hacker break ins ethical
Are computer hacker break ins ethicalUltraUploader
 
Lofty Ideals: The Nature of Clouds and Encryption
Lofty Ideals: The Nature of Clouds and EncryptionLofty Ideals: The Nature of Clouds and Encryption
Lofty Ideals: The Nature of Clouds and EncryptionSean Whalen
 

Similar to Overcoming Encryption in Investigations (20)

Smart intelligence
Smart intelligenceSmart intelligence
Smart intelligence
 
chap1.pdf
chap1.pdfchap1.pdf
chap1.pdf
 
chap1 crypto.pdf
chap1 crypto.pdfchap1 crypto.pdf
chap1 crypto.pdf
 
Cybersecurity - Cooperation or Proliferation
Cybersecurity -  Cooperation or ProliferationCybersecurity -  Cooperation or Proliferation
Cybersecurity - Cooperation or Proliferation
 
1 Evaluate the role of managers in criminal justice planning and in.pdf
1 Evaluate the role of managers in criminal justice planning and in.pdf1 Evaluate the role of managers in criminal justice planning and in.pdf
1 Evaluate the role of managers in criminal justice planning and in.pdf
 
Hhs en12 legalities_and_ethics
Hhs en12 legalities_and_ethicsHhs en12 legalities_and_ethics
Hhs en12 legalities_and_ethics
 
Outline D
Outline DOutline D
Outline D
 
#Folksonomies: the next step forward to transparency?
#Folksonomies:  the next step forward to transparency?#Folksonomies:  the next step forward to transparency?
#Folksonomies: the next step forward to transparency?
 
Computer Forensic: A Reactive Strategy for Fighting Computer Crime
Computer Forensic: A Reactive Strategy for Fighting Computer CrimeComputer Forensic: A Reactive Strategy for Fighting Computer Crime
Computer Forensic: A Reactive Strategy for Fighting Computer Crime
 
From the Cuckoo’s Egg to Global Surveillance Cyber Espion
From the Cuckoo’s Egg to Global Surveillance Cyber EspionFrom the Cuckoo’s Egg to Global Surveillance Cyber Espion
From the Cuckoo’s Egg to Global Surveillance Cyber Espion
 
Electronic Surveillance Of Communications 100225
Electronic Surveillance Of Communications 100225Electronic Surveillance Of Communications 100225
Electronic Surveillance Of Communications 100225
 
Electronic Surveillance of Communications 100225
Electronic Surveillance of Communications 100225Electronic Surveillance of Communications 100225
Electronic Surveillance of Communications 100225
 
Don't Panic. Making Progress on the 'Going Dark' Debate
Don't Panic. Making Progress on the 'Going Dark' DebateDon't Panic. Making Progress on the 'Going Dark' Debate
Don't Panic. Making Progress on the 'Going Dark' Debate
 
ESSENTIALS OF Management Information Systems 12eKENNETH C..docx
ESSENTIALS OF Management Information Systems 12eKENNETH C..docxESSENTIALS OF Management Information Systems 12eKENNETH C..docx
ESSENTIALS OF Management Information Systems 12eKENNETH C..docx
 
ESSENTIALS OF Management Information Systems 12eKENNETH C.
ESSENTIALS OF Management Information Systems 12eKENNETH C.ESSENTIALS OF Management Information Systems 12eKENNETH C.
ESSENTIALS OF Management Information Systems 12eKENNETH C.
 
The Investigative Lab - Nuix
The Investigative Lab - NuixThe Investigative Lab - Nuix
The Investigative Lab - Nuix
 
The Investigative Lab - White Paper
The Investigative Lab - White PaperThe Investigative Lab - White Paper
The Investigative Lab - White Paper
 
Final Paper Draft Outline – Week 7 For the second to last.docx
Final Paper Draft Outline – Week 7  For the second to last.docxFinal Paper Draft Outline – Week 7  For the second to last.docx
Final Paper Draft Outline – Week 7 For the second to last.docx
 
Are computer hacker break ins ethical
Are computer hacker break ins ethicalAre computer hacker break ins ethical
Are computer hacker break ins ethical
 
Lofty Ideals: The Nature of Clouds and Encryption
Lofty Ideals: The Nature of Clouds and EncryptionLofty Ideals: The Nature of Clouds and Encryption
Lofty Ideals: The Nature of Clouds and Encryption
 

Overcoming Encryption in Investigations

  • 1. TILBURG UNIVERSITY TILBURG LAW SCHOOL TILBURG INSTITUTE FOR LAW, TECHNOLOGY AND SOCIETY (TILT) STEFANO FANTIN (ANR. 428819) ENCRYPTION WITHIN LAW ENFORCEMENT INVESTIGATIONS STATE OF PLAY OF THE CURRENT DEBATE Master Thesis LL.M. Law & Technology Supervisor: Lorenzo Dalla Corte
  • 2. 2 Table of Contents Acknowledgments...................................................................................................................................3 1. INTRODUCTION ..............................................................................................................................4 1.1. Framing the research question .....................................................................................................4 1.2. Methodology and purpose............................................................................................................5 2. ENCRYPTION TODAY, THE GOLDEN AGE OF SURVEILLANCE...........................................8 2.1. Introduction......................................................................................................................................8 2.2 Importance of encryption for human rights and civil liberties......................................................8 2.3. State of play of the Going Dark debate......................................................................................13 3. TECHNICAL APPROACHES.........................................................................................................19 3.1 Introduction.................................................................................................................................19 3.2. Backdoors ..................................................................................................................................19 3.2.1. A semantic issue..................................................................................................................21 3.2.2. Backdoors vs front doors....................................................................................................23 3.3 Key-escrow solutions..................................................................................................................24 4. JUDICIAL AND INVESTIGATIVE APPROACHES.....................................................................32 4.1 Introduction.................................................................................................................................32 4.2 Mandatory key disclosure warrants ............................................................................................33 4.3 The UK 2016 Investigatory Powers Act and the decryption order.............................................39 4.4 Investigative approaches.............................................................................................................42 4.4.1 Social engineering................................................................................................................43 4.4.2 Live forensics.......................................................................................................................46 5. CONCLUSIONS...............................................................................................................................50 BIBLIOGRAPHY.................................................................................................................................54
  • 3. 3 Acknowledgments First and foremost, I would like to express my sincere gratitude to my supervisor Lorenzo Dalla Corte. The completion of this Thesis would not have been possible without his extensive support and excellent guidance. I also sincerely thank the Data Protection Office, the European Counter-Terrorism Centre, the European Cybercrime Centre of Europol, as well as the IT Policy Sector of the European Data Protection Supervisor. My period of internship in such organizations has highly enriched this Thesis with a profound insight of the analysed topic. I am extremely grateful to Jan Ellermann, who has closely followed my path and helped me understand the value of this research. A thank you to Lodewijk van Zwieten, Camille Antunes, Desislava Borisova and Sabine Berghs, for having supported this work with interest, priceless expertise and friendly advice. Last but not least, a big thank you to Emeline and to my friends for supporting me, always and indiscriminately, guai se non ci foste stati. This Thesis is dedicated to my parents, Eleonora and Paolo.
  • 4. 4 1. INTRODUCTION 1.1. Framing the research question Following the Snowden revelations, a renewed fear of being observed indiscriminately by Big Brother triggered the public debate, consequently lowering the level of trust between citizens and law enforcement. The watchdogs, accountable for ensuring our safety and security, are at the heart of a storm surrounding mass collection of personal data in the years following 9/11. This fear played a role in the decision of many communication providers to step aside from cooperating actively and closely with the intelligence community1 . It would be redundant to go through all the recent episodes of a saga that repeats itself like an historical cycle: the so-called ‘crypto war’, name given to the attempt by the U.S. Government (and others after it) to limit the access to cryptographic systems2 for the facilitation of law enforcement or intelligence operations, is now taking place again, after more than twenty years. In addition to the growing discussion on the collection of metadata that we are currently facing, the second crypto-war created a situation in which public opinion places itself between the perpetrator and the investigator, in a debate that often raises privacy as the ultimate safeguard against non- authorized intrusions. The social legitimation behind conducting effective intelligence and law enforcement operations is being increasingly challenged, and the number of implications from a legal perspective is enormous. This master’s Thesis builds upon the above-mentioned premise, and tries to discuss the most feasible and lawful policies to be adopted in order to overcome encryption when intercepting a communication, or when retrieving data post- mortem. By doing so, the research question explores the most recent arguments put forward in order to support each of the analysed solutions. The purpose of encryption can be seen as twofold: its protective use could turn into a crime facilitator, but it can undoubtedly also be an instrument necessary to protect an idea or an opinion. Approaching this from another perspective, encryption’s double connotation raises the following inherent contradiction: despite being a useful tool to secure private data transfers, it 1 For instance, by implementing robust end-to-end encryption in their services or arguing privacy concerns when asked for providing technical assistance and backdoors in various court-cases, often involving terrorism- related crimes. 2 By means of key escrow systems or backdoors, extensively analysed further on.
  • 5. 5 creates a challenging obstacle for the unauthorized access to plaintext. This is extremely frustrating for the law enforcement and intelligence community. This paradox is highlighted by the growth of some forms of crimes that use the online world either as an instrument to commit an offence or for a networked crime. Perpetrators are very much aware of what encryption allows, as confirmed by the successes of their illegitimate activities. The following analysis explores the state of play of various legal hypotheses set forth to overcome encrypted data in the course of an investigation. One of the purposes behind the research question is to assess the real need for law enforcement to solve this issue, and the levels of admissibility and accountability of evidences taken within the overall forensic panorama, in light of a policing compliant with human rights. 1.2. Methodology and purpose The encryption debate is currently examined from various perspectives. This Thesis focuses on the latest developments of some of the most discussed policies supported by law enforcement and intelligence communities when overcoming encrypted data. After a horizontal overview of the topic, three different paths to evaluate the research question become apparent. The first deals with the concept of encryption, its functioning and the solutions therein. Analysing this tool from a legal point of view can help us understand its significance from a human rights perspective, as well as clarifying what the concept of ‘going dark’ entails. A second way of tackling this topic would be with a strictly regulatory approach, trying to define a comparative landscape, taking into consideration the common and harmonized standards involving several legal systems. Lastly, an alternative approach could be pursued. It would pay attention to the best ways to collect evidences by implementing substitutive methodologies aimed at circumventing encryption, rather than tackling it upfront (offering both technical and non-technical solutions). This Thesis will primarily adopt the first approach. By initially evaluating encryption conceptually, we will be able to acknowledge the two sides of the coin. Once these are assessed, the reader will understand the main principles to take into account throughout the dissertation. We will begin by appraising the importance of encryption for the protection of human rights. Following this, we will explore the recent trends of the so-called going-dark debate, analysing new developments of the widely-criticized method of weakening encryption.
  • 6. 6 The following chapters will consider some of the main solutions discussed to overcome encryption, namely backdoors, key escrow, mandatory key disclosure order, social engineering and live forensic tools. The purpose is to demonstrate, by highlighting each of the latest developments, whether the law enforcement community is currently discussing a balanced approach to tackle this issue. The conclusion will give a consistent answer to this question, taking into account the various hypotheses that have been examined. A look towards the future will be fundamental, both from a legal and a practical point of view. For the former, it will be interesting to evaluate the dichotomy ‘privacy vs. security’: with this, we will be able to understand how this formula is out-of-date and misleading, theoretically and concretely3 . As for the technical developments of encryption, some questions will have to be answered: how do we expect the panorama to be? Will there be other mechanisms in place? The added value that this research will offer builds upon the paradigm used for the analysis. Drawing from Lessig’s prominent theories4 on how the regulation of the internet is moving, this thesis will aim to assess how the different constraints of the cyberspace are interacting with each other. Generally speaking, this starting point will be a valuable tool in providing a broader evaluation of how different actors are developing their own position on the encryption debate. Unsurprisingly, the attempt is to demonstrate how the level of the law in the regulatory process has not yet reached a stable approach. The balance between the law and other constraints is nevertheless fundamental in order to engage in a proper debate to overcome this moment of regulatory impasse. The importance of this assessment is confirmed by the fact that the statement ‘data equals power’ has never been so true. In the course of this analysis, there will be recurring elements to take into account, starting with the role of private parties. The relationship between the law enforcement community and the hi-tech industry is key. Considerations around the reasons why this relationship is currently being tested will give us a better idea of how these players are expected to act and react in the future. Secondly, the research intends to explore the core of the dispute as it stands. The purpose is to determine whether the debate is keeping us away from what the discussion should be focusing on, namely the real need for law enforcement to find a prompt solution for the issues deriving from widespread encryption. Indeed, although there undoubtedly are 3 H. Nissenbaum, ‘Where Computer Security Meets National Security’ Ethics and Information Technology (2005) 7:61–73, <https://www.nyu.edu/projects/nissenbaum/papers/ETINsecurity.pdf> accessed 15.10.2016. 4 L. Lessig, ‘The Laws of Cyberspace’, Taiwan Net ’98 Conference, <https://cyber.harvard.edu/works/lessig/laws_cyberspace.pdf> accessed 22.09.2016.
  • 7. 7 difficulties, a global and distanced approach should help us realize the factual value of intercepting a plaintext, and the real risk of jeopardizing a related investigation, if such interception is made impossible by encryption technologies. As such, it will be relevant to highlight on which grounds the problem is currently trying to be solved, in order to evaluate common grounds (if any) and accepted parameters that should be considered when addressing it. The research question will give us the possibility to conduct an evaluative exercise, starting with a descriptive analysis of the latest developments on the subject matter. These reflections will have to be understood as a common ground for broader assessments, which will take into consideration a number of primary sources, alongside official reports by international and European organisations or private stakeholders. Moreover, attention will be paid to the evolution of the jurisprudence, in order to provide the research with a practical narrative on how encryption has recently been dealt with by national and trans-national jurisdictions. Lastly, this work intends to consider two other fundamental sources: legal doctrine and other academic sources, which will help us support a conceptual analysis carried out by extrapolating some of the main principles, and face-to-face interviews with experts, prosecutors and practitioners, necessary to understand how the same concepts are interpreted by actors who are confronted with them on a daily basis. Whilst not forgetting the legalistic features that this work intends to be confronted with, a multi-disciplinary approach is essential, in order to reach a critical perspective of the issue at stake. Since this research takes into consideration some technical elements of encryption technologies, it will be fundamental to consider some of the para-legal aspects that are inherently related. A legal scholar is now increasingly asked to focus on a variety of parallel disciplines, to reach a fuller understanding of the argument examined. This analysis is nevertheless aimed towards reflecting upon the impact of encryption on modern society, fleshing out the various views of the same issue by the different actors involved surrounding the issue.
  • 8. 8 2. ENCRYPTION TODAY, THE GOLDEN AGE OF SURVEILLANCE 2.1. Introduction This chapter aims to dive into the twofold feature5 of encryption. The starting point of this research is intended to analyse further in depth the state of play across current interpretations of this concept. On the one hand, it is a technology that is of utmost importance for businesses and citizens, for their interests but most of all, for the protection of their rights. This feature is now recognized as being a vehicle for securing our thoughts online, as confirmed by David Kaye6 , and the upcoming obligations under EU law demonstrate the attention of policy makers in safeguarding such a practice. Moreover, the following section will underline that in certain circumstances encryption protects individuals from states’ circumventions, where laws and their enforcements create a ‘chilling effect’ on subjects and their freedom of expression, stressing constitutional concepts such as rule of law and democratic state. However, on the other hand, this chapter wants to explore how and to what extent law enforcement authorities are struggling against the wall that such a tool creates on people’s electronic communications. The mainstream terminology used (‘going dark’) will have to be analysed in light of an impartial assessment of the problem at stake. It is indeed acknowledged that end-to-end or built-in encryption represents an obstacle for the access to such data, but the dimension of this issue may have been overstated, and its perception by normal citizens may have been misconducted. 2.2 Importance of encryption for human rights and civil liberties Today, encryption7 represents an advantageous tool for our globalized and digitalized third-millennium society. Implications from economic and trade perspectives are largely recognized8 . From a fundamental rights perspective, encryption is an essential way to protect 5 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 12. 6 UN Special Rapporteur for the right to freedom of expression and opinion. 7 For the purpose of this Thesis, encryption will be mainly considered as follows: End-to-end encryption (referring to an encrypted information in transit), and built-in encryption (related to stored data on a certain device or in the cloud). 8 See also R.Hagemann and J.Hampson, ‘Encryption, Trust, and the Online Economy An Assessment of the Economic Benefits Associated With Encryption’, Niskanen Centre, 1-25, 9.11.2015 <https://niskanencenter.org/wp-content/uploads/2015/11/RESEARCH- PAPER_EncryptionEconomicBenefits.pdf>, accessed 30.11.2016, or also D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF- Information Technology and Innovation Foundation, March 2016, pp. 1-50 < http://www2.itif.org/2016-unlocking-encryption.pdf >, accessed 15.09.2016.
  • 9. 9 information when sent through the networked environment. Furthermore, the protection of human rights fully applies to the digital world9 . Our assessment should start by referring to the previous topical considerations: there is much more of a connection between these statements than we might expect. This paragraph tries to link them together with the aim of clarifying the question on the actual impact of encryption on the protection of human rights and civil liberties. The use of encryption is now widespread, and can be identified within the six clusters or areas that this taxonomy is divided into, and where such technologies are deployed: e-mail, e-commerce, internet services, darknet, new open source (see Open Whysper systems) and next generation telephone networks (NGN)10 .It is of primary importance to start from the view of many policy makers, legislators and IT experts11 , integrating encryption as a privacy by design concept12 ; according to Bart Preenel and Seda Gurses, this instrument can be interpreted as aimed at ‘the protection of information and computation using mathematical techniques’13 . Understanding technical security as an instrument to protect fundamental rights is becoming crucial, and information security is undoubtedly an essential enabler to online privacy. Without it, a number of civil liberties would not be protected and trust could vanish. This argument is confirmed by the UN Special Rapporteur on the right to privacy, Joseph Cannataci, in the last section of his first report, titled Ten Points Action Plan: The safeguards and remedies available to citizens cannot ever be purely legal or operational. Law alone is not enough. The SRP will continue to engage with the technical community in an effort to promote the development of effective technical safeguards including encryption, overlay software and 9 Dutch Ministry of Security and Justice (Criminal Policy Dept.), Cabinet’s View on Encryption. Joined Opinion of the Dutch Ministry of Security and Justice (G.A. Van der Stur) and the Dutch Minister of Economic Affair (H.G.J. Kamp) to the President of the House of Representatives of the States General. The Hague, 4.1.2016, ref. 708641. 10 ENISA, Opinion Paper on Encryption, December 2016, p.10 <https://www.enisa.europa.eu/publications/enisa- position-papers-and-opinions/enisa2019s-opinion-paper-on-encryption>, accessed 13.12.2016. 11 J. Lang, A. Czeskis, D. Balfanz, M.Schilder, S.Srinivas, ‘Security Keys: Practical Cryptographic Second Factors for the Modern Web’ , Google, Inc., Mountain View, CA, USA, p. 1-18, 25.2.2016, http://fc16.ifca.ai/preproceedings/25_Lang.pdf, accessed 22.12.2016 12 This term was firstly introduced by the data protection Commissioner of Ontario (Canada), referring to the practice of implementing Privacy Enhancing Technologies (PETs) and similar mechanisms within the design phase of information technologies systems. Currently, the concept has been broadened to other technologies and consolidated by several legal texts, remarkably the new General Data Protection Regulation (Article 23: Data protection by design and by default). 13 J. Van Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016, p.9, <http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.> accessed 8.12.2016.
  • 10. 10 various other technical solutions where privacy-by-design is genuinely put into practice14 . A 2016 dossier of the European Union Agency for Network and Information Security (ENISA) on the use of privacy enhancing technologies15 (PETs) in the context of big data reports that encrypted tools not only play a vital role from a technical standpoint, but are also used for the purpose of raising awareness and building trust amongst users16 . Bearing that in mind, it will not be difficult to support the view that consolidated the value of such tools for the protection of human rights, with regard to the confidentiality17 and the integrity of the information therein18 . It is remarkable to note the increase of interest of international institutions, due to this argument being broadly included in the tendency of the recent decades to translate a full set of fundamental rights into the online world19 . Focusing on the rights to freedom of opinion and expression, the internet should be considered as an extremely effective means to multiply information in such a central public world forum, as sustained by David Kaye20 . In his 2015 official report on the promotion and protection of the right to freedom of opinion and expression, the UN Special Rapporteur shows that encryption provides a fertile area to secure ideas of any kind. What is interesting, from a sociological point of view, is that encryption is not only considered as a form of protection against random unauthorized access. Indeed, over the last few years, users have started to qualify governments and law enforcement agencies as main potential intruders: ‘where States impose unlawful censorship through filtering and other 14 UN – Human Rights Council, Report of the Special Rapporteur on the Right to Privacy, Joseph A. Cannataci, Joseph Cannataci, A/HRC/31/64, 8.3.2016, p.19. 15 ‘Privacy Enhancing Technologies are designed to provide functionality while minimizing the user data that becomes accessible to the service provider. The most popular examples can now be found in the market of private messaging’ J. Van Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016, <http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.> accessed 8.12.2016. 16 ENISA, Privacy by Design in Big Data: an Overview on Privacy Enhancing Technologies in the Era of Big Data Analytics, Final Version, 1.0/Public, Dec. 2015, p.6. 17 Working Party Article 29, Opinion 3/2016, On the Evaluation and Review of the ePrivacy Directive (2002/58/EC), WP240 adopted on 19 July 2016. 18 EDPS, Preliminary EDPS Opinion On the Review of the ePrivacy Directive (2002/58/EC), Opinion 5/2016, 22.07.2016, p.19. 19 As an example, regarding the right to access to public information and the protection of whistle-blowers, see for instance UN – Human Rights Council, Report of the Special Rapporteur to the General Assembly on the Protection of Sources and Whistleblowers - David Kaye, 8.9.2015 (para. 62), where is recommended to the states to implement encryption and promote it in order to ensure the protection of sources. 20 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 11.
  • 11. 11 technologies, the use of encryption (…) may empower individuals to circumvent barriers and access information and ideas without the intrusion of authorities’21 . The value of this statement is supported by a second argument which confirms the urgency for these principles to be laid down in binding and enforceable legal texts. Along the line of David Kaye’s report22 , a parallel may be drawn, for instance, with art. 17(2) of the International Covenant on Civil and Political Rights23 , stating the positive obligation for states to implement legislative measures to protect privacy against unlawful infringements, regardless of the fact that the wrongdoer can be either a private entity or a governmental institution, and the current situation in which these fundamental rights frequently need a stricter protection in the networked environment. Touching upon these examples should help us understand how the international community is trying to promote encryption and other Privacy Enhancing Technologies (PETs) tools as standardized and practiced ways to protect and promote civil liberties. This idea is supported by the latest reports released on the exercise of civil rights within the digital world, mostly performed by assessing the impact of protective instruments like encryption and by elaborating on the impact using specific case-studies, as this section will further specify. As acknowledged by the European Data Protection Supervisor24 (EDPS), intrusive surveillance software would enable us to bypass encryption when using such technologies for investigative purposes25 , therefore creating an interference with the right to privacy and the right to freedom of expression (often observed as complementary rights). In March 2016, Amnesty International released a dossier on encryption, showing how the digital revolution generated ‘new opportunities for state surveillance’26 : ‘were it not for encryption, states’ reach into the internet could be total. Even with encryption, states are still in a position to intercept communication en masse’27 . This statement proves the focus, as well 21 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 12. 22 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 18 23 International Covenant on Civil and Political Rights, Article 17: 1. No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation. 2. Everyone has the right to the protection of the law against such interference or attacks. 24 EDPS, Opinion 8/2015 On the Dissemination and Use of Intrusive Surveillance Technologies, 15/12/2015. 25 V.Blue, ‘How Spyware Peddler Hacking Team Was Publicly Dismantled’, (Engagdet, 07.09.15), <https://www.engadget.com/2015/07/09/how-spyware-peddler-hacking-team-was-publicly-dismantled> accessed 29/08/16. 26 Amnesty International, Encryption: a Matter of Human Rights, POL 40/3862/2016, March 2016, p.10. 27 Amnesty International, Encryption: a Matter of Human Rights, POL 40/3862/2016, March 2016, p.11.
  • 12. 12 as the immense strength of the online world, leading actors to acknowledge the importance of having some levels of control over the flow of digital information. Remarkably, in July 2016, the same Amnesty International (AI) published another report, this time on the use of surveillance techniques against activists and media in Belarus28 . It puts forward evidence highlighting a serious level of intrusiveness of national authorities, exploiting such methods for threatening political adversaries, members of the press community and opponents. AI’s analysis on the legal framework of Belarus serves as an example of what David Kaye voiced in his report29 . UN Rapporteur tackles the issue of national laws on encryption from a dual approach, stating that, more often than we think, domestic restrictions are not necessary to meet a legitimate interest, being sometimes disproportionate when assessed in light of the right to freedom of expression30 : ‘a proportionality analysis must take into account the strong possibility that encroachments on encryption and anonymity will be exploited by the same criminal and terrorist networks that the limitations aim to deter’31 . The two missing characteristics that are identified by the Rapporteur are the failure of the law in providing instrument of protection (a) and accountability systems (b). Thus said, evidence of such insufficiency are put forward by Kaye by examining cases in Latin America and Asia: in these circumstances, the easiest targets of this legal discrepancy are media communities, informants and whistle-blowers, often seen by ‘non-democratic’ governments as serious threats against their supremacy. This test perfectly applies to the case of Belarus highlighted by Amnesty: first and foremost, from a general point of view, national law and its application throughout surveillance techniques creates a chilling effect on individuals, leading entire groups of citizens to perceive a permanent sense of control of government and, conversely, lowering their trust in the public institutions. Moreover, focusing on the role of 28 Amnesty International, It’s enough for People to Feel It Exists. Civil Society, Secrecy and Surveillance in Belarus, EUR 49/4306/2016, July 2016. 29 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 38 30 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 39 31 ‘A proportionality analysis must take into account the strong possibility that encroachments on encryption and anonymity will be exploited by the same criminal and terrorist networks that the limitations aim to deter’. UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32,, 22.05.2015, para. 12
  • 13. 13 encryption in this scenario, Amnesty noted it as ‘absolutely necessary’32 , despite sometimes being burdensome to media’s work33 . A first conclusion can be made, drawing from the previous assessment, on the balance between the law and other constraints. The Belarus case is a clear example of how regulating encryption (directly or indirectly) does not necessarily ensure effective safeguards, often lacking the essential standards foreseen by human rights principles. Indeed, by a careful understanding of the scenario, the two legal failures (namely, lack of protection and accountability) can be easily identified within this example, therefore confirming the extreme value of an instrument like encryption, vehicle for democracy and freedom of expression, first and foremost in those places and countries where certain civil liberties are not guaranteed and some minorities are deliberately targeted by governmental institutions. However, it must be understood that a certain level of restriction of this right is collectively and legally granted, under certain conditions and with certain safeguards. Checks and balances should be present in the legislation to ensure that no purpose other than (inter)national security is pursued. On this basis, law enforcement and intelligence services are therefore legally entitled to invade a citizen’s right to privacy for national security reasons. Recently, the whole debate focused on the use of such measures for the fight against terrorism. In this context, as we will be able to assess in the next section, encryption plays a fundamental role. 2.3. State of play of the Going Dark debate In July 2016, the American-based threat intelligence company Flashpoint released a report called ‘Tech for Jihad’34 , which explored the most recent state-of-the-art uses of communication technologies by terrorist groups, with a particular focus on the Islamic State35 . It has emerged that these collectives have overcome the initial familiarization stage, and the use of certain tools is now widespread36 . Tor and Opera browsers now extensively 32 Amnesty International, It’s enough for People to Feel It Exists. Civil Society, Secrecy and Surveillance in Belarus, EUR 49/4306/2016, July 2016, p.19. 33 Amnesty International, It’s enough for People to Feel It Exists. Civil Society, Secrecy and Surveillance in Belarus, EUR 49/4306/2016, July 2016, p.20. 34 The Flashpoint Group, Tech For Jihad: Dissecting Jihadists’ Digital Toolbox, 22.7. 2016, < https://www.flashpoint-intel.com/library/>, accessed 29.11.2016. 35 For a deeper view of the current terrorist threat, see Europol, European Union Terrorism Situation and Trend Report ‘TE-SAT’ 2016, <https://www.europol.europa.eu/sites/default/files/documents/europol_tesat_2016.pdf>, accessed 30.11.2016, or Europol, The Internet Organized Crime Threat Assessment ‘IOCTA’ 2016 < https://www.europol.europa.eu/sites/default/files/.../europol_iocta_web_2016.pdf >, accessed 28.11.2016. 36 See also Europol, Changes in Modus Operandi of Islamic State (IS) - revisited, 02.12.2016.
  • 14. 14 accompany online interactions amongst these actors, as well as end-to-end encrypted applications like Threema or Telegram. With regard to the latter, Flashpoint explains how the possibility for users to communicate within ‘secret chats’ generated a fertile way of conveying secure communications, as it uses client-to-client encryption (i.e., readable only by the end- devices) and allows for their complete deletion after being exploited. Furthermore, encrypted e-mail services like Hushmail or Tutanota are also widely implemented. As we can see from the outcomes of this report, terrorist groups understood the importance of cyber security for protecting their data, developing it as an asset in their most recent modus operandi, and including it as a pivotal tool within the recruitment cycle. On the other hand, ‘for intelligence agencies and eavesdroppers alike, (messaging and) e-mail surveillance is a primary way to monitor an actor’37 . The frustration of cyber investigators on this matter highly influenced a broader public debate on the lawful access by intelligence and law enforcement on citizens’ devices and digital storage media. In spite of the increasingly superficial approach that this discussion is taking on a public level, leading citizens to assess this issue not from a fully-impartial point of view, it is nevertheless relevant to understand that law enforcement agencies are still objectively experiencing a challenging period when it comes to encryption within investigations related to terrorism and other forms of serious crime. Fig. 1: Castro and McQuinn’s table on how encryption affects government access to data38 Data at rest Data in motion Law enforcement authorities (LEAs) LEAs cannot gain access to encrypted data stored on a user’s device or in the cloud, even with a search warrant. LEAs cannot use wiretaps to intercept communications. Intelligence agencies Intelligence community cannot gain access to encrypted data Intelligence community cannot analyze 37 The Flashpoint Group, Tech For Jihad: Dissecting Jihadists’ Digital Toolbox, 22.7. 2016, <https://www.flashpoint-intel.com/library/>, accessed 29.11.2016, p.10. 38 D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF- Information Technology and Innovation Foundation, March 2016, pp. 1-50 <http://www2.itif.org/2016- unlocking-encryption.pdf>, accessed 15.09.2016.
  • 15. 15 stored on a user’s device or in the cloud, including bulk access to user data. communications for trigger terms. Specifically, the term “going dark” was used during a press conference in 2014 by the FBI’s director James Comey39 , arguing that within the recruitment phase of some terrorist groups, a consistent part of the process is undertaken through encrypted communications, thus leading police to ‘go dark’ by losing their traces in the absence of a legal basis for intervening. Comey repeated it to the Senate Judiciary Committee in August 2015 (“we see them giving directions’ to move to a mobile messaging app that is encrypted. And they disappear”40 ), for then testifying it even later: “there is no doubt that the use of encryption is part of terrorist tradecraft now because they understand the problems we have getting court orders to be effective when they’re using these mobile messaging apps, especially that are end-to-end encrypted”41 . The San Bernardino case42 is one of the most well-known examples of a case which triggered the attention of all the actors, from policy makers to other groups of interests. It is relevant to note how the interaction between the industry representative and the law enforcement body has proved to be imbalanced towards the former. Apple showed an obstacle on allowing the requests of the FBI, raising (costly) architectural problems, as well as ethical and privacy implications. It needs to be acknowledged that this is not the first time we try to regulate similar matters, as confirmed by the report ‘Keys under Doormats’43 . The article begins by saying that law enforcement already tried (unsuccessfully) to regulate lawful access, but after long debates the projects were abandoned, and today we are experiencing a so-called repeating of history, 39 James B. Comey, ‘Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?’ Speaking Note at Brookings Institution , Washington D.C, 16.10.2014, <https://www.fbi.gov/news/speeches/going-dark- are-technology-privacy-and-public-safety-on-a-collision-course>accessed 10.10.2016. 40 U.S. Committee on Homeland Security, Going Dark, Going Forward, A Primer on the Encryption Debate, House Committee on Homeland Security Majority Staff Report, June 2016, <https://homeland.house.gov/wp- content/uploads/2016/07/Staff-Report-Going-Dark-Going-Forward.pdf>, accessed 28.11.2016. 41 CQ press, Senate Judiciary Committee Holds Hearing on FBI Oversight, CQ Congressional Transcripts, 9.12 2015 (Available at < http://www.cq.com/doc/congressionaltranscripts-4803506?2 >, accessed 26.11.2016. 42 E. Perez ‘Apple opposes judge's order to hack San Bernardino shooter's iPhone’ CNN, 17.02.2016 <www.cnn.com/2016/02/16/us/san-bernardino-shooter-phone-apple>, accessed 11.10.2016. 43 H. Abelson, R. Anderson, S. M.Bellovin, J. Benaloh, M.Blaze, W. Diffie, J. Gilmore, M. Green, S. Landau, P.G. Neumann, R. L. Rivest, J. I. Schiller, B. Schneier, M.l Specter, D. J. Weitzner, ‘Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications’, MIT Computer Science and Artificial Intelligence Laboratory Technical Report, 6/7/2016, p.2<https://dspace.mit.edu/handle/1721.1/97690>, accessed 12.11.2016.
  • 16. 16 which will be extensively discussed further on. The crypto debate is not something new, and it would be a mistake to think that we are still exploring an initial phase of its regulatory approach. Such a regulatory approach was initiated a long time ago. What has developed reflects the globalized fear of mass surveillance after the Snowden revelations, the increase of new ready-to-use cutting edge technologies, as well as the evolution of the hi-tech market. On this last point, an extremely interesting and exemplary dichotomy may be drawn again from Lawrence Lessig: the ‘West Coast Code’, the framework of rules arising from the architecture originating from hi-tech companies in Palo Alto, stands on the same regulatory level of the ‘East Coast Code’ (laws that are issued in D.C.). A positive point is that the dynamism of the going dark debate can be evidenced by the interactions between actors representing either different interests, or constraints, if we bear in mind the regulatory taxonomy of this research. In early 2016, a group of experts from the Berkman Klein Institute of Harvard published a paper with the provocative title ‘Don’t Panic. Making Progress on the ‘Going Dark’ Debate’. The findings of this report were foreseeable when taking into consideration the actors they were representing: the message conveyed therein tries to answer “the question whether the going dark metaphor accurately describes the state of the affairs. (We think not)”44 . A remarkable follow-up to this research is the written exchange between Mr. Cyrus Vance Jr., District Attorney of the County of New York City, and the Berkman’s group to some of the points highlighted by the experts in their paper. Specifically, two points are of certain relevance to this study, both focusing on the roles of the various stakeholders within different features of the debate. First, contrary to what ‘Don’t Panic’ suggests, Cyrus Vance Jr. is of the view that within the market, hi-tech companies do not frequently have ongoing relationships with their customers, since these multi-nationals have told society that customers’ data would be unavailable to law enforcement, even if asked for by ‘lawful means’45 . What Mr. Vance argues instead is that ‘either the companies are misleading the public, because the material is 44 J. DeLong, U. Gasser, N. Gertner ,J. Goldsmith, S. Landau, A. Neuberger, Nye, David R. O’Brien, M. G. Olsen, D. Renan ,J. Sanchez, B. Schneier, L. Schwartzt, J. Zittrain, ‘Don’t Panic. Making Progress on the ‘Going Dark’Debate’,Berklett Project – Berkman Centre Report, 01.02.2016, p. 2<https://cyber.harvard.edu/pubrelease/dontpanic/Dont_Panic_Making_Progress_on_Going_Dark_Debate.pdf>, accessed 5.9.2016. 45 Cyrus Vance Jr. (District Attorney of N.Y.C.) Letter to the Berkman Centre for Internet & Society, ‘RE: Don’t Panic. Making Progress on the ‘Going Dark’ Debate’, p.2.
  • 17. 17 accessible, or they are entirely correct and the material is inaccessible, notwithstanding the cloud, back-ups, and data centers’46 . Having elaborated on these positions, it is relevant to note the inherent behavior of the market and the players therein. What should be raised here is that hi-tech companies and their customers align in the demand and supply cycle. For a citizen who buys a hi-tech product, a privacy-friendly approach is now almost invaluable47 : it satisfies the need of being protected and secure, pushing away the fear of being under surveillance. On the other hand, hi-tech companies are highly committed to satisfying this need, as long as this approach is not economically overwhelming, for the reasonable purpose of designing a tailor-made and attractive item. But the question is: how much do hi-tech companies actually believe in privacy- friendly tools, and how much does this policy refer to a business strategy to align themselves to the needs of their customers? However, against this background and as a result of this, while industries and citizens may have reached a compromise, namely around end-to-end encryption, the rationale behind may be opposite. Notably, the consequence in cyberspace is that the law is hampered by this perverse market cycle. Another remarkable aspect to this jostle between the law (enforcement) and the scientific community is represented by the future scenario of the Internet of Things (IoT)48 . As Cyrus Vance argues, the IoT will not be ‘as all-encompassing as Don’t Panic suggests’, thus for law enforcement, it will still be a challenging future for investigations, and this so-called state of near-constant surveillance is not likely to happen in a short-term period. Again, it should be noted here that the two positions are possibly too distant from one another. Whilst acknowledging the increase of IoT products, we should nevertheless keep in mind that there is no intention from a law enforcement perspective to intercept, for example every oven connected to the internet within the next five years. Rather, the need for eavesdropping is assessed on a case-by-case basis. To put it simply, using the previous example, one particular oven might be more relevant than a hundred. This should trigger that the regulatory intentions 46 Cyrus Vance Jr. (District Attorney of N.Y.C.) Letter to the Berkman Centre for Internet & Society, ‘RE: Don’t Panic. Making Progress on the ‘Going Dark’ Debate’, p.2. 47 see Proposal for a Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications) 48 ‘The Internet of Things refers to the development that not only computers, but more and more objects (and the sensors installed in them, including microphones and cameras) become connected to the Internet’, [see J. Van Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016, <http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.> accessed 8.12.2016]
  • 18. 18 from a law enforcement point of view are to allow, rather than generally, collectively and proactively legislate. After having briefly commented on the exchange between these stake-holders, a last consideration is that an active debate is truly taking place. This may be not enough. As this thesis will try to argue, the current state of play may actually be lacking a truthful comprehensive discussion that engages a multi-stakeholder commitment on the topic. What has been assessed so far, suggests that the terminology used (going dark) may have an emotional impact for the society and lead to misconceptions on this debate49 . Moreover, it should be noted that such a diversity of technologies, actors and side-factors may lead to different challenges related with encryption within the going dark trend. For these reasons, it is relevant to underline that a univocal solution may not be enough, either50 . With this in mind, this argument will be further developed using the same approach that has been practiced so far, taking into consideration a multi-stakeholder view. We will be able to verify how the law is currently not at the level of the other constraints, thus leading to an impartial debate that needs to be re-organized along the lines of a fair and equal line of discussion, in view of a regulatory (or legal) solution to this issue. 49 B. Scheneier, ‘The value of encryption’, The Ripon Forum, April 2016, <https://www.schneier.com/essays/archives/2016/04/the_value_of_encrypt.html>, accessed 11.12.2016. 50 House Judiciary Committee & House Energy and Commerce Committee, Encryption Working Group Year- End Report, 20.12.2016
  • 19. 19 3. TECHNICAL APPROACHES 3.1 Introduction The following chapter is intended to give a deep overview of the two most discussed solutions to bypass encryption. The first one relates to the creation of hardware or software backdoors, which would enable law enforcement agencies to access encrypted data, often without informing the user. However, its impact on security standards is massive, as it may be exploited by malicious actors after its potential disclosure. In the second section, key escrow solutions will be assessed. An outline of this item will be beneficial for looking to the past, when the previous ‘crypto war’ was taking place, to see whether the progress of the modern technologies have in some way altered the previous dynamics. We will be able to see that no substantial changes can be observed in this parallel, but rather that there was an evolution of pre-existing constraints that were born during the discussions around the so-called ‘clipper chip’. Both solutions are largely controversial from the perspective of many of the actors involved. In order to dive into these alternatives, it is appropriate to have in mind what Castro and McQuinn pointed out in March 2016: “adding extraordinary access complicates encryption. Every line of new code is a place where a mistake can be made and produced51 ”. 3.2. Backdoors Probably the most renowned solution to overcoming encryption relates to the word backdoor. Generally speaking, a backdoor consists in the deployment of methods or techniques inside computer programs in order to circumvent, break or bypass security mechanisms in a given software or any digital storage media. This practice is often not communicated to the user and should allow the maintenance or the update of such a system. By default, it creates a weakness within the software or the device in which such a backdoor is implemented which, once disclosed, enables potential malicious actors to deliberately access it. Narrowing our focus on encryption52 , the term typically refers to the act of intentionally weakening all of encryption algorithms used by citizens and businesses beforehand. It means that such an intervention 51 D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF- Information Technology and Innovation Foundation, March 2016, 1-50, p.19<http://www2.itif.org/2016- unlocking-encryption.pdf>, accessed 15.09.2016. 52 For a detailed understanding, see also T. Wu, J. Chung, J. Yamat, J. Richman, ‘Encryption Backdoors, The Ethics (or not) of Massive Government Surveillance’, CS201 Final Project, Stanford Edu, <https://cs.stanford.edu/people/eroberts/cs201/projects/ethics-of-surveillance/index3.html> accessed 30.11.2016.
  • 20. 20 would be done by default in the design process, thus allowing the third-party to have a sort of ‘window’ to users’ devices, which may be broken each time there is a need or a purpose. Of course, this solution would be beneficial for combating serious crimes and terrorism, as it would allow any person (in this case, law enforcement authorities) to break inside a subject’s or suspect’s data stored on their respective medium. As explained, a by-product of this solution is theoretically also allowing any potential wrongdoer to access people’s data, with massive consequences for everyone’s privacy and security. This solution was discarded a number of times in the past for various reasons, firstly for the impact that this measure would have on all individuals. Indeed, it is technically overwhelming to create a backdoor in people’s storage media or in an encryption structure that would solely enable intelligence services or law enforcement agencies to make use of such a measure53 . Terrorist groups, ultimately the target of this technical solution against encryption, would consequently misuse this for their illicit purposes, posing a serious threat to personal, business or government data, and decreasing the security on our devices by means of backdoors would endanger both our personal data and our fundamental rights54 . Amongst legislators and policy makers, one of the biggest opponents of backdoors in the European arena is Andrus Ansip, Vice President for the Digital Single Market in the European Commission and former Prime Minister of Estonia (currently one of the most digitally-oriented countries in the world). Inter alia, his position on backdoors is relevant to better understand the fundamental role of encryption as a vehicle of trust for businesses and society. It is topical what Mr. Ansip said to Politico in March ’16: The creation of backdoors to access encrypted services and information will be a boon for terrorism and threatens democracy. (...)We will be much more affected by terrorism if we create those backdoors if good guys will get possibilities to test your pin code many times as it will be needed, then bad guys will do it definitely55 . 53 J.L.Hall, ‘Issue Brief: A ‘Backdoor’ to Encryption for Government Surveillance’, Centre for Democracy & Technology (CDT)- Security &Surveillance, 3.3.2016, <https://cdt.org/insight/issue-brief-a-backdoor-to- encryption-for-government-surveillance/>, accessed 1.12.2016. 54 D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF- Information Technology and Innovation Foundation, March 2016, 1-50, p.18/19< http://www2.itif.org/2016- unlocking-encryption.pdf>, accessed 15.09.2016. 55 C.Spillane, ‘Ansip: Encryption Backdoors Will Aid Terrorism, Harm Democracy’, Politico, 15.3.2016, <http://www.politico.eu/pro/ansip-encryption-backdoors-will-aid-terrorism-harm-democracy/>, accessed 16.11.2016.
  • 21. 21 These words confirm the considerations made on backdoors in this Thesis, and outline the non-feasibility of the solution, not only technically speaking -but also (and mostly, for the purpose of this research) - in a democratic context. On the view of the European Union, and in order to understand the two dimensions of the same solution, the joint statement Europol/ENISA of May 2016 is paradigmatic in terms of differentiation of levels of approach to the policy in question: ‘intercepting an encrypted communication or breaking into a digital service might be considered as proportional with respect to an individual suspect, but breaking the cryptographic mechanisms might cause collateral damage. The focus should be on getting access to the communication or information; not on breaking the protection mechanism’56 . However, there are two new arguments worth mentioning in this Thesis, either because of the notable actors that proposed them or due to the peculiarity of the theory itself. 3.2.1. A semantic issue Both following sections tackle the backdoor solution from the point of view of its definition. Indeed, policy discrepancies sometimes arise from an unclear and non-harmonized set of definitions, thus leading to divergent understandings of the same subject matter. The first semantic consideration brings some technical and legal consequences. The second is a recent landmark position in terms of academic opinions on backdoors. Both arguments are brought to our attention with the aim of demonstrating how actors are currently building new ‘walls’ to protect their views on this issue, and this section will conclude with some interesting statements made by prominent governmental institutions in the recent past. The common ground between these two arguments is that both refer to the problem of providing clear and specific definitions to the word ‘backdoor’. One of the main misconceptions around the use of backdoors to access stored data actually arises from the (mis)use of such terminology. Recent studies57 revealed how this wording may refer to a number of technical implementations, often different from one another, 56 Europol/ENISA joint statement, On Lawful Criminal Investigation That Respects 21st Century Data Protection, statement at margin of the International Conference ‘Privacy in the Digital Age of Encryption & Anonymity Online’, The Hague, 20.05.2016, <https://www.enisa.europa.eu/publications/enisa-position-papers-and- opinions/on-lawful-criminal-investigation-that-respects-21st-century-data-protection>, accessed 30.11.2016 57 ‘While backdoors have become a significant concern in today’s computing infrastructure, the lack of a clear definition of a backdoor has led the media (...) to misuse or abuse the word’. J. Zdziarski, ‘Backdoors: A Technical Definition’, Zdziarski's Blog of Things, 2.5.2016, <https://www.Zdziarski.com/blog/wp- content/uploads/2016/05/Backdoors-A-Technical-Definition.pdf> accessed 16.11.2016.
  • 22. 22 which in turn leads to misinterpretation58 by non-IT experts. From a policy point of view, this process is highly dangerous, as it may lead to wrong conclusions and erroneous directions. The following paragraph will underline this issue on the definition of the concept, which claims, once again, a common understanding between the actors. This argument was brought forward by Jonathan Zdziarski, leading expert in iOS digital forensics, a pioneer in modern digital investigative techniques. He argues59 that a clear definition of the term backdoor has never reached wide consensus in the IT forensic community; for instance, in one of his most recent papers Zdziarski assesses backdoors on the basis of the so-called ‘three pong test’, in which such a method is evaluated throughout three different phases: intent, consent and access. Leaving aside more technical details of his view, it is relevant to underline that this semantic issue may provoke consequences, from both a societal point of view and from a regulatory perspective. Furthermore, what Zdiarsky fairly argues is that the misinterpretation of this concept may arise at stages where important and concrete decisions on individuals are made60 : Pending legal cases already exist within the court system in which a technical definition of backdoor would be beneficial. Without a clear definition, these proceedings pose the risk of misinformation in criminal prosecution, search warrants, warrants of assistance, secret court proceedings, and proposed legislation - all by preventing a duly appointed legal body from adequately understanding the concept61 . As we can see here, the impact of this small detail may be enormous, and it is exemplary in understanding how regulators and architects must agree on common, standard and accepted grounds and terminologies. Summing up Zdziarski’s reasoning, coming to an accepted and fair definition of backdoors would actually help in removing them from other taxonomies of malwares or legitimate security mechanisms. 58 An example of this potential misconception argued by Zdziarski can be found in this article: L. Ryge, ‘Most Software Already Has a ‘Golden Key’ Backdoor: the System Update’, ArsTechnica, 27.2.2016<http://arstechnica.com/security/2016/02/most-software-already-has-a-golden-key-backdoor-its- called-auto-update/>, accessed 30.11.2016. 59 J. Zdziarski, ‘Backdoors: A Technical Definition’, Zdziarski's Blog of Things, 2.5.2016, <https://www.Zdziarski.com/blog/wp-content/uploads/2016/05/Backdoors-A-Technical-Definition.pdf> accessed 16.11.2016. 60 J.Zdziarski,, ‘Identifying Backdoors, Attack Points, and Surveillance Mechanisms in iOS Devices’ International Journal of Digital Forensics and Incident Response, Elsevier, 26.01.2014, < https://www.Zdziarski.com/blog/wp- content/uploads/2014/08/Zdziarski-iOS-DI-2014.pdf> accessed 16.11.2016. 61 J. Zdziarski, ‘Backdoors: A Technical Definition’, Zdziarski's Blog of Things, 2.5.2016, <https://www.Zdziarski.com/blog/wp-content/uploads/2016/05/Backdoors-A-Technical-Definition.pdf> accessed 16.11.2016.
  • 23. 23 3.2.2. Backdoors vs front doors While the debate is now leading to the questions above, it is nonetheless worth discussing in slightly more detail, another contribution that has been already mentioned in this Thesis. ‘Keys Under Doormats’ reports substantial and clear arguments against backdoors, bringing our attention to recent statements made by the FBI director Mr. James Comey: ‘it makes more sense to address any security risks by developing intercept solutions during the design phase, rather than resorting to a patchwork solution when law enforcement comes knocking after the fact’62 . What can be fairly supported here relates to the proactive approach that Mr. Comey would pursue to weaken encryption in the design process by placing such measures. From a law enforcement perspective, tackling a crime, especially if serious and organized, can lead to better results when done proactively. However, it should be argued that this thought would result in an unfair imbalance for law enforcement, and the cost-benefit analysis would not take into account the impact of such vulnerabilities on the IT ecosystems. For these reasons, Mr. Comey added a point that specifies the intended approach within his previous statement: ‘we aren't seeking a backdoor approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process - front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks’63 . This solution sounds fascinating and perfectly fits within the limitations posed against the use of backdoors. A ‘front door’ which only gives access to law enforcement to access people’s data would be the perfect compromise, as it would also be lawfully granted by a court order. Notwithstanding the lack of trust by citizens which may be not solved with this solution, the other issue here relates to those technical constraints that were briefly exposed in the previous pages. What exactly did the FBI director mean by the word ‘front door’? To solve this dilemma, it is worth quoting Vagle and Blaze: “Director Comey was trying to describe (t)he 62 James B. Comey, ‘Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?’ Speaking Note at Brookings Institution, Washington D.C, 16.10.2014, <https://www.fbi.gov/news/speeches/going-dark- are-technology-privacy-and-public-safety-on-a-collision-course>, accessed 10.10.2016. 63 James B. Comey, ‘Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?’ Speaking Note at Brookings Institution, Washington D.C, 16.10.2014, <https://www.fbi.gov/news/speeches/going-dark- are-technology-privacy-and-public-safety-on-a-collision-course>, accessed 10.10.2016.
  • 24. 24 concept of “security through obscurity”, a widely disavowed practice where the security of the system relies on the secrecy of its design”64 . The two experts add some more background at this overview: “once that secret is discovered—and it’s always discovered, sooner or later—the system’s security is lost forever. One needs look no further than the many failed attempts at secure digital rights management (DRM) systems to see this in action65 ”. Indeed, for many IT legal experts66 , the term DRM has unpleasant connotations, reminding them of the repeated efforts made in the past to implement this technology. Concluding again with Vagle and Blaze, the wording ‘front door’ brings about a false distinction of these terms, with highly confusing results67 . Arguments against backdoors were raised back in the 90s and prevented governments from using such methods. An example of this is the attempt of implementing the ‘Clipper Chip’ during the so-called ‘second crypto- war’, our starting point for the following section. 3.3 Key-escrow solutions The concept behind this solution foresees individuals and private parties being forced to hand over their secret encryption keys to a pre-defined public institution. It is a system in which the assigned agency (Trusted Third Party, TTP) plays a crucial role, as well as being put in a delicate position, a reason why one of the main principles that was claimed within the theorization of this solution is the independence of the governmental authority in charge of the retention of such keys. Similar to discussions on features that data protection authorities should have by definition, one of the primary concerns fairly raised here relates to the correct implementation of full, impartial and complete statutory independence and autonomy. 64 J.Vagle and M. Blaze, ‘Security ‘Front Doors’ vs. ‘Backdoors’: A Distinction Without a Difference’, Just Security, 17.10.2014, <https://www.justsecurity.org/16503/security-front-doors-vs-back-doors-distinction- difference>, accessed. 21.10.2016. 65 J.Vagle and M. Blaze, ‘Security ‘Front Doors’ vs. ‘Backdoors’: A Distinction Without a Difference’, Just Security, 17.10.2014, <https://www.justsecurity.org/16503/security-front-doors-vs-back-doors-distinction- difference>, accessed. 21.10.2016. 66 Walker J., ‘The Digital Imprimatur: How Big Brother and Big Media Can Put The Internet Genie Back in the Bottle’, 13.09.2003, http://www.fourmilab.ch/documents/digital-imprimatur/, accessed 10.01.2017 67 J.Vagle and M. Blaze, ‘Security ‘Front Doors’ vs. ‘Backdoors’: A Distinction Without a Difference’, Just Security, 17.10.2014, <https://www.justsecurity.org/16503/security-front-doors-vs-back-doors-distinction- difference>, accessed. 21.10.2016.
  • 25. 25 Furthermore, from an information security perspective, additional accesses imply a default complication of encryption schemes: “every line of new code is a place where a mistake can be made and a flaw produced”68 . However, a better understanding of the current state-of-play around key escrows inevitably means looking back to the past, to what many experts renamed ‘crypto war’. A first ‘war’ took place, but we will obtain a clearer overview of the evolution of this debate by focusing on what happened in the United States within the last twenty-five years, as well as drawing from some notable documents that testify the parallels between that time and now. Not surprisingly, we will be able to assess how, even in this case, history often has the magic power to repeat itself over the decades. Through the analysis of key escrow solutions, this section tries to understand which dynamics during the crypto war have changed and which have not, with a particular focus on the (non)collaboration between parties involved, the legal constraints of this methods, the technical limitations around it and the consequences on a social scale. The Clipper Chip69 was introduced with the Computer Securities Act in 198770 , but in 1991 it was disclosed that the Digital Signature Standard lobbied for its governmental implementation. Finally, the White House announced it in 199371 . The microchip was intended to be inserted in users’ hardware devices, allowing law enforcement to have access to unencrypted communications through that same chip. The key escrow system would have therefore allowed the U.S. Government to have all the decryption keys stored under its control. Multiple concerns, critics and debates were raised over the initiative. First and foremost, privacy and civil liberties advocates fairly argued the highly intrusive nature of this technology. Other groups of interests also reacted, supporting arguments like the particular vulnerability (from a security perspective) of the system or the huge economic impact that this would have had if fully implemented in the American environment. 68 D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF- Information Technology and Innovation Foundation, March 2016, p. 13 < http://www2.itif.org/2016-unlocking- encryption.pdf>, accessed 15.09.2016. 69 As described by M.Blaze, ‘The Escrowed Encryption Standard (EES) defines a US Government family of cryptographic processors, popularly known as ‘Clipper’ chips, intended to protect unclassified government and private-sector communications and data. A basic feature of key setup between pairs of EES processors involves the exchange of a ‘Law Enforcement Access Field’ (LEAF) that contains an encrypted copy of the current session key. The LEAF is intended to facilitate government access to the clear text of data encrypted under the system’. M. Blaze, ‘Protocol Failure in the Escrowed Encryption Standard’ Proceedings of the 2nd ACM Conference on Computer and Communications Security: 59–67, 20.08.1994, <http://www.crypto.com/papers/eesproto.pdf>, accessed 29.11.2016. 70 Computer Security Law of 1987, Public Law No. 100-235 (H.R. 145), (Jan. 8, 1988) 71 The White House, Office of the Press Secretary, Statement by the Press Secretary- Factsheet on the Clipper Chip, 16.4 1993
  • 26. 26 These information security weaknesses were proven by several notable players, firstly by Matt Blaze in 1994. His research72 showed how an installed protective 16-bit hash was too short to provide full security within such an articulated Escrowed Encryption Standard (EES) system. Furthermore, a paper written in 1997 and titled “The risks of key recovery, key escrow, and trusted third-party encryption’73 gave a final negative opinion on the Clipper Chip, de facto closing the healed debate around it, at least temporarily. The findings of this study revealed that key escrow solutions were inherently costly for the end users as well as particularly dangerous for law enforcement agencies. The first critical point is the high risk of exposure of an organization retaining escrow keys. The need for this feature within the authority’s DNA brings by default the growth of security risks. Internally, authorized personnel may turn into “malicious insiders”74 who misuse his or her functions for illicit benefits; externally, a cyber-attack could at any time hamper the correct functioning of the agency’s core business. This point led to a reflection on the difficulties in placing an oversight mechanism, bearing in mind the absolute independence that this organization should have. From a legal perspective, it was indeed difficult to imagine a certain level of supervision over an entity that by definition should stay out from the governmental dynamics. On the contrary, an internal solution would have undermined the trust of the citizenship on the democratic state, and this is exactly what happened during the mid- 90s. As a matter of fact, the crypto war testified this principle to be unachievable in concrete terms. Secondly, it was a matter of overall complexity. The scale of such a global key escrow system was impossible to realize due to its inherent overwhelming implementation, and this is another reason why this project was abandoned. Indeed, ‘key recovery as envisioned by law enforcement will require the deployment of secure infrastructures involving thousands of companies, recovery agents, regulatory bodies, and law enforcement agencies worldwide 72 M. Blaze, ‘Protocol Failure in the Escrowed Encryption Standard’ Proceedings of the 2nd ACM Conference on Computer and Communications Security: 59–67, 20.08.1994, <http://www.crypto.com/papers/eesproto.pdf>, accessed 29.11.2016. 73 H. Abelson, R. N. Anderson, S. M. Bellovin, J. Benaloh, M. Blaze, W. Die,J. Gilmore, P. G. Neumann, R. L. Rivest, J. I. Schiller, and others, ‘The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,’ 1997. <http://academiccommons.columbia.edu/catalog/ac:127127>, accessed 26.9.2016. 74 H. Abelson, R. N. Anderson, S. M. Bellovin, J. Benaloh, M. Blaze, W. Die,J. Gilmore, P. G. Neumann, R. L. Rivest, J. I. Schiller, and others, ‘The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,’ 1997. <http://academiccommons.columbia.edu/catalog/ac:127127>, accessed 26.9.2016.
  • 27. 27 interacting and cooperating on an unprecedented scale’75 . The paper advanced some hypotheses and made some calculations, counting, inter alia, thousands of products and agents, millions of users and public-private keys, tens of thousands of law enforcement agencies around the world, working with the same levels and standards of security. These numbers would have grown with the development of new communication technologies. Reasons around cost-benefit presented in the paper and largely analysed by many actors led policy makers to provisionally discard the project. It is now time to come back to the present, and more or less surprisingly, the same authors of the article gathered again almost twenty years later to see what had changed in light of the new cases that recently initiated a new crypto war. The result of this exercise is the paper previously mentioned: ‘Keys Under Doormats’. By starting with the comparison between the two ‘wars’, the latter paper states that ‘today, the fundamental technical importance of strong cryptography and the difficulties inherent in limiting its use to meet law enforcement purposes remain the same’76 . This wording confirms the thesis that not so many things have changed in favour of a lawful access by the law enforcement community. Indeed, ‘what has changed is that the scale and scope of systems dependent on strong encryption are far greater, and our society is far more reliant on far-flung digital networks that are under daily attack’77 . Along the same line, another group of researchers theorized similar developments of the crypto war related to the clipper chip debate in the paper ‘Doomed to repeat history’ (June 2015)78 . Notably, the two articles give similar findings. What these groups observed is that we are not assisting to a change, rather to an increase or the growth of certain crucial features, specifically: 75 H. Abelson, R. N. Anderson, S. M. Bellovin, J. Benaloh, M. Blaze, W. Die,J. Gilmore, P. G. Neumann, R. L. Rivest, J. I. Schiller, and others, ‘The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,’ 1997. <http://academiccommons.columbia.edu/catalog/ac:127127>, accessed 26.9.2016. 76 H. Abelson, R. Anderson, S. M.Bellovin, J. Benaloh, M.Blaze, W. Diffie, J. Gilmore, M. Green, S. Landau, P.G. Neumann, R. L. Rivest, J. I. Schiller, B. Schneier, M.l Specter, D. J. Weitzner, ‘Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications’, MIT Computer Science and Artificial Intelligence Laboratory Technical Report, 6/7/2016, p.2 <https://dspace.mit.edu/handle/1721.1/97690>, accessed 12.11.2016 77 H. Abelson, R. Anderson, S. M.Bellovin, J. Benaloh, M.Blaze, W. Diffie, J. Gilmore, M. Green, S. Landau, P.G. Neumann, R. L. Rivest, J. I. Schiller, B. Schneier, M.l Specter, D. J. Weitzner, ‘Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications’, MIT Computer Science and Artificial Intelligence Laboratory Technical Report, 6/7/2016, p.13 <https://dspace.mit.edu/handle/1721.1/97690>, accessed 12.11.2016 78 D.Kehl, A. Wilson, K. Bankston, ‘Doomed to Repeat History? Lessons from the Crypto Wars of the 1990s’, New America, Open Technology Institute- Cybersecurity Initiative, p.18, <https://static.newamerica.org/attachments/3407-doomed-to-repeat-history-lessons-from-the-crypto-wars-of- the-1990s/OTI_Crypto_Wars_History.abe6caa19cbc40de842e01c28a028418.pdf>, accessed 11.12.2016.
  • 28. 28  Standardization: U.S. Government cryptographic standards are now widespread both within the Country and abroad;  Security: the challenge around security has shifted from the pure mathematics of cryptosystems to a higher level, namely the design and the implementation processes of complex software systems;  Cyber defence: attacks against governmental cyber infrastructures are now massively increasing. There is a long history of exponential growth of cyber threats under public institutions or similar bodies are exposed to;  Social and economic reliance to the internet: it is clear and undoubtable that current times are led by internet-related mechanisms and digital infrastructures. For this reason, our necessity to make the online environment secure at any cost has become of the utmost importance, much more than during the crypto war. Indeed, since that time, ‘strong encryption has become a bedrock technology that protects the security of the internet, as well as the promotion of free expression online. Moreover, (...) information economy has grown exponentially’79 . In one of the closed sessions of the 38th Annual Conference of Data Protection and Privacy Commissioners, held in Marrakesh in October 2016, Christopher Kuner, a prominent practicing lawyer and academic, gave a speech on encryption from an historical perspective. It is remarkable to mention some excerpts, due to both the notoriety of the author and for the interesting analysis behind. Two points of his summary in fact require a closer look, as they highlight the action-reaction of two different groups of actors that were involved in the previous (and in the current) crypto war. Firstly, he argues that one of the biggest constraints was that many non-U.S. governments did not support the project80 . It may sound obvious, but if we approach this consideration from a broader perspective, taking into account the development of international relations, what we should acknowledge is that no changes to this can now be observed. Rather, and similarly to the four points mentioned above, the assumption data equals power led to the 79 D.Kehl, A. Wilson, K. Bankston, ‘Doomed to Repeat History? Lessons from the Crypto Wars of the 1990s’, New America, Open Technology Institute- Cybersecurity Initiative, p.18, <https://static.newamerica.org/attachments/3407-doomed-to-repeat-history-lessons-from-the-crypto-wars-of- the-1990s/OTI_Crypto_Wars_History.abe6caa19cbc40de842e01c28a028418.pdf>, accessed 11.12.2016 80 C.Kuner, Summary remarks/ closed session ‘Encryption and the Rule of Law: an International Conversation’, 38th Annual Conference of Data Protection and Privacy Commissioner, Marrakech (Morocco) - 17 October 2016, <https://icdppc.org/wp-content/uploads/2015/03/Dr-Christopher-Kuner.pdf>p.1.
  • 29. 29 distancing of governments, when it came to the cyber world81 : “cryptography rearranges power: it configures who can do what, from what”82 , quoting Rogaway. To this end, the cyberspace is nothing else than an unexplored battlefield, where governments jealously claim their sovereignty, and examples of unsuccessful legislative attempts83 support the theory that national entities are not yet ready to sacrifice a piece of their competence over a common intent. To make it more concrete, and coming back to the legal ground, the problem of cross-border jurisdiction84 in the context of investigations in cyberspace is able to jeopardise a unique common intent, at least with regard to the achievement of fast, harmonized and readily- enforceable set of trans-national rules, as confirm Svantesson and Van Zwieten85 . Another relevant point of discussion by Kuner is about the presence and interaction of the private industries86 . Remarkably, another constraint that resulted in the rejection of the project was the claim of crypto industries and major hi-tech firms that they would delocalize their production (and their headquarters) overseas or outside the United States. Bearing that in mind, it is not hard to believe that every government would carefully reflect before introducing rules that lead industries to leave their country, especially if related to the Silicon Valley, at the beginning of their boosting phase. Undoubtedly, such a policy would hamper innovation and keep investments out of the borders. Additionally, it must be recognised that certain countries, mostly economic or political competitors of Europe and the U.S., are either drafting or implementing legislations that would force hi-tech industries to actively collaborate with governmental intelligence agencies for the purpose of granting them access to customers’ data87 . From the European point of view, the position of ENISA confirms this and widens the scope of the risks that such practices carry: 81 On shifting powers in the Internet society, see for instance B.-J. Koops, ‘Law, Technology, and Shifting Power Relations’, Berkeley Technology Law Journal, Art.5 [vol. 25:973], March 2010, p.973-1034, <http://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?article=1848&context=btlj>, accessed 30.11.2016. 82 P. Rogaway, ‘The Moral Character of Cryptographic Work’. University of California, December 2015<http://web.cs.ucdavis.edu/~rogaway/papers/moral-fn.pdf>, accessed 8.12.2016. 83 ECJ, Case C-293/12 Digital Rights Ireland v Data Protection Commissioner 84 see also J. Ellermann, ‘Terror Won’t Kill the Privacy Star – Tackling Terrorism Propaganda Online in a Data Protection Compliant Manner’, ERA Forum DOI 10.1007/s12027-016-0446-z, 17.11.2016, <http://www.readcube.com/articles/10.1007/>, accessed 10.12.2016. 85 J. B. Svantesson and L. Van Zwieten, ‘Law Enforcement Access to Evidence Via Direct Contact with Cloud Providers - Identifying the Contours of a Solution’, Computer Law & Security review 32, Science Direct, Elsevier Ltd, p.671-682. 86 C.Kuner, Summary remarks/ closed session ‘Encryption and the Rule of Law: an International Conversation’, 38th Annual Conference of Data Protection and Privacy Commissioner, Marrakech (Morocco) - 17 October 2016 , https://icdppc.org/wp-content/uploads/2015/03/Dr-Christopher-Kuner.pdf p.2/3. 87 See for instance J.A. Lewis, ‘Posturing and Politics for Encryption’, Centre for Strategic and International Studies, 17.2.2016, <https://www.csis.org/analysis/posturing-and-politics-encryption >, accessed 20.7.2016.
  • 30. 30 ‘the perception that backdoors and key escrow exist can potentially affect and undermine the aspirations for a full embraced Digital Society in Europe’88 . As the state of play saw tech industries leveraging this argument in support of a strong crypto advocacy, it now seems at least improbable that big firms would re-locate their structures out of the U.S.: on one hand, many countries where innovation seems fertile and funded have intrusive surveillance programs in place, and on the other hand, the European option may not be so feasible due to the privacy implications of the most recent case-laws89 involving big hi-tech firms, hence leading to a less flexible scenario than the American one. The U.S.-based solution still seems to be the best one in terms of compromise between trust and surveillance, from the perspective of hi-tech industries. Interestingly (and different from the ones cited in this chapter), this argument does not support the idea enshrined in the previous pages (and in the quoted documents) that no changes have occurred in the past twenty years, rather there has been an increase of some kind of leitmotivs; what manufacturers threatened to do during the 90s seems quite impractical to achieve in modern times (at least within the same conditions), although we should acknowledge that the battles are played off the same fields as twenty plus years ago, and European players are recognising that ‘the experience in the US that limiting the strength of encryption tools inhibited innovation and left the competitive advantage in this area with other jurisdictions’90 . To conclude, it is acknowledged that key escrow would actually enable governments to unlock encrypted data by forcing enterprises, and entrusted third party or public institutions themselves, to store an extra key for all this data. However, requiring such methods would mean exposing businesses and citizens to overwhelming security risks and excludes the deployment of certain security features such as perfect forward secrecy91 , considered by many as best practice92 . 88 ENISA, Opinion Paper on Encryption, December 2016, p.5 <https://www.enisa.europa.eu/publications/enisa- position-papers-and-opinions/enisa2019s-opinion-paper-on-encryption>, accessed 13.12.2016. 89 De Atley, Richard K.; Wesson, Gail, 21.12.2015 , "SAN BERNARDINO SHOOTING: Bail denied; Marquez to remain in custody on terror-related charges (UPDATE 3)".<http://www.pe.com/articles/court-789907- marquez-hearing.html>, accessed 28.12.2016. 90 ENISA, Opinion Paper on Encryption, December 2016, p.5 <https://www.enisa.europa.eu/publications/enisa- position-papers-and-opinions/enisa2019s-opinion-paper-on-encryption>, accessed 13.12.2016. 91 D. Castro and A. McQuinn, ‘Unlocking Encryption: Information Security and the Rule of Law’, ITIF- Information Technology and Innovation Foundation, March 2016, p.13 <http://www2.itif.org/2016-unlocking- encryption.pdf >, accessed 15.09.2016. According to TechoPedia, Perfect Forward Secrecy (PFS) is ‘a data encoding property that ensures the integrity of a session key in the event that a long-term key is compromised. PFS accomplishes this by enforcing the derivation of a new key for each and every session’. 92 J. Van Hoboken, W. Shulz et al., Human Rights and Encryption, UNESCO report, December 2016, p.16 <http://ec.europa.eu/research/participants/portal/desktop/en/opportunities/h2020/topics/ds-08-2017.html.> accessed 8.12.2016.
  • 31. 31 As we have seen, this section has shown how, despite the battles of more than two decades ago, actors are still not able to find a common solution for implementing key escrow systems to access encrypted data. Moreover, what we have interestingly observed through the assessments of certain documents is that some stakeholders are on the same page as the one they were during the clipper chip phase. What we demonstrated is that almost nothing has changed. Rather, there has been a growth of certain constraints for the implementation of key escrows, and that these constraints constitute even bigger limitations than in the past, keeping actors distanced from each other with regards to this specific solution.
  • 32. 32 4. JUDICIAL AND INVESTIGATIVE APPROACHES 4.1 Introduction The following pages will explore the state of play of alternative solutions to overcome encryption, namely, mandatory key disclosure warrant, social engineering and governmental hacking by means of remote live forensics. The first approach mostly refers to a legal one, i.e. deploying a solution that solely relies on legal provisions and their enforcement to achieve the result of overcoming encryption. The intended methodology will mainly focus on the European scenario, assessing the state of the art of such legislations, as well as correlated court cases. However, it should be kept in mind that, due to its relevance in the debate, a particular attention will be also paid to the United States, as its legal regime offers interesting points of reflection, bearing in mind the cross-border nature of this issue. As many have argued93 over the last years, an approach that also takes into due consideration overseas dynamics is of utmost relevance since one of the newest and sometimes most problematic weaknesses of a cyber-investigation relates to the need for European law enforcement authorities to liaise with actors that are often based out of the European borders. The discussion around disclosure orders offers us a paradigmatic example of how the regulator is in this way capable of enabling law enforcement authorities a series of powers and rights to force individuals to compel with such an order. As we will be able to see, however, this approach can be difficult to achieve in certain circumstances (due to professional secrecy obligations or to central principles such as the privilege against self-incrimination). Attention will be then paid to the newly released 2016 RIPA, a legislation that regulates surveillance in the United Kingdom. Such a focus intends to give an example of how provisions regarding cyber investigations and in particular encryption still cannot meet balanced approaches or full consensus within their addressees. The facts and the documents that will be analysed bring forward the conclusions that, due to its intrusiveness and the potential infringement of human rights, mandatory warrants should be considered as the last resort from the range of options that law enforcement authorities have in order to achieve an effective result. 93 Svantesson J. B. and Van Zwieten L., ‘Law Enforcement Access to Evidence Via Direct Contact With Cloud Providers - Identifying the Contours of a Solution’, Computer Law & Security review 32, Science Direct, Elsevier Ltd, 2015, p.671-682.
  • 33. 33 The second possibility refers to social engineering. Despite being a purely alternative approach, examples of effective social engineering solutions will show how this policy in practice still plays a crucial role for overcoming encrypted communications, especially in cases of serious crimes and terrorism. The chapter concludes with the assessment of so-called live forensics94 techniques. Despite being a policy that has been under criticisms several times due to a number of constraints, it is an approach increasingly used in those countries where such a practice is allowed or in those where no strict regulations on this item are present. 4.2 Mandatory key disclosure warrants Globally, including in Europe, some countries have in their criminal or procedural legislation a provision enabling law enforcement agencies to force individuals to reveal their private key for the purpose of investigating and prosecuting a crime95 (Belgium96 or France97 , to name a few98 ). Under those provisions and by means of a court order, judges can ask for the disclosure of the encrypted content99 of subject’s devices when deemed relevant for a specific case or investigation. In case of refusal of such a request, detention or fines are foreseen for the subject if the court deems the undisclosed data as potentially relevant (compared with the overall forensic collection of evidence). It is a solution that entails some points of reflection, mostly from a procedural point of view, since much criticism was raised over the collision of such norms with essential principles of criminal and procedural law. Additionally, this practice can be relatively easy to technically circumvent100 . 94 ‘Live forensics considers the value of the data that may be lost by powering down a system and collect it while the system is still running. The other objective of live forensics is to minimize impacts to the integrity of data while collecting evidence from the suspect system’, MacForensicsLab, Definition of Live Forensics, <http://www.macforensicslab.com/index.php?main_page=document_general_info&products_id=212>, accessed 22.12.2016 95 S. Ranger,’The Undercover War on Your Internet Secrets: How Online Surveillance Cracked Our Trust in the Web’. 24.3.2015, TechRepublic, <http://www.techrepublic.com/article/the-undercover-war-on-your-internet- secrets-how-online-surveillance-cracked-our-trust-in-the-web/>, accessed 22.12.2016. 96 Loi du 28 novembre 2000 [2000-11-28/34] Relative à la Criminalité Informatique (Article 9). 97 Loi no 2001-1062 du 15 novembre 2001 Relative à la Sécurité Quotidienne (Article 30). 98 See next section for a focus on the United Kingdom. 99 For the purpose of this Thesis, the following analysis will mainly focus on full-disk-encryption. 100 As will be further explained later, by means of forward secrecy or deniability.
  • 34. 34 The preliminary consideration refers to a series of requirements that an order should follow in terms of checks and balances, specifically with regard to its compliance with the rule of law. Firstly, such a warrant should be based on a publicly accessible law101 which clearly limits its scope. In Rotaru v Romania102 , the European Court of Human Rights decided in favour of the applicant since domestic laws did not specify which information, and pertaining to whom, could be processed for surveillance purposes under the scope of national security. On the contrary, a key disclosure order should effectively define under which terms such a warrant can be issued, avoiding a blanket approach to the item103 . Moreover, a legislation that provides investigative bodies with this tool should be “(...) implemented under independent and impartial judicial authority, in particular to preserve the due process rights of targets, and only adopted when necessary and when less intrusive means of investigation are not available”104 . Indeed, what Kaye recommends by summarizing the findings of his report relates to the principles of proportionality and necessity. In fact, a disclosure warrant based on unlimited (and untargeted) powers given to law enforcement authorities should not be lawfully pursuable, and this test can be assessed through an evaluation of two familiar concepts: independence and impartiality. These measures actually find a lawful ground, when in place for a limited and definite aim. On the other hand, mandatory key disclosure warrants present some clear difficulties within their implementation which will be hereby briefly explained. Before this, it should be noted that such criticalities mainly relate to both legal and technical constraints. As for the first, one of the most debated arguments that poses a serious limitation to such a solution relates to the right against self-incrimination, broadly called right to be silent. 101 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 45. 102 ECtHR, Rotaru v. Romania [GC], No. 28341/95, 4 May 2000, para. 57. 103 See also ECtHR, Taylor-Sabori v the United Kingdom, No 47114/99, 22 October 2002. 104 UN – Human Rights Council, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, A/HRC/29/32, 22.05.2015, para. 45.
  • 35. 35 Recalling some old-fashioned TV shows, ‘Pleading the Fifth’105 (Amendment106 ), i.e. exercising the right to be silent, represents a crucial legal counter-argument when brought against mandatory key disclosure orders. The right against self-incrimination, the principle under which everyone has the right to not incriminate himself for the commission of an offence by means of testimonial evidence, is one of the most debated topics when discussing solutions to overcome encryption. Focusing on the European scenario, the principle is indirectly107 protected by Article 6 of the European Convention of Human Rights108 , and serves as a vital defence for a suspect or a witness109 . Specifically, it ‘protects a defendant from the ‘cruel trilemma’ of offering incriminating evidence against oneself and risking a criminal conviction; lying to government officials and risking perjury; or keeping silent and risking contempt of court’110 . Expectedly, a number of national jurisdictions within the European territories follow these traditional criminal law principles, applying it by means of different procedural ways: leading countries like France111 or Germany112 for example, are all systems that deploy these approaches within their legislations. Broadly speaking, this concept is not so distant from the principles accompanying the right to privacy: in fact, it brings to an enlargement of legislators’ view on what should be 105 Constitution of the United States of America, Amendment V: ‘No person shall be held to answer for a capital, or otherwise infamous crime, unless on a presentment or indictment of a Grand Jury, except in cases arising in the land or naval forces, or in the Militia, when in actual service in time of War or public danger; nor shall any person be subject for the same offence to be twice put in jeopardy of life or limb; nor shall be compelled in any criminal case to be a witness against himself, nor be deprived of life, liberty, or property, without due process of law; nor shall private property be taken for public use, without just compensation’ 106 One of the first cases in the United States that addressed a suspect giving up an encryption key is U. S. District Court for the District of Vermont, United States v. Boucher, 2009 107 ECtHR, Murray v UK, [1996] 108 European Convention on Human Rights (ECHR), Article 6(1): In the determination of his civil rights and obligations or of any criminal charge against him, everyone is entitled to a fair and public hearing within a reasonable time by an independent and impartial tribunal established by law. Judgment shall be pronounced publicly but the press and public may be excluded from all or part of the trial in the interest of morals, public order or national security in a democratic society, where the interests of juveniles or the protection of the private life of the parties so require, or the extent strictly necessary in the opinion of the court in special circumstances where publicity would prejudice the interests of justice. Everyone charged with a criminal offence shall be presumed innocent until proved guilty according to law. Everyone charged with a criminal offence has the following minimum rights: (a) to be informed promptly, in a language which he understands and in detail, of the nature and cause of the accusation against him;(b) to have adequate time and the facilities for the preparation of his defence, (c) to defend himself in person or through legal assistance of his own choosing or, if he has not sufficient means to pay for legal assistance, to be given it free when the interests of justice so require; (d) to examine or have examined witnesses against him and to obtain the attendance and examination of witnesses on his behalf under the same conditions as witnesses against him; (e) to have the free assistance of an interpreter if he cannot understand or speak the language used in court. 109 See for instance, Blau v. United States, 340 U.S. 159, 71 S. Ct. 223, 95 L. Ed. 170 (1950) 110 R.M. Thompson II and C. Jaikaran,’Encryption: Selected Legal Issues’, Congressional Research Service 1914 3.3.2016, <https://fas.org/sgp/crs/misc/R44407.pdf>, accessed 5.12.2016. 111 Code de Procédure Pénale, Art. L116 [FR] 112 Strafprozessordnung, § 136 [DE]