Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide


  2. 2. NATO Science for Peace and Security Series This Series presents the results of scientific meetings supported under the NATO Programme: Science for Peace and Security (SPS). The NATO SPS Programme supports meetings in the following Key Priority areas: (1) Defence Against Terrorism; (2) Countering other Threats to Security and (3) NATO, Partner and Mediterranean Dialogue Country Priorities. The types of meeting supported are generally “Advanced Study Institutes” and “Advanced Research Workshops”. The NATO SPS Series collects together the results of these meetings. The meetings are co-organized by scientists from NATO countries and scientists from NATO’s “Partner” or “Mediterranean Dialogue” countries. The observations and recommendations made at the meetings, as well as the contents of the volumes in the Series, reflect those of participants and contributors only; they should not necessarily be regarded as reflecting NATO views or policy. Advanced Study Institutes (ASI) are high-level tutorial courses to convey the latest developments in a subject to an advanced-level audience. Advanced Research Workshops (ARW) are expert meetings where an intense but informal exchange of views at the frontiers of a subject aims at identifying directions for future action. Following a transformation of the programme in 2006 the Series has been re-named and re- organised. Recent volumes on topics not related to security, which result from meetings supported under the programme earlier, may be found in the NATO Science Series. The Series is published by IOS Press, Amsterdam, and Springer Science and Business Media, Dordrecht, in conjunction with the NATO Public Diplomacy Division. Sub-Series A. Chemistry and Biology Springer Science and Business Media B. Physics and Biophysics Springer Science and Business Media C. Environmental Security Springer Science and Business Media D. Information and Communication Security IOS Press E. Human and Societal Dynamics IOS Press http://www.nato.int/science http://www.springer.com http://www.iospress.nl Sub-Series E: Human and Societal Dynamics – Vol. 34 ISSN 1874-6276
  3. 3. Responses to Cyber Terrorism Edited by Centre of Excellence Defence Against Terrorism, Ankara, Turkey Amsterdam • Berlin • Oxford • Tokyo • Washington, DC Published in cooperation with NATO Public Diplomacy Division
  4. 4. Proceedings of the NATO Advanced Research Workshop on Responses to Cyber Terrorism Ankara, Turkey 4–5 October 2007 © 2008 IOS Press. All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without prior written permission from the publisher. ISBN 978-1-58603-836-6 Library of Congress Control Number: 2008920687 Publisher IOS Press Nieuwe Hemweg 6B 1013 BG Amsterdam Netherlands fax: +31 20 687 0019 e-mail: order@iospress.nl Distributor in the UK and Ireland Distributor in the USA and Canada Gazelle Books Services Ltd. IOS Press, Inc. White Cross Mills 4502 Rachael Manor Drive Hightown Fairfax, VA 22032 Lancaster LA1 4XS USA United Kingdom fax: +1 703 323 3668 fax: +44 1524 63232 e-mail: iosbooks@iospress.com e-mail: sales@gazellebooks.co.uk LEGAL NOTICE The publisher is not responsible for the use which might be made of the following information. PRINTED IN THE NETHERLANDS
  5. 5. This page intentionally left blank
  6. 6. Responses to Cyber Terrorism vii Centre of Excellence Defence Against Terrorism, Ankara, Turkey (Ed.) IOS Press, 2008 © 2008 IOS Press. All rights reserved. Preface On 4–5 October 2007 the Centre of Excellence – Defence Against Terrorism (COE– DAT) organized an Advanced Research Workshop (ARW) on the topic “Responses to Cyber Terrorism”. The venue was the Merkez Ordu Evi (Central Officers’ Club) in Ankara. This was one of numerous workshops that have been organized each year by COE–DAT, after the Centre was opened in Ankara in 2005. It is the only Centre of Excellence dedicated to supporting NATO on defence issues related to terrorism. Turkey is the framework nation, although at present six other nations also contribute with staff and funds. Through courses, workshops, and academic publications, the aim is to bring western academic rigour and Turkish experience and expertise in terrorism to NATO members, Partnership for Peace (PfP), Mediterranean Dialogue countries, Non-Triple Nations, and others. One issue touched on repeatedly by the participants at the “Responses to Cyber Terrorism” ARW was the difficulty of arriving at a definition of this kind of terrorism. A NATO Office of Security document cautiously defines it as “a cyber attack using or exploiting computer or communication networks to cause sufficient destruction or disruption to generate fear or to intimidate a society into an ideological goal.” 1 But the cyber world is surely remote from what we recognize as terrorism: the bloody attacks and ethnic conflicts, or, more precisely, the politically-motivated “intention to cause death or serious bodily harm to civilians or non-combatants with the purpose of intimidating a population or compelling a government …” (UN report, Freedom from Fear, 2005). It is hard to think of one instance when computer code has physically harmed anyone. Yet a number of our speakers, in particular Prof. Goodman and Lt. Paul Everard, showed that we should be preparing for just such events, potentially on a huge scale. Here we are talking about attacks on critical infrastructure, in particular on SCADA (Supervisory Control and Data Acquisition) systems which control physical processes in places like chemical factories, dams, and power stations. Focus on Solutions At the planning stage of the ARW it was agreed that the workshop would bring together people from a range of disciplines, from information technology researchers and lawyers, to terrorism and security experts. The title “Responses to Cyber Terrorism” was chosen in order to put the onus on the discussion of practical solutions, and in some respects the meetings of the Working Groups were as important for achieving the goals of the ARW as were the plenary sessions (see the last chapter on the “Account of the Working Group Discussions”). Accordingly, the speakers all gave time in their presentations to the issue of ‘responding’ to terrorism in cyberspace. 1 Cited from Lt. Paul Everard’s chapter on “NATO and Cyber Terrorism”.
  7. 7. viii Overview of the Workshop Papers In the introductory, first chapter of the ARW (see the chapter on “The History of the Internet”), Clare Cridland notes that the Internet was originally developed in the U.S. for military purposes. With ARPANET, the Defense Advanced Research Projects Agency (DARPA) created a network for sending packets of information with no central hub, so that communications could be more resilient during a devastating war. The idea of security was, therefore, part of the original idea of the internet. However, an entirely different ethos took over after the US Department of Defense relinquished the project to the burgeoning computer and software companies in the 1990s. The architects of the worldwide network saw it, and wrote of it, in terms of the centuries-old struggle for freedom of thought and expression. Clare Cridland’s description of the internet also evokes this theme: “New media in the early 21st century is a participatory, user-driven information environment, far from the linear platform of the mass media that delivered information through a ‘gatekeeper’ to a passive mass audience. These outlets … were capital intensive and … somewhat privileged. In contrast, new media, driven by technological change in telecommunications, has undermined this sphere of knowledge ownership … However, we’ve been here before. ‘Counter-culture’ always used ‘grassroots media’ (folk songs, posters, leaflets, public meetings) rather than the more traditional mass media of radio and television to message audiences.” Contrast this triumph of the common people, then, with the altogether more pessimistic comments on the freedoms the internet offers by Prof. Seymour Goodman in the third paper of the ARW (see his chapter “Critical Information Infrastructure Protection”). Prof. Goodman is the chairperson of the Committee on Improving Cyber Security Research at the National Research Council, advising the U.S. Congress. Much of what the professor had to say, and this was reflected also in the Working Groups of the ARW, had to do with the vulnerabilities in the globalized net to abuse by terrorists, and the need for CIIP (Critical Information Infrastructure Protection). It is clear that the “current technology asymmetrically favours the attacker, and provides them with great non-linear leverage. The attackers can put their innovations into practice more quickly and effectively than the defenders.” However, when much of the network is outsourced, or owned by companies in a variety of countries, defence is left to the end user. As Seymour Goodman writes, “most of the 200-plus connected countries have little or no national cyber security capabilities.” The users are often unaware of the seriousness of the risk. Frequently networks controlling important infrastructure are not ‘air-gapped’, or separated, carefully enough from the worldwide internet. If one employee’s computer is not air-gapped, perhaps due to negligence, this is enough to create the route for a determined and skilled attacker to gain entry to the whole system. Professor Goodman’s chapter in this book also contains a wide range of recommendations for national and international action. He begins with general measures, which would be equally relevant to protection against accidents, disasters, crime, or different forms of conflict than terrorism. Emergency response systems, including ones with an international dimension, must be in place; SCADA systems must be made more secure, with security as “a factor to be considered over the entire life cycle of any system that is part of the CII”; and countries “must build cadres of capable defenders” including national-level CSIRTs (Computer Security Incident Response Teams).
  8. 8. ix On the issue of legal measures against cyber terrorism, Seymour Goodman mentions the need for international conventions, as well as effective national laws. The conventions would relate to three areas: crime and punishment, infrastructure protection, and arms control. In each case he gives examples already in place which could guide developments in combating cyber terrorism. Among these, the agreements on civil aviation are the best model for developing a similar legal and institutional framework for CIIP. However, it will be difficult to gain acceptance for a CIIP convention, especially as every country would have to sign up, otherwise measures protecting the network could simply be by-passed. Such a convention could be under the umbrella of the UN, and it would involve the creation of an organization to build and certify national capabilities. Phillip Brunst’s paper (see the chapter “Use of the Internet by Terrorists”) is a highly analytical overview of the subject. This kind of paper is highly valuable for those considering an appropriate legislative approach to combating terrorists’ use of cyberspace. The overview covers both of the distinct aspects which emerged at the ARW: cyber terrorism proper, and the issue of terrorist use of the internet for communication, propaganda, researching targets, etc.. After discussing the advantages of cyber attacks for the terrorist (anonymity, low cost, etc.), types of cyber attack are analyzed. In general, attacks on IT systems may take the following three forms: (1) Hacking attacks on individual systems, (2) Denial of Service (DoS) attacks, usually by bombarding a computer with messages so that it cannot process anything else, and (3) ‘hybrid attacks’ which combine one or both of the above with a conventional terrorist attack like a bombing. (1) Hacking can be further analyzed into three types. The hacker can shut down a computer, although here the administrator can usually recognize the problem and restore the system rapidly. There are also so-called ‘defacements’, which alter the information on the victim computer. Typically these are easily recognized, especially if a hacker places a notice saying “you have been hacked by …”. Potentially more disruptive are defacements which subtly change figures or other information. Thirdly, there is the possibility of introducing ‘Trojan horse’ programmes. These are silent operations, and aim to pass undetected by virus scanners. They gather data from the target computer (typically bank details in cyber crime) and relay it to the hacker. (2) Distributed Denial of Service (DDoS) attacks are an effective way of putting computers out of action for a period of time. DoS attacks bombard a computer with vast numbers of messages, occupying all its processing capability. ‘Distributed’ attacks make use of worldwide networks of computers (so-called ‘bot-nets’, from their use of ‘robot’ software) infected with a virus which allows them to be ‘zombies’ controlled by a ‘bot-master’. These viruses have become very common. Terrorists would not have to control such systems. The services of a bot-net, typically used for mass mailings, can be hired for prices ranging between 150–400 US dollars per day. (3) Hybrid attacks combine one or both of the above with a conventional terrorist attack. For example, a terrorist group might combine a bombing with a DoS attack to hamper the work of the emergency services. Terrorists might also target the physical hardware of IT communications, like the ‘bundles’ of cables, or the so-called ‘peering points’. All the above types of attack would harm IT data and lead to economic losses. A more fatal kind of cyber attack is now discussed in security circles, namely attacks on
  9. 9. x the newly-developed SCADA systems, which usually run on well-known operating systems like Windows. Many companies now use SCADA systems to monitor and control production or supply processes. It is clear that, if such a system is hacked, there is a considerable danger of the kind of loss of life associated with ‘conventional’ forms of terrorism. Phillip Brunst recommends measures to encourage companies to invest more in security. Secondly, referring to Article 35 of the CoE Convention on Cyber Crime, he sees a need for the establishment of designated communication paths within countries and between countries to fight digital attacks. On the issue of the terrorist presence on the internet, he sees efforts to block terrorist communications as bound to fail. These communications should be monitored for intelligence (compare the chapters by Prof. Gabriel Weimann and Yael Shahar). Lt. Paul Everard attended the workshop to represent the NATO Computer Incident Response Capability at the alliance’s European Headquarters in Belgium. His presentation (see the chapter “NATO and Cyber Terrorism”) is an introduction to cyber terrorism and the defensive measures NATO is taking. Lt. Everard begins by giving numerous illustrations of cyber attacks to show what directions cyber terrorism might take. There was the dramatic hacking of a SCADA system controlling sewage in Queensland, Australia: “Symantec research highlighted an Australian case where a disgruntled ex-employee, Vitek Boden, hacked into a computerized waste management system in Maroochy Shire and caused millions of litres of raw sewage to spill into local parks, rivers, and even the grounds of a Hyatt Regency hotel in March 2000.” If terrorists could replicate the destructive effects of the ‘Slammer Worm’ of January 2003, they would score a great success in their terms. This computer worm spread across the world in a matter of minutes, and the resultant disruption of banking, airline, infrastructure and emergency services had a high economic cost. Lt. Everard notes that “the safety monitoring system at a nuclear power plant was disabled for a combined period of eleven hours.” Paul Everard then focuses on the attacks that have been directed at NATO, including attacks from Chinese hackers after NATO bombed the Chinese embassy in Belgrade (1999), and a distributed attack on the NATO mail server on 09–10 August 2006, when “the attack was stopped by re-configuring the mail server to respond correctly to the attempted e-mail relay traffic.” The organization has therefore long been aware of its vulnerability to cyber attacks. It generally uses ‘off the shelf’ software, the vulnerabilities of which are well known to potential hackers. Also, “although NATO’s internal networks are supposedly separated from the internet, documents, messages and other data are being uploaded onto the internal network constantly.” With the approval of the North Atlantic Council, the NATO Computer Incident Response Capability was added to InfoSec after 9/11. At present there is an Intrusion Detection Systems project which will be at full operating capacity in 2008. The Prague Summit of 21 November 2002 was attended by the leaders of NATO countries, who signed a commitment to “strengthen our capabilities to defend against cyber attacks”. The paper concludes that providing security can be seen in terms of the following cycle: (1) Protect: this involves ‘system hardening measures’, and anti-malware support for NATO projects. (2) Prevent: this means assessing and notifying vulnerabilities, as well as conducting training and awareness-raising. (3) Detect: using intrusion detection systems twenty-four hours a day, and checking incoming mail. (4) Respond: the teams
  10. 10. xi must be ready to respond to incidents at any time of the day or night. (5) Recover: a recovery support service must be present, or available on-line, to ensure minimal disruption. Both this NATO presentation, but particularly that of Ms Reet Oorn of the Estonian Informatics Centre, Tallinn, referred to the massive DDoS attacks on the Estonian government and institutions in April – May 2007. Ms Oorn gives a fascinating eye-witness account of how the Estonian government fought back against the attacks, when they were able to considerably increase their band width of their computers (see the chapter, jointly written by Ms Reet Oorn and Ms Eneken Tikk, on “Legal and Policy Evaluation: International Coordination of Prosecution and Prevention of Cyber Terrorism”). The Estonians showed a united front, as government equipment was supplemented by that of private sector companies. Ms Oorn illustrates with detailed graphs and discusses the results of the assessment conducted by her Informatics Centre. These showed that the attack was in two phases: an initial phase of attacks was on a small scale, and seemed to be designed to test the limits of the target computers. These attacks were associated with the 09 May WWII victory anniversary important to pro-Russian Estonians, who were already protesting violently about the prime minister’s decision to remove a statue commemorating Russians heroes. The second phase was much more professionally organized, and hours of bombardments by bot-nets had clearly been purchased. In terms of the success of the attacks, it is generally agreed that Estonia, which has some of the highest figures of internet use in the world, survived well. Two of the biggest banks in Estonia came under heavy DDoS attacks, and on-line services were unavailable for several hours. Attacks were also performed against critical routers at the Internet Service Providers level, and this disrupted the government’s internet-based communication for a short time. Some government websites experienced temporary loss of service. Two speakers at the ARW addressed the issue of whether legal controls can be imposed on the internet. However, Ms Eneken Tikk (Faculty of Law, Tartu University, Estonia), unlike Seymour Goodman, does not expect much of the UN: “One could argue that the method of developing legal instruments that the United Nations has used fails because it is too focused on building a consensus about … existing methods used by terrorists. It cannot lead the fight against new methods (such as cyber terror). Thus, we might consider using the United Nations experience as an argument to avoid an overly reactive (rather than proactive) approach …” (see the chapter, jointly written by Ms Reet Oorn and Ms Eneken Tikk on “Legal and Policy Evaluation: International Coordination of Prosecution and Prevention of Cyber Terrorism”). The Estonians’ paper contains incisive comments on the main legal instruments concerning cyber attacks, relating these especially to terrorism. These address the Cyber Crime Convention (ETS No. 185), which, with the Convention on the Prevention of Terrorism (CETS No. 196), is “the most important international instrument for fighting cyber terrorism and other terrorist use of the Internet.” However, not enough states are party to this agreement, weakening it considerably. Also, “serious threats to commit terrorist acts are not adequately covered either by this Convention … this Convention should be evaluated with regard to its ability to cover technological advances, particularly in the area of forensic investigative techniques (such as online searches or the use of key logger software). In the fast-paced technological environment of cyber crime, such evaluations, which frequently lead to revisions and
  11. 11. xii updates, are an absolutely normal process, especially when dealing with high risks such as those posed by terrorism.” In general, as with the other lawyers at the Workshop, Ms Tikk warned that attempts at legal control of the Internet might lead to infringements upon civil liberties. However, perhaps with the attacks on Estonia in mind, which led to almost no prosecutions, she adds: “Should a decision to amend the Convention be taken, the possibility of excluding the political exception clause for some of the Convention’s offences might also be considered, especially in serious cases of data and system interference.” The paper also gives details of amendments to the Estonian Penal Code, designed to strengthen the hand of prosecutors if similar attacks come. Estonian politicians have an initiative at the EU level to amend the Framework Decision on Attacks against Information Systems 2005/222/JHA. One other discussion of international law is offered by Police Superintendent Dr. Süleyman Özeren. His paper (see the chapter “Cyberterrorism and International Co- operation: General Overview of the Available Mechanisms to Facilitate an Overwhelming Task”) discusses definitions and typologies of cyber terrorism. There is a consideration of which of the available international organizations might most effectively achieve “consensus-based, concrete, result-oriented co-operation”. The papers mentioned so far examine cyber terrorism in the proper sense of the term, and how to respond in terms of technology, awareness, and legal/political measures. However, there is also the related question of responding to the terrorist presence on the internet (so-called ‘terrorist contents’). Here the internet is not a weapon, but an important tool for terrorists’ communications (co-ordination, training, recruiting), and information gathering on the targets of planned attacks. The COE– DAT Workshop included four fascinating papers on terrorist contents. An undoubted expert on terrorist websites is Prof. of Communication Gabriel Weimann, who from an early stage has been archiving literally thousands of terrorist websites, from al-Qaida to FARC, and Hizbullah to the PKK (see the chapter “WWW.AL-QAEDA: The Reliance of al-Qaeda on the Internet”). This project, based at Haifa University, brings many different analytical approaches to bear on this material, including link analysis, participant observation, language analysis, and case studies. Prof. Weimann’s paper reports on his project, with colourful illustrations from the world of terrorist websites. The professor shows how, since 9/11, al Qaeda operatives sharpened their internet skills and increased their web presence. When the Americans drove al-Qaida from its camps in Afghanistan, the organization was dispersed and forced to retreat into cyberspace. As Gabriel Weimann shows, they now make extensive use of the internet, to the extent that they even rely upon it. Also giving the ARW an account of a terrorist organization’s use of the internet, Capt. Erdo an Çelebi has built up a wealth of knowledge, and uses a high-tech approach, in his research on the terrorist Kurdistan Workers’ Party (PKK) (see the chapter “A Case Study: the PKK and Cyberspace”). This is an exemplary study, showing the amount of information that can be gathered from the Internet concerning a single organization. It shows that the PKK has created, or is closely linked to, thirty- eight websites. In addition to data and analysis, the paper gives some indication of the style of the websites, and the way the PKK seeks to present itself to its various audiences.
  12. 12. xiii Of particular interest is that fact that Erdo an Çelebi uses Ucinet software to conduct various kinds of link analysis of the PKK-related sites. This technology provides a method for demonstrating which sites were used by PKK leaders in the field, and which are the main sites which propagate their message. This may have practical applications: “Taking out these hubs will make the rest of the network individual islands that have no connection to the others. The question in terms of counter terrorism agencies is how many of these hubs have to be taken down to crash the whole network.” Other papers based on the phenomenon of ‘terrorist contents’ sought to give, in my view, very contrasting practical responses. Yael Shahar, of the Institute for Counter Terrorism in Herzliya, Israel, spoke on “The Internet as a Tool for Intelligence and Counter-Terrorism”. Yael Shahar notes that “The jihadi online presence is literally the physical brain of the global jihad movement. The very openness and accessibility of this medium provides the intelligence community with a wealth of material for foundation intelligence and analysis.” Arguing that we should ‘tune in’ to, not try to shut down, these communications, she pointed out that much can be learned from analysis of websites and chat-rooms about the enemy’s situation, plans, and also weaknesses. Shahar is also interested in exploiting these weaknesses for counter-terrorism purposes, using the legally-shady method of ‘hacking back’, exploiting the same anonymity and access from which the terrorists benefit. She reveals an armoury of sowing dissent, countering propaganda, and secretly altering instructions on websites. By contrast, Dr. Katharina von Knop proposes an open source response. Instead of concentrating on breaking down the structures created by the enemy, here is a proposal to build a new counter-structure. Her discussion paper (see the chapter on the “Institutionalization of a Web-focused, Multinational Counter-terrorism Campaign – Building a Collective Open Source Intelligent System”) focuses on the organizational and management issues surrounding such a system. As she writes: “There is an intense need to work on new solutions to develop effective and efficient counterterrorism measures that follow the democratic process, values and freedoms. Knowledge discovery, data mining techniques and data fusion play a central role in improving the counter-terrorism capabilities of intelligence, security and law enforcement agencies. … Having all the challenges in mind, this article will focus on the most important and highly sensitive one, international cooperation. This contribution … highlights the most important factors towards the development and institutionalization of an international interagency collective open source intelligent system regarding the threat of Islamist terrorism.” Dr. von Knop points out that, if such a co-operative campaign is to succeed, it will need to be arranged in an innovative and flexible way: instead of a hierarchical organization, there would be a network, and knowledge would be pooled. There would be committee management, and a credit point system. Governments would be allowed to use the resource only to the extent that they contribute good quality information and analysis. The Collective Open Source idea is a well thought-out response to the challenge of organizing international cooperation regarding terrorist contents on the Internet. It is a cause for optimism that the speakers, coming from a variety of backgrounds, presented so many practical ways in which to respond to the problem of cyber terrorism. A vital next step is for the experts, with the support of governments and international organizations, to agree on priorities and methods and to implement a common strategy.
  13. 13. xiv Participants at the conference gained, perhaps, an impression of the form the discussions between experts might take from the Working Groups that met at the end of each day’s presentations. The answers that emerged from the Groups are compiled in the last chapter of this book (see the “Summary of Working Group Discussions”). Osman Aytaç, Col. ARW Director
  14. 14. xv Contents Preface vii Osman Aytaç The History of the Internet: The Interwoven Domain of Enabling Technologies and Cultural Interaction 1 Clare Cridland Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign – Building a Collective Open Source Intelligent System. A Discussion Paper 8 Katharina von Knop Critical Information Infrastructure Protection 24 Seymour E. Goodman Use of the Internet by Terrorists – A Threat Analysis – 34 Phillip W. Brunst WWW.AL-QAEDA: The Reliance of al-Qaeda on the Internet 61 Gabriel Weimann Cyberterrorism and International Cooperation: General Overview of the Available Mechanisms to Facilitate an Overwhelming Task 70 Süleyman Özeren Legal and Policy Evaluation: International Coordination of Prosecution and Prevention of Cyber Terrorism 89 Eneken Tikk and Reet Oorn The Internet as a Tool for Intelligence and Counter-Terrorism 104 Yael Shahar NATO and Cyber Terrorism 118 Paul Everard Analysis of PKK/KONGRA-GEL Websites to Identify Points of Vulnerability 127 Erdoğan Çelebi Summary of the Working Group Discussions 142 Osman Aytaç Author Index 145
  15. 15. This page intentionally left blank
  16. 16. Responses to Cyber Terrorism 1 Centre of Excellence Defence Against Terrorism, Ankara, Turkey (Ed.) IOS Press, 2008 © 2008 IOS Press. All rights reserved. The History of the Internet: The Interwoven Domain of Enabling Technologies and Cultural Interaction1 Clare CRIDLAND Ministry of Defence, Whitehall, UK Abstract. The development of the internet is much more than a story of technological achievement: it is about social change. It is not only a history of the accessibility of interconnected computers and user-friendly software, but also of a technological revolution. The internet has enabled the democratisation of information sources away from the elites of the mass media and institutionalised politics into the hands of active, assessing audiences. This paper will address both the technological and social change that the internet has brought about. It will describe a brief history of technological development of what is now readily known as the internet, including some of the more popular software applications associated with it. I will then briefly look at the main issues of social interaction on internet platforms and how information sources have changed. Finally, I shall make some observations on what the future of the internet may be. Keywords. Internet, culture, information society, democratisation of information, mass media, telecommunications Technological Development The history of the internet is actually a component of the history of mass telecommunications. This started in 1837 with Samuel Morse’s invention of a telegraph transmitter and receiver. After five attempts at a trans-Atlantic communications cable, 2 in 1866 the foundations were set in place for near-instant communication between nation states, which would essentially compress their geographical distances. For computer hardware, it was over a hundred years after Morse’s invention that the first semi-programmable computer, known as ‘Colossus’ (which was the size of a small office), was built for code-breaking activities in the UK at Bletchley Park during the Second World War. 1 Note: The basis for this paper is a speech given by the author to the Information Operations Europe conference in London, June 2007, entitled ‘New Media and Technology Analysis for IO: Moving with the Times’. 2 The first cable which gave tangible results was in 1858. It was destroyed by an attempt at sending a message more quickly than was the norm and the higher voltage burned out the wire.
  17. 17. 2 C. Cridland / The History of the Internet What brought the two technologies together was a project run by the US Department of Defense’s Advanced Research Projects Agency (DARPA), which focused research into computer connections and mass communications technologies. In 1968 DARPA called for tenders for a project called ARPANET, a system to connect computers and transfer data ‘packets’ between them. Before this time, connections concerned circuit switching as opposed to data transferral. The concept of interconnected computers and some of the technologies supporting a network had been devised a few years earlier. In 1964, the RAND Corporation looked into a communications network that could link cities, states and military establishments. The core issue was that the network had no central hub of authority and so could be more resilient during a third ‘total war’, a war which, in the 1960s, was expected to include the use of nuclear weapons. After ARPANET’s first public demonstration in 1972, the service grew over the following 18 years to connect new institutions, run email, newsgroups and limited international communication. However, ARPANET was still very much an institutionalised communication network. Running in parallel, similar academic networks elsewhere in the world generally remained independent, constrained by the inhibitive cost of international data connections. Meanwhile, in the public domain, computers were generally only employed on routine work within companies or as game players in homes. It wasn’t until the 1990s that the internet in its current form evolved. Organisation of such a vast amount of data was becoming a key issue. Technical solutions, such as the Domain Name System (DNS), transformed hard to remember internet protocol numbers into easy to remember names. The Defense Data Network – Network Information Center handled all registration services, including the top-level domain addresses such as .mil, .org and .gov. Only in 1992, as non-defence and public access grew, did the US Department of Defense stand back from the internet and passed registration over to civilian contractors. In the 1990s, the use of and applications available on the internet grew at an astonishing rate, assisted by improving telecommunications infrastructure. With it came the requirement to regulate protocols and domains to manage the enormous increase in information. Bulletin Board Systems (BBS) were popular: a system where users’ computers that were attached to modems left messages for reply on servers. Hypertext, a concept dating from 1945, had been used in various networks as a method of organising information. But the real starting point for what became know as the World Wide Web (WWW) was in 1993 through the development of graphical browsing – a web browser. Over subsequent years, numerous web browsers were in place, and the directory system of information sources was generally replaced by search engines that browsed for the most relevant sites. Essentially, the internet was evolving into a popularity contest between sites rather than a library or directory. Innovation and market-led economics that drove a consumer appetite for new technology played important roles in the development of the internet. Devices became smaller in size and even portable, but also became larger in data storage and processing capability. Prices of equipment and telecommunications costs steadily fell, making devices more accessible to the public. Applications were being developed in two ways, both top down by large software development companies, but also bottom up from intensive personal users. It was this participatory culture of technological development of the World Wide Web that brought about the largest changes in how the world received its information.
  18. 18. C. Cridland / The History of the Internet 3 Democratisation in the ‘Information Society’ The revolution will not be televised – Gill Scott Heron3 Pre-dating the internet, newsgroups and e-mail gave users the ability to exchange information and pass it around without the need for a filter or mediator. As the internet grew, so did the sources of information available to users. This ‘source bombardment’ has had a number of consequences for the traditional mass media of print and broadcasters, as well as the actors who use it (such as politicians and advertisers). The vast number of information sources available to an internet-enabled society has seen a dilution of a single mass audience into multi-source, self-assessing segments. New media in the early 21st century is a participatory, user-driven information environment, far from the linear platform of the mass media that delivered information through a ‘gatekeeper’ to a passive mass audience. These outlets – radio, television stations and newspapers – were capital intensive and subject to varying levels of government or government-inspired intervention or regulation. Information ownership in the mass media was, therefore, somewhat privileged. In contrast, new media, driven by technological change in telecommunications, has undermined this sphere of knowledge ownership and the singular authority of mass media companies. As a result, content ownership is becoming more complex. Bloggers and websites present information in new forms from traditional, mass media sources that they can link back to other websites. Widely available (and easy to use) editing software means images and sounds can be edited into something completely different and re-published in their new form. Downloading music and movies from site to site is an ongoing challenge to copyright holders, just as copying (or ‘pirating’) music to audio cassette once was, but the scale of the global proliferation of copying is unprecedented. However, we’ve been here before. ‘Counter-culture’ always used ‘grassroots media’ (folk songs, posters, leaflets, public meetings) rather than the more traditional 4 mass media of radio and television to message audiences. The alternative sources now residing on the internet are merely offering new platforms to the old grassroots, and potentially giving them a global audience. Arguably, the growth in the popularity of alternative sources is also driven in part by the demise of the large audience once afforded to the mass media. In the UK, circulation figures for the national newspapers have been in a general decline since the 1950s, and television audiences are regularly no higher than eight million for the most popular programmes, a fall of nearly ten million in the past twenty years. Academics cite a number of reasons for this demise, from the availability of multi-platform satellite and digital television stations that fracture the audience into smaller viewing groups, to a perception that the content of the mass media is serving the interests of advertisers and financial backers as opposed to audiences. 3 Heron’s song lyric became the title of the memoirs of Joe Trippi, the campaign manager of 2004 US Presidential candidate Howard Dean. Trippi used a number of new media and grassroots platforms to promote and raise money for Dean’s campaign. 4 From Jenkins, 2006.
  19. 19. 4 C. Cridland / The History of the Internet On-line Strategic Communications – A Bowl of Noodle Soup The internet has enabled self-publishing, which means governments, companies, special interest groups and individual members of the public have – in theory at least – an equal voice. It is a place where narratives and counter-narratives compete for attention, and a place where conspiracies unwind without media filtration. Equally, the power of message interpretation is no longer with the mass media ‘gatekeepers’, but with personal members of an audience. Both have significant consequences for anyone engaged with public messaging. Messages are now available multi-platform, being sent as digital audio, text or instant messaging, to a number of static and portable devices. These can be sent simultaneously from a single source to different receivers, or as unconnected multiple sources. One no longer has physically to visit websites to gain information. Really Simple Syndication (RSS) feeds can lead users, once they have subscribed to the feed, not needing to visit a website at all. Memes (a term originally coined by a biologist and evolutionary theorist to describe how cultural information propagates between minds), are items that spread quickly across the internet via virtual word of mouth and self- publishing. Meme tracker sites, such as ‘techmeme’ and ‘tailrank’, track the most popular items on the internet. The information age is like a bowl of noodle soup – a mass of communication strands from sender to user floating in a soup of information. A user takes one strand at a time, maybe several, depending upon how big the user’s fork is. The vast array of sources means the audience, unable to digest them all at once, is now self-selecting within their own agendas. Separating and recognising fact from an author’s opinion is just one of the issues facing information consumers. Audiences are constructing individual hierarchies of sources. Some place their trust in the information received from mass media outlets over a blog or chatroom; for others, the hierarchy is different, putting information gained from virtual contact above that of the mass media. This is an evolving area of understanding, but it stems from what sources are perceived to be important on which subjects to which members of the audience. Local events in my street are important, but I wouldn’t find them reported on a national radio bulletin (unless the event was particularly serious). Similarly, timely news about my extended family elsewhere in the world would be achieved through telecommunications – hearing and seeing them via the internet, perhaps – not through the channels of the mass media. Arguably communications has always been thus, but the connected environment has changed perceptions of what events are now important to us. Time Space Compression The speed of information is wondrous to behold. It is also true that speed can multiply the distribution of what we know to be untrue. – Edward R. Murrow Advances in telecommunications means that multiple channels and democratic information is travelling faster across greater distances: geographical place is no longer relevant. As audience members pick and choose their credible sources, nation- statehood may not enter the equation as credible boundaries to information or
  20. 20. C. Cridland / The History of the Internet 5 communication control. The popularly of social networking is that friends in a physical space (such as a school or workplace) can keep in touch virtually when they are geographically separated. The mass media news agenda, too, ‘chases the sun’ across the globe, with global outlets setting the day’s agenda in Australasia and chasing the rising sun across the continents into the Americas. Indeed, searching the front page of Google News at 0900GMT is more likely to consist of stories from Asia than if one visited the site at 1800GMT when North American stories dominate. As Will King of CNN once said, ‘it’s always prime time somewhere’.5 This is compounded by devices that converge new media platforms, such as a mobile telephone with an embedded camera and an e-mail capability. Information can be supplied instantly in a number of formats (or ‘cross-platform’). For the mass media and those engaged in message campaigns, dominant narratives on traditional platforms are challenged by alternatives at the same speed – in many cases, faster. Uploading a message into the public sphere of the internet is now instant and it will be only a matter of time before the quality of pictures improves to mass-media broadcast standard.6 The opposite of speed is also true. Information now has greater longevity in the new media sphere through archiving and smart searching (the so-called ‘long tail’). In the media and public sphere, after an initial burst of activity a piece of information is replaced by new pieces of information and generally forgotten as the day progresses; accessing certain platforms on the internet means that the first piece of information can live on, and even re-emerge into the public sphere. Terrorist Use of the Internet What of terrorists, extremists and criminals and their use of the internet? I argue they use the technologies available to them in the same way as any other group on-line. Recruitment, fundraising, the promulgation of ideologies and ideas, through to planning, co-ordination and publications are equally valid for a charity organisation or a workplace as for a group engaged in criminal activities. Fear of what opportunities the internet could provide a criminal is a similar fear which was afforded to telephones, radio or television in their early inceptions (Furedi, 2006). For instance, the development of television during the late 1930s in the UK was put on hold until the end of the Second World War. What the internet and mass communications have changed is the ease of accessibility to broadcast messages to disparate audiences. The Future of the Internet Globalisation, as defined by rich people like us, is a very nice thing… you are talking about the internet, you are talking about cell phones, you are talking about computers. This doesn’t affect two-thirds of the people of the world. - Jimmy Carter 5 Cited in Campbell, 2004. 6 Arguably, some mass media outlets are willing to take a loss of picture quality over the immediacy or alternative narrative of a story. This is illustrated by the UK broadcast media’s almost regular use of mobile telephone pictures from witnesses and CCTV pictures.
  21. 21. 6 C. Cridland / The History of the Internet There are a number of scenarios that could affect the internet in the years to come, ranging from flourishing success to collapse. Technological change will continue to drive development, but so will government legislation and economics. There are parts of the world where the national infrastructure is undergoing change and development, from the availability of the one hundred-dollar laptop to digitisation and broadband. The world use of the internet is still in the minority, at only 17.8% in June 2007 (around 1.1 billion people), but that is an increase of some 225% since 2000.7 Regulation and censorship of the internet will shape its future architecture. The trans-national nature of global communications has proven difficult for nation states to govern, and even the supranational European Union’s collective legislation of television in the new multi-platform environment has been problematic. National constraints in one country do not necessarily hold true in another, which may lead to not one internet, but several running in parallel and operating with varying degrees of filtering and censorship. Economics is also a key influence. Disposable incomes in developed countries have generally driven the internet’s guises, and should financial buoyancy begin to slow or reverse, then the buying power of consumers may also slow down and the take- up of converged devices or higher speed telecommunications may reduce. The physical infrastructure of the internet could also be at risk. As users become ever more reliant on networked computers in everyday life, telecommunications, power supplies and hardware resilience become prime targets for hacking. An earthquake knocked out an underwater telecommunications cable – and, therefore, the internet – to parts of Japan for nearly a week earlier in 2007. Unreliable infrastructure has stifled the development of the internet in many parts of the world, particularly in rural regions, and such problems could be its undoing. Catastrophic failure of the internet could also come from within as a malicious code or virus may be so virulent as to close down servers or cause large scale communications damage. Conclusion After its beginnings as a tool to support the war fighter and assist civilian resilience, the commercial incarnation of the internet and its supporting technologies has been a significant driver of enormous technological change across the world. Many academics judge these changes to be the most extensive since the invention of the telegraph. There is now a generation of people who have not known a time without the internet as part of their information space. We live in a congested, self-selecting media environment. The public sphere has grown beyond recognition, giving individuals and groups greater opportunities to communicate directly to a target audience. But there is still a legitimate role for the multinational conglomerate mass media. Surveys in 2006 have shown ‘traditional’ media sources are considered to be more credible than new media sources, although some argue that this position is continually eroding. Additionally, such organisations are one step removed from the content that appears on-line, instead the businesses own the platform, the host site and the telecommunications providers. Traditional mass media is not in a twilight age, but we 7 Internet World Stats from June 2007, accessed from http://www.internetworldstats.com. The highest penetration of the internet into a population is in North America, Australasia and Europe. The largest number of users is in Asia, Europe and North America.
  22. 22. C. Cridland / The History of the Internet 7 live in a noisy message environment, and the internet is forcing all of us engaged in delivering and consuming messages to be somewhat more selective about the ones we wish to influence us. References Campbell, V (2004), Information Age Journalism, London, Arnold Freedman, D (2006), ‘Internet Transformations: ‘old’ media resilience in the ‘new media’ revolution’ in Curran, J. and Morley, D (eds), Media and Cultural Theory, London, Routledge, pp 275-290. Furedi, F (2006), Culture of Fear Revisited, London, Continuum Kember, S (2006), ‘Doing Technoscience as (‘new’) media’ in Curran, J. and Morley, D (eds), Media and Cultural Theory, London, Routledge, pp 235-249 Jenkins, H (2006), Convergence Culture, New York, New Yorkshire University Press Nacos, B (2007), Mass Mediated Terrorism: The Central Role of the Media in Terrorism and Counterterrorism, 2nd edition, Lanham, Rowman & Littlefield Ryan, J (2007), Countering Militant Islamist Radicalisation on the Internet: A User Driven Strategy to Recover the Web, Dublin, Institute of European Affairs Quotes from http://www.brainyquotes.com
  23. 23. 8 Responses to Cyber Terrorism Centre of Excellence Defence Against Terrorism, Ankara, Turkey (Ed.) IOS Press, 2008 © 2008 IOS Press. All rights reserved. Institutionalization of a Web-focused, Multinational Counter-terrorism Campaign – Building a Collective Open Source Intelligent System A Discussion Paper Dr. Katharina von Knop George Marshall Center, Germany Abstract. When we turn our attention to the fast-growing Internet activities of various radial and terrorist entities, there is an intense need to work on new solutions to develop effective and efficient counterterrorism measures that follow the democratic process, values and freedoms. Knowledge discovery, data mining techniques and data fusion play a central role in improving the counterterrorism capabilities of intelligence, security and law enforcement agencies. The broad diversity of potential sources of web-based and web-focused attacks, our reliance on information systems that are inherently insecure, and the international dimension of both cyber attacks and governmental responses raise a host of complicated policy questions and cultural challenges for governmental security institutions. These include how best to improve the state of cyber security: what can be done to improve international interagency cooperation on stemming cyber crime and preventing and responding to cyber terrorism; and cyber warfare. Having all the challenges in mind, this article will focus on the most important and highly sensitive one, international cooperation. This contribution, written in the style of a discussion paper, highlights the most important factors towards the development and institutionalization of an international interagency collective open source intelligent system regarding the threat of Islamist terrorism. Keywords. Counter-terrorism, cyber terrorism, cyber security, international co- operation 1. Introduction Continental Western Countries are very concerned about the threat of terrorism, not least since several attacks have occurred on our continent since 11 Sept 2001, and many more have been prevented. But we are also very aware that one aim of terrorists is to undermine our democracy, the rule of law and our human rights. Many continental western countries have only recently acquired the democratic system and as such the democratic values and freedoms which have been achieved are highly vulnerable. On the other hand, the continental countries and their security institutions notice a growing
  24. 24. K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign 9 Islamist radicalization in recent years.1 Intense investments have been made to prevent classical terrorist violence but the western countries remain highly vulnerable to cyber attacks against the computer networks that are critical to national and economic security. The growing complexity and interconnectedness of these infrastructure systems, and their reliance on computers, not only makes them more vulnerable to attack but also increases the potential scope of an attack’s effects. The fear which has prompted the governments to pump significant resources into protecting the critical national infrastructures (CNI) is that al-Qaeda is determined to use cyber terror to cause damage which leads to loss of life and economic catastrophe. One type of online operational activity is the use of hacking techniques to sabotage Internet sites – what the Islamists term “electronic jihad”. As part of this activity, Islamist hackers attack websites of those whom they consider their enemies with the aim of damaging morale, and they attempt to hack into strategic economic and military networks with the aim of inflicting substantial damage on infrastructures in the West. Many Islamist websites and forums have special sections devoted to the topic of electronic jihad, such as the electronic jihad section in the Abu Al-Buhari forum.2 These developments require effective and efficient counter- and antiterrorism measures. The dry textbook definition of cyber terrorism is terror which is directed at automated systems directly or that uses automated systems to disrupt other critical infrastructure systems that they support or control. Cyber attacks generally consist of directed intrusions into computer networks to steal or alter information or damage the system; malicious code, known as viruses or worms, that propagates from computer to computer and disrupts their functionality; or denial of service attacks that bombard networks with bogus communications so that they cannot function properly. It has to be noted that the motivations for an attack can vary widely: attackers range from hackers bent on proving their skills to others in the hacking community, to criminals stealing credit card numbers, to extortion rings, to foreign intelligence services stealing military or economic secrets, to terrorists or foreign armies wanting to cause widespread damage to the western countries.3 The arsenal of modern weapons that terrorists and other unfriendly entities might someday use to disrupt power grids, gas lines and other parts of the nation’s critical infrastructure includes conventional weapons as well as bits and bytes – in other words cyberterror attacks. The global nature of the Internet and telecommunications networks means that cyber attacks can be launched from anywhere in the world, at low cost, and with incredible speed. With current technology, it is nearly impossible to predict in advance when an attack may begin. There is no longer the luxury of the 20-minute window from launch to landing of a nuclear-tipped intercontinental ballistic missile as there was in the Cold War. Cyber attacks therefore require swift responses and effective cooperation with international counterparts to detect and respond to an attack once it is underway. 1 Ministerie van Binnenlandse Zaken en Koninkrijksrelaties, National Coordinator for Counterterrorism (NCTb), Jihadis and The Internet, 2006. 2 Memri, The Enemy Within: Where Are the Islamist/Jihadist Websites Hosted, and What Can Be Done about It? The Middle East Media Research Institute, Inquiry and Analysis Series, No. 374, July 19, 2007, p. 2. 3 Several nations like the US, Russia and China have already developed cyber warfare or “information warfare” doctrine, programs, and capabilities. Other often cited examples are France, Israel, India and Pakistan. The Defense Department’s Foreign Technology Assessment (FTA) for 2000 suggested that around 25 countries may now have the ability to carry out significant cyber attacks.
  25. 25. 10 K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign Potential adversaries like terrorist and organized crime organizations as well as state actors like China4 are looking for the weaknesses in the governmental information infrastructure of the continental countries and mapping out where and how they would mount a cyber attack or how they could “just” use the Internet for their businesses like propaganda, recruiting, data mining, funding etc. In 2002 US officials discovered an al- Qaeda safe house in Pakistan devoted solely to training people for computer hacking and cyber warfare. “Calling it a “cyber academy”, intelligence officials said al-Qaeda operatives gathered information and expertise on the automated systems that control U.S. infrastructure, such as dams and power grids.”5 In June 2006 a hacker penetrated an unclassified Pentagon email system, prompting authorities to take as many as 1,500 accounts offline, US defence officials said.6 “Confidential documents about supervisory control and data acquisition (SCADA) systems, for instance, have been found in al-Qaeda hiding places in Afghanistan, while the Irish Republican Army has said it plans cyber attacks on crucial supply systems.” 7 Scotland Yard has uncovered evidence that al-Qaeda has been plotting to bring down the Internet in Britain. In a series of raids, detectives recovered computer files revealing that terrorist suspects had targeted a high-security Internet “hub”, the headquarters of Telehouse Europe in London.8 For almost two years, intelligence services around the world tried to uncover the identity of an Internet hacker who had become a key conduit for al-Qaeda. The Internet and computer savvy individual, presumably a young webmaster, taunted his pursuers, calling himself Irhabi – Terrorist – 007. He hacked into American university computers, propagandized for the Iraqi insurgents led by Abu Musab al-Zarqawi and taught other online jihadists how to wield their computers for the cause. Suddenly in Fall 2005, Irhabi 007 disappeared from the message boards. The postings ended after Scotland Yard arrested a 22-year-old West Londoner, Younis Tsouli, suspected of participating in an alleged bomb plot. The terrorists who congregate in these cyber communities are rapidly becoming skilled in hacking, programming, executing online attacks and mastering digital and media design – and Irhabi was a master of all those arts. Even if terrorists have not yet demonstrated the capacity to carry out a large scale web-based terrorist attack, that does not mean they have not achieved the necessary level of expertise to do it. This situation is alarming when one considers that we have many thousands of airports, chemical plants, federal reservoirs and of course power plants, most of whose integral systems are operated and controlled by sophisticated computer systems or other automated controllers. The broad diversity of potential sources of attack, our reliance on information systems that are inherently insecure, and the international dimension of both cyber attacks and governmental responses raise a host of complicated policy questions and cultural challenges for governmental security institutions. These include how best to improve the state of cyber security: what can be done to improve international interagency cooperation on stemming cyber crime and preventing and responding to 4 Chinesische Trojaner auf PCs im Kanzerlamt, in: Spiegel Online, http://spiegel.de/netzwelt/tech/0,1518,501954,00.html, 25.08.2007. 5 Rick White and Stratton Sclavos, Targeting our Computers, in: Washington Post 15.08.2003, pg A. 27. 6 Correspondents in Washington, Pentagon Email hacked, in: Australian IT, 22.06.2007, http://www.australianit.news.com.au/story/0,24897,21948818-15306,00.html 7 Blau John, The Battle against Cyberterror, in: Network World, Vol. 21, Issue 48; pg. 49. 8 David Leppard, Al Qaeda plot to bring down UK Internet, The Sunday Times, 11.03.2007.
  26. 26. K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign 11 cyber terrorism; and cyber warfare. Having all the challenges in mind this article will focus on the most important and highly sensitive one, international cooperation. 2. The M.U.D. Approach The different types of terrorist activities and different levels of web-based radicalization on the Internet require appropriately differentiated responses. One such response is based on what Prof. Gabriel Weimann termed, at the NATO ARW “Hypermedia Seduction for Terrorist Recruiting”, held in September 2006 in Eilat Israel, the “M.U.D.” approach (Monitoring, Using and Disrupting). First, terrorist websites need to be monitored to learn about their mindsets, motives, persuasive “buzzwords”, audiences, operational plans and potential targets for attack. This form of knowledge discovery refers to non-trival extraction of implicit, previously unknown and potentially useful knowledge from data. Monitoring forums, blogs and other frequently updated sites are increasingly a focus of attention. New methods to monitor the so called “hidden web” have to be improved. The hidden web is that part of the Internet which search engines cannot access. Some estimate that the hidden web is actually 95% of all Internet content. Second, counterterrorism organizations need to “use” the terrorist websites to identify and locate their propagandists, chat room discussion moderators, Internet service provider (ISP) hosts, operatives and participating members. The retrieved data needs to be archived to enhance the learning process and to identify social networks. A social network consists of a web of connections between people, between people and events, and between people and organizations. There are mathematical techniques which among other things can: identify clusters of people within a network, display a network in the best and clearest way, identify key persons within a network, and measure the robustness of a network. Integrated Early Warning Systems are an additional requirement. Third, terrorist websites need to be “disrupted” through negative and positive means. In a negative “influence” campaign, sites can be infected with viruses and worms to destroy them, or kept “alive” while flooding them with false technical information about weapons systems, circulating rumours to create doubt about the reputation and credibility of terrorist leaders, or inserting conflicting messages into discussion forums to confuse operatives and their supporters. In a more positive approach, alternative narratives can be inserted into these websites to demonstrate the negative results of terrorism or, aiming at potential suicide bombers, to suggest the benefits of the “value of life” versus the self-destructiveness of the “culture of death and martyrdom”. It has to be noted that “disruption” of relevant websites conflicts with “monitoring” and “using”. For instance Country X would like to monitor a specific chat room and country Y would prefer to disrupt this website by negative means. That could cause disagreements, and to avoid such conflicts, to save resources, and to carry out an effective and efficient web-focused counterterrorism campaign, an international interagency decision-making and harmonizing committee should lead that approach. However, an effective “M.U.D.” approach depends on several conditions. It must be interdisciplinary, involving experts in communications and rhetoric, psychologists
  27. 27. 12 K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign who understand the impact of influence campaigns on their targeted audiences’ cognitive and behavioral responses, graphic designers and Islam experts who understand the type of graphic interface and layout that would appeal to such potential audiences, and civil liberty attorneys to ensure that such influence campaigns do not infringe constitutional rights of free speech and expression. This is a dynamic arena of continuous feedback loops in which our actions must ceaselessly anticipate and respond to the reactions of the targeted terrorist websites. For instance, when a website is brought down, it usually re-emerges with a different configuration elsewhere. Moreover, we need to prioritize the audiences to be targeted by such influence campaigns. For example, devoted activists may be considered a lost cause, while potential recruits who have not yet been activated into terrorism represent new opportunities for influence operations. Such influence campaigns must be led by moderate political and religious leaders from Islamic communities who formulate alternative messages and narratives to the radical Islamist ideologies. Here, further differentiation is required because, for example, mainstream Islam in the Middle East will be different to its counterparts in Southeast Asia or Europe. Above all, such a response requires new counterterrorism “armies” possessing new strategies, capabilities, tactics and cyber weapons to counteract the Jihadi websites. Intense intergovernmental, interagency and international communication and harmonizing processes embedded in an institutional framework and clear defined rules of the game are required to make such a campaign effective and efficient.
  28. 28. K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign 13 3. Responses of the European Union The EU has implemented the first steps towards institutionalizing such an approach. It has recognized the threat of how terrorists are using the Internet for their purposes. In its strategy and action plan for combating radicalization and recruitment to terrorism (doc 14781/1/05 and doc. 14782/ 05) the EU calls for measures to combat terrorist use of the Internet: “We need to spot such behaviour by, for example, community policing, and effective monitoring of the Internet and travel to conflict zones. (…) And we will examine ways to impede terrorist recruitment using the Internet.”9 The EU also emphasizes that the activities of the member states have to be accompanied by action at the EU level. In its conclusions of 15/16 June 2006 (doc. 10633/06 CONCL 2), the European Council expressly asks the Council and the Commission to develop measures to prevent the misuse of the Internet for terrorist purposes while at the same time observing fundamental rights and principles: “The European Council calls for the implementation of the action plans agreed under the EU Counter Terrorism Strategy, including the strategy against radicalization and recruitment, to be accelerated. Work must also be sped up on the protection of critical infrastructure. The European Council awaits the Commission’s first programme in this connection as well as concrete proposals on detection technologies. The Council and the Commission are also invited to develop measures to combat the misuse of the Internet for terrorist purposes while respecting fundamental rights and principles.” 10 The EU member states and Europol are already actively monitoring and evaluating terrorist websites. The Council supports the initiative “Check the Web”, which aims at strengthening cooperation and sharing the task of monitoring and evaluating open Internet sources on a voluntary basis “(…) there is also scope to strengthen cooperation on an EU basis, specifically with regards to monitoring and evaluating Islamist terrorist websites. Many Internet pages in various languages have to be monitored and evaluated, which requires enormous technical and human resources. Due to the huge quantity of Internet pages in use, problems arise on a national and international level concerning the quantity and quality of resources, especially with a view to the language skills needed. It is hardly possible for one individual member state to cover all suspicious terrorism related activities on the Internet. Monitoring and evaluating the Internet should therefore be intensified by sharing this task on a voluntary basis among the member states, taking advantage of the special language and professional competence of the relevant authorities of the individual member states. In addition to sharing information via Europol, member states may also choose to divide labour amongst themselves on a voluntary basis to achieve the most efficient use of resources. However, irrespective of potential distribution of priorities the responsibility of deciding whether to monitor, interrupt or shut down specific websites remains with the member states. In all of this work the activities of the various actors (member states, the Commission, Europol, SitCen, et al.) have to be coordinated in a targeted way.” 11 To reach this goal Europol is building the information portal as a technical platform for information exchange among member states. “It will contain the following 9 Council of the European Union doc. 14781/1/05, subject: The European Union Strategy for Combating Radicalisation and Recruitment to Terrorism, 24. 11. 2005, p. 3. 10 Council of the European Union, subject: Presidency Conclusions, doc. 10633/06, P.5. 11 Council of the European Union, subject: Council Conclusions on cooperation to combat terrorist use of the Internet 8457/2/07, p. 3.
  29. 29. 14 K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign modules, for which the member states provide their data and to which all member states have access: • Contact persons for strengthening the expert network; • Lists of links to monitored websites for mutual information; • Additional information (special language competence in the individual member states, technical expertise, possibilities of legal action against terrorist websites) that enables the sharing of resources; • List of announcements by terrorist organizations, to aid in combining resources; • Evaluation results to avoid duplication of work.”12 Europol will expeditiously extend the information portal on contact persons, link lists and lists of statements by terrorist organizations. The establishment of this information portal should facilitate a significantly increased quality of cooperation between the member states in monitoring and evaluating Islamist terrorist websites. It is planned to provide a platform where member states can make their information accessible to each other, thus compiling the knowledge available within the EU. Member states will have direct and fast access to information on the work performed by other member states and their results. In urgent cases direct contact can be established and cooperation can be coordinated through the list of national contact persons. In addition, initial steps were taken to strengthen cooperation, on a voluntary basis, under the principle of the division of labour amongst interested member states. The success of this information platform depends on the willingness of the EU member states to provide useful data and it might be a disadvantage that so far just EU countries participate at this project. 4. Towards a Collaborative Terrorism Data Fusion Centre The general wisdom and truth is that the terrorism threat forces governments to expand its legal and law enforcement powers and many of them are just implemented on a ad hoc basis and/or without conducting effectivity analysis. The terrorist organization and the individual are in a power position because they force the governments to act. But new laws are useless when they are being institutionalized without an expansion of good educated human capabilities. New powers in terms of intelligence, surveillance, data collection, etc., only make sense when at the same time data interpretation and analysis capabilities are to be expanded as well. The most valuable resource in addition to HUMINT is the good educated analyst. A good analyst has not only excellent knowledge of the topic and the target communities, he or she has high language abilities, knows how to think like the enemy to evaluate the data, has a tremendous knowledge of quantitative and qualitative analysis techniques and methods, and additionally the analyst knows how and in which context the data has been collected. Technical solutions have their value but also their abilities are limited. Having in mind that Islamist terrorism is a common threat for many countries, it would make sense, on the basis of a rational cost-benefit analysis, to cumulate/pool resources. It will be assumed that, for instance, the same radical Islamic website, forum or chat room will be observed by several intelligence agencies at the same time. That means a waste of 12 Ibid., p. 4-5.
  30. 30. K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign 15 the high value human analysis resources. To make such a system work specific factors have to be taken into consideration. I assume that no international security threat has facilitated governmental cooperation on the levels of politics, intelligence and law enforcement to the extent terrorism has. I propose the notion of “synergy”, and that 2+2=5, implying that governmental institutions could attain a competitive advantage by joining forces. Bi- and multilateral agreements and strategic alliances were the first wave of networking in the name of internationalization and expansion of effective and efficient counterterrorism. The United Nations, the European Union and the OECD have proven to be successful platforms for harmonizing counterterrorism policies. Interpol, Europol, and shadowy organizations like the Club of Berne or the Security Alliance in Paris are examples which show that, in the face of the threat of terrorism, the institutionalization of functional cooperation and information-sharing on the level of law enforcement and intelligence is possible. This section discusses how such a virtual and physical network(ed) organization can be theoretically organized, and how geographically dispersed knowledge analysts can collaborate virtually for a project in the absence of classic central planning. Even if for many people these thoughts seem to have much in common with dreams, sooner or later governmental institutions cannot avoid the implementation of such a system if they are to have a serious chance of combating terrorism in the long term. Co- ordination, management and the role of knowledge arise as the central areas of focus. The planned study proceeds to the formulation of a framework that can be applied to a web-based counterterrorism method in the sense of virtual decentralized work and concludes that value creation is maximized when there is intense interaction and uninhibited sharing of information between the organizations and the surrounding community. Therefore, the potential success or failure of this organizational paradigm depends on the degree of dedication and involvement by the surrounding community. Recent technological achievements have enabled governmental organizations to become more centralized, or decentralized, according to their strategic and cultural orientation, and they have further enhanced the efficiency of managing organizational goals. However, centralization is still the prevailing mode of management. To date, the existing organizational and management theory that examines the “virtual network(ed) organization” is not clear. It does not provide more than a basic explanation of how one could boost technological capacity so that emerging governmental opportunities are seized by flexible organizations which together face a global threat, namely terrorism. Similarly, no in-depth analysis has been carried out regarding the management of such a governmental “virtual organization” and the key factors that play a decisive role in the viability and potential success or failure of this fluid organizational structure. One of the reasons behind the lack of extensive research and literature on “virtual organizations” is merely that this presents an emerging phenomenon or organizational structure. 5. Framework/System Analyses Before our M.U.D. approach can be embedded in an international institution a framework/system analysis should be conducted which will take all relevant paradigms
  31. 31. 16 K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign and factors into consideration. The framework analysis draws upon key features of major organizational paradigms (participation, levels of participations, continuous improvement, organizational learning, rules of the game, technological equipment, etc.) and how these are managed. The key factor may be the creation of an access for consuming and providing open source material and analyses that enables a common computing and communications infrastructure. The heart of this system might be a technological platform with analytical tools and databases of open source data. The challenging factors are trust and symmetry. Potential parties to a shared infrastructure can rationally trust it more if they can see how it works all the way down, and will prefer an infrastructure in which all parties have symmetrical rights to one in which a single party is in a privileged position to extract fees or exert control. For this reason the institution should be virtually and physically led by a committee consisting of representatives of the participating institutions. To avoid a situation where a participating country just consumes the data provided by other participants, a sort of credit point system should be established. This system should guarantee that the parties are allowed to consume as much data as they have provided. 6. From Hierarchies to Joint Governmental Networks There is an old saying in military planning: “Get the command and control relationships right, and everything else will take care of itself.” It is a common sense acknowledgement that people provide solutions only if they are well led in a functional organization. The concept of the hierarchy of governmental security institutions is built on three assumptions: the environment is stable, the processes are bureaucratic and the output is definable and more or less predictable. Obviously, these assumptions no longer apply to cyber terrorism. Governmental organizations are controlled by hierarchies, and the counterterrorism departments should be linked according to a paradigm that relies on open and adaptive systems that promote learning, co-operation and flexibility, and that takes the form of networks of governmental analysts, artificial intelligence labs and research institutions instead of individuals. The system should be based on open source analyses, should focus on tactical and strategic issues using participation and empowerment, team accountability, matrix arrangements (flexible positions and responsibilities based on the abilities of the participating institutions), information networking, and initiatives for improvements should emanate from all directions on a regular basis. While military and governmental institutions do not like committees, a committee structure might be most effective for command in a web-based counter-terrorism campaign. There should be an executive committee for every major technical subdivision. Each committee must include all key personnel involved in the counterterrorism: police, intelligence officers, economic developers (to include NGOs), public services ministers, and the military. The committees must be in charge and have full authority. Committee members must not be controlled or evaluated by their parent agencies at the next level up; otherwise, the committee will fail to achieve unity of effort.
  32. 32. K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign 17 Table 1. Hierarchies and joint governmental networks. Hierarchy Networked organization Structure Hierarchal Networking and pooling Scope Internal, closed External, open Resource focus Classified Open source State Stable Dynamic Direction Commands, bureaucracy Committee-management Basis of action Control Empowerment to collect and to provide Basis of compensation Credit-point system: based on the amount and quality of the provided data, the governmental institutions are allowed to use the material provided by the others. 7. System Theory The underlying assumption that such a virtual and physical networked organization might work has its roots in the System Theory. System Theory is the trans-disciplinary study of the abstract organization of phenomena, independent of their substance, type, or spatial or temporal scale of existence. It investigates both the principles common to all complex entities, and the (usually mathematical) models which can be used to describe them. A system can be said to consist of four things. The first is objects – the parts, elements, or variables within the system. These may be physical or abstract, or both, depending on the nature of the system. Second, a system consists of attributes – the qualities or properties of the system and its objects. Third, a system had internal relationships between its objects. Fourth, systems exist in an environment. A system, then, is a set of things that affect one another within an environment and form a larger pattern that is different from any of the parts. The fundamental systems-interactive paradigm of organizational analysis features the continual stages of input, throughput (processing), and output, which manifest the concepts of openness/closedness. A closed system does not interact with its environment. It does not take in information and therefore is likely to atrophy, that is, to vanish. An open system receives the information which it uses to interact dynamically with its participating elements. Openness increases its likelihood of survival and prosperity. Several system characteristics are: wholeness13 and interdependence, correlations, perceiving of causes, chain of influence, self-regulation and control, goal-orientation, interchange with the environment, inputs/outputs, the need for balance/homeostasis, change and adaptability (morphogenesis) and equifinality (there are various ways to achieve goals). Communication from this perspective can be seen as an integrated process. 13 The whole is more than the sum of the parts.
  33. 33. 18 K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign 8. Creating an Intelligent System Such open systems are called openflows. An openflow is a cluster of initiatives, people and computers who create platforms, projects and concepts for the development of Open Source Intelligence (OSINT). The technologies of the Internet allow us to develop new ways of collaboration, ways that are more open, more collaborative, less hierarchical and, also, more efficient. The Open Source Software movement has shown this. However, real change does not come easily, and in each context it raises different challenges. The openflow aims to help address these challenges, both in terms of its technological aspects and in terms of its organizational and conceptual dimensions. In addition to particular features, these communities display overall organization patterns similar to those seen in other organization types, including both natural and artificial systems. Self-organizing processes are processes known in different kinds of communities’ software developers, as well colonies and open-source communities. The most well-known open-source community might be Wikipedia. The strength of Wikipedia is not the technology, but the massively collaborative effort of thousands of decentralized brains that the technology enables. Take, for example, the Wikipedia entry for Moqtada al-Sadr. Mr. Sadr’s entry in this free encyclopaedia that anyone can edit has been modified approximately 500 times by about 50 people in the past three years. These motivated authors have expanded the entry and corrected hundreds of one another’s errors or omissions. Readers can “vote” the most accurate and relevant information to the top, giving them enough credibility to be taken seriously. These communities practice an ongoing collective learning process and collective intelligence. Based on the assumption that 80% of radical Islamist terrorism information is open source and available, a huge amount of this data could be collected in the Internet (websites, open communication platforms), and relevant analysis can be generated just using this data. Collaborative analysis in a networked multinational interagency system would save resources and would increase the output.14 The new tools of US Intelligence include a federated search engine called Oogle15 and Intellipedia, a controversial intelligence data-sharing tool based on Wiki social software technology. Intellipedia runs on JWICS, SIPRNet, and Intelink-U and the server can not be reached over the Internet. Intellipedia uses MediaWiki, the same software used by the Wikipedia free-content encyclopaedia project.16 It might be worth thinking and discussing how a similar and improved system could be developed on an international level. 9. Qualitative Research of Open Sources I estimate that 80% of the information regarding radical Islamist terrorism is provided via open sources and a large amount of this data is being communicated in the world- wide web. Open source research covers a much wider field than just news monitoring. 14 Dizard, Wilson P. Spy agencies adapt social software, federated search tools, in: GCN http://www.gcn.com/print/25_29/42090-1.html 15 Google also provided its hardware and software system, which includes proprietary algorithms that intelligence IT managers praise highly, to the Army, the Energy Department and other agencies in the intelligence world. 16 Wikipedia for Intel Officers Proves Useful, National Defense Magazine http://www.nationaldefensemagazine.org/issues/2006/November/SecurityBeat.htm#Wik
  34. 34. K. von Knop / Institutionalization of a Web-Focused, Multinational Counter-Terrorism Campaign 19 Investigations often need to locate and retrieve thousands of potential documents, pictures, videos, etc., from the Internet. The relevant data of radical Islamists are not inherently quantitative, and can be bits and pieces of almost anything. They do not necessarily have to be expressed in numbers. Frequency distributions and probability tables can be useful but a lot of data can come in the form of words, images, impressions, gestures, or tones which represent real events or reality as it is seen symbolically or sociologically. To develop such a collaborative integrated system the first step would be to identify a joint definition of open sources and OSINT. OSINT is collected from information that is openly available to the public. An open source can be any person, group, or system that provides information without the expectation that the information, relationship, or both, are protected against public disclosure. Publicly available information includes data, facts, instructions, or other material published or broadcast for general public consumption available on request to a member of the general public; lawfully seen or heard by any casual observer; or made available at a meeting open to the general public.17 What is being understood under “openly available” might vary between governmental institutions in different countries. In general “OSINT operations support other intelligence, surveillance, and reconnaissance (ISR) efforts by providing information that enhances collection and production. As part of a multidiscipline intelligence effort, the use and integration of OSINT ensures decision-makers have the benefit of all available information.”18 Data collected from these different sources are often in diverse formats, ranging from structured database records to unstructured test, image, audio, and video files. As open source data volumes continue to grow, extracting valuable, credible intelligence and knowledge becomes increasingly problematic. Social science and other academic disciplines provide a tremendous amount of useful analytical methods. The first question which arises is how to define qualitative research. The simplest definition is to say it involves methods of data collection and analysis that are non-quantitative.19 Historical-comparative researchers would say it always involves the historical context, and sometimes a critique of the “front” being put on to get at the “deep structure” of social relations. Qualitative research most often is developed bottom up – not top down. Qualitative research uses unreconstructed logic to get at what is really real – the quality, meaning, context, or images of reality in what people actually do, not what they say they do. The challenge at this point is that institutions in charge of the analysis of open source materials use very different analysis methods. 17 Taylor, Michael C. Doctrine Corner: Open Source Intelligence Doctrine Military Intelligence Professional Bulletin. Ft. Huachuca: Oct-Dec 2005. Vol. 31, Iss. 4; p. 3. 18 Internet sites enable users to participate in a publicly accessible communications network that connects computers, computer networks, and organizational computer facilities around the world. The Internet is more than just a research tool. It is a reconnaissance and surveillance tool that enables intelligence personnel to locate and observe open sources of information. Through the Internet, trained collectors can detect and monitor Internet sites that may provide I&W of enemy intentions, capabilities, and activities. Collectors can monitor newspaper, radio, and television websites that support assessments of information operations. Collectors can conduct periodic searches of web pages and databases for content on military order of battle, personalities, and equipment. Collecting web page content and links can provide useful information about relationships between individuals and organizations. Properly focused, collecting and processing publicly available information from Internet sites can help analysts and decision makers understand the operational environment. Taylor, Michael C. Doctrine Corner: Open Source Intelligence Doctrine Military Intelligence Professional Bulletin. Ft. Huachuca: Oct-Dec 2005. Vol. 31, Iss. 4; p. 3. 19 Lofland, John und Lyn H. Lofland. 1984. Analyzing Social Settings: A Guide to Qualitative Observation and Analysis. Belmont, CA.