Due to recent military and commercial conflicts' paradigm shift, the term Information Warfare (IW) is and will increasingly become the focal point of the Risk and Crisis Management (RMCM) endeavors of any enterprise, from local companies up to country sized and global organizations.
This paper takes a look at IW from the Risk and Crisis Management point of view and develops a "how to" guide to ensure that your investment will guarantee results.
1. Information Waterfare (IW): a "New" Hazard, Old
Syndromes, and the Look of a Risk and Crisis Manager
by Cesar and Franco Oboni, Oboni, Riskope Associates Inc. www.riskope.com
Due to recent military and commercial conflicts' paradigm shift, the term Information Warfare (IW)
is and will increasingly become the focal point of the Risk and Crisis Management (RMCM)
endeavors of any enterprise, from local companies up to country sized and global organizations.
INTRODUCTION
Information Warfare (IW) is more and more heavily discussed in the news during or in the
aftermath of conflicts (2008 Georgia invasion, for example http://tinyurl.com/08Georgiainv ) and
references to IW abound in various civilian and/or military publications (GAO, 2001a; GAO, 2001b;
GAO, 2001c). In the meantime, voices raise from various countries (Birnbaum, 2005; Vernez, 2009)
claiming that "Information fighters become an increasingly common phenomenon" and therefore
new control should be embraced (Lesser et al., 1999; Thuraisingham, 2003).
This "new" awareness is further confirmed by NATO & the Pentagon stating there is urgent need to
tackle cyber attacks from organized national armies or rogue hackers acting on their own initiative,
or on behalf of (terror) partisan groups ( http://tinyurl.com/07natocyberAtt ;
http://tinyurl.com/pentagonecyberwarisnow ).
As a final confirmation, if at all needed, of the seriousness of the concerns, a new ISO code came to
fill the void (ISO/IEC 27001, 2005) in this realm.
The aim of this paper is to show how a transparent Risk and Crisis Management (RMCM)
approach can help organizations of any size and scope to avoid wasting money in costly and
ineffective mitigations, by avoiding well know behavioral syndromes which we will summarize in
the next sessions.
INFORMATION WARFARE IS A HAZARD LIKE ANOTHER
IW is a hazard like earthquakes, fires, landmines etc., no more, no less, and should be treated
accordingly (Oboni & Oboni, 2007). Even terrorism is a hazard like another, and should be treated
in a logical, riskbased way ( Oboni & Oboni, 2004a; Oboni & Oboni, 2004b; Oboni & Oboni,
2004c).
Instead, and very unfortunately lots of "mystique writings" are produced in the IW risk management
field, probably because its "new, thus mysterious" nature, and the aura of secrecy surrounding it,
due to its "covert operations" flavour.
For example, the generally spread idea that the IW threat is unquantifiable, is in itself absurd at best
and is probably the root cause of the inertia displayed by many in endorsing this cause: why would
one invest IW mitigative funds if the hazard is, in the saying of the specialists, unquantifiable?
Incidentally, Peter Drucker, in a different field of management, beautifully summarized this type of
reaction when he claimed “If you can't measure it, you can't manage it”.
Hazards are generally declared unquantifiable by people that may know all the details of the
hazards, but do not understand Risk and Crisis Management. This can be exemplified by the fierce
resistance encountered by the authors when, under UNDP mandate to create a risk management tool
for unexploded ordnance in Laos and other mined countries (GICHD, 2005; GICHD, 2007
http://tinyurl.com/07GICHD ), demining expert stated that "it was impossible" to encode "all the
experience and the flair resulting from a life spent in the field". To the big surprise of the same
3. ● The "technology fixitall syndrome": This syndrome leads to the classic excesses driven by
harware vendors and other biased parties who want to erase aspects of the hazards, but miss the
true nature of the risks. History is full of unseizable castles who were seized in a day, starting
from Troy on; unsinkable vessels, who sank miserably; invincible armies who starved or froze
to death too far away from a logistic base (Russian campaign(s) etc.). Furthermore, examples
abound of laws and decrees aimed at solving one situation, then backfiring on another; parking
planes close together to avoid "local sabotage" in Pearl Harbor, only to offer an easy pray to the
Imperial Air Force, etc.
A TRANSPARENT TREATMENT OF IW RISKS
In Information Security (IS), a risk "R" is typically written as the combination of an asset, the
threats to the asset and the vulnerability that can be exploited by the threats to impact that asset.
This is a definition of R which is compliant to the definition for any other hazard from landslides to
fires, chemical accidents etc. An example would be: Our desktop computers (Asset or Target) can be
compromised by malware (Threat or Hazard) entering the environment as an email attachment
(Vulnerability). Thus R is assessed as a function of three variables:
1. the probability that there is a threat
2. the probability that there are any vulnerabilities
3. the potential impact to the business (also called Cost of Consequences, "C").
The two probabilities are sometimes combined and are also known as likelihood (probability) "p" of
a hit of the Hazard following a given scenario.
The same exact treatment can be developed for IW, or any other risk, as a matter of fact.
Scenario: a "group of interest" wants to destroy your business by an intoxication campaign related
to your production in China.
Risk Evaluation:
1. There is probability that such a group exist, based, among others, on your type of business
and your presence on the market(s), etc.;
2. there is a probability they will select the China production as the most vulnerable part of
your business;
3. the potential impact of such a campaing could cost you x% market share in Europe for n
years, y% market share in the US for m years etc.
You will note that the "enemy" has not been explicitly introduced in this discussion. In IW the
"enemy" will most likely be "invisible", but belong to a certain type (Parker et al., 2004). The
knowledge of the "enemy" type will help in determining the nature and probability of a hit and the
resulting consequences. In a study (Riskope International, 2005) the authors resolved to use the
"aim" of the IW campaign as a discriminant.
It is important to note that this apparently simple formulation of risk, actually hides numerous
difficulties, insofar neither the probability evaluation nor the cost of consequences are easy to
derive. Many attempts have been made in the literature to further detail the definition of risk, but
none has achieved a better definition, often ending up with mistakes and biases (for example
resulting in double counting). As we will see below, it is well worth overcoming these difficulties
and enter the world of rational risk management.
In the example above one can say: The risk R of the scenario "Our business hit by a campaign
5. scenarios actually proved to be a critical scenario, as they had "intolerable parts" of the risk
one order of magnitude larger than the others (See Figure 2).
Figure 2 Bar graph of the intolerable part of the scenarios showing 3 critical ones
CONCLUSIONS
● In a well managed organization the IW specialist should define the hazard, its magnitude,
and to a certain extent, with facilitation, the likelihood of a strike.
● Then a Risk Manager will determine the risk posed by IW, after helping to evaluate
probabilities and potential consequences of the hazard hitting the system. Finally a risk
estimation will be delivered in a clear and transparent way, compared to the organization
tolerabity threshold. A that point it will be possible to know which mitigations have to be
implemented, if any, and a road map will be defined.
● In a well managed organization a preliminary IW risk assessment (see the two points above)
will allow to evaluate which risks are relevant and should be tackled. This will result from a
comparison of the evaluated risks with the organization's tolerability curve, an exercise that
has to be developed quantitatively and transparently in order to avoid biases of various
nature.
● In a well managed organization prestige, arrogance and selfpraise will be kept at bay by
unbiased, transparent evaluations.
● A good RMCM approach will ensure the balance of the mitigative measures.
● No good mitigation can be implemented unless a serious RM approach weights the residual
risks, and secondary effects.
REFERENCES
Birnbaum J. H., The Washington Post, September 19, 2005
GAO, CRITICAL INFRASTRUCTURE PROTECTION Significant Challenges in Protecting
Federal Systems and Developing Analysis and Warning Capabilities Statement of Joel C.
Willemssen Managing Director, Information Technology Issues, Page 12, September 12, 2001a
GAO, Homeland Security: A Risk Management Approach Can Guide Preparedness Efforts.
GAO02208 T. Washington, D.C.: October 31, 2001b
GAO, United States General Accounting Office, CRITICAL INFRASTRUCTURE PROTECTION
Significant Challenges in Protecting Federal Systems and Developing Analysis and Warning
Capabilities Statement of Joel C. Willemssen Managing Director, Information Technology Issues
Testimony Before the Committee on Governmental Affairs, U.S. Senate 2001c
GICHD, Geneva International Centre of Humanitarian Demining, A Study of Manual Mine