Due to recent military and commercial conflicts' paradigm shift, the term Information Warfare (IW) is and will increasingly become the focal point of the Risk and Crisis Management (RMCM) endeavors of any enterprise, from local companies up to country sized and global organizations. This paper takes a look at IW from the Risk and Crisis Management point of view and develops a "how to" guide to ensure that your investment will guarantee results.
Information Waterfare (IW): a "New" Hazard, Old
Syndromes, and the Look of a Risk and Crisis Manager
by Cesar and Franco Oboni, Oboni, Riskope Associates Inc. www.riskope.com
Due to recent military and commercial conflicts' paradigm shift, the term Information Warfare (IW)
is and will increasingly become the focal point of the Risk and Crisis Management (RMCM)
endeavors of any enterprise, from local companies up to country sized and global organizations.
Information Warfare (IW) is more and more heavily discussed in the news during or in the
aftermath of conflicts (2008 Georgia invasion, for example http://tinyurl.com/08Georgiainv ) and
references to IW abound in various civilian and/or military publications (GAO, 2001a; GAO, 2001b;
GAO, 2001c). In the meantime, voices raise from various countries (Birnbaum, 2005; Vernez, 2009)
claiming that "Information fighters become an increasingly common phenomenon" and therefore
new control should be embraced (Lesser et al., 1999; Thuraisingham, 2003).
This "new" awareness is further confirmed by NATO & the Pentagon stating there is urgent need to
tackle cyber attacks from organized national armies or rogue hackers acting on their own initiative,
or on behalf of (terror) partisan groups ( http://tinyurl.com/07natocyberAtt ;
As a final confirmation, if at all needed, of the seriousness of the concerns, a new ISO code came to
fill the void (ISO/IEC 27001, 2005) in this realm.
The aim of this paper is to show how a transparent Risk and Crisis Management (RMCM)
approach can help organizations of any size and scope to avoid wasting money in costly and
ineffective mitigations, by avoiding well know behavioral syndromes which we will summarize in
the next sessions.
INFORMATION WARFARE IS A HAZARD LIKE ANOTHER
IW is a hazard like earthquakes, fires, landmines etc., no more, no less, and should be treated
accordingly (Oboni & Oboni, 2007). Even terrorism is a hazard like another, and should be treated
in a logical, riskbased way ( Oboni & Oboni, 2004a; Oboni & Oboni, 2004b; Oboni & Oboni,
Instead, and very unfortunately lots of "mystique writings" are produced in the IW risk management
field, probably because its "new, thus mysterious" nature, and the aura of secrecy surrounding it,
due to its "covert operations" flavour.
For example, the generally spread idea that the IW threat is unquantifiable, is in itself absurd at best
and is probably the root cause of the inertia displayed by many in endorsing this cause: why would
one invest IW mitigative funds if the hazard is, in the saying of the specialists, unquantifiable?
Incidentally, Peter Drucker, in a different field of management, beautifully summarized this type of
reaction when he claimed “If you can't measure it, you can't manage it”.
Hazards are generally declared unquantifiable by people that may know all the details of the
hazards, but do not understand Risk and Crisis Management. This can be exemplified by the fierce
resistance encountered by the authors when, under UNDP mandate to create a risk management tool
for unexploded ordnance in Laos and other mined countries (GICHD, 2005; GICHD, 2007
http://tinyurl.com/07GICHD ), demining expert stated that "it was impossible" to encode "all the
experience and the flair resulting from a life spent in the field". To the big surprise of the same
experts the RM model was proven by a field test to be way more accurate than their "experience
based" approach (Oboni & Oboni, 2009).
Another example is constituted by people stating that as "statistics are missing", it is impossible to
evaluate hazards and risks: well, if that was true, how could we be performing RA on projects that
are still on paper and, moreover, be quite successful at managing future issues? Techniques to cope
with lack of statistics do exist, but have to be applied by specialists of Risk Management, not
specialists of the hazards who, tragically, generally think only in terms of the past, not of the future,
i.e. in terms of biased and censored statistics ("clean" statistics are very rare, in any field).
The conclusion is that a Risk Manager is required to help an organization understanding what risks
are generated by all the potential hazards surrounding it (natural, financial, manmade, information,
environmental, etc.), including IW. Think about this:
● When you go for your yearly checkup you are going to your General Practitioner, not to a
Specialist. Indeed as the Specialist is very capable his domain, he will most likely focus on
what he knows best and therefore he may miss some more obvious problems.
● When you need a vaccine you will neither talk to the biologist who did the research in the
domain, (he will most likely not know what the vaccine will do to your body) nor to a
Pharmaceutical Manufacturer (who will have a biased point of view, as they just love to sell
their products), but rather to your Family Doctor, who knows you well, and should have a
holistic approach to your health.
Consider the Risk Manager as the Family Doctor, a highly qualified person who is neutral,
unbiased, has a holistic approach and cares about your health. After a thorough checkup, i.e. once
the various hazards are identified and characterized (magnitude, frequency, probability), the Risk
Manager will help evaluating potential consequences of a hit on the system and will finally deliver a
risk estimation in a clear and transparent way. Then the Risk Manager will help the organization to
define their tolerability threshold; techniques exists now to facilitate this endeavour and produce
meaningful curves (Oboni & Oboni, 2007).
The risks will then be compared to the organization's own tolerability threshold, and from there a
ranking based on the intolerable part of each scenario will be produced. Such a list will look very
different from the "usual" ranked list of risks (a sort of topten of potential catastrophes) which can
actually be shown to be dangerously misleading in terms of prioritization and decision making.
At this point the Risk Manager will have concluded his job, by delivering a clear and sustainable
roadmap for the risk and crises mitigation of the organization, which in turn will contribute to
enhancing the chances of success and survivability of the said organization.
THE RISK MANAGEMENT RELATED SYNDROMES
IW being a hazard (like another), has to be treated transparently (which does not exclude
secrecy/confidentiality), in order to avoid various well known organizational riskrelated money
wasting syndromes which we will summarize in three points:
● The "specialist syndrome": this syndrome leads hazard specialists, i.e. in this case IW hazard
specialist (military, IT, politologist, etc.) to believe they understand how to evaluate its risks.
● The "denial syndrome": This syndrome is exemplified by the classic "it will not happen to
me: I am too large, too small, it can only happen to others etc."
● The "technology fixitall syndrome": This syndrome leads to the classic excesses driven by
harware vendors and other biased parties who want to erase aspects of the hazards, but miss the
true nature of the risks. History is full of unseizable castles who were seized in a day, starting
from Troy on; unsinkable vessels, who sank miserably; invincible armies who starved or froze
to death too far away from a logistic base (Russian campaign(s) etc.). Furthermore, examples
abound of laws and decrees aimed at solving one situation, then backfiring on another; parking
planes close together to avoid "local sabotage" in Pearl Harbor, only to offer an easy pray to the
Imperial Air Force, etc.
A TRANSPARENT TREATMENT OF IW RISKS
In Information Security (IS), a risk "R" is typically written as the combination of an asset, the
threats to the asset and the vulnerability that can be exploited by the threats to impact that asset.
This is a definition of R which is compliant to the definition for any other hazard from landslides to
fires, chemical accidents etc. An example would be: Our desktop computers (Asset or Target) can be
compromised by malware (Threat or Hazard) entering the environment as an email attachment
(Vulnerability). Thus R is assessed as a function of three variables:
1. the probability that there is a threat
2. the probability that there are any vulnerabilities
3. the potential impact to the business (also called Cost of Consequences, "C").
The two probabilities are sometimes combined and are also known as likelihood (probability) "p" of
a hit of the Hazard following a given scenario.
The same exact treatment can be developed for IW, or any other risk, as a matter of fact.
Scenario: a "group of interest" wants to destroy your business by an intoxication campaign related
to your production in China.
1. There is probability that such a group exist, based, among others, on your type of business
and your presence on the market(s), etc.;
2. there is a probability they will select the China production as the most vulnerable part of
3. the potential impact of such a campaing could cost you x% market share in Europe for n
years, y% market share in the US for m years etc.
You will note that the "enemy" has not been explicitly introduced in this discussion. In IW the
"enemy" will most likely be "invisible", but belong to a certain type (Parker et al., 2004). The
knowledge of the "enemy" type will help in determining the nature and probability of a hit and the
resulting consequences. In a study (Riskope International, 2005) the authors resolved to use the
"aim" of the IW campaign as a discriminant.
It is important to note that this apparently simple formulation of risk, actually hides numerous
difficulties, insofar neither the probability evaluation nor the cost of consequences are easy to
derive. Many attempts have been made in the literature to further detail the definition of risk, but
none has achieved a better definition, often ending up with mistakes and biases (for example
resulting in double counting). As we will see below, it is well worth overcoming these difficulties
and enter the world of rational risk management.
In the example above one can say: The risk R of the scenario "Our business hit by a campaign
against our Chinese production (coming from a certain type of "enemy")" is equal to the likelyhood
p combined (multiplied) by the consequence C.
Of course the risk has to be compared with the organization's tolerability, and this comparison will
actually dictate if that risk actually warrants a reduction/mitigation action. All the risk scenarios
potentially afflicting the organization will be ranked on the base of their "intolerable part", if any.
Let's develop an example taken from our day to day practice. It belongs to a rather large
organization (confidential client) for which a preliminary risk assessment was developed. The list of
scenarios includes industrial, logistic and IW hazards. Imagine, fires, explosions, transportation
mishaps, hackers penetrating the IT systems, etc., named: WPI, SMMF, CfoTD, etc. in Figure 1a,b
(14 scenarios in total).
Figure 1a displays probabilities p in the vertical axis and cost of consequences (losses) in the
horizontal axis, as determined for the 14 scenarios during the risk assessment process.
Figure 1b displays the same information, but the tolerability curve has been added to the plot.
You will notice that for some of the scenarios ranges of the probabilities are given. In the real case
each scenario and the tolerability curve were considered with their uncertainties both in terms of
probability and Cost of consequences.
Fig 1a Probability Loss plot for 14 risks Fig 1b same as 1a, but with tolerability curve
From Figure 1b one can immediately notice that not all the risks lie in the intolerable area, but
actually only half of them. Without the comparison with the tolerability curve the "topten" list
would have already been misleading and generate potential wastes in money allocation.
For each risk scenarios above tolerability it is possible to calculate the "part" of that risk which is
intolerable. At that point a clear roadmap becomes available as the prioritization is based on an
objective comparison with the client's own risk tolerability. Transparent decision can be taken
leading to a rational and sustainable RM. The benefits are obvious:
● Every time a risk scenario is identified, it can be integrated in the prioritization to allow
reasonable mitigation if necessary;
● Generally this type of analysis allows a significant reduction of the risks that actually
command immediate attention and mitigation. In the case of this example, only ¼ of the
scenarios actually proved to be a critical scenario, as they had "intolerable parts" of the risk
one order of magnitude larger than the others (See Figure 2).
Figure 2 Bar graph of the intolerable part of the scenarios showing 3 critical ones
● In a well managed organization the IW specialist should define the hazard, its magnitude,
and to a certain extent, with facilitation, the likelihood of a strike.
● Then a Risk Manager will determine the risk posed by IW, after helping to evaluate
probabilities and potential consequences of the hazard hitting the system. Finally a risk
estimation will be delivered in a clear and transparent way, compared to the organization
tolerabity threshold. A that point it will be possible to know which mitigations have to be
implemented, if any, and a road map will be defined.
● In a well managed organization a preliminary IW risk assessment (see the two points above)
will allow to evaluate which risks are relevant and should be tackled. This will result from a
comparison of the evaluated risks with the organization's tolerability curve, an exercise that
has to be developed quantitatively and transparently in order to avoid biases of various
● In a well managed organization prestige, arrogance and selfpraise will be kept at bay by
unbiased, transparent evaluations.
● A good RMCM approach will ensure the balance of the mitigative measures.
● No good mitigation can be implemented unless a serious RM approach weights the residual
risks, and secondary effects.
Birnbaum J. H., The Washington Post, September 19, 2005
GAO, CRITICAL INFRASTRUCTURE PROTECTION Significant Challenges in Protecting
Federal Systems and Developing Analysis and Warning Capabilities Statement of Joel C.
Willemssen Managing Director, Information Technology Issues, Page 12, September 12, 2001a
GAO, Homeland Security: A Risk Management Approach Can Guide Preparedness Efforts.
GAO02208 T. Washington, D.C.: October 31, 2001b
GAO, United States General Accounting Office, CRITICAL INFRASTRUCTURE PROTECTION
Significant Challenges in Protecting Federal Systems and Developing Analysis and Warning
Capabilities Statement of Joel C. Willemssen Managing Director, Information Technology Issues
Testimony Before the Committee on Governmental Affairs, U.S. Senate 2001c
GICHD, Geneva International Centre of Humanitarian Demining, A Study of Manual Mine