2. 19Get The Cutter Edge free: www.cutter.com Vol. 24, No. 5 CUTTER IT JOURNAL
private organizations and public agencies. It found that
24% of respondents had experienced computer crime,
with losses ranging from US $145 million to $730 mil-
lion, so the problem was already large.2
Since that time, government, industry, and law enforce-
ment have not been idle. In 1986, the US Congress
passed the Computer Crime and Abuse Act. Beginning
in 2002, state breach notification laws came on the
scene, followed by Sarbanes-Oxley and HIPAA. In the
late 1990s under President Clinton, again in 2003 under
President Bush, and again in 2009 under President
Obama, the US government undertook strategic reviews
of the path toward securing cyberspace. Each of these
assessments made findings and outlined proposals that
were strikingly similar, and each resulted in efforts to
do things such as increase cooperation between govern-
mental agencies; promote partnerships between govern-
ment, industry, and academia; and elevate awareness
across the country. The US–Computer Emergency
Readiness Team (US–CERT), National Security Agency
(NSA), and Department of Homeland Security (DHS)
Centers of Academic Excellence are examples of what
has been put in place. At the end of the first decade of
the 21st century, it seems that — despite governmental
rhetoric about the security of the national infrastructure
— the “adversary” is a cyber criminal who wants our
personally identifiable information (PII), our financial
information, and our medical information.
How have corporations responded to the cyber crime
threat? In the late 1990s they saw the “Melissa” and “I
Love You” viruses and established IT security groups
(yes, one person can be a “group”). The first order of
business was getting control of corporate endpoints by
using standard loads and antivirus software. Then they
tried to get things like incident response and forensics in
place. These were served up centrally and had an infor-
mation assurance focus. Next, they moved on to securing
applications and infrastructure. Some security personnel
used fear, uncertainty, and doubt (FUD) effectively to get
funding. Yet they struggled to make progress and many
times showed up just before launch or at a gate review
and rendered their “opinion” on the security of the appli-
cation or infrastructure. They took some abuse for this
kind of behavior and now attempt to “play nice.”
When Sarbanes-Oxley came around, security folks found
that the threat of noncompliance is a better tool than
the threat of a future breach from the hacker in the shad-
ows, and they used it. Some organizations have availed
themselves of opportunities to collaborate with federal
government working groups launched in the last five
years, such as the DHS’s Software Assurance Working
Groups.3
Some realized that a feedback loop from the
IT security group to the developers and architects is a
valuable thing, and they instituted software assurance
processes. Most did not. It seems that because of the way
compliance fear and pressures to deliver new content
have influenced organizations, the “adversary” is a
cyber criminal who wants our corporate intellectual
property, our financial information, our medical
information, and our customers’ PII.
For their part, law enforcement agencies have attempted
to get their arms around cyber crime and have had a
degree of success. At the local, county, and state levels,
there is some cooperation that has helped build regional
crime labs (mostly focused on forensics) across the US.
However, on average there is less than a handful per
state. Each jurisdiction seems to address only a piece
of the cyber crime pie. For example, state cyber crime
personnel might focus on fraud and identity theft, while
county cyber crime personnel might focus on online
predators, and local personnel focus on something else
altogether.
In addition, law enforcement personnel have faced
difficulties on several fronts. Resources (people and
equipment) are hard to fund given the tough economic
times. Even when funding is available, it is difficult to
find qualified officers. Training is scarce and expensive.
Education is even more expensive and takes several
years. When police forces are unionized, cyber security
staffing becomes tougher still. The officer with the high-
est seniority must get the cyber security job even when
more qualified officers are available. All that being said,
there are qualified law enforcement officers out there
trying to solve cyber crimes, and we need more, many
more. Here the “adversary” is a cyber criminal that is
defined by the particular law enforcement agency you
are talking with.
WHAT HAS ALL THIS EFFORT GOTTEN US?
We are now in a situation where government, cor-
porations, academia, and law enforcement have each
developed an understanding of cyber crime and the
adversary (a cyber criminal) based on their individual
Law enforcement personnel have faced
difficulties on several fronts. Resources are
hard to fund given the tough economic times.
4. 21Get The Cutter Edge free: www.cutter.com Vol. 24, No. 5 CUTTER IT JOURNAL
defeat at whatever cost. Second, cyber crime requires a
return on investment. Cyber criminals will not spend
endless amounts of time on malware or hacks if there is
no revenue stream. Therefore, the targets they choose
and the ways in which they choose to compromise them
will be determined by the monetary gain of the compro-
mise amortized over the time it takes to acquire the pot
of gold.
Cyber war, on the other hand, does not require a return
on investment. Cyber soldiers can spend as long as it
takes to develop malware or hacks for a target. The tar-
gets they choose will be determined by the damage that
will be inflicted on the country that owns the target. In
short, cyber soldiers will attack different targets in dif-
ferent ways taking as much time as needed. Cyber crim-
inals will attack targets that can be turned into money.
SO WHAT?
Why does this differentiation between cyber crime
and cyber war matter? Isn’t a slight repurposing of
personnel or a slight redefinition of processes all that
is required to address cyber war? Haven’t we built
up competencies fighting cyber crime that can be
applied fighting cyber war? These questions require
some discussion.
Ideally, when organizations address the cyber security
of their assets, they must begin by identifying those
assets. This is not a trivial task, and it requires agree-
ment among all stakeholders. Then stakeholders must
come to an understanding of how those assets are
accessed, transformed, and transported inside and
between software-intensive systems. Cross-functional
teams of technical, business, and legal personnel
will then identify, analyze, and rank threats to those
assets. Finally, mitigations are proposed, analyzed,
and adopted. This systematic process is called “threat
modeling,” and while it has gained popularity over
the last several years, it is not yet widely performed. In
our experience, assets are defined as money, personal
information, and processes that manage money and
personal information. Threats are identified by asking
questions like “Who would want to know or alter this
information?” Mitigations are adjustments to processes
and systems that are identified by looking at legal and
regulatory guidance, industry best practices, and cost.
Assets, adversaries, and adjustments are currently
defined in the context of cyber crime. We would argue
that they must be defined in a larger context that
includes cyber war.
Assets are what we value. They speak to our lifestyle,
our culture, and our identity. They are not simply Social
Security numbers and dollar bills. Therefore, we need to
alter the definition of assets to include all of the things
we value.
Adversaries are all those who would seek to harm our
assets. They are not simply cyber criminals or cyber
thugs seeking financial gain. Sun Tzu sheds light on the
importance of understanding your adversary:
So it is said that if you know your enemies and know
yourself, you can win a hundred battles without a
single loss.
If you only know yourself, but not your opponent, you
may win or may lose.
If you know neither yourself nor your enemy, you will
always endanger yourself.
— The Art of War
Adjustments are things that we change in order to
make it impossible (or less likely) for a threat to be
realized. Once we truly understand our assets and our
adversaries, we can make decisions using equations
(metaphorically speaking) that weight variables accord-
ing both cyber crime and cyber war.
We have spent the better part of the last 35 years getting
to know an enemy we call a cyber criminal. We have
organized departments, enacted legislation, trained per-
sonnel, and launched initiatives with the intent of pro-
tecting things of value from the cyber criminal. We are
in an environment that increasingly favors the cyber
adversary (more devices on the Internet, more intercon-
nectedness between those devices, more business and
financial transactions online, more treasure in the cloud
and on personal devices, and more porous network
boundaries around everything). We can barely stave
off the cyber criminal; how will we deal with a cyber
soldier? Is all hope lost?
No, all hope is not lost. We have developed a lot of
skills over the last few decades that can be repurposed
for the task. To show how these capabilities can be
brought to bear, we will close with some concrete
recommendations.
Conduct Threat Modeling
If you haven’t already, begin threat modeling. If you
already conduct threat modeling, then adjust your per-
spective to think about assets, adversaries, and adjust-
ments in a new way. This will require a significant
amount of work, and getting buy-in at the executive
level will require some homework and some luck.
6. 23Get The Cutter Edge free: www.cutter.com Vol. 24, No. 5 CUTTER IT JOURNAL
Jeffrey A. Ingalsbe is a Department Chair at the University of Detroit
Mercy in the Center for Cyber Security and Intelligence Studies,
where he teaches, among other things, ethical hacking and incident
response (master’s level). Mr. Ingalsbe directs a state-of-the-art cyber
security lab, where students gain real-world competencies through
exploration of cyber security problems. Until recently, he managed the
IT security consulting group at Ford Motor Company, where he was
involved in IS solutions for the enterprise, consumerization explo-
ration, threat modeling efforts, and strategic security research. He
holds a BSEE degree from Michigan Technological University and an
MSCIS degree from the University of Detroit Mercy. He is currently
working on a PhD in information systems engineering at the
University of Michigan–Dearborn. Mr. Ingalsbe can be reached at
ingalsja@udmercy.edu.
Dan Shoemaker is the Director of the Institute for Cyber Security
Studies, a National Security Agency (NSA) Center of Academic
Excellence, at the University of Detroit Mercy (UDM). Dr.
Shoemaker is a well-known speaker and writer in the area of cyber
security. He is a professor at UDM, where he has been the Chair of the
computer and information systems program since 1985. He is also a
visiting professor in cyber security at London Southbank University
in the UK. His Ph.D. is from the University of Michigan in Ann
Arbor. Dr. Shoemaker is Co-Chair of the Workforce Training and
Education working group for the secure software assurance initiative
within the US Department of Homeland Security’s National Cyber
Security Division (NCSD). He is one of the earliest academic partic-
ipants in the development of software engineering as a discipline,
starting at the SEI in fall 1987. He is the coauthor of McGraw-Hill’s
best-selling book on cyber security, Information Assurance for
the Enterprise. He is also a prolific writer and speaker on cyber
security topics across the nation. Dr. Shoemaker can be reached at
dshoemaker1@twmi.rr.com.
Nancy R. Mead is a senior member of the technical staff in the
Networked Systems Survivability (NSS) Program at the SEI. Dr.
Mead is also a faculty member in the master of software engineering
and master of information systems management programs at Carnegie
Mellon University. She is currently involved in the study of secure
systems engineering and the development of professional infrastruc-
ture for software engineers. Dr. Mead served as team lead for the ini-
tial Build Security In (BSI) website development and launch and later
served as technical lead on the project. She also served as Director of
Education for the SEI from 1991 to 1994. Mead has more than 100
publications and invited presentations and has a biographical citation
in Who’s Who in America. She is a Fellow of IEEE and the IEEE
Computer Society and a member of the ACM. Dr. Mead can be
reached at nrm@sei.cmu.edu.
Wesley J. Meier is a graduate business and computer and information
systems student at the University of Detroit Mercy (UDM). He serves
as a graduate assistant for the College of Business Administration
and Decision Sciences Department, is the President of the College of
Business Administration’s Graduate Student Advisory Board, and is
also a member of the President’s Council. Mr. Meier is a founding
student member of the Global Jesuit Business Student Association
Honor Society as well as Alpha Iota Delta, the International Honor
Society in the Decision Sciences and Information Systems. He
received his MBA from UDM in 2010. Mr. Meier can be reached
at wesley.meier@gmail.com.