January 2017
Volume 15 Issue 1
Machine Learning: A Primer for Security
Enterprise Security Architecture: Key for Aligning
Security Goals with Business Goals
The Role of the Adjunct in Educating the Security Practitioner
Fragmentation in Mobile Devices
Gaining Confidence in the Cloud
Crypto Wars II
The Best Articles
of 2016
Table of Contents
DEVELOPING AND CONNECTING CYBERSECURITY LEADERS GLOBALLY
Articles
22 Enterprise Security Architecture: Key for
Aligning Security Goals with Business Goals
By Seetharaman Jeganathan
In this article, the author shares his insights about why
security architecture is critical for organizations and
how it can be developed using a practical framework-
based approach.
30 The Role of the Adjunct in Educating the
Security Practitioner
By Karen Quagliata – ISSA member, St. Louis Chapter
The cybersecurity industry faces a shortage of qualified
professionals. Part of the solution is to better deliver
cybersecurity education in colleges and universities.
The purpose of this article is to equip cybersecurity
professionals working as adjunct instructors with
resources to deliver a more efficient and effective class.
Also in this Issue
3 From the President
4 editor@issa.org
5 Sabett’s Brief
(Not) The Best of Cybersecurity, 2016 Version
6 Herding Cats
Sweat the Small Stuff
7 Open Forum
Executive Juris Doctor: Rewarding and
Influential Career Path
8 Security in the News
9 Security Awareness
Security in the News in 2016
10 Crypto Corner
A Feeble Attempt at Humor
12 Association News
Feature
14 Machine Learning: A Primer for Security
By Stephan Jou – ISSA member, Toronto Chapter
The author examines how machine learning can be leveraged to address the practical challenges of delivering lower-cost
security by resolving more threats faster, with fewer resources. It will focus on machine learning security techniques that
work at typical levels of data volumes, from those operating with “small data” to those implementing data lakes.
©2017 Information Systems Security Association, Inc. (ISSA)
The ISSA Journal (1949-0550) is published monthly by
Information Systems Security Association
11130 Sunrise Valley Drive, Suite 350, Reston, Virginia 20191
703.234.4095 (Direct) • +1 703.437.4377 (National/International)
35 Fragmentation in Mobile Devices
By Ken Smith
The purpose of this article is to explore the threat to
consumers posed by mobile device fragmentation. The
author categorizes mobile device fragmentation by
operating systems, manufacturer, and carrier, exploring
the vulnerabilities at each level.
39 Gaining Confidence in the Cloud
By Phillip Griffin – ISSA Fellow, Raleigh Chapter and
Jeff Stapleton – ISSA member, Fort Worth Chapter
Can cloud-based technologies, such as the blockchain,
play a role in providing cloud subscribers assurance
their data is being properly managed and that their
cloud service provider is in compliance with established
security policies and practices?
44 Crypto Wars II
By Luther Martin – ISSA member, Silicon Valley Chapter
and Amy Vosters
The debate over whether or not to give US law
enforcement officials the ability to decrypt encrypted
messaging has recently been revisited after a twenty-
year break. The results may be surprising.
Article of the Year
2 – ISSA Journal | January 2017
From the President
January 2017 | ISSA Journal – 3
International Board Officers
President
Andrea C. Hoy, CISM, CISSP, MBA,
Distinguished Fellow
Vice President
Justin White
Secretary/Director of Operations
Anne M. Rogers
CISSP, Fellow
Treasurer/Chief Financial Officer
Pamela Fusco
Distinguished Fellow
Board of Directors
Debbie Christofferson, CISM, CISSP,
CIPP/IT, Distinguished Fellow
Mary Ann Davidson
Distinguished Fellow
Rhonda Farrell, Fellow
Geoff Harris, CISSP, ITPC, BSc, DipEE,
CEng, CLAS, Fellow
DJ McArthur, CISSP, HiTrust CCSFP,
EnCE, GCIH, CEH, CPT
Shawn Murray, C|CISO, CISSP, CRISC,
FITSP-A, C|EI, Senior Member
Alex Wood, Senior Member
Keyaan Williams, Fellow
Stefano Zanero, PhD, Fellow
The Information Systems Security Asso-
ciation, Inc. (ISSA)® is a not-for-profit,
international organization of information
security professionals and practitioners. It
provides educational forums, publications
and peer interaction opportunities that en-
hance the knowledge, skill and professional
growth of its members.
With active participation from individuals
and chapters all over the world, the ISSA
is the largest international, not-for-profit
association specifically for security pro-
fessionals. Members include practitioners
at all levels of the security field in a broad
range of industries, such as communica-
tions, education, healthcare, manufactur-
ing, financial, and government.
The ISSA International Board consists of
some of the most influential people in the
security industry. With an internation-
al communications network developed
throughout the industry, the ISSA is fo-
cused on maintaining its position as the
preeminent trusted global information se-
curity community.
The primary goal of the ISSA is to promote
management practices that will ensure the
confidentiality, integrity and availability of
information resources. The ISSA facilitates
interaction and education to create a more
successful environment for global informa-
tion systems security and for the profes-
sionals involved.
F
rom a cybersecurity profession-
al’s perspective, we probably
can relate to the differentiation
of having a “good” year versus a “hap-
py” one. Many of us remember notable
events in 2016 that probably did not
make anyone “happy.” Those in our
Healthcare SIG might recall cancer-care
service provider 21st
Century Oncol-
ogy’s announcement that 2.2 million
patients may have had their personal
information affected by a breach in Oc-
tober 2015: hackers had access to patient
names, Social Security numbers, doc-
tors, diagnosis and treatment informa-
tion, along with insurance information.
Even the loss of one password-protected
laptop led to 200,000 patients’ sensitive
information being exposed in the Pre-
miere Healthcare case. Maybe it was
the Yahoo breach announcements of
500 million accounts being stolen by a
state-sponsored actor, then later in De-
cember one billion accounts!
Meanwhile it was a “good year” from the
perspective of heightened awareness of
cybersecurity and privacy issues by the
average person on the street. As well,
leading companies—and more impor-
tantly their boards—have been address-
ing and providing better protection of
sensitive personal and company infor-
mation.
In 2016, with consumers embracing
the Internet of Things, hackers brought
us Mirai, causing possibly the largest
DDoS attack known to date, delivering
665 Gigabits per second and 143 million
packets per second of unwanted traffic
via hijacked IoT devices to the Krebs on
Security blog.
The increase in regulations, as well as
privacy concerns, meant an increase in
regulatory compliance, leading many
companies to address information se-
curity budget in-
creases. In the first
six months of 2016,
even the US federal government had
hired 3,000+ new cybersecurity/IT
professionals as part of its first Federal
Cybersecurity Workforce Strategy. And
the president’s 2017 budget contains a
proposed $3.1 billion to overhaul diffi-
cult-to-secure systems.
So looking forward, ISSA aims to con-
tinue providing timely and thought-pro-
voking information and educational
resources. And more importantly, we
want to provide the peer/industry net-
working necessary to give you a global
helping hand.
Our global Special Interest Groups
(SIGS) are ready to ring in the new year
with exciting webinars and meetings.
We had two very successful joint events
in December, one the IEEE Women in
Engineering Internet of Things World
Forum, the other with SANS Connect.
ISSA members can look forward to
more of these events throughout 2017.
For CISOs, our excellent CISO Execu-
tive Forum is set up by a committee of
your peers and overseen by CISO Exec-
utive Forum chair and International di-
rector Debbie Christofferson. This year’s
with be at RSA; in partnership with the
IAPP conference in Washington, DC;
at Black Hat in Las Vegas; and the ISSA
International Conference in San Diego.
And be sure to join us January 24 for
this year’s first ISSA web conference
where we discuss more of what to expect
in 2017!
To our ISSA members across the globe:
have a Happy and Good New Year!
Moving forward,
Happy New Year! Bonne annee’! Szczesliwego Nowego
Roku! Feliz año nuevo! Manigong Bagong Taon! Felice
Anno Nuovo or Buon anno! Mutlu Yillar! Ein glückliches
neues Jahr! Hauoli Makahiki hou! And Shanah tovah
u’metuka (‫הנש‬ ‫הבוט‬ ‫)הקותמו‬ or hopes for a
good and sweet year!
Andrea Hoy, International President
The information and articles in this mag-
azine have not been subjected to any
formal testing by Information Systems
Security Association, Inc. The implemen-
tation, use and/or selection of software,
hardware, or procedures presented
within this publication and the results
obtained from such selection or imple-
mentation, is the responsibility of the
reader.
Articles and information will be present-
ed as technically correct as possible, to
the best knowledge of the author and
editors. If the reader intends to make
use of any of the information presented
in this publication, please verify and test
any and all procedures selected. Techni-
cal inaccuracies may arise from printing
errors, new developments in the indus-
try, and/or changes/enhancements to
hardware or software components.
The opinions expressed by the authors
who contribute to the ISSA Journal are
their own and do not necessarily reflect
the official policy of ISSA. Articles may
be submitted by members of ISSA. The
articles should be within the scope of in-
formation systems security, and should
be a subject of interest to the members
and based on the author’s experience.
Please call or write for more information.
Upon publication, all letters, stories, and
articles become the property of ISSA
and may be distributed to, and used by,
all of its members.
ISSA is a not-for-profit, independent cor-
poration and is not owned in whole or in
part by any manufacturer of software or
hardware. All corporate information se-
curity professionals are welcome to join
ISSA. For information on joining ISSA
and for membership rates, see www.
issa.org.
All product names and visual represen-
tations published in this magazine are
the trademarks/registered trademarks
of their respective manufacturers.
4 – ISSA Journal | January 2017
editor@issa.org
The Best Articles of 2016
Thom Barrie – Editor, the ISSA Journal Editor: Thom Barrie
editor@issa.org
Advertising: vendor@issa.org
866 349 5818 +1 206 388 4584
Editorial Advisory Board
Phillip Griffin, Fellow
Michael Grimaila, Fellow
John Jordan, Senior Member
Mollie Krehnke, Fellow
Joe Malec, Fellow
Donn Parker, Distinguished Fellow
Kris Tanaka
Joel Weise – Chairman,
Distinguished Fellow
Branden Williams,
Distinguished Fellow
Services Directory
Website
webmaster@issa.org
866 349 5818 +1 206 388 4584
Chapter Relations
chapter@issa.org
866 349 5818 +1 206 388 4584
Member Relations
member@issa.org
866 349 5818 +1 206 388 4584
Executive Director
execdir@issa.org
866 349 5818 +1 206 388 4584
Advertising and Sponsorships
vendor@issa.org
866 349 5818 +1 206 388 4584
W
e’d like
to ac-
knowl-
edge the passing
of 2016, not with
reminiscing the
breaches, malware,
privacy invasions,
legislations—Andrea, Geordie, and
Randy help us out with that—but by cel-
ebrating the articles the Editorial Advi-
sory Board deemed the best of the year.
The 2016 Article of the Year
“Machine Learning: A Primer for Se-
curity” by Stephan Jou [Toronto Chap-
ter]. Stephan lays out the workings of
machine learning and artificial intel-
ligence, painting a clear picture of this
growing technology that some argue is
still not ready for prime time. But the
promise of combining big data and ma-
chine learning—whether for analyzing
unimaginably huge amounts of data for
business processes or picking up on the
bad actors knocking, poking, and prod-
ding our infrastructures—has me excit-
ed to see how 2017 plays out in this field.
The Best of 2016
“Enterprise Security Architecture: Key
for Aligning Security Goals with Busi-
ness Goals,” by Seetharaman Jegana-
than—Seetharaman deserves an hon-
orable mention as his article was a very
close runner up.
“The Role of the Adjunct in Educating
the Security Practitioner,” by Karen
Quagliata [St. Louis Chapter].
“Fragmentation in Mobile Devices,” by
Ken Smith.
“Gaining Confidence in the Cloud,” by
Phillip Griffin [Raleigh Chapter] and
Jeff Stapleton [Fort Worth Chapter].
“Crypto Wars II,” by Luther Martin [Sil-
icon Valley Chapter] and Amy Vosters.
Congratulations to our best authors of
the year! A number are already plan-
ning to submit further works in the up-
coming year.
Readers’ Choice for 2016
So, these are the board’s choices. Do
you concur? Please take a look through
the year and let us know your top three
or four selections. We’d love to have a
Readers’ Choice. Some of my favorites
not mentioned are “Impact of Social
Media on Cybersecurity Employment
and How to Use It to Improve Your Ca-
reer,” Tim Howard [South Texas Chap-
ter]; “Stop Delivery of Phishing Emails,”
Gary Landau [Los Angeles Chapter];
“Beware the Blockchain,” Karen Mar-
tin; “The Race against Cyber Crime Is
Lost without Artificial Intelligence,”
Keith Moore [Capitol of Texas Chapter];
and “Why Information Security Teams
Fail,” Jason Lang.
Let me know at editor@issa.org.
It’s been a great year in the ISSA Journal.
Here’s looking forward to an even bet-
ter year. Do you have an article to share.
Bring it on.
—Thom
Sabett’s Brief
By Randy V. Sabett – ISSA Senior Member, Northern Virginia Chapter
(Not) The Best of Cybersecurity,
2016 Version
S
o how many cybersecurity “Best of
2016” lists have you seen over the
past few weeks? Well, this won’t be
one of those lists, because as I’ve done
in prior years, I’m going to cover events
that I think were notable but that weren’t
necessarily “best of.” And, as in past
years, my wife thinks that this is a silly
approach, but here goes anyway…
First off, the Internet has survived an-
other year. Despite all of the predictions
of gloom and doom that have been pos-
ited over the past decade or more, we’re
still plugging away with the same basic
infrastructure we’ve had for several de-
cades. To some extent, this survival is a
testament to its original design—adapt-
able to changing conditions and attacks.
Turning to a legislative event from very
early in the year, the passage of the Con-
solidated Appropriations Act of 2016
included the Cybersecurity Information
Sharing Act (CISA). CISA created a vol-
untary process for sharing cybersecu-
rity information without legal barriers
or threats of litigation. DHS and DOJ
released additional guidance on infor-
mation sharing under CISA in February
and June. Based on personal experience
in 2016, I find CISA has influenced a
number of decisions to share informa-
tion, including B2B, B2G, and G2B.
Continuing for a moment on the gov-
ernment side of things, in February the
Administration released the Cyberse-
curity National Action Plan (“CNAP”).
The CNAP provides a combination
of near-term tactical actions and lon-
ger-term strategy components intended
to “enhance cybersecurity awareness
and protections, protect privacy, main-
tain public safety as well as economic
and national security, and empower
Americans to take better control of their
digital security.”1
Good stuff, but proper
implementation will be critical.
On the commercial side, businesses
continued to be subjected to a variety
of ever-evolving threats, including the
incredible rise in both frequency and
insidiousness of ransomware. 2016
saw ransomware evolve from phish-
ing-based attacks on individual ma-
chines into an attack mechanism that
threatened entire networks. In particu-
lar, SamSam (which exploits unpatched
servers, moves laterally to any machine
it finds, and then encrypts the entire
network) proved to be particularly over-
whelming. Only robust patching and
diligent backups offer resiliency.
In 2016, we saw cybersecurity become
an integral part of the due diligence
process for most M&A transactions
(and personal experience bore this out).
In fact, according to a recent survey, 85
percent of public company directors and
officers say that an M&A transaction in
which they were involved would likely
or very likely be affected by “major se-
curity vulnerabilities.” In addition, 22
percent say that they wouldn’t acquire
a company that had a high-profile data
breach, while 52 percent said they would
still go through with the transaction but
only at a significantly reduced value.2
This interest in cybersecurity diligence
is not just theoretical: in the midst of
an October M&A transaction involv-
ing Verizon and Yahoo!, news broke of
a Yahoo! breach that had occurred ap-
proximately two years earlier. This event
raised speculation around what it might
do to the deal. To me, the bigger question
will be how the overall scope of the due
1 https://www.whitehouse.gov/the-press-
office/2016/02/09/fact-sheet-cybersecurity-national-
action-plan.
2 https://www.nyse.com/publicdocs/Cybersecurity_and_
the_M_and_A_Due_Diligence_Process.pdf.
diligence process
will be influenced
by cybersecurity in
future deals.
To round out the year, I will end on a
hopefully positive note. In December,
the findings of the Commission on En-
hancing National Cybersecurity were
released.3
The Commission had been
tasked with developing recommenda-
tions for ways to strengthen cybersecu-
rity across both the federal government
and the private sector. In a statement,
President Obama stated that “[t]he
Commission’s recommendations...make
clear that there is much more to do and
the next administration, Congress, the
private sector, and the general public
need to build on this progress.”
Amen to that—all stakeholders must
meaningfully participate and address
cybersecurity so that everyone benefits.
Let’s hope that 2017 sees that partici-
pation increase. With that, I hope that
your holiday season has been enjoyable
and that your new year is off to a great
start. Now I’m headed off to the refrig-
erator to come up with a top 10 list of
leftovers for my wife. Looking forward
to hearing from you in 2017!
About the Author
RandyV.Sabett,J.D.,CISSP,isViceChair
of the Privacy & Data Protection practice
group at Cooley LLP, and a member of
the Boards of Directors of ISSA NOVA,
MissionLink, and the Georgetown Cy-
bersecurity Law Institute. He was named
the ISSA Professional of the Year for
2013, and chosen as a Best Cybersecurity
Lawyer by Washingtonian Magazine for
2015-2016. He can be reached at rsabett@
cooley.com.
3 https://www.nist.gov/cybercommission.
January 2017 | ISSA Journal – 5
I
f you are going
to be at RSA
Conference this
year, or perhaps you
picked up a print
copy and are reading this in the shad-
ow of one of the expo halls, take a mo-
ment to think about all the vendors on
the floor who are selling amazing kit.
If you have not walked the floor yet, be
sure to allocate a few hours to do so. I
like to start at the edges because that’s
often where some of the best new stuff is.
But remember, buyer beware. Snake oil
salesmen work everywhere!
As you speak to these vendors and un-
derstand how their products work, you
might get caught up in the excitement of
new kit and new capabilities, so much
that you lose rational thought for a mo-
ment. I mean, how else do you end up
with three timeshares at the end of a lav-
ish Las Vegas weekend? Before you sign
on the dotted line, think about the prob-
lem that the kit is trying to solve and see
if you have already solved it elsewhere
(or should solve it elsewhere).
Sometimes we forget our roots, but
that’s understandable as our industry
has grown from nothing to what you see
around you in the expo halls over the
last twenty years. Those of us who have
been around that long certainly remem-
ber security as something one of the IT
guys did, that and building tools to help
us manage our growing infrastructure
on a small scale—often times in the
same manner that the big vendors do to-
day. Before you run to your finance guy
for budget, let’s look at a couple basic
things we all need to master first.
How’s your logging?
PCI DSS may have been the first step in
forcing companies to capture good and
usable logging information, but DevOps
is the new darling on the block. Compa-
nies I work with tend to check the box
for PCI to close that nagging require-
ment but have expanded their informa-
tion generation capabilities dramatical-
ly to gain extremely important insight
into their infrastructure as it runs.
Getting rich logging information to do
both user behavior analysis and to gain
valuable insights into your infrastruc-
ture in real time will power your intel-
ligence-gathering capabilities. Many of
the products you will see on the fringe
this year are going to make the case to
shift from SIEM (security information
event management) to UEBA (user and
entity behavior analytics). If you don’t
have solid—and I mean really solid—
logging capabilities baked into every
layer of your infrastructure, these tools
won’t work as advertised. In fact, any
tool you see that promises to look for
trends, to do machine learning to alert
you on anomalies, or to just make you
more efficient will struggle to work if
you are terrible at logging.
Show me machine learning!
I was at an expo a few months ago and
had a string of vendors tell me about
their machine learning capabilities.
They show a graph with fifty bars on it,
all of which are under a value of, say,
ten except for one that is at a thousand.
Then they point to it and say MACHINE
LEARNING! For the record, that is
anomaly detection. My godson who
is almost three can do the exact same
thing and make you laugh when he does
it. Machine learning would be pointing
to one of the small bars and telling an
analyst to look at that one. Challenge
your vendors to go beyond the glitz and
buzzwords. Vaporware is just as present
today as it has ever been. Machine learn-
ing is a fantastic tool, but be sure you are
covering your anomaly detection basics
first.
How’s your debt?
Technical debt exists everywhere. It’s
that patch you decided to leave off the
list, or that coding workaround you
built to solve a latency issue, or a default
password you left in an application to
make support easier. Good companies
know exactly how far in debt they are
and work to pay this debt back. No com-
pany will always be debt free, but man-
aging this debt will help you understand
how to deploy your limited resourc-
es. Sometimes it’s a system that has a
known flaw in it, but it takes an attacker
twenty minutes to compromise. Sounds
like that virtual resource will only exist
for ten to fifteen minutes at a time until
you can address the root cause!
This year’s RSA Conference is geared
up to be the biggest ever. Tweet me at
@BrandenWilliams with a comment
about the article before February 18,
2017, and you could be the lucky winner
of a $25 Amazon gift card! Look for me
around the expo, in a session, or decom-
pressing in the airport lounge on Friday
as I hurry home for the weekend!
About the Author
Branden R. Williams, DBA, CISSP,
CISM, is a seasoned infosec and pay-
ments executive, ISSA Distinguished
Fellow, and regularly assists top global
firms with their information security and
technology initiatives. Read his blog, buy
his books, or reach him directly at http://
www.brandenwilliams.com/.
Sweat the Small Stuff
By Branden R. Williams – ISSA Distinguished Fellow, North Texas Chapter
Herding Cats
6 – ISSA Journal | January 2017
Open Forum
Executive Juris Doctor: Rewarding
and Influential Career Path
I
wanted to write in support of Randy
V. Sabett’s column, “Who’s Ready
for a JD?,” in the October issue of
the ISSA Journal. I agree we need more
people with legal education in the secu-
rity profession, although I will take the
position that one does not need to be a
full-blown, bar-certified Juris Doctor
(JD). I was told while in law school that
70 percent of JDs don’t practice law. So,
if you don’t have the desire to be bar-cer-
tified and practice law, a JD may not be
the best option for you.
In late 2005, I looked at the future of the
security industry and saw that every-
thing we do in security would have an
ever-increasing legal implication. Be-
cause of that, I decided I needed a better
legal education. I did not have any inter-
est in practicing law, so I did not want
to go the JD route. I was looking for a
Master’s in legal studies, but at that time
none existed (there are several Master’s
of legal studies degrees today). I came
across an Executive Juris Doctor (EJD)
degree distance learning program.
As I describe it, it’s a law degree for peo-
ple who want the same legal education
that lawyers get but who have no inter-
est in practicing law. You take courses
in the same substantive courses JD stu-
dents take (e.g., torts, contracts, crimi-
nal law, and civil procedure to name a
few), but because it is not bar eligible,
you don’t take the full course load a JD
student would take like wills and trusts
or corporations, and there is flexibly to
specialize. In my case, I specialized in
law and technology and took courses in
cyberlaw and intellectual property.
I have reaped huge rewards for having
this legal education as a security profes-
sional. I have published and presented
on legal topics in security since 2009. I
was twice published in the ISSA Journal,
one on e-discovery, the other on social
media policy. Being able to take a law
and translate it into business processes
or technical controls is very hard to do
if you do not understand how to read
law, how courts will interpret the law,
or even understanding rulings coming
down from the courts. And laws per-
meate our entire profession—CFAA,
ECPA, HIPAA (which are actually reg-
ulations effectuated by legislation), etc.
But, there are other advantages for hav-
ing a legal education. Much like we in
the security industry have our own vo-
cabulary, so too do lawyers; being able
to speak to lawyers in a language they
understand is very important today. For
example, I explain to people that the
word “risk” means nothing to a lawyer,
but when you use the term “liability,”
you can get a lawyer’s attention. As a
security professional, when I speak to
lawyers using their lexicon, most law-
yers light up and become very interested
in what I have to say and become very
willing to help me.
Getting a lawyer’s attention and support
has another advantage—that of stake-
holder in security. Rather than trying
to futilely drive security initiatives with
finance, marketing, or technology de-
partments or even executive manage-
ment, I use the legal department as my
driving stakeholder. Their job is to pro-
tect the organization from liability and
lawsuits, and they usually have the ear
of the CEO and the board. So, if they are
aware of security issues that are creating
liability for the organization, they can
be your biggest advocate for advancing
change.
But, I would warn you not to jump into
a legal education lightly. There is a tre-
mendous amount of reading and a good
bit of writing that goes with a legal edu-
cation. Also, I pursued my legal educa-
tion going to school full time and work-
ing full time, so I slept about four hours
a night for the first nine months I was
in school. Be prepared for the amount of
time that will be required from you.
That being said, I can say having a legal
education has been very advantageous
in my career as a security professional,
and it is something I am glad I pursued.
About the Author
Dr. Jon J. Banks, EJD, GPEN, CEH,
OSWP, CISSP is a Sr. Security Architect
at Link Technologies with 19 years of ex-
perience building information security
architectures and programs. Since 2009,
Dr. Banks has used his legal education to
give back to our profession by publishing,
presenting, and teaching on various top-
ics in law and information security. He
can be reached at jonb@linktechconsult-
ing.com.
By Jon J. Banks – ISSA member, Denver Chapter
The Open Forum is a vehicle for individuals to provide opinions or commentaries on infosec ideas, technologies, strategies,
legislation, standards, and other topics of interest to the ISSA community. The views expressed in this column are the author’s
and do not reflect the position of the ISSA, the ISSA Journal, or the Editorial Advisory Board.
January 2017 | ISSA Journal – 7
Security in the News
News That You Can Use…
Compiled by Joel Weise – ISSA Distinguished Fellow, Vancouver, BC, Chapter and
Kris Tanaka – ISSA member, Portland Chapter
It’s Time to Pull Out Your Crystal Ball
What do you think is going to happen with security and technology in 2017? Will things be better, worse, or will
they remain status quo? Here is an assortment of forecast articles for your consideration. To me, these predic-
tions are less about the future and more about a replay of 2016. What do we have to look forward to according to
security experts? More of the same: The Internet of Things, more viruses and APTs, cloud everything, DoS attacks,
ransomware, etc. My personal favorite? Dronejacking. I previously mentioned this to friends at an unnamed online
retailer, but in spite of demonstrated attack scenarios they thought it was not possible. As always, it might be
fun to hold on to these links and revisit them in December to see how accurate they really were. Here’s to the
future and keeping cybersafe in 2017! Cheers!
http://www.forbes.com/sites/gilpress/2016/12/12/2017-predictions-for-ai-big-data-iot-cybersecurity-and-jobs-from-se-
nior-tech-executives/ - 5ff851ee62e9
http://www.usatoday.com/story/money/columnist/2016/12/17/think-cyberthreats-bad-now-theyll-get-worse-2017-spear-
phishing-etc/95262574/
http://www.infosecisland.com/blogview/24860-Top-10-Cloud-and-Security-Predictions-for-2017.html
https://blog.radware.com/security/2016/12/cyber-security-predictions-2017/
http://www.mcafee.com/us/resources/reports/rp-threats-predictions-2017.pdf
http://www.csoonline.com/article/3150997/security/what-2017-has-in-store-for-cybersecurity.html
https://www.scmagazine.com/gazing-ahead-security-predictions-part-2/article/578976/
Biggest Data Breaches and Hacks of 2016: Yahoo Data Breach, DNC Hacking, and More
http://www.techtimes.com/articles/190021/20161225/biggest-data-breaches-and-hacks-of-2016-yahoo-data-breach-dnc-
hacking-and-more.htm
In addition to looking forward, the new year is also a time of reflection and taking stock of what transpired
over the past year. Here’s a quick look at some of the biggest data breaches and hacks that took place in 2016.
And just in case you haven’t seen it before, check out this frequently updated, interactive infographic from In-
formation is Beautiful. http://www.informationisbeautiful.net/visualizations/worlds-biggest-data-breaches-hacks/
Major Cyberattacks on Health Care Grew 63 Percent in 2016
http://www.darkreading.com/attacks-breaches/major-cyberattacks-on-healthcare-grew-63--in-2016/d/d-id/1327779
The Internet of Things continues to open up new attack vectors, particularly in the healthcare industry as secu-
rity experts reported a surge in medical device hijacking in 2016. The industry will continue to face challenges
in 2017, thanks to predictions of unprecedented levels of ransomware and the increasing ability of hackers to
launch multiple attacks at once.
Cybersecurity Confidence Gets a C-. How to Improve Your Grade in 2017
http://www.csoonline.com/article/3151078/security/cybersecurity-confidence-gets-a-c-how-to-improve-your-grade-
in-2017.html
How do you feel about detecting and mitigating cyber threats in your organization? If your answer is “not very
confident,” you are in good company. According to a new survey, global confidence in cybersecurity is dropping,
while challenges, such as the expanding threat environment, are increasing. Although it is easy to get discour-
aged, especially when we continue to see article after article revealing new breaches and cyberattacks, there
are ways we can improve.
Five Ways Cybersecurity Is Nothing Like the Way Hollywood Portrays It
http://www.networkworld.com/article/3151064/security/five-ways-cybersecurity-is-nothing-like-the-way-hollywood-por-
trays-it.html
Cybersecurity is cool. Just take a look at how many television shows and movies have woven it into their scripts
and storylines. But just how accurate is their portrayal of the industry? Yes, Hollywood usually tends to glam-
orize things. We all know our day-to-day work lives rarely involve fist-fights and elaborate stunts found in action
movies. But the increasing popularity around cybersecurity, even in fictional form, is a good thing. Awareness
is one of the best weapons in the fight against the “bad guys.”
Increasing the Cybersecurity Workforce Won’t Solve Everything
http://www.csoonline.com/article/3153079/security/increasing-the-cybersecurity-workforce-wont-solve-everything.html
The word is out—we all need to focus on cybersecurity, improving our security posture and infrastructure. Even
the US government is receiving recommendations and guidelines on how to make this goal a reality. Unfortunate-
ly, many of the proposed plans will take time and additional resources. What can you do while you are waiting
for these “new” solutions to make an impact? Increase security awareness at all levels in your organization.
As you have heard many times before, all it takes is just one click. Make sure the humans on your network are
prepared to make the right choices.
8 – ISSA Journal | January 2017
I
t’s been a huge year for information
security in the public eye. It seemed
like security was constantly in the
news for massive corporate security
breaches, election email leaks, or draco-
nian new cyber laws.
We had Apple vs. the FBI. Tempers
flared. People got hysterical. And that
was just the FBI’s legal team. Not all
the commentary was credible. The well-
known encryption experts the National
Sheriffs’ Association stated that Apple
was “putting profit over safety” and this
had “nothing to do with privacy.” Aww
bless.
Yahoo announced yet another huge
breach. It’s sad to see the once mighty
Internet giant slowly transitioning from
respected Internet pioneer to a honey-
pot experiment with live customer data.
The official line was that Yahoo had been
the victim of “state-sponsored” attacks.
That sounds a lot better than being re-
peatedly caught out with obsolete se-
curity controls like MD5 encryption to
protect customer passwords. To be fair,
MD5 encryption can be considered very
strong. But only if your threat model
is focused on Russian cryptographers
attacking through a star gate from the
1990s.
James Clapper announced his resigna-
tion. The man who with a straight face
denied to the US Congress that data was
being collected on millions of Amer-
icans is leaving the building. His exit
interview would have been a hoot. Have
you held anything
back? Is there any clas-
sified information that
you’ve failed to return?
Um, “Not wittingly.”
Under Clapper’s direc-
tion, national security
objectives have pros-
pered. However, tech-
nologies we all depend
on have been weak-
ened, exposing us to
risk from cyber crim-
inals and repressive
regimes. The profits of
US companies have suffered as they’ve
struggled to convince global customers
that their data is safe with a US com-
pany. If you’re a US citizen, you might
think the national security trade off was
worth it. However, if you live anywhere
else in the world, or you’re a US com-
pany who has lost customers, then you
might have a different view.
In November the most intrusive pow-
ers ever proposed for the UK intelli-
gence services were made law in the UK.
Critics protested that the new law gave
too many government agencies access
to people’s browsing history without
the need for a warrant. In fact, the list
of agencies that can access browsing
data without a warrant is so large that
it might have been quicker just to list
those that can’t. On the plus side we can
all sleep safely knowing that the Welsh
Ambulance Services National Health
Service Trust knows what we’re doing
online. Privacy activists took the UK
government to the European Court of
Justice, which ruled
in December that
government agen-
cies needed inde-
pendent judicial
oversight and that
access had to be in response to serious
crime. If you swap “web history” with
“that special bedroom drawer,” then the
judgment is entirely consistent with re-
al-world privacy.
There were persisting concerns about
the security weaknesses of voting ma-
chines in the US elections. We should
be grateful that the winner of this part-
ly automated vote count wasn’t Select *.
The FBI learned that Hillary Clinton’s
campaign chief John Podesta’s email
had been compromised. Unfortunately
all their agents were busy ogling An-
thony Weiner’s laptop, so they just left a
message with Podesta’s IT helpdesk. It’s
a mystery why Weiner’s laptop deserved
thousands of hours of agent time and
the compromise of Podesta’s email by
a foreign power didn’t merit an agency
visit.
2016 was also the year that the burgeon-
ing Internet of trash really started to
stink. Brian Kreb’s website was hit with
the largest distributed denial of service
attack ever: a great amorphous pudding
of hijacked IP-enabled household appli-
ances. People started waking up to the
risks. Some even asked, what’s the point
of a rice cooker having an IP address?
Here’s to 2017.
About the Author
Geordie Stewart, MSc, CISSP, is the
Principle Security Consultant at Risk
Intelligence and is a regular speaker and
writer on the topic of security awareness.
His blog is available at www.risk-intelli-
gence.co.uk/blog, and he may be reached
at geordie@risk-intelligence.co.uk.
Security Awareness
Security in the News in 2016
By Geordie Stewart – ISSA member, UK Chapter
Image used with permission
January 2017 | ISSA Journal – 9
Crypto Corner
A Feeble Attempt at Humor
By Luther Martin – ISSA member, Silicon Valley Chapter
some hiring manager thought that be-
ing able to understand and laugh at this
particular joke was a good criterion to
use for selecting employees. Really.
Here is the joke, reproduced as well as
my memory allows. This one requires
more thought than the first one. You
should not feel bad if you do not under-
stand it right away. But even if you do
understand it, you might want to feel
lucky that you did not end up working
for this particular company.
Three cryptographers walk into a bar.
The bartender says, “Are you all hav-
ing beer tonight?”
“Hmm,” says the first cryptographer,
“I don’t know.”
“Hmm,” says the second cryptogra-
pher, “I don’t know.”
“Yes,” says the third cryptographer.
I’m not sure where explaining this joke
ranks compared to other pointless in-
terview questions, like asking how many
ping-pong balls it would take to fill a
school bus or asking why manhole cov-
ers are round, but it seems to me like it is
roughly just as useful.
This joke actually made me laugh. It also
made me wonder exactly how the discus-
sion went among the people doing inter-
views that led to this particular element
being added to their interview process. I
assume that nobody starts with the goal
of making a bad decision, but using this
as part of an interview seemed as good
an example of something resulting from
a bad decision as anything I have ever
seen.
The third and final example of humor is
another one that I had the dubious hon-
or of creating. It is even harder to un-
derstand than the previous joke—unless
I
n f o r m a t i o n
security pro-
fessionals in
general, and cryp-
tographers in partic-
ular, are not known for their senses of
humor. This could be because the most
common personality type in informa-
tion security is MBTI type INTJ. People
of type INTJ tend to be very competent
but coldly rational. The characters Greg
House from the TV show House and
Sherlock Holmes from the TV show
Sherlock are examples of how INTJs may
come across to most people.
But this does not mean that we do not
appreciate humor when we see it. Every
ten years or so, I come across examples
of humor that seem to appeal to some
security professionals and to almost all
cryptographers. Here are three exam-
ples.
The word “rogue” is often misspelled as
“rouge.” I first noticed this back in the
dot-com era when a discussion started
on a mailing list about how to handle
“rouge CAs.” After other list members
exchanged a few messages, I could not
help asking what these “rouge CAs”
were. I asked if they were described in
some document that I had not “red,” but
suggested that they were probably real,
rather than something that someone
would just “makeup.”
Only one other list member seemed to
understand my attempt at humor, while
many others tried to provide serious
answers to my obviously (at least to me)
flippant questions. This might have been
when I first suspected that humor might
be quite rare in some parts of the securi-
ty industry. It also might not have been
as funny as I thought it was at the time.
Several years later, I heard a joke in a
rather unusual context. Apparently
you spent time in college studying the
theory of computation, of course.
Several years ago I had to give a talk in
Pittsburgh one morning, and then drive
to Cincinnati that afternoon for a meet-
ing the next day. The roads through that
part of the US are notoriously bumpy
and busy, and when I finally made it
to Cincinnati that evening, I was very
tired. When I went to check in at my ho-
tel, I was greeted by an enthusiastic and
cheerful young woman.
“How are you today?” she asked.
“I’m tired,” I replied, perhaps a bit too
truthfully.
Not realizing that I was a cryptogra-
pher, she misattributed another pro-
fession to me.
“Being a traveling salesman can be
tough,” she said.
“Yes,” I said, “it can be. And the worst
part is how NP-hard the car seats can
get.”
“What?”
“Never mind.”
What have I learned from my many
years of experience in the security in-
dustry? Apparently not enough. I still
have a bad habit of starting talks with a
joke, no matter how many times it ends
up failing miserably. But isn’t that what
we should expect from an INTJ?
About the author
Luther Martin is a Distinguished Tech-
nologist at Hewlett Packard Enterprise
and the author of the first attempt at hu-
mor published in the ISSA Journal (“The
Information Security Life Cycle,” March
2008). You can reach him at luther.mar-
tin@hpe.com.
10 – ISSA Journal | January 2017
SECURE ANY CLOUD WITH ARMOR ANYWHERE
Start Your Secure Cloud Journey Here
Armor Anywhere is a managed, scalable security solution
designed for data within public, private, hybrid or on-premise
cloud environments. Installed at the OS level and managed by a
team of experienced security experts, it prevents data breaches
so you can realize your multi-cloud strategy.
How it works: cut along the dotted line and apply to your hosting
infrastructure responsible for sensitive and regulated data.
Managed Security for any cloud. Anywhere.
armor.com | (US) 1 877 262 3473 | (UK) 800 500 3167
Association News
Through January 13, 2017 – For information:
www.issa.org/events/EventDetails.aspx?id=712365&group=
T
he second research report from the groundbreaking
global study of cybersecurity professionals by ISSA
and independent industry analyst firm Enterprise
Strategy Group (ESG) has been released.
In aggregate 54 percent of cybersecurity professionals sur-
veyed admitted that their organizations experienced at least
one type of security event over the past year. Yet, surprisingly,
none of the top contributors to these cyber attacks and data
breaches are related to cyber technology. Rather they point
to human issues such as a lack of enough cybersecurity staff
members as well as a lack of employee training and board-
room prioritization.
Further supporting this finding, 69 percent of cybersecurity
professionals say the global cybersecurity skills shortage has
had an impact on the organization they work for leading to
excessive workloads, inappropriate skill levels, high turnover
and an acute shortage especially in the areas of security ana-
lytics, application security, and cloud security.
In this time with fluid world events, such as the US presiden-
tial transition, cybersecurity professionals surveyed also send
a strong message to national government: the vast majority
believe that their nation’s critical infrastructure is extreme-
ly vulnerable or vulnerable to some type of significant cyber
attack and want government more involved in cybersecurity
strategies and defenses. Going further they recommend spe-
cific actions government should take, leading with providing
better ways to share security information with the private
sector, incentives to organizations that improve cybersecu-
rity, and funding for cybersecurity training and education.
“There’s lots of research indicating a global cybersecurity
skills shortage, but there was almost nothing that looked at
the associated ramifications. Based upon the two ESG/ISSA
reports, we now know that beyond the personnel shortage
alone, cybersecu-
rity professionals
aren’t receiving
appropriate lev-
els of training,
face an increas-
ing workload,
and don’t always
receive adequate
support from the business,” said Jon Oltsik, ESG senior prin-
cipal analyst. “Simply stated, these findings represent an exis-
tential threat. How can we expect cybersecurity professionals
to mitigate risk and stay ahead of cyber threats when they are
understaffed, underskilled, and burned-out?”
Based upon the data collected from the first global survey to
capture the voice of cybersecurity professionals on the state
of their profession, this final report of the two-part series, ti-
tled “Through the Eyes of Cybersecurity Professionals: An-
nual Research Report (Part II),” concludes:
•	 The clear majority (92 percent) believe that an average or-
ganization is vulnerable to some type of cyber attack or
data breach
•	 People and organizational issues contribute to the on-
slaught of security incidents
•	 Most organizations are feeling the effect of the global cy-
bersecurity skills shortage
•	 Cybersecurity professionals have several suggestions to
help improve the current situation
•	 Sixty-two percent believe critical infrastructure is very
vulnerable to cyber attacks
•	 Sixty-six percent believe government cybersecurity strate-
gy tends to be incoherent and incomplete
•	 Eighty-nine percent of cybersecurity professionals want
more help from their governments
“The results gleaned from this research are both alarming
and enlightening. Alarming in the sense that if we don’t
collectively pay attention to the cries for help, we will put
businesses unnecessarily at risk. Enlightening in that orga-
nizations need to be willing to invest in their cybersecurity
professionals, with clearly defined career paths and skills de-
velopment in order to hire and retain qualified employees,”
said Candy Alexander, cybersecurity consultant and chair
of ISSA’s Cybersecurity Career Lifecycle. “This research data
will help ISSA and other professional groups to clearly define
career paths for our profession.”
The Voice of Cybersecurity Professionals (Part II)
Research Reveals “Human” Issues as Top
Cybersecurity and Business Risk
Figure 1 – Impact of cybersecurity skills shortage
Has the global cybersecurity skills shortage impacted
your organization over the past few years?
12 – ISSA Journal | January 2017
CSCL Pre-Professional Virtual Meet-Ups
ISSA.org => Learn => Web Events => CSCL Meet-Ups
S
o, you think you want to work in cyberse-
curity? Not sure which way to go? Not sure
if you’re doing all you need to do to be suc-
cessful? Check out Pre-Professional Virtual Meet-
Ups to help guide you through the maze of cybersecurity.
January 19, 2017: 2:00 p.m. – 3:30 p.m. EDT. Future Chal-
lenges: Are You Ready?
This discussion will look at the history of security and tech-
nology in order to identify what has changed and what hasn’t
as well as lessons learned from our past to help prepare for
our future. We will review methodologies, technologies, and
business practices. Are the challenges really all that different?
2016 Security Review and Predictions
for 2017
2-Hour live event Tuesday, January 24, 2017
9 a.m. US-Pacific/ 12 p.m. US-Eastern/ 5 p.m. London
2016 was a monumental year in cybersecurity: from email
hacking impacting the US political world to the October DNS
attacksandtheongoingriseofransomwareandIoTconcerns.
“Cyber” is huge right now. How will this growing spotlight on
security translate in terms of media and regulatory attention?
And what kinds of threats will dominate the 2017 landscape?
Join us, make notes, and then check back in a year to see how
we did!
Generously sponsored by
For more information on this or other webinars:
ISSA.org => Web Events => International Web Conferences
ISSA.org => Learn => CISO Executive Forum
T
he CISO Executive Forum is a peer-to-peer event. The
unique strength of this event is that members can feel
free to share concerns, successes, and feedback in a
peer-only environment. Membership is by invitation only
and subject to approval. Membership criteria will act as a
guideline for approval.
The 2017 venues will be the following:
San Francisco, CA
Innovation and Technology
February 11-12, 2017
Washington DC
Information Security, Privacy, and Legal Collaboration
April 20-21, 2017
Las Vegas, NV
Security Awareness and Training—Enlisting Your Entire
Workforce into Your Security Team
July 23-24, 2017
San Diego, CA
Payment Str ategies: The Game Has Changed
October 11-12, 2017
For information on sponsorship opportunities, contact Joe
Cavarretta, jcavarretta@issa.org.
ISSA CISO Virtual Mentoring Series
L
EARN FROM THE EXPERTS! If you’re seeking a
career in cybersecurity and are on the path to becom-
ing a CISO, check out the 19 webinars from April 2015
through December 2016!
ISSA.org => Learn => Web Events => CISO Mentoring We-
binar Series
ISSA.org => Career => Career Center
Looking to Begin or Advance Your
Career?
T
he ISSA Career Center offers a listing of current
job openings in the infosec, assurance, privacy,
and risk fields. Visit the Career Center to look for
a new opportunity, post your resume, or post an open-
ing.
Questions? Email Monique dela Cruz at mdelacruz@
issa.org.
The report also lays out the “Top 5 Research Implications” as
a guideline for cybersecurity professionals and the organiza-
tions they work for. “Assume your organization will experi-
ence one or several cyber attacks or data breaches and take
the cybersecurity skills shortage into account as part of every
initiative and decision. Push for more all inclusive cybersecu-
rity training and, as importantly, get involved in educating
and lobbying business executives and government legislators
alike,” recommended Oltsik.
Leslie Kesselring, ISSA Public Relations Consultant
—“Through the Eyes of Cybersecurity Professionals: Annual
Research Report (Part I)”: http://www.issa.org/esgsurvey/.
—“Through the Eyes of Cybersecurity Professionals: Annual
Research Report (Part II)”: https://www.issa.org/page/is-
saesg_survey_P2.
January 2017 | ISSA Journal – 13
ISSA
DEVELOPING AND CONNECTING
CYBERSECURITY LEADERS GLOBALLY
Machine
Learning:
A Primer for Security
By Stephan Jou – ISSA member, Toronto Chapter
“Machine learning is revolutionizing the security landscape.”
The author examines how machine learning can be leveraged to address the practical challenges
of delivering lower-cost security by resolving more threats faster, with fewer resources. It will
focus on machine learning security techniques that work at typical levels of data volumes, from
those operating with “small data” to those implementing data lakes.
P
opular responses to that statement are all over the
map. Some say machine learning is vastly over hyped
in our market, while others contend it is the combi-
nation of machine learning with access to more data that is
the main reason to be optimistic about security in the future.
In the day-to-day world of data security, analytics practi-
tioners who have embraced machine learning are regularly
catching bad actors, such as externally compromised ac-
counts or malicious insiders. We do this by using machine
learning and analytics to detect indicators of compromise
and predict which employees or associates are likely to leave
with stolen data. We succeed when we define what is normal,
then determine anomalies using machine learning. Machines
are simply faster at repetitive tasks like finding inconsisten-
cies in the patterns of data usage, and machines do not tire
from scouring through billions of data events per day.
At present, the cybersecurity industry is still behind the curve
in demonstrating the kind of success that machine learning
has achieved in some other industries. But with rapidly grow-
ing volumes of data and better behavioral monitoring aimed
at leveraging data, big data, and data lakes, machine learning
and security clearly will achieve more breakthroughs together.
There are two good reasons why machine learning is useful
to security. First, it can reduce the cost of standing up and
maintaining a security system. In this industry, we’ve spent
billions, yet we clearly need better tools to protect our data.
The bad guys still have better tools than the good guys, and
it still costs too much to investigate and respond to security
incidents. The nature of defense is that it simply takes time to
build up resistance, only to have a new attack render that de-
fense ineffective or obsolete. This leads to the second reason
that machine learning is important: it can reduce the time
required to detect and respond to a breach once the inevitable
occurs. Proper use of machine learning can have a measur-
able impact on deployment time and cost, as well as dwell
time from incident to response.
In this article, I will examine how we leverage machine
learning to address the practical challenges of delivering low-
er-cost security by resolving more threats faster, with fewer
resources. I will focus on machine learning security tech-
2016 Article of the Year
14 – ISSA Journal | January 2017
niques that work at typical levels of data volumes, from those
operating with “small data” to those of us implementing data
lakes. My purpose is to empower security teams to make use
of machine learning to automate what skilled experts can do:
prioritize risks so that experts can focus attention on those
high-threat anomalies that signify targeted attacks, compro-
mised accounts, and insider threats.
Automate and learn: What machine learning does
best
The concept of machine learning is based on the idea that
we can use software to automate the building of analytical
models and have them iteratively learn, without requiring
constant tuning and configuring. Machine learning, if im-
plemented properly, learns by observing your company’s par-
ticular data. It should not require rules, tool kits, or a team
of data scientists and integrators to endlessly examine the
datasets in order to become operational. Similarly, the soft-
ware should not require a team with system administration
or DevOps skills to architect a big data infrastructure. Many
companies’ experiences with analytics date back to when sci-
entists and integrators had to spend months, or even years, to
understand the business and how every aspect of the dataset
intersected with users and machines. This is no longer the
case. Modern machine learning works with the data in your
organization, observing it persistently through continuous
user, file, and machine monitoring.
Further, machine learning can react automatically to typical
business changes by detecting and reacting appropriately to
shifting behavior. This is often a surprise to companies ac-
customed to bringing in teams of consultants and having
to re-engage them when a new business unit is created or a
merger occurs. It is expected that if there are new behaviors;
the old software must be configured; rules constantly rewrit-
ten; new thresholds created. But if done correctly, machine
learning can learn—then automatically continue to learn—
based on updated data flowing through the system. Just as a
teacher doesn’t have to tell an equa-
tion how to compute the average
grade score for the population of a
class, the same equation for com-
puting averages will work in class-
rooms everywhere—or when class-
es are added or removed.
Math is magical, but not magic.
The fact is, math cannot do any-
thing that a human can’t do, given
enough time and persistence. Math
simply expresses what is happen-
ing in an automated fashion using
equations. In machine learning, such equations are imple-
mented as software algorithms that can run continuously and
tirelessly. There is plenty of mystique around the seemingly
limitless capabilities of “magical” algorithms that are, in real-
ity, far less responsible for what machine learning can do for
security than the data itself. In fact, connecting the data to
the math (a process known as feature engineering) and then
implementing the math at scale (using appropriate big data
technologies) is where the real magic of machine learning for
security lies.
Cost and time essentials
One way to understand how machine learning can have an
impact on cost is to look at the steps required to install and
use an analytical product. We all know there is fixed time
associated with installation and configuration, but it is the
Automatic means no
rules must be fine-
tuned, no thresholds
must be tweaked, no
maintenance must
be performed when
your business shifts.
January 2017 | ISSA Journal – 15
Machine Learning: A Primer for Security | Stephan Jou
pendent on the capabilities of the analytics. The real cost dis-
parity emerges when we ask questions such as:
•	 Do I need to set thresholds?
•	 Will we have to write rules?
•	 Am I paying service fees for these capabilities?
•	 How easy is it?
To get value from the system, you obviously want to ask the
essential question: How long before we can actually learn
something about a breach? By asking and answering this, we
can know time to value.
To obtain the answer, we need to focus on how machine
learning extracts value. It’s popular to focus attention on the
algorithm, most likely because recently algorithms such as
Deep Learning have been achieving exciting successes in the
news. And it’s naturally easy to get lost in that excitement!
However, more important than the algorithm is a focus on
the right data and correspondent use case appropriate for
your particular organization. Getting the right datasets for
the job and applying the right principles will trump any giv-
en algorithm, every time. With this approach, we can allow
machine learning to do what it does best: find evidence, and
connect the dots between pieces of evidence, to create a true
picture of what is happening.
This “connecting of dots” is important because it allows us
to show corroboration across datasets. When security profes-
sionals talk about alert fatigue, they are really referring to the
need for better corroboration so they can reduce the number
of results the system fires. Simply put, when we have alert fa-
tigue, the math is not helping us compress the results that
the system is finding. But math can help compress billions
of events per day into dozens of incidents by effectively scor-
ing all events, and then corroborating multiple-scored events
together. A machine learning implementation further means
that this approach to reduce false positives and alert fatigue
can be done automatically, to give us the reduced cost and fast-
er time to value we’re looking for. But how does that work?
The value of a score: Probabilistic methods vs.
rules and thresholds
One important machine-learning technique is using probabi-
listic statistical methods1
to score events for risky indicators,
rather than to rely on rules with thresholds that either fire or
do not fire.
When we talk about scoring an event, we are simply talking
about computing a number, for example, between zero and
100. This contrasts with relying on rules that issue a Bool-
ean alert. Boolean alerts either fire or do not fire, based on
parameters and thresholds the operator has set. The problem
with this approach is that since alerts either fire or do not
fire, as the alerts accumulate (in your SIEM, for example), the
best we can do is count them. Having 10 alerts, all with lim-
1 For a good overview of probabilistic and statistical methods as it applies to machine
learning, see: Murphy, K. P. 2012. Machine Learning: A Probabilistic Approach,
Cambridge, Massachusetts: MIT Press.
tuning and training of the analytics that has been historically
costly.
There are many steps involved in the process between decid-
ing to start to build a security analytics-enabled process, to
receiving valid analytics that can detect and respond to inci-
dents. Choosing the right approach can significantly reduce
the time and the cost between the project start and when val-
ue can be provided. Specifically, choosing a proper machine
learning-based approach that does not require manual tun-
ing, customization, building of rules, etc., can greatly accel-
erate the time to value (figure 1).
Whether total deployment time is fast (a couple of hours or
few days) or painfully slow (as long as a year!) is largely de-
Figure 1 – Time to value: Security analytics using rules, versus security
analytics using machine learning
Don’t Miss This Web Conference
2016 Security Review and
Predictions for 2017
2-Hour live event Tuesday, January 24, 2017
9 a.m. US-Pacific/ 12 p.m. US-Eastern/ 5 p.m. London
2016 was a monumental year in cybersecurity: from
email hacking impacting the US political world to
the October DNS attacks and the ongoing rise of
ransomware and IoT concerns. “Cyber” is huge right
now. How will this growing spotlight on security
translate in terms of media and regulatory attention?
And what kinds of threats will dominate the 2017
landscape? Join us, make notes, and then check back in
a year to see how we did!
Generously sponsored by
For more information on this or other webinars:
ISSA.org => Web Events => International Web Conferences
16 – ISSA Journal | January 2017
Machine Learning: A Primer for Security | Stephan Jou
are trained to look for—bad or at least
“weird” things happening to their data.
Finally, we can collect and score all of the
events and compute their likelihood of
causing us problems. In this way, we cre-
ate a system that can learn automatically.
This automatic learning is an important
component of why the machine learning
approach works. Automatic means no
rules must be fine-tuned, no thresholds
must be tweaked, no maintenance must
ited severity information and context,
delivers little information that is helpful.
When we score events for risk, we can as-
sign them meaning—for example, 0% is
no risk, while 100% is the most extreme
risk—and then more smartly aggregate
risk values to get a combined picture of
the risks associated. Risk scores can give
additional context by being associated
with not only a particular activity, but
also with the assets, people, and ma-
chines involved. Mathematical weight-
ing helps us tune and train our model for
specific activities, people, assets, and end
points on a per-behavior pattern basis.
Aggregating scores, rather than simply
counting alerts, is more effective because
we can define a weighted representation
of how risky behavior is. In contrast, if
all you have is an alert, you can only say
that “X” things happened. While it’s true
that we can label events, labeling things
either good or bad does not help. In
fact, it can be risky. It quickly becomes
easy to ignore low probability events or
trick the system into ignoring them. You
can see why it is possible to get 10,000
alerts when the threshold is set too low,
for example. In a typical medium-size
business environment, it is quite likely
to have the data present us with billions
of “events”—multiple bits of evidence of
what is happening to the data. Machine
learning can work quickly to distill these
billions of events to tell the difference
between low- and incredibly high-risk
events, and then connect them together
for a picture, or handful of pictures, that
can tell us what is going on. Here, math
helps us compress the results, so instead
of having alert fatigue or a group of pat-
terns with arbitrary values, we have a
clear picture using statistics of what is
anomalous.
In addition to using scoring, effective
machine learning in data security lets
us use probabilistic math rather than
thresholds. Probabilistic methods are
better than thresholds because they tell
us not just about badness, but the prob-
ability or degree of badness. We can
compute all of the events, not just those
arbitrarily deemed likely to be interest-
ing. We can much more accurately assess
the overall risk posture of any entity and
actually measure what security experts
be performed when your business shifts.
But how does machine learning pull off
this trick?
How machines learn
Machines don’t learn in a vacuum; ma-
chines learn by continually observing
data. Given enough data, machines can
turn data into patterns. Observation of
patterns can lead to generalizations, a
process accomplished by taking exam-
January 2017 | ISSA Journal – 17
Machine Learning: A Primer for Security | Stephan Jou
As a human, when given a set of observations that look like
figure 2, you might eventually conclude (or learn) that cats
generally have longer tails and whiskers than dogs.
There are two broad classes of machine learning: supervised
learning and unsupervised learning.
In supervised learning, we are given the answers. In our cat
and dog example, suppose that whenever we are given a whis-
ker length and tail length, we are also told whether the animal
is a cat or a dog; this is an example of supervised learning.
Rather than simply asking us to “find me dogs and cats,” the
data told us what these animals are. Since we, in turn, advised
the algorithm about whisker and tail length, this class of al-
gorithm is known as supervised learning. It requires accurate
examples.
The model, represented visually by the dotted line (figure 3),
states that if the tail and whisker length is to the left of the
dotted line, declare the animal to be a dog. If it’s on the right,
call it a cat.
Using the learned model shown in figure 3, we can start to
make predictions. When we see animal X, and measure its
tail and whisker length, we would predict that it’s a cat, since
it is to the right of the dotted line (figure 4). X’s long whiskers
and long tail give it away!
In unsupervised learning, we hope that a grouping (or cluster-
ing) pattern emerges based solely on the input data, without
any output labels (figure 5). The data tells the story, self-or-
ganizing into clusters. In general, unsupervised learning is a
much harder problem than when output labels are available.
ples and creating general statements or truths. This learning
process is true not just of machines, but of humans. Machine
learning is nothing more than algorithms2
that automate this
same learning process that we as humans do naturally.
Consider that when we as humans see something, we know
what we probably saw because it is most similar to what we’ve
seen before. This is actually an example of a machine learning
algorithm known as “nearest neighbor” (or k-nearest neigh-
bors, for the picky).
Here is an example of applying machine learning to deter-
mine whether an animal is a cat or a dog. By fitting points to
a line we can observe that when we see an animal and it has
long whiskers (cats) and longer tails (also cats), it is more like-
ly to be a cat than a dog. The more examples we see, the more
generalizations prove the rule. While it’s true that sometimes
a cat has a short tail and occasionally a dog has really long
whiskers, it is mostly not the case. Clusters emerge showing
cats and dogs. Children quickly recognize by this method
what is a cat and what is a dog. Algorithms, when given ex-
amples, can be created to do the same thing, using math to
automate this process.
Suppose we go around our neighborhood and measure the
whisker lengths and tail lengths, in inches, for the first 14 pets
we see. We may end up with a set of data points like the fol-
lowing (table 1):
Whisker Length
(input)
Tail Length
(input)
Cat or Dog?
(output)
5 6 Cat
5.7 11 Cat
4.3 9.5 Cat
4.2 7 Cat
6.4 8 Cat
5.9 10 Cat
5.2 9 Cat
2.3 5 Dog
2.5 3 Dog
4 9.5 Cat
2.1 7 Dog
1.3 9 Dog
3.4 7.5 Dog
Table 1 – Whisker and tail lengths of sample pets
2 There are many good books that introduce the concepts of machine learning.
The following book is short and very readable, and does not require a deep math
background: Adriaans, P. and Zantinge D., 1996. Data Mining, England: Addison-
Wesley Longman. The following is a great reference for those more comfortable with
mathematical notation. Tan, P.-N.; Kumar, V. and Steinbach, M. 2006. Introduction
to Data Mining, Boston: Addison-Wesley Longman. For the coders, try: Conway, D.
and White, J. M. 2012. Machine Learning for Hackers, O’Reilly.
Figure 2 – A plot of neighborhood
dogs and cats, and their tail and
whisker lengths, in inches.
Figure 3 – A simple model that
distinguishes between dogs and cats,
based on tail and whisker length.
Figure 4 – Predicting with a model Figure 5 – Data points without labels
18 – ISSA Journal | January 2017
Machine Learning: A Primer for Security | Stephan Jou
But how do we determine the right features? Selecting fea-
tures requires knowledge. For example, we might include our
historical experience or studies from industry organizations
such as CERT, academic research, or our own brainstorming.
This type of knowledge is the reason we need experts who can
take what is in their heads and ask machines to automate it.
Creating good features is a far better use of people skills and
money, anyone would agree, than hiring expensive hunters to
sift through a sea of alerts. Machine learning simply allows
us to automate typical patterns so that our highly qualified
hunters can focus on the edge cases specific to the company
and the business.
Online vs. offline learning
There are two modes of machine learning: online and offline.
Offline learning is when models learn based on a static data-
set that does not change. Once the models have complet-
ed their learning on the static dataset, we can then deploy
those models to create scores on real-time data. Traditional
credit-card fraud detection is an example of offline learning.
Credit card companies can take a year of credit card trans-
actions and have models learn what patterns of fraud look
like. The learning can take many days or weeks to actually
complete. Once completed, those models can be applied in
real time as credit-card transactions occur, to flag potentially
fraudulent transactions. But the learning part was done off–
line from a static dataset.
Online learning occurs when we take a live dataset and si-
multaneously learn from it as the data comes in, while si-
multaneously deploying models to score activity in real time.
This process is quite a bit harder, since we are taking data as
it comes in, using live data to get smarter and run models at
the same time. This is the nature of modern, machine learn-
ing-based, credit card fraud detection. It notices what you
personally do or do not do. It involves individualized data,
simultaneously scoring activity. We use machine learning
online to learn and react at the same time.
This distinction is important because, for security, many of
our use cases require learning new patterns as quickly as pos-
sible. We do not always have the luxury of using offline ma-
chine learning to collect months and years of data. Instead, it
is often more desirable to have models that learn as quickly as
possible, as data comes in, and also react as quickly as possi-
ble, as data changes.
Historically, much of the machine learning we have done is
offline because it has been hard to move and analyze data fast
enough to run at scale. But now, with big data technologies
such as Hadoop,3
HBase,4
Kafka,5
Spark,6
and others, we are
able to learn and score as data streams into our system. The
speed and volume of our data feeds are so much greater than
ever before. Online learning (building the models) and scor-
3 Hadoop – http://hadoop.apache.org.
4 HBase – https://hbase.apache.org.
5 Kafka – http://kafka.apache.org.
6 Spark – http://spark.apache.org.
Unsupervised learning means we do not have any “labels,”
so we are not told the “answers.” In other words, we observe
a set of whisker and tail lengths from 14 animals, but we do
not know which are cats and which are dogs. Instead, all we
might know (if we’re lucky!) is that there are exactly two types
of animals. We might still arrive at a good model to distin-
guish between dogs and cats (such as the one illustrated in
Figure 4), but this is clearly a harder problem!
In general, security use cases require a mix of supervised and
unsupervised learning because datasets sometimes have la-
bels, and sometimes have not. An example of datasets where
we have a lot of labels is malware: we have many examples of
malware in the wild, so for many malware use cases, we can
use supervised learning to learn by example. An example of
datasets where we have little to no labels is anything related
to insider threat or APT; there is generally not enough data
available to rely on supervised learning methods.
The importance of the input
The input that you give your machine learning model matters
significantly. In trying to distinguish cats from dogs, know-
ing to focus on whisker and tail lengths allowed our machine
learning to be successful. If we had chosen less meaningful
inputs—such as trying to distinguish cats from dogs by the
number of legs—we would have been less successful.
The process of picking and designing the right inputs for a
model is critically important to succeeding with analytics.
For security use cases, research and experience must guide
the feature engineering process so that the right model inputs
are chosen. For example, we know from CERT, Mandiant,
and others that good indicators of insider threat and lateral
movement are related to unusually high volumes of traffic.
Our own research has discovered that the ratio of an individ-
ual’s writes to and reads from an intellectual property reposi-
tory—something we affectionately call the “mooch ratio”—is
a valuable, predictable input as well. By observing such indi-
cators, an effective machine-learning system can predict who
might be getting ready to steal data.
As you can see, the most important part of data science is
selecting the inputs to feed the algorithm. It’s an important
enough process to have its own special name: feature engi-
neering. Feature engineering, not algorithm selection, is
where data scientists spend most of their time and energy.
This process involves taking data—for example, raw firewall,
source code, application logs, or app logs—understanding the
semantics of the dataset, and picking the right columns or
calculated columns that will help surface interesting stories
related to our use case. A feature is little more than a column
that feeds the algorithm. Picking the right column or features
gets us 90 percent of the way to an effective model, while
picking the algorithm only gets us the remaining 10 percent.
Why? If we are trying to distinguish between cats and dogs,
and all we have as inputs are the number of legs, the fanciest
algorithm in the world is still going to fail.
January 2017 | ISSA Journal – 19
Machine Learning: A Primer for Security | Stephan Jou
to search, for example, on terabytes of data per day. And for
this, we have widely available big data-suitable technologies
like Solr7
and Elasticsearch.8
Such technology lets us scalably
index across all analyses from all detected threats, from all
datasets in the data lake. Technologies like Kibana are now
readily available to give us a friendly UI and API to search
and visualize our results.
However, visualizing big data is hard. You can imagine how a
pie chart of a thousand users, in which each bar corresponds
to one person, leads to a sea of color (figure 6).
Visualization in the data lake is obviously an enormous field
for research involving the challenge of how to take huge
amounts of data and convey meaning. It requires under-
standing, aggregating, summarizing, and the ability to drill
down into different levels of detail. Techniques from visual-
ization research—like focus-and-context visualization or an
understanding of visual cognition and biological precepts—
all come into play here. In other words, visualization is more
than just the drawing of the picture; the analytics underneath
the picture is equally important.
In figure 7, we can see the result of processing more than 45
billion events. We can see that the most important events
happened in February and March. Visualization on a large
amount of data must tell us a story. By using machine learn-
ing and visualization tools, we see the end of a pipeline of
analytics using computed risk scores to generate this picture
from the raw data. As we learned, math using machine learn-
ing is behind the tail end of a picture that shows risk over
time.
The “matrix” visualization at the top represents 45 billion
events. However, the underlying machine learning analysis
has processed the events to 7,535 “stories,” each with varying
levels of risk, which appears in the visualization as areas oc-
cupied by squares. Notice how quickly you see that two of the
highest risk time periods occurred in mid-to-late February.
Additional interactivity allows the user to zoom in and focus
on that specific time region for more detail.
7 Solr – http://lucene.apache.org/solr/.
8 Elasticsearch – https://www.elastic.co/products/elasticsearch.
ing (running the models) on terabytes of data a day is now
technically possible, whereas it would have been impossible
a decade ago.
Leveraging the data lake
A final reason that machine learning is more important to se-
curity now than ever becomes clear when we consider its use
with data lakes. Data lakes matter because they can be input
sources for the storage of data logs, as well a repository of an
organization’s intellectual property around which we build
protection. Clearly, we need big data analytics and automated
methods in order to see what threats are happening in this
realm. Increasingly, big data lakes are giving us the oppor-
tunity to analyze, detect, and predict threats—beyond seeing
what has happened—for compliance and forensics purposes.
This trend has occurred, in part, because data has gotten too
big to store in a SIEM. As we know, most SIEMs can practical-
ly store only a few months of data; anything older is dropped
or stored where it is not available for analysis. Increasingly,
organizations have focused on Hadoop and related technolo-
gies as a more cost-effective way to act as the system of record
for log files. But how can we better detect threats
once we are storing data (e.g., log files) in our
Hadoop data lake?
Search, visualize, detect, predict—and
repeat
As with any data, we want to be able to search,
visualize, detect, and predict threats. With ma-
chine learning, we want to combine human ex-
pertise with automated analyses for faster, more
accurate results. All of these tasks are harder on
big data, which requires newer technologies to
be capable of handling them at scale.
Data lakes let us search across and join all our
datasets into a single query. We want to be able
Figure 6 – A pie chart showing the top 100 most active tweeters.
Source: http://chandoo.org/wp/2009/08/28/nightmarish-pie-charts/
Figure 7 – A big data interactive visualization from Interset
20 – ISSA Journal | January 2017
Machine Learning: A Primer for Security | Stephan Jou
moves. It turns out that the combination of humans and com-
puters together produces stronger chess play than either hu-
mans alone or computers alone.
Why is the combination of humans with computers so pow-
erful for playing chess? It turns out that computers are gener-
ally better at calculating lots of moves, of being consistently
tactical, and not making mistakes. Humans, however, tend to
have a better holistic feel for the game. They see broad themes
and are better able to identify an edge, excelling in strategic
play.
What is perhaps best, of course, is humans and computers
working together. Why spend time looking at log files and
billions of events when computers are so good at these tasks?
Why look to an algorithm for a strategy on use cases? A skilled
cyber hunter fed with amazing data sources and machine
learning will save time, because the math never gets tired and
rarely, if ever, makes a mistake. This leaves our experts far
more free to focus on edge cases and provide feedback and
guidance back to the system on new models and features.
Better together, the human expert with proper machine learn-
ing tools is the winning combination that makes the future of
security analytics so optimistic, compelling, and powerful.
References
—Adriaans, P. and Zantinge D., 1996. Data Mining, En-
gland: Addison-Wesley Longman
—Conway, D. and White, J. M. 2012. Machine Learning for
Hackers, Cambridge: O’Reilly Press.
—Guyon, I.; Gunn, S.; Nikravesh, M. and Zadeh, L. A. 2006.
Feature Extraction: Foundations and Applications, Nether-
lands: Springer.
—Marz, N. and Warren, J. 2015. Big Data: Principles and
Best Practices of scalable Real-Time Data Systems, NY:
Manning Publications.
—Murphy, K. P. 2012. Machine Learning: A Probabilistic
Approach, Cambridge, Massachusetts: MIT Press.
—O’Neil, C. and Schutt, R. 2013. Doing Data Science:
Straight Talk from the Frontline, Cambridge: O’Reilly Press.
—Tan, P.-N.; Kumar, V. and Steinbach, M. 2006. Introduc-
tion to Data Mining, Boston: Addison-Wesley Longman.
—Tufte, E. R. 1983. The Visual Display of Quantitative Infor-
mation, Connecticut: Graphics Press.
—Zumel, N. and Mount, J. 2014. Practical Data Science with
R, NY: Manning Publications.
About the Author
Stephan Jou is CTO at Interset. He was pre-
viously with IBM and Cognos and holds an
M.Sc. in Computational Neuroscience and
Biomedical Engineering and a dual B.Sc. in
Computer Science and Human Physiology
from the University of Toronto. He may be
reached at sjou@interset.com.
Here, every visualization supports large amounts of data,
with machine learning and the analytics working behind
the scenes to surface and compresses billions of events into
dozens of stories we can understand. Further, these visual-
izations can be interactive, provided you have the right tech-
nology to support that interactivity with filtering done using,
for example, fast search.
Taming big data
Just as we need big data tools to search and visualize, we need
tools to detect and predict that are suited to the data lake
realm. It’s still important to allow humans to inject business
context and priorities, as well as human intuition, into the
process. But clearly, standard rules engines may struggle to
keep up with the volumes and velocities of the data lake. They
are simply not going to scale to the size volume and velocity
of a big data engine. Fortunately, just as with search and vi-
sualization, there are technologies to support rules engines at
scale. Kafka, Spark, and Storm are good examples of technol-
ogies which understand how to move data at scale, process
patterns at scale, and trigger rules.
We also use different math because small-data math does not
apply to big datasets. To illustrate, remember how in high
school statistics we would always have to make sure our sam-
ple size was large enough to be statistically significant? A typ-
ical rule was to make sure you had at least a sample size of 20!
Back then, it was hard to get data, but that is no longer true.
Standard frequentist methods are sometimes not appropriate
for large datasets, where a Bayesian approach may be better
at dealing with large, messy, data. We also had to invent ways
of compressing large amounts of data into small, actionable
results that we could visualize, investigate, and plug into
workflow. This is best done using math and statistics, and not
counting, because as covered earlier, simply adding up scores
tells us little that is meaningful. We must use statistical ways
of computing and comparing use-principled math and statis-
tics. These are essential technology tools for the data lake. But
what about our human experts? Where do we fit in?
Humans and machines: Better together
With big data and data lakes, machine learning can be far
more automated than ever before and as unsupervised as we
allow, while still accepting feedback such as in a semi-super-
vised system. Because data is simply becoming bigger, it is
safe to argue that the data lake is inevitable. With machine
learning to help us automate and learn—and with the right
technologies to help us search, visualize, and detect threats—
our human experts take on a new, more expert and guiding
role.
Here is how I think the security professional is evolving. Ad-
vanced chess,9
sometimes called Centaur chess, is a form of
chess where the players are actually teams of humans with
computer programs. The human players are fully in con-
trol but use chess programs to analyze and explore possible
9 Centaur Chess – https://en.wikipedia.org/wiki/Advanced_Chess.
January 2017 | ISSA Journal – 21
Machine Learning: A Primer for Security | Stephan Jou
In this article, the author shares his insights about why security architecture is critical for
organizations and how it can be developed using a practical framework-based approach.
By Seetharaman Jeganathan
Enterprise Security
Architecture: Key for Aligning
Security Goals with Business
Goals
22 – ISSA Journal | January 2017
ISSA
DEVELOPING AND CONNECTING
CYBERSECURITY LEADERS GLOBALLY
Abstract
Enterprise security architecture is an essential process that
aims to integrate security as a part of business and technolo-
gy initiatives handled by any organization. When the security
goals and objectives are aligned with organizational business
goals and objectives, any organization can make informed
decisions about business ventures and protect organizational
assets from ever-emerging security threats and risks. In this
article, the author shares his insights about why security ar-
chitecture is critical for organizations and how it can be de-
veloped using a practical framework-based approach.
Introduction
E
nterprise security architecture (ESA) is a design pro-
cess where the current state of enterprise security is
analyzed, gaps are identified based on effective risk
management processes, and the identified gaps are fulfilled
by applying cost-effective security controls. It is a life-cycle
process that enables any organization to protect itself from
advanced security threats. Until recently, ESA was a major
technology effort wherein the IT technical team owned the
definition, implementation, and operation of security pro-
cesses and controls. However, this model has created a vac-
uum with respect to business involvement and has failed to
align the IT security functions with the organizational goals
and objectives [11].
Security goals and objectives
Traditionally, information security functions have been pro-
viding confidentiality, integrity, availability, and accountabil-
ity services to information systems and infrastructure. These
services are often referred to as primary goals for informa-
tion security functions. The primary objective is to secure the
overall IT system and business functions as well as support
growth of the underlying business. ESA is a key enabling
factor to ensure that the security goals and objectives are
achieved as per the expectations of the senior management
[11].
Why security architecture?
•	 Security architecture is a key in aligning security func-
tions with the organization’s business functions
•	 Without a clearly defined architecture, security solutions
cannot be balanced between over protection and under
protection
•	 Security architecture functions enable accountability and
help obtain support and commitment from senior man-
agement
Even though the proposed security architecture framework
is a part of the enterprise architecture, it can also be rolled
out separately as a new initiative for organizations that are
not matured yet with respect to enterprise architecture. In
the sections below, the author shares his practical experienc-
es in implementing the proposed framework with several of
his industry customers. The primary goal of the framework is
to provide an organization-wide security architecture review
process to ensure that security is an integral part of all busi-
ness critical systems and processes [2][7].
Note: Since this article focuses on security architecture in general
rather than information security architecture specifically, it will be
appropriate to include corporate security, personnel security, and
physical security aspects in this exercise.
People factor
This area focuses on several actors (people) who must operate
together to effectively roll out the proposed framework. The
enterprise security architecture group (ESAG) or enterprise
security review board (ESRB) is a governance body that must
be formed if not available already, as an initial step. The effec-
tiveness of the framework will be dependent vis-a-vis the in-
volvement and participation of the identified team members.
They must fulfill their required roles and responsibilities as
effectively as possible. Human resources being expensive as-
sets for organizations, it is indispensable to get adequate sup-
port and commitment from the senior management to effec-
tively utilize human resources to protect the interests of the
stakeholders. Senior management support can be obtained by
developing a charter of this proposed ESA group by identi-
fying key roles and responsibilities of the group members. It
is important to map the goals and objectives of this group to
the overall organizational business goals and objectives and
portray how this group will enable or support the growth of
the underlying business functions [1].
Figure 2 depicts the proposed people factor top-down ap-
proach model to form the ESA group.
•	 Security architecture functions support IT functions
during changes in the business processes
•	 Security architecture provides a snapshot of an organiza-
tion’s security posture at any point of time [9]
Enterprise security architecture framework
Figure 1 shows the proposed enterprise security architecture
framework discussed throughout this paper.
The framework begins with defining the security strategy,
based on risk profile of the organization. An organization’s
security requirements are derived mainly from security
threats and risks faced by the organization [4]. These require-
ments are analyzed in the framework to clearly define a se-
curity strategy for the organization. The framework leverag-
es three major factors; people, processes, and technology to
implement the defined strategy across the organization. It is
supported by other essential elements such as organizational
governance, risk management, and IT governance bodies to
effectively achieve total security of the organization. The au-
thor has referenced “The Business Model for Information Se-
curity” (BMIS) model and designed this article with exclusive
focus on the security architecture function. The BMIS model
was originally created by Dr. Laree Kiely and Terry Benzel
at the USC Marshall School of Business Institute for Critical
Information Infrastructure Protection. Later in 2008, ISA-
CA adopted this model and has been promoting its concepts
globally.
Figure 1 – Enterprise security architecture framework
TOTAL SECURITY
Organizational Governance
Executives, Board of Directors, Stakeholders
Enterprise Risk Management
Chief Risk Officer, Risk management Group
Enterprise IT / Security Governance
CIO, CISO, CSO, etc.
Enterprise Architecture
Enterprise Architects
Enterprise Security Architecture
Framework 
Security Strategy
Company Assets
Information Security
Corporate Security
Physical Security
Organizational Entities
IT
Functions
Business
Units
Business
Partners
Customers
Enterprise
Security
Architecture
Group
Enterprise
Security
Governance
Board
Senior
Management
• Board Members
• Stakeholders
• Chief Risk Officer
• Chief Security Officer
• Corporate Security Head
• Chief Information Security Officer
• BU Heads
• Security Architec ts
• Information Risk Manager(s)
• Information Security Manager(s)
• Corporate Security Group Members
Figure 2 – People factor (top-down approach) model
January 2017 | ISSA Journal – 23
Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
The ESA group must consist of people representing all busi-
ness units of the organization such as HR, finance, R&D, IT,
products, manufacturing, etc. It is important to note that the
focus of this group is not only securing the information sys-
tems but also securing the organization with a holistic ap-
proach. Business insights and guidance are essential to derive
a holistic “organization wide” security approach. A top-down
approach will provide necessary commitment and oversight
from senior management; also, when there is a disagreement
between business groups, senior management can liaise and
resolve critical issues. It is extremely important for this group
to cascade the architectural functions and decisions to the
entire organization below and/or above them. The head of
this group or its representatives must conduct regular “con-
nect meetings” with the business units to provide security
architecture oversights and guidance for all their technology
and business initiatives [1]
One of the primary expectations and outcomes of this work-
ing group should be developing security policies and stan-
dards for all organizational functions wherein security is a
key requirement. Security policies are directions by the se-
nior management to the organization on what is allowed and
what is not allowed from the security standpoint. Security
standards are guidelines developed to substantiate/support
each policy and set directions for business units on how to
adhere to the required policies [8].
Note: The author is highly inspired by the series of books, In-
formation Security Policies Made Simple, by Charles Cresson
Wood and recommends them as reference material(s) to create
relevant security policies by any organization. However, the
samples provided in the book should be used as an inspira-
tion and must not be adopted directly without careful review.
The teams working on defining the policies must also take into
consideration industry regulations, country-specific laws, and
compliance requirements before defining the policies.
Process factor
This area focuses on how the security architecture review
process should work in real time at any given organization.
The need for an organization-wide risk management pro-
cess is now more than ever because information systems and
technology are widely used for business functions across the
world. Information systems are subject to serious security
threats. Threat agents exploit known and unknown vulner-
abilities and cause damages to information systems. This
will impact the confidentiality, integrity, availability, and
accountability goals of security functions. Security breach-
es even cause permanent damage to organizations and can
make them go out of business. Recent laws and compliance
requirements make senior management personally account-
able for any negligence in securing their customer’s personal-
ly identifiable information (PII), financial data, and personal
health information (PHI) in the healthcare industry. There-
fore, it is critical and of utmost importance that the senior
management, mid-level, and lower-level employees of an
organization understand their roles and responsibilities in
protecting organization’s resources effectively from security
risks [1].
Enterprise risk management is focused on managing risks
faced by the organization. Security risks are one among sev-
eral others risks faced, but security risks are more severe than
the others. Organizations generally follow widely known
risk management frameworks (NIST, ISACA, etc.) or cus-
tom-made frameworks specific to the organization based on
its culture, laws, and compliance requirements. The author
discusses and illustrates this article based on the NIST (SP
800-39) risk management process, which suggests that risk
management is carried out as a holistic, organization-wide
activity that addresses risk from the strategic level to the tac-
tical level. This enables organizations to make informed deci-
sions about their security activities based on the outcome of
the risk management process already in place [10].
Figure 3 depicts the NIST risk management process and
multi-tiered organization-wide risk management approach.
Note: As the scope of this paper is not to detail the NIST risk
management process, readers are encouraged to read the NIST
SP 800-39 document to understand the risk management
framework.
An important discussion in SP 800-39 is that information
security architecture is an integral part of an organization’s
enterprise architecture. However, the author from his experi-
ence suggests that organizations that do not have a matured
enterprise architecture yet must also roll out the security ar-
chitecture processes in their IT program initiatives. The pri-
mary purpose of the security architecture review process is to
ensure that specific security requirements are reviewed and
cost-effective security solutions (management, operational,
and technical) are suggested/designed for qualified risks that
must be mitigated as per the risk management strategy. Or-
ganizational security requirements could also arise from oth-
er factors such as policies, standards, laws, and compliance
regulations among others. These requirements must also flow
Figure 3 – NIST risk management process
Strategic Risk
Tactical Risk
Multitiered Organization-Wide Risk Management
Risk Management Process
Tier 1
Organization
Tier 2
Mission / Business
Processes
Tier 3
Information
Systems
Assess
Frame
Monitor Respond
24 – ISSA Journal | January 2017
Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
More than 60% of companies recently surveyed
had a data breach involving printers.1
Has yours?
Only HP printers can stop an attack before it
starts, with real-time threat detection, automated
monitoring and built-in software validation that
no one else offers.2
Reduce your risk with HP printers.
See how at hp.com/go/ReinventSecurity
1 Ponemon Institute, “Insecurity of Network-Connected Printers,” October 2015.
2 Based on HP review of 2016 published security features of competitive in-class printers. Only HP offers a combination of security features that can monitor to detect and automatically stop an attack, then self-validate software integrity in a reboot. For a list of
printers, visit: www.hp.com/go/PrintersThatProtect. For more information: www.hp.com/go/printersecurityclaims.
© Copyright 2017 HP Development Company, L.P. The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services.
Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein.
Reinvent security
After security design recommendations are submitted to the
implementation team, it is vital for security architects to pro-
vide required support throughout the implementation pro-
cess and ensure that the proposed design recommendations
are implemented as per the suggestions made in the design
review. This process is referred to as the security implementa-
tion review. Post implementation of a security control, quality
assurance testing must be done thoroughly to ensure whether
the security control is mitigating the underlying risk(s) as per
the requirements. If there is still any gap in risk mitigation,
then the whole process must be repeated until the underlying
risk is mitigated to the acceptable level. This way, the enter-
prise security review process could also be aligned with the
organization’s SDLC process being followed in the informa-
tion systems project implementation [8]. Figure 5 describes
the proposed review process flow from the beginning to end.
Technology factor
This area focuses on the technology aspects of the informa-
tion systems layer supporting the business functions. The
into the security architecture review process as depicted in
figure 4 and be addressed properly [10].
Enterprise security architecture review process
An enterprise security architecture review process is primar-
ily conducted to derive the most appropriate security solu-
tion(s) for the qualified requirements. Enterprise security ar-
chitecture review board members should meet and review the
security requirements and brainstorm possible cost-effective
solutions that could be management, operational, technical,
or combination of these controls to mitigate risks to an ac-
ceptable level. These steps are referred to as a security require-
ments review and security controls review. Once a cost-effec-
tive security control is identified, a security design review is
conducted for functional and non-functional requirements,
depending on the type of the security control(s) identified [8].
A high-level summary of the security design review process
for each control type is briefed in table 1; this is not an exten-
sive list of design review steps but suggestions to kick start
the process [8].
Security
Control Types
Design Review
Management
•	Conduct	current-state	assessment	and	identify	gaps
•	Review	the	associated	security	policies	and	
standards
•	If	required,	create	new	security	policies/stan-
dards	or	modify	the	existing	policies	to	meet	
the	security	requirements
•	Document	and	submit	the	recommendations
Operational
•	Conduct	current-state	assessment	and	identify	gaps
•	Suggest	new	or	amendments	to	the	operational	
model/process	to	meet	the	security	requirements
•	Document	and	submit	the	recommendations
Technical
•	Conduct	current-state	assessment	and	identify	gaps
•	Suggest	new	controls	or	changes	to	the	existing	
technical	control	to	meet	the	security	requirements
•	Conduct	cost-benefit	analysis,	security	return	
of	investment	(SROI)	analysis	to	prove	the	cost	
effectiveness	of	the	solutions
•	Provide	functional	design	inputs
•	Provide	non-functional	design	inputs
•	Document	and	submit	the	recommendations
Table 1 – Security controls design review (high-level summary)
Figure 4 –Enterprise security architecture process drivers
Figure 5 – Enterprise security architecture process flow
No
Start
Security Requirements Review
Security Controls Review
Decision
Management Operational Technical
Security Design Review
Security Implementation Support
Security Post-Implementation Review
Risk Mitigation ?
Yes
Stop
Other
Drivers
Risk
Management
Framework
Enterprise Security
Architecture Review Process
Security
Requirements
Security
Requirements
26 – ISSA Journal | January 2017
Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
ate a detailed inventory of the current state of information
security controls in place at the physical, network, infrastruc-
ture, and application levels [3].
There are several possible approaches in assessing the cur-
rent-state security controls of physical, network, infrastruc-
ture, and application layer components; one such approach is
creating a 3X3 matrix of current security controls classified
into types such as preventive, detective, and corrective con-
trols (table 2) [8].
After completing the above exercise at all the levels, the cur-
rent security posture (“as-is”) of the organization would be
created successfully. The next critical steps are:
a) Conduct a gap analysis of current-state security
b) Conduct a risk analysis of critical information systems
An effective risk management process is a key here to find
out the gaps in the current state and the residual risk on
the business critical information systems. The architecture
team must review the gap analysis and risk analysis reports
to derive the requirements for the desired (“to-be”) state ar-
chitecture to adequately protect the identified assets. The
requirements must be presented to the senior management,
governance board, etc., for approval, funding, and adequate
support for implementation. It is extremely important to map
the goals of the security architecture with business goals and
process begins with creating a blueprint(s) of the current
state (“as-is”) of the information systems layer (logical and
technical) in the primary and secondary data center(s) of the
organization. Figure 6 provides a high-level logical view of
the primary data center.
After a high-level blueprint is created, the team has to move
on to creating more focused blueprints about the following
areas but not limited to:
1. Detailed blueprints about the network segments such as
the DMZ layer, internal network layers (intranet), and
external networks such as partner networks, Internet,
VPN networks, etc.
2. Detailed blueprints about the infrastructure layer such as
servers, databases, mail servers, file servers, etc.
3. Request the individual application teams (mission criti-
cal and all others) to derive their application-specific ar-
chitecture diagrams if not already available [4]
Figure 7 depicts a high-level logical and physical deployment
sample architecture blueprint of an e-commerce application
in a financial services company.
After completion of above mentioned exercise, the architec-
ture team would have relevant information to review and cre-
Preventive Detective Corrective
Administrative •	Acceptable	usage	policy
•	Change	management	policy
•	Effective	change	management	process
Technical
•	Data	encryption
•	SSL/TLS	transport	encryption
•	Secure	configuration
•	Access	control
•	Vulnerability	scanning
•	Secure	coding/review
•	Effective	incident	response
Physical •	Application-specific	control(s)	if	any •	Application-specific	control(s)	if	any •	Application-specific	control(s)	if	any
Table 2 –Sample controls review matrix (e-commerce application)
Figure 6 – Organization data center – logical view
Figure 7 – Sample e-commerce application – logical/deployment architecture
January 2017 | ISSA Journal – 27
Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
•	 US Department of Defense architectural framework, and
much more
Organizations can leverage these frameworks and custom-
ize them to meet security architecture requirements. As the
scope of this paper is not to review these frameworks in detail,
the author notes that organizations must choose an architec-
ture framework that is flexible enough to support business
growth and adapt for changes in the business environment
and processes [5].
Conclusion
The enterprise security architecture process streamlines se-
curity functions of an organization to achieve effective total
security. In this article, the author proposed an enterprise se-
curity architecture framework and detailed the pillars (peo-
ple, process, and technology) of the framework. Whether
adopting this framework or an industry-recognized frame-
work is a decision to be made by the organization, depending
on the culture, business environment, and industry com-
pliance requirements. Any framework will be effective only
when adequate support is given. In order to obtain desired
outcomes of an architecture framework, it must be under-
stood and supported by the senior management. In today’s
economy-centric business environments, a security architec-
ture framework must be increasingly business oriented rather
than a technology-centric framework to obtain expected re-
turns on security investments.
References
1. Architecture Compliance. (n.d.), The Open Group. Re-
trieved November 09, 2016, from http://pubs.opengroup.
org/architecture/togaf9-doc/arch/chap48.html.
2. Breithaupt, J., & Merkow, M. S. Principle 11: People,
Process, and Technology Are All Needed to Adequately
Secure a System or Facility, Information Security Prin-
ciples of Success, Pearson IT Certification (2014, July 4).
Retrieved from http://www.pearsonitcertification.com/
articles/article.aspx?p=2218577&seqNum=12.
3. Building an e-commerce Solution Architecture, Dami-
con. Retrieved November 9, 2016, from http://www.dam-
icon.com/resources/Architecture_Practices.pdf.
objectives to garner the support from management and board
of directors [5].
As the technology environment is ever changing due to sev-
eral factors such as changes in the business processes, en-
vironment, technology adoption, automation, etc., security
architecture must always be a live and vigilant group in any
organization in order to adopt the changes and support the
business to function without disruptions. It is essential to
remember that IT and security functions must be business
enablers rather than creating road blocks for business func-
tions.
Note: Organizations that are matured in information securi-
ty processes have adopted one or more industry frameworks
listed below to implement the minimum required security
controls and desired security architecture for effective infor-
mation security:
•	 ISO 27001:2013 Information Security Standard
•	 NIST Cybersecurity Framework
•	 ISACA COBIT 5 Information Security
•	 SANS Critical Security Controls, etc. [5]
Incorporating security architecture in enterprise
architecture
MIT Sloan School’s Center for Information Systems Research
(CISR) defines enterprise architecture as “the organizing log-
ic for business process and IT infrastructure reflecting the
integration and standardization requirements of the firm’s
operating model. The enterprise architecture provides a long-
term view of a company’s processes, systems, and technol-
ogies so that individual projects can build capabilities—not
just fulfill the immediate needs” [12].
In simple terms, enterprise architecture provides a logical
view of the enterprise business layer (functions and process-
es), the IT infrastructure layer, the data layer, and the applica-
tions layer. As enterprise architecture is not the focus of this
article, the author would emphasize that security architecture
could be part of the enterprise architecture and provide ad-
equate security for all the architecture functions. Figure 8
depicts a security-enabled enterprise architecture view of an
organization [13].
Security architecture frameworks
There are several industry frameworks that provide architec-
tural approaches for meeting security requirements. Some of
them to be noted are:
•	 SABSA comprehensive framework for enterprise security
architecture and management
•	 Zachman framework of IBM
•	 The Open Group architecture framework
•	 The Open Security architecture group framework
•	 UK Ministry of Defense architecture framework
Enterprise Architecture
Business Architecture
Information
Architecture
Technology
Architecture
SECURITY
SECURITY
Figure 8 – Security enabled enterprise architecture view
28 – ISSA Journal | January 2017
Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
4. Gunnar Peterson, Security Architecture Blueprint,
Arctec Group LLC, 2007.
5. ISACA. CISM Review Manual 2015, ISACA (November
1, 2014).
6. ISACA. An Introduction to the Business Model for In-
formation Security. Rolling Meadows IL: ISACA, 2006
– http://www.isaca.org/knowledge-center/research/re-
searchdeliverables/pages/an-introduction-to-the-busi-
ness-model-for-information-security.aspx.
7. Kreizman, G. (2011, October 4). An Introduction to In-
formation Security Architecture. Retrieved from http://
gartnerinfo.com/futureofit2011/MEX38L_D2 mex38l_
d2.pdf.
8. Landoll, D. J. (2016). Information Security Policies, Pro-
cedures, and Standards: A Practitioner’s Reference. Boca
Raton: CRC Press, Taylor & Francis Group.
9. Olzak, T. (2010, September). Build an Enterprise Securi-
ty Architcture-based Framework. Retrieved January 16,
2011, from CBS Interactive/TechRepublic: http://www.
techrepublic.com/downloads/build-an-enterprise-archi-
tecture-base-framework/2212421.
10. Publication: SP 800-39. Managing Information Security
Risk: Organization, Mission, and Information System
View, National Institute of Standards & Technology –
http://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecial-
publication800-39.pdf.
11. Ritchot, B.  2013.  An Enterprise Security Program and
Architecture to Support Business Drivers.  Technology
Innovation Management Review, 3(8): 25-33. http://tim-
review.ca/article/713.
12. Ross, J. W., Weill, P., & Robertson, D. Enterprise Archi-
tecture as Strategy: Creating a Foundation for Business
Execution. Boston, MA: Harvard Business School Press
(2006).
13. Security Architecture: A New Hype for Specialists,
or a Useful Means of Communication? Retrieved
November 9, 2016, from https://www.pvib.nl/down-
load/?id=11542823.
About the Author
Seetharaman Jeganathan, CISSP, has more
than 14 years of experience in IT technolo-
gy security consulting and program man-
agement. He mainly focuses on information
systems risk assessments, identity and access
management (IAM) solution strategy defini-
tion, architecture definition, and design and implementation
of IAM security solutions. He also specializes in cloud-based
applications security consulting and implementation of IAM
solutions in cloud. He may be reached at seetharaman.jegana-
than@gmail.com.
EDITOR@ISSA.ORG  •  WWW.ISSA.ORG
ISSA Journal 2017 Calendar
JANUARY
Best of 2016
FEBRUARY
Legal, Privacy, Regulation, Ethics
Editorial Deadline 1/6/17
MARCH
Internet of Things
Editorial Deadline 1/22/17
APRIL
New Technologies in Security
Editorial Deadline 2/22/17
MAY
The Cloud
Editorial Deadline 3/22/17
JUNE
Big Data/Machine Learning/Adaptive Systems
Editorial Deadline 4/22/17
JULY
Cybersecurity in World Politics
Editorial Deadline 5/22/17
AUGUST
Disruptive Technologies
Editorial Deadline 6/22/17
SEPTEMBER
Health Care
Editorial Deadline 7/22/17
OCTOBER
Addressing Malware
Editorial Deadline 8/22/17
NOVEMBER
Cryptography and Quantum Computing
Editorial Deadline 9/22/17
DECEMBER
Social Media, Gaming, and Security
Editorial Deadline 10/22/17
March 2016
Volume 14 Issue 3
Crypto Wars II
Fragmentation in Mobile Devices
Mobile Application Security
Mobile App Testing for the Enterprise
Crypto Wars II
MOBILE APPS
May 2016
Volume 14 Issue 5
Do Data Breaches Matter?
A Review of Breach Data and What to Do Next
FedRAMP’s Database Scanning Requirement:
The Letter and Spirit
Smart Practices in Managing an Identity Auditing Project
On the Costs of Bitcoin Connectivity
★ ★ ★ ISSA ★ ★ ELECTION ★ ★ 2016 ★ ★ ★
Do Data
Breaches
Matter?
A Review of Breach Data
and What to Do Next
BREACH REPORTS:
COMPARE/CONTRAST
You are invited to share your expertise with the association
and submit an article. Published authors are eligible
for CPE credits. For theme descriptions,
visit www.issa.org/?CallforArticles.
January 2017 | ISSA Journal – 29
Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
Abstract
The cybersecurity industry faces a shortage of qualified
professionals. Part of the solution is to better deliver cyber-
security education in colleges and universities. While other
professionals have addressed the issue by proposing a for-
mal university curriculum, this article approaches the sub-
ject from the perspective of professional security personnel
teaching as adjunct instructors. The purpose of this article
is to equip cybersecurity professionals working as adjunct
instructors with resources to deliver a more efficient and ef-
fective class.
A
key battle in the cybersecurity war today is not being
fought with firewalls and encryption. It is not be-
ing fought between cybersecurity professionals and
cybercriminals. And it is not being fought in corporate net-
works and the Internet. No, this struggle is between academia
and the information security profession. It is being fought in
the classroom.
Challenges
The cybersecurity industry continues to fall short of qualified
professionals. Prominent information security organizations
have stated that the shortage is real. For example, (ISC)2
pre-
dicts that the cyber world will be short 1.5 million cybersecu-
rity professionals by 2020.1
Furthermore, Enterprise Strategy
Group (ESG)2
research shows that 28 percent of organizations
1 Jon Oltsik, “Creating a Cybersecurity Center of Excellence,” Network World (2015)
– http://www.networkworld.com/article/3016593/security/creating-a-cybersecurity-
center-of-excellence.html.
2 Enterprise Strategy Group - http://www.esg-global.com/.
have a “problematic shortage” of IT security skills.3
Mean-
while, as a society we are living in the world of the Internet of
Things and ransomware. The threats are numerous, the crim-
inals are resilient, and the rewards are rich.
The US Bureau of Labor Statistics projects great growth in
the computer and mathematical occupations, which include
cybersecurity.4
In fact, they project that this group of indus-
tries will produce more than 1.3 million job openings with-
in the next six years. This equates to an 18 percent increase,
which they label as faster than average. For information se-
curity analysts, the outlook is even better. It is expected that
this occupation will grow at a rate of 36.5 percent. The bureau
attributes this fast growth to the increased use of electronic
medical records and mobile technology.
However, some of the greatest growth areas come with high-
er educational requirements as illustrated in table 1. The bu-
reau reports that the fastest growing occupations within the
business and financial operations sectors will require at least
a Bachelor’s degree. For the computer and mathematical oc-
cupations the educational requirements are even higher. As
seen in table 2, the bureau projects that most of the new jobs
occurring in this group will require a master’s degree.
Solutions
The shortage of qualified cybersecurity professionals is a
complex problem, which means there is no easy answer. One
of the issues that must be addressed is properly educating a
cybersecurity workforce. In their 2014 article, “Application of
Pedagogical Fundamentals for the Holistic Development of
3 Jon Oltsik, “Creating a Cybersecurity Center of Excellence,” Network World (2015)
– http://www.networkworld.com/article/3016593/security/creating-a-cybersecurity-
center-of-excellence.html.
4 Bureau of Labor Statistics – http://www.bls.gov/opub/mlr/2013/article/occupational-
employment-projections-to-2022.htm.
The Role of the Adjunct in
Educating the Security Practitioner
By Karen Quagliata – ISSA member, St. Louis Chapter
The cybersecurity industry faces a shortage of qualified professionals. Part of the solution is to
better deliver cybersecurity education in colleges and universities. The purpose of this article is to
equip cybersecurity professionals working as adjunct instructors with resources to deliver a more
efficient and effective class.
30 – ISSA Journal | January 2017
ISSA
DEVELOPING AND CONNECTING
CYBERSECURITY LEADERS GLOBALLY
As seen in figure 2, the KBP model is a multi-dis-
cipline approach to information assurance. Mul-
tiple schools within the university play a role in
educating the security professional. They include
the business school, the IT school, and the law
school.7
Education transformation
Such formal approaches to education delivery
as the KBP model are vital to the cybersecu-
rity workforce. However, they rely on a more
traditional education delivery model. What is
becoming more evident, though, is that higher
education is undergoing a transformation. That
transformation is a move from a staff of tenured
professors with degrees in education to the use
of adjunct instructors who are often profession-
als working in the industry. For a multitude of
reasons, colleges and universities are now rely-
ing more heavily on adjunct instructors. In fact,
in 2014 adjuncts made up more than 70 percent
of all college and university faculty.8
Adjunct in-
structors teach classes on a contract basis. Often
a master’s degree in the subject that they will
teach and professional experience are the main
requirements. They can be trained teachers, but many are in-
dividuals who work in a given profession full-time and teach
part-time. It is the latter category where the cybersecurity
world must focus.
It is the responsibility of cybersecurity professionals to teach
cybersecurity courses in an effective and efficient manner.
Part of the problem, though, is that often a cybersecurity ad-
junct is handed only a course name, brief course description,
and textbook name along with his/her contract. In some cas-
es another teacher who previously taught the class will share
7 Ibid.
8 J. Fruscione, “When a College Contracts ‘Adjunctivitis,’ It’s the Students Who Lose,”
PBS Newshour - http://www.pbs.org/newshour/making-sense/when-a-college-
contracts-adjunctivitis-its-the-students-who-lose/.
Cybersecurity Professionals,” authors Barbara Endicott-Pop-
ovsky and Viatcheslav Popovsky provide a wealth of informa-
tion for universities looking to develop a formal curriculum
for information assurance education.5
Figure 1 illustrates the
Kuzmina-Bespalko-Popovsky (KBP) Pedagogical Model de-
veloped at the Center for Information Assurance and Cyber-
security (CIAC) at the University of Washington.6
The model
takes into consideration such influences on education as the
current job market as well as technical, economic, cultural
and political trends.
5 Barbara Endicott-Popovsky and Viatcheslav Popovsky, “Application of Pedagogical
Fundamentals for the Holistic Development of Cybersecurity Professionals,” ACM
Inroads (2014 March) – https://niccs.us-cert.gov/sites/default/files/documents/files/
p57-endicott-popovsky.pdf?trackDocs=p57-endicott-popovsky.pdf.
6 Ibid.
Education level
Employment Projected change, 2012-2022
2012 2022 Number Percent
Bachelor’s	degree 2,893.1 3,415.2 522.1 18.0
Some	college,	no	degree 547.7 658.5 110.8 20.2
Associate’s	degree 316.1 356.6 40.6 12.8
Master’s	degree 31.1 39.2 8.2 26.3
Doctoral	or	professional	degree 26.7 30.8 4.1 15.3
Source:	US	Bureau	of	Labor	Statistics
Table 2 – Computer and mathematical occupations employment by educational requirement,
2012 and projected 2022 (employment in thousands)
Education level
Employment Projected change, 2012-2022
2012 2022 Number Percent
Bachelor’s	degree 5,344.3 6,131.4 787.1 14.7
High	school	diploma	or	equivalent 1,809.7 1,921.4 111.7 6.2
Postsecondary	nondegree	award 13.5 12.8 –0.7 –5.3
Source:	US	Bureau	of	Labor	Statistics
Table 1 – Business and financial operations occupations employment by educational
requirement, 2012 and projected 2022 (employment in thousands)
Figure 1 - The KBP Pedagogical Model: CIAC as a pedagogical system
KBP Pedagogical Model for
Information Assurance Curriculum Development
IA ProfessionalsNew Students
Dynamic Social /
Professional Context
Teachers
Students
Goals
Content
Didatic
Processes
Trends: Technological, Economic, Cultural, Political
Job Market
Figure 2 – Multi-disciplinary approach to information assurance
Business School–IT
iSchool
Evans School
Law School
Tech Comm-Eng
Business School–IT
iSchool
Evans School–InternetCenter
Law School—Shidler Center
Business School–IT
iSchool
Tech Comm-Eng
Goal of System
Policy
Procedures
& Practices
Mechanisms
SecurityAwareness
Training
Secure System
iSchool
Computer Science
Elect Engr
IA Audit Feedback
January 2017 | ISSA Journal – 31
The Role of the Adjunct in Educating the Security Practitioner | Karen Quagliata
his or her syllabus and notes with the cybersecurity adjunct.
Predictably, the results can be varying degrees of class quality.
Recommendations
Cybersecurity professionals teaching as adjuncts need a
strong support system of resources in the form of profession-
al organizations, industry research and publications, and
assistance from experienced cybersecurity adjuncts. Figure
3 shows the suggested model needed to produce an effective
adjunct-led cybersecurity classroom.
Figure 3 – Adjunct-led cybersecurity classroom model
The following are some recommendations for cybersecurity
professionals who are entering the world of adjunct teaching.
Topics
Be sure to cover the key topics and pain points of the security
industry today. Sure, the (ISC)2
Common Body of Knowledge
(CBK)9
is a great foundation to build the class, but do not stop
there. Go beyond the CBK basics—address the reality of im-
plementing security in a corporate setting, address the strug-
gle of trying to implement a security culture when senior
leadership is lukewarm (at best) to the idea, address funding
and threat landscapes specific to industries.
•	 Be sure to address the sometimes forgotten risk areas such
as third-party vendor management. Data breaches that can
be attributed to third-party vendors are still occurring.
For example, a 2015 study by the Ponemon Institute shows
that 39 percent of respondents attributed their healthcare
organization’s data breach to a third-party issue.10
•	 Adjunct instructors should certainly discuss any recent
data breaches in their classes, but they should do so in
an analytical manner. For example, compare three major
data breaches and identify any similar aspects (compro-
mised credentials, third-party backdoors, unpatched sys-
tems, etc.)
•	 Adjunct instructors should use industry resources to ad-
dress key aspects of cybersecurity, such as the Verizon
9 (ISC)2
Common Body of Knowledge –https://www.isc2.org/cbk/default.aspx.
10 Ponemon Institute, Fifth Annual Benchmark Study on Privacy & Security of
Healthcare Data Report – https://iapp.org/media/pdf/resource_center/Ponemon_
Privacy_Security_Healthcare_Data.pdf.
Data Breach Investigations Report11
and Ponemon Insti-
tute’s Cost of a Data Breach Report.12
These two free re-
sources will not only help the adjunct address causes of
breaches, but also how much they will cost an organiza-
tion.
•	 Adjunct instructors owe it to their students to address
certifications. Students of higher education are painfully
aware of the cost of a degree. Some may question whether
it is better to forgo the degree and just pursue the certifi-
cation route. Share a list of some of the more in-demand
security certifications and what is required to obtain them.
Go the extra mile and study the want ads to see what certi-
fications employers want their candidates to have.
Tests
Strongly consider using standardized tests for introductory
security classes only. It can be assumed that at higher course
levels the students already have a firm grasp of basic secu-
rity topics, such as bios, authentication methods, basics of
encryption, etc. At the higher level courses adjunct instruc-
tors should be looking for analytical skills in their students.
Writing in a clear, concise, analytical manner is vital to any
industry, including information security.
Assignments
Adjuncts should make their assignments as realistic as possi-
ble. The goal of education is to expand the mind, but it should
also prepare students for the real world. A good way to make
realistic assignments is to draw upon actual work experienc-
es, or use prewritten case studies. One approach is to pro-
pose a business problem that must be solved by IT. Ask the
students to research and propose a solution, complete with
a cost summary and risk assessment. Adjuncts can expand
upon that assignment by instructing the students to present
the solution both to a technical audience and a business au-
dience. Another possible assignment is to have students on
a weekly basis choose a topic in security news and write a
summary. While not an overly complex assignment, it forc-
es students to get into the habit of staying abreast of current
issues and topics in the industry. It also helps students hone
their writing skills.
Textbooks
If the university does not already have a textbook in mind
for the class, then the adjunct should consider using certifica-
tion study guides. These resources lend themselves to a good
foundation because of the breadth and depth of the material
they cover.
Professional organizations
The classroom is also a good platform for encouraging stu-
dents to consider how professional organizations, such as
ISSA, can help them at any stage of their careers. The adjunct
11 Verizon Data Breach Report – http://news.verizonenterprise.com/2016/04/2016-
data-breach-report-info/.
12 Ponemon Institute’s Cost of a Data Breach Report – http://www-03.ibm.com/
security/data-breach/.
Experience
Case Studies
Assignments
Speakers
Professional
Organizations
Offsite Learing
Field Trips
Formal Tests
Books, Professional
Organizations
Adjunct-Led
Cybersecurity Classroom
32 – ISSA Journal | January 2017
The Role of the Adjunct in Educating the Security Practitioner | Karen Quagliata
SecureWorld conferences provide more resources and facilitate more
connections than any other cybersecurity event in North America. Our regional
events are designed to equip and inspire those defending the digital frontier.
Join like-minded security professionals in your local community for high-quality,
affordable training and education. Attend featured keynotes, panel discussions
and breakout sessions, and learn from nationally-recognized experts. Network
with fellow practitioners, thought leaders, associations and solution vendors.
Don’t go it alone. Register for a SecureWorld conference near you.
See Globally. Defend Locally.
Announcing our 2017 conference schedule. In addition to our lineup
of 14 regional events, we’re excited to introduce two new markets:
Chicago and Twin Cities. Mark your calendars and make plans to attend!
*Dates subject to change
Fall:
Cincinnati, OH - September 7
Detroit, MI - September 13-14
St. Louis, MO - September 20-21
Denver, CO - October 4-5
Twin Cities, MN - October 12*
Dallas, TX - October 18-19*
Bay Area, CA - October 26*
Seattle, WA - November 8-9
Spring:
Charlotte, NC – March 2
Boston, MA – March 22-23
Philadelphia, PA – April 5-6
Portland, OR – April 20
Kansas City, KS – May 3
Houston, TX – May 18
Atlanta, GA – May 31-June 1
Chicago, IL – June 7
www.secureworldexpo.com
should provide a list of website links to pertinent organiza-
tions. The adjunct can also invite presidents of local chap-
ters to speak during a class, or ask permission for students to
come to a future event on a trial basis.
Innovation
Adjuncts should try to go beyond the typical lecture. Infor-
mation security is a vibrant, constantly changing industry.
Traditional lectures do not do it justice. While it is not always
possible to avoid a lecture while speaking about the vulner-
abilities of a kernel, there are other opportunities to educate
without producing glazed stares.
•	 Field trips – Adjuncts should take the education outside
of the classroom. Even professionals working in the indus-
try welcome a field trip. Reach out to local law enforce-
ment to see if the class can visit a forensics lab, or if a local
company will allow students to view its network monitor-
ing system.
•	 Guest speakers – Guest speakers can also provide an in-
novative way to impart knowledge. No matter how excit-
ing an adjunct may be as a speaker, students always seem
to pay more attention to a guest speaker. Adjuncts can use
a guest speaker to reinforce the topic of the day. For exam-
ple, if the class discussion is authentication, the adjunct
can bring in an expert on biometrics. Sources for speak-
ers are abundant. Adjuncts can use professional organi-
zations and/or colleagues as a source for speakers. Local
law enforcement can be another source. Finally, with web
conferencing there is no geographical boundary.
•	 Video – Adjuncts should not be afraid to incorporate vid-
eo into their lectures. YouTube provides a plethora of se-
curity-related videos and instructions that will go beyond
the typical lecture. There are also plenty of free webinars
available from vendors and professional organizations.
Along those lines, the adjunct can have students play free
interactive phishing games available online.
Implications for professional security organizations
The changing dynamics of the information security indus-
try and the higher education system are providing a perfect
storm for professional security organizations to play a greater
role in promoting and supporting the profession. Profession-
al security organizations, such as ISSA, can play a vital role
in helping to ensure the level of quality delivered by security
professionals teaching in an adjunct capacity. Professional se-
curity organizations can be a repository of resources to help
the new—or established—adjunct instructor. Some of the
possible ways that organizations can help include:
•	 Classroom material – Professional security organiza-
tions can solicit their members to submit case studies, test
questions, and project ideas to their local chapters to help
establish a repository of classroom material. Participation
can be counted toward CPEs.
•	 Speaker’s bureau – Professional security organizations
contain a wealth of experience. Organizations can estab-
lish a speaker’s bureau to help bring some of that expe-
rience to the classroom. The local chapter can establish a
repository of speakers based on their area of expertise and
availability. Participation can be counted toward CPEs.
•	 Mentoring program – Almost certainly there will be
members of a professional security organization who are
already working as adjunct instructors. Organizations
can establish a mentoring program where experienced ad-
juncts can help new adjuncts with lesson plans, tips, and
effective teaching strategies. Participation can be counted
toward CPEs.
•	 Textbook recommendations – Professional security orga-
nization members can also provide support to the adjunct
community by recommending potential textbooks or sup-
plemental reading material for cybersecurity classes.
Conclusion
The growing cybersecurity field is expecting its personnel to
have terminal degrees. At the same time colleges and univer-
sities are relying upon adjunct instructors to help supplement
their staff of tenured professors. The result is that cyberse-
curity programs at colleges and universities are turning to
cybersecurity professional to fill adjunct instructor roles.
Often cybersecurity adjunct instructors are thrown into the
classroom with little support. To help ensure a consistent lev-
el of quality of cybersecurity classes, cybersecurity adjunct
instructors should follow guidelines outlined in this article.
Furthermore, professional security organizations should act
as a support system by providing classroom material and ex-
perienced professionals.
About the Author
Karen Quagliata, PhD, PMP, CISA, CISSP,
is an information security analyst working
in risk management and governance. She is
also an adjunct instructor for multiple uni-
versities and colleges. Karen can be reached
at Karen.quagliata@gmail.com.
ISSA Journal Back Issues – 2016
Past Issues – digital versions: click the download link:
ISSA.org => Learn => Journal
Securing the Cloud Mobile Apps
Big Data / Data Mining & Analytics
Malware Threat Evolution
Breach Reports – Compare/Contrast
Legal, Privacy, Regulation Social Media Impact
Internet of Things Payment Security
Cybersecurity Careers & Guidance
Practical Application and Use of Cryptography
34 – ISSA Journal | January 2017
The Role of the Adjunct in Educating the Security Practitioner | Karen Quagliata
Fragmentation by OS
Apple’s success in this area is primarily due to the company’s
control over both the hardware and software components of
the iOS line of devices. With all mobile hardware, there are
three primary stake-holders, aside from the end user. The de-
velopers create the operating system. Device manufacturers,
of course, are responsible for actually building or assembling
phones, tablets, and additional hardware. And carriers sup-
ply the means through which the devices communicate with
the world. Apple comprises two of these three stakeholders, a
position the tech giant has been able to leverage to limit carri-
er incursions into its operating system and hardware. Device
fragmentation among Android and Windows mobile devices,
on the other hand, is significantly more complicated (figure 1).
Due in large part to the open nature of the Android oper-
ating system, there are hundreds of Android-compatible de-
vice manufacturers from the well-known Samsung to smaller
companies like Blu Dash2
(figure 2). Each manufacturer is
able to make alterations to the operating system itself, which
can include useful features such as Samsung or HTC’s fin-
gerprint scanners (which were not natively supported by An-
droid until recently) or proprietary front ends like Samsung’s
2 OpenSignal. “Android Fragmentation Visualization.” Aug. 2015. Web. 19 Jan. 2016 -
http://opensignal.com/reports/2015/08/android-fragmentation/.
T
hirty seconds and a USB
cable are all it takes to
compromise an iPhone 4.
Through the execution of an “SSH
Ram Disk” attack, a malicious at-
tacker can achieve root access on
a victim device without jailbreak-
ing or causing permanent changes
to the hardware or operating sys-
tem. The vulnerability responsible
for this exposure can be found at the hardware level, and as
a result of device fragmentation, it is unlikely that Apple will
ever be able to directly address the problem for affected con-
sumers.
Mobile device fragmentation is a phenomenon that occurs
at a point in time when groups of mobile users are running
various versions of an operating system across a variety of
hardware platforms. As mobile devices age, software develop-
ers and hardware manufacturers cease to provide updates for
older devices, leaving users, often times unknowingly, with a
wide range of potential problems ranging from minor bugs to
compromise-level security flaws.
Device fragmentation happens naturally over time as new
hardware is released that includes additional features or sys-
tem requirements that cannot be supported by older models.
Of the three major kings of the mobile market, Apple iOS de-
vices fare far better than their Android and Windows phone
counterparts when it comes to device fragmentation. As of
August 2015, iOS 8 had been adopted by 85 percent of Apple
device users; Android Lollipop, Google’s latest release at the
time, had only an 18 percent adoption rate despite there only
being a two-month difference in their release times.1
1 Whitney, Lance. “iOS 8 Hits 85% Adoption Rate; Android Lollipop Only at 18%.” 5
Aug. 2015. Web. 24 Feb. 2016 – http://www.cnet.com/news/ios-8-hits-85-adoption-
rate-android-lollipop-only-at-18/.
The purpose of this article is to explore the threat to consumers posed by mobile device
fragmentation. The author categorizes mobile device fragmentation by operating systems,
manufacturer, and carrier, exploring the vulnerabilities at each level.
By Ken Smith
Fragmentation in Mobile
Devices
Figure 1 – OS and Android Versions on the Market in 2015 (OpenSignal, 2015)
January 2017 | ISSA Journal – 35
ISSA
DEVELOPING AND CONNECTING
CYBERSECURITY LEADERS GLOBALLY
TouchWiz. Additionally, carriers such as Verizon, AT&T, and
T-Mobile are able to leverage their control of the airwaves to
drop proprietary third-party software onto many Android
devices before they hit the market. The NFL Mobile applica-
tion and Verizon’s line of in-house applications are examples
of such installations. These applications cannot normally be
removed by the user without gaining root access to the phone
or tablet, which, of course, creates additional security prob-
lems and exposures.
Microsoft controls a much smaller portion of the mobile mar-
ket; according to the International Data Corporation (IDC),
in Q2 2015, a mere 2.6 percent of the world’s mobile users
were on Windows-powered phones.3
Nevertheless, Windows
mobile devices, specifically Windows phones, have historical-
ly been subjected to significant device fragmentation (though
it is not as big as a problem as it is with Android) due in large
part to a lack of support from carrier brands. When Micro-
soft’s Denim update for Windows Phones was first released in
September 2014, it took months for T-Mobile and AT&T to
push the update to compatible phones. Verizon was quicker
to respond, but only after it had continuously refused to push
the previously available update, Cyan, to compatible devices.4
In response to these kinds of problems, in the spring of 2015
Microsoft announced that it would be taking over updates for
all Windows 8 devices, assuring customers with compatible
devices that they would receive an upgrade to Windows 10
in a timely manner.5
Unfortunately, at the time of the an-
nouncement there were still many users on Windows 7 devic-
es, many of which would not be able to make the upgrade. In
November 2014, nearly 17 percent of Windows phones on the
market were running Windows 7, and a large portion of those
3 IDC. “Smartphone OS Market Share.” International Data Corporation. 2015. Web. 10
Feb. 2016 – http://www.idc.com/prodserv/smartphone-os-market-share.jsp.
4 Murphy, David. “Microsoft Going after Smartphone Fragmentation in Windows 10
Mobile.” PC Magazine. 16 May 2015. Web. 10 Feb. 2016 – http://www.pcmag.com/
article2/0,2817,2484312,00.asp.
5 Bott, Ed. “Microsoft Says It’s Taking Over Updates for Windows 10 Mobile Devices.”
ZDNet, 25 May 2015. Web. 10 Feb 2016 – http://www.zdnet.com/article/microsoft-
says-its-taking-over-updates-for-windows-10-mobile-devices/.
devices had already been unable to make the
jump to Windows 8/8.16
(figure 3). This has
subsequently left a significant portion of the
Windows Phone customer base unsupported,
necessitating an upgrade to newer hardware.
Consumer vulnerabilities
It goes without saying that device fragmentation presents a
number of serious problems for consumers. First-party ap-
plications and changes to the underlying operating system
introduced by device manufacturers can put users at serious
risk. Ryan Welton, a security researcher with NowSecure, re-
vealed at BlackHat 2015 that Samsung’s update process for its
customized version of the Swift keyboard application could
be exploited to create a man-in-the-middle scenario through
which remote code execution could be achieved. According
to a 2015 report by Dan Goodin for Ars Technica:7
Attackers in a man-in-the-middle position can imperson-
ate the [update] server and send a response that includes
a malicious payload that’s injected into a language pack
update. Because Samsung phones grant extraordinarily
elevated privileges to the updates, the malicious payload is
able to bypass protections built into Google’s Android op-
erating system that normally limit the access third-party
apps have over the device.
Welton successfully demonstrated the attack at the confer-
ence, compromising a variety of Samsung phones attached to
Verizon, Sprint, and AT&T carrier networks.8
The vulnera-
bility was quickly patched by Samsung, but carrier networks
were slow to push the update to affected devices.
Often times, very serious vulnerabilities are discovered and
problems arise when patches must be written, tested, mod-
ified, and verified across a variety of devices, operating sys-
tem versions, and carrier or manufacturer builds. On July
27, 2015, Joshua Drake of Zimperium announced that he had
6 Kumar, Ramesh. “Microsoft Hopes 0% Windows Phone 8.x OS fragmentation;
Windows 10 Is the Future.” Inferse, 15 Nov 2014. Web. 10 Feb 2016 – http://www.
inferse.com/19397/microsoft-hopes-0-windows-phone-8-x-os-fragmentation-
windows-10-future/.
7 Goodin, Dan. “New Exploit Turns Samsung Galaxy Phones into Remote Bugging
Devices.” Ars Technica. 16 Jun 2015. Web. 10 Feb. 2016 – http://arstechnica.com/
security/2015/06/new-exploit-turns-samsung-galaxy-phones-into-remote-bugging-
devices/.
8 Welton, Ryan. “Remote Code Execution as System User on Samsung Phones.”
NowSecure Blogs. NowSecure, 15 June 2015. Web. 19 Jan. 2016 – https://www.
nowsecure.com/blog/2015/06/16/remote-code-execution-as-system-user-on-
samsung-phones/.
Figure 2 – Android Market Shares by Manufacturer (OpenSignal, 2015)
Figure 3 – Q4 2014, Windows Phone OS Distribution
(Kumar, 2014)
36 – ISSA Journal | January 2017
Fragmentation in Mobile Devices | Ken Smith
discovered a variety of Android vulnerabilities that are now
collectively known as “Stagefright.” The Stagefright bug af-
fected Android version 2.2 (“Froyo”) and newer. When ex-
ploited successfully, the vulnerability allowed for an attacker
to perform remote code execution and privilege-escalation
attacks against affected devices. Zimperium estimated that,
at the time of the initial announcement, a successful exploit
could be triggered on roughly fifty percent of affected devic-
es without any user interaction.9
Aside from the seriousness
of the vulnerability itself, most concerns about Stagefright
stemmed from the hurdles to patching presented by device
fragmentation. In spite of the fact that Google fixed the prob-
lem in its code base days after being notified in April 2015, it
took months for major manufacturers and carrier networks
to push the patches out to affected devices. Even worse, cer-
tain older devices may never be patched for Stagefright.10
To
cap off the crisis, the slow release of patches and updates was
complicated when a second round of Stage-
fright vulnerabilities was discovered and
publicly released in early October of that
same year.11
Logo for the Stagefright vulnerability (Rundle, 2015)
Often times, vulnerabilities are discovered in specific mod-
els of hardware, but manufacturers opt not to issue a fix in
favor of promoting new versions of the devices. In 2010, an
independent researcher discovered a technique for gaining
root access to iOS devices (iPhone 4 and older) without the
need for a jailbreak (the “SSH Ram Disk” attack described at
the beginning of the article).12
At the time of this writing, the
iPhone 4 is still vulnerable to this particular attack, and as
these devices are no longer officially supported by Apple (and
the fact that the exposure is hardware-based), there will likely
never be a security update that resolves the issue. Similarly, in
2013, independent security researchers discovered two lock-
screen bypasses for the Samsung Galaxy S3. Both require lit-
tle technical knowledge or time to execute successfully. At the
time of the discovery, Samsung neglected to patch the vul-
nerabilities in favor of releasing the Galaxy S4. According to
a study by OpenSignal, the Samsung Galaxy S III was still the
most widely owned Android phone in the United States as of
Q4 2014.13
Future of mobile technologies
Given the current state of the mobile landscape, device frag-
mentation is likely to remain a big part of the future of mo-
bile technologies. Fortunately, there are steps consumers can
take to protect themselves and their devices. For those users
9 Z Team. “How to Protect from Stagefright Vulnerability.” Zimperium. 30 Jul. 2015.
Web. 10 Feb. 2016 – http://blog.zimperium.com/how-to-protect-from-stagefright-
vulnerability/.
10 Rundle, Michael. “‘Stagefright’ Android Bug Is the ‘Worst Ever Discovered.’”
Wired Magazine. 27 Jul. 2015. Web. 10 Feb. 2016 – http://www.wired.co.uk/news/
archive/2015-07/27/stagefight-android-bug.
11 Mimoso, Michael. “Stagefright 2.0 Vulnerabilities Affect 1 Billion Android Devices.”
ThreatPost. 01 Oct. 2015. Web. 10 Feb. 2016 – https://threatpost.com/stagefright-2-0-
vulnerabilities-affect-1-billion-android-devices/114863/.
12 “Working iPhone Recovery Ramdisk with SSH;-).”16 May 2010. 19 Jan. 2016 – http://
msftguy.blogspot.com/2010/05/working-ramdisk-with-ssh.html.
13 OpenSignal (2015).
January 2017 | ISSA Journal – 37
Fragmentation in Mobile Devices | Ken Smith
IT’S GOOD FOR
BUSINESS
Why Advertise in the
ISSA Journal?
Contact Joe Cavarretta
jcavarretta@issa.org
Learn More
Today
Access nearly 11,000 ISSA members, of which
74 percent are CISO/Security Leaders, Senior
Executives, and Mid-Career Professionals either
making decisions or highly influencing decisions on
products and services to evaluate or purchase.
Place your advertising strategically to
surround our monthly themes with your
organization’s products and services...
JANUARY 2017
Best of 2016
FEBRUARY
Legal, Privacy, Regulation, Ethics
MARCH
Internet of Things
APRIL
New Technologies in Security
MAY
The Cloud
JUNE
Big Data/Machine Learning/Adaptive
Systems
JULY
Cybersecurity in World Politics
AUGUST
Disruptive Technologies
SEPTEMBER
Health Care
OCTOBER
Addressing Malware
NOVEMBER
Cryptography and Quantum Computing
DECEMBER
Social Media, Gaming, and Security
of mobile device manufacturers. Earlier this year, a Dutch
consumer group issued a lawsuit against Samsung over what
they perceived to be the manufacturer’s slow security update
policy. According to a January 2016 Neowin report by Andy
Weir:
The [consumer group] cites a survey that it carried out,
which it says shows that “82% of the Samsung phones ex-
amined had not been provided with the latest Android
version in the two years after being introduced.” Indeed,
it’s worth noting that Samsung isn’t exactly known for
delivering updates with any great sense of urgency, as its
record with the newest major version of Android shows.15
The group ultimately wants Samsung to provide a full two
years of updates for its devices starting on the day that the de-
vice is sold (as opposed to the date of its initial release or even
manufacturing). While these demands may be overzealous,
and keeping in mind that the case has yet to be decided, it is
the first time such action has taken place: consumers break-
ing new security ground in the mobile device market.
Conclusion
This means that users are likely, at least a small extent, to
have more positive control over the security of their devices
of choice from the very beginning. And that’s a crucial step
in the growth and maturity of the mobile device market. And
as that market continues to expand, all stakeholders, includ-
ing consumers, have a responsibility to ensure that issues like
device fragmentation do not seriously impact the security
posture of the market at large, thus ensuring that all mobile
devices are as secure as possible from initial
release to end-of-life.
About the Author
Ken Smith works for SecureState, a global
management consulting firm specializing in
information security. Ken works primarily on
wireless and physical assessments as well as
in mobile device and application security. He enjoys presenting
and regularly speaks at industry conferences. Reach Ken via
email at ksmith@securestate.com or call 216.927.8200.
15 Weir, Andy. “Samsung Sued by Dutch Consumer Group over ‘Poor Software Update
Policy for Android Phones.’” Neowin. 20 Jan. 2016. Web. 10 Feb. 2016 – http://www.
neowin.net/news/samsung-sued-by-dutch-consumer-group-over-poor-software-
update-policy-for-android-phones.
whose primary concern is device fragmentation, iOS is the
best option of the “big three.” Consumers with a preference
for Android or Windows devices should purchase the newest
technologies whenever possible and stick to the more well-
known manufacturers. These companies are the most likely
to maintain the resources that will allow them to continue
to support future versions of Android or Windows on legacy
hardware. Additionally, the devices of those manufacturers
with larger portions of the market (Google, Samsung, HTC,
and LG) typically receive security updates faster and more
frequently than the lesser-known manufacturers.
Application developers can also help alleviate the sting of de-
vice fragmentation. While the latest hardware can be allur-
ing, responsible developers should continue to support lega-
cy hardware for as long as possible. A significant portion of
the smartphone market is comprised of outdated hardware.14
Dropping support for those devices that still see regular use
can place large portions of the market at risk. Whenever pos-
sible, developers should seek to support at least two gener-
ations of hardware. Businesses and organizations, including
those with bring-your-own-device (BYOD) policies, should
consider the implementation of a mobile-device-manage-
ment (MDM) solution. Though not perfect, MDM solutions
like AirWatch and MobileIron offer a variety of benefits in
the mobile security space:
•	 Automatically update corporate-relevant applications
and services
•	 Flag jailbroken or rooted devices and disconnect
them from corporate resources
•	 Group devices together logically for ease of control
and monitoring
•	 Prevent vulnerable or unsupported devices from be-
ing connected to corporate networks and resources
•	 Push corporate mobile policies to all employee devic-
es
Finally, consumers also have a responsibility to protect them-
selves. Users should avoid jailbreaking or rooting their phones
and tablets. Doing so can significantly amplify the fallout of a
successful exploit, giving attackers complete control over the
contents of compromised devices. There may also be emerg-
ing methods through which customers can force the hands
14 Ibid.
38 – ISSA Journal | January 2017
Fragmentation in Mobile Devices | Ken Smith
ISSA Special Interest Groups
Special Interest Groups — Join Today! — It’s Free!
ISSA.org => Learn => Special Interest Groups
Security Awareness
Sharing knowledge,
experience, and
methodologies regarding
IT security education,
awareness and training
programs.
Women in Security
Connecting the world, one
cybersecurity practitioner at
a time; developing women
leaders globally; building
a stronger cybersecurity
community fabric.
Health Care
Driving collaborative thought
and knowledge-sharing
for information security
leaders within healthcare
organizations.
Financial
Promoting knowledge
sharing and collaboration
between information security
professionals and leaders
within financial industry
organizations.
has physical control over the data—whether within a virtual
environment or the cloud—the organization remains respon-
sible for ensuring it can meet legal and regulatory control
requirements. For the financial services industry, the X9.125
standard for cloud compliance is being developed to address
requirements and compliance between a cloud subscriber
and its CSP.
Some background might help clarify how X9.125 fits into fi-
nancial and cloud services. As shown in figure 1, the Ameri-
can National Standards Institute3
(ANSI) is the United States’
representative to the International Standards Organization4
(ISO) among others. However, ANSI does not develop stan-
dards; rather, they accredit other organizations as indus-
try-specific standards developers and technical advisory
groups (TAG) to ISO technical committees. The Accredited
Standards Committee X95
(ASC X9 or just X9) is one such
organization designated by ANSI to perform the following
roles:
•	 Develop ANSI standards for the financial services in-
dustry
3 www.ansi.org.
4 www.iso.org.
5 www.x9.org.
Abstract
The Cloud offers organizations
faster, cheaper, richer, and some-
times more secure application
deployments than they them-
selves can orchestrate. However, organizations remain re-
sponsible for ensuring the security of their data, even when
they transfer its physical control to a cloud service provider
(CSP). What  information does an organization require from
a CSP to gain confidence they are meeting their data gover-
nance obligations? Can cloud-based technologies, such as the
blockchain, play a role in providing cloud subscribers assur-
ance their data is being properly managed and that their CSP
is in compliance with established security policies and prac-
tices? For the financial service industry the X9.125 standard
is under development to define requirements and provide a
compliance model using blockchain technology.
Introduction
A
s organizations embrace the Cloud and migration or
deploy applications and invariably data, they trans-
fer control from internal processes to a cloud service
provider (CSP). However, organizations (subscribers) remain
responsible for industry information security compliance
despite the delegation to the CSP. Healthcare1
data and pay-
ment2
data notwithstanding, organizations must ensure they
exert adequate governance over how their data is protected.
Regardless of where their data is located and who actually
1 http://www.hhs.gov/ocr/privacy/index.html.
2 https://www.pcisecuritystandards.org/.
Gaining Confidence in the
Cloud
By Phillip Griffin – ISSA Fellow, Raleigh Chapter and Jeff Stapleton – ISSA member, Fort Worth Chapter
In cloud deployments organizations remain responsible for ensuring the security of their data.
Can cloud-based technologies, such as the blockchain, play a role in providing cloud subscribers
assurance their data is being properly managed and that their cloud service provider is in
compliance with established security policies and practices?
Figure 1 – Standards overview
January 2017 | ISSA Journal – 39
ISSA
DEVELOPING AND CONNECTING
CYBERSECURITY LEADERS GLOBALLY
Figure 2 – Controls overview
•	 Represent the United States as the
TAG to ISO technical committee 68
Financial Services (TC68)
•	 Manage TC68 and as the official
secretariat
Consequently, many X9 standards are sub-
mitted to ISO for international standardization. Further, X9
often initiates ISO work items, adopts ISO financial stan-
dards, and retires ANSI standards in favor of its ISO version.
Sometimes the US markets are uniquely distinct that a do-
mestic X9 standard is needed in absence or parallel with an
ISO standard. The cloud security work item is assigned to the
X9F4 cryptographic protocols and application security work
group; Jeff Stapleton is the X9F4 chair, and Phil Griffin is the
X9.125 editor.
One of the first X9F4 actions was to review the existing body
of work including special publications from the National
Institute of Standards and Technology (NIST),6
Federal Fi-
nancial Institutions Examination Council (FFIEC)7
cloud
computing recommendations, Central Intelligence Agency
(CIA) views on cloud computing, and the Cloud Security Al-
liance (CSA) research on comparable audit programs. These
materials were digested to formulate a core set of security
requirements for managing and securing information in the
cloud, whether this information is located in a private cloud
completely under control of the organization, or managed in
a hybrid or public cloud environment. Regardless of the cloud
service type or environment, these basic questions were iden-
tified:
1. What security controls does the cloud subscriber (the
consumer of the cloud services) need to protect the confi-
dentiality and integrity of its data?
2. What security controls does the cloud service provider
offer to protect the confidentiality and integrity of its sub-
scriber’s data?
3. What security controls provided by the cloud service pro-
vider can be monitored by the cloud subscriber to verify
compliance?
While the development of X9.125 is still work in progress
and has undergone several redesigns, cloud services and its
adoption in the financial industry have continued to evolve.
Thus the X9.125 standard is attempting to hit a moving tar-
get. X9 standards provide requirements (“shall”) and recom-
mendations (“should”) that are practical and verifiable. Thus,
as shown in figure 2, the standard needs to address security
controls and interoperability between the cloud service pro-
vider and the cloud subscriber, in addition transparency of
any service sub-providers.
Cloud service providers, like any organization relying on in-
formation technology (IT), need to have their security con-
trols documented in policy (why), practices (what), and pro-
6 http://csrc.nist.gov/publications/PubsSPs.html.
7 http://ithandbook.ffiec.gov/media/153119/06-28-12_-_external_cloud_
computing_-_public_statement.pdf
cedures (who). They also need to securely manage resources,
including people, places, and processes. IT controls include
network, systems, and applications addressing authentica-
tion, authorization, and accountability (AAA). Data must
also be managed across its life cycle including creation, dis-
tribution, storage, and termination. When cryptography is
used, the keys must be managed in a secure manner. How
the controls are deployed and managed depends on the re-
lationship between the CSP and the subscriber is depicted in
figure 2:
•	 The topmost solid arrow shows the case when controls are
provided solely by the CSP to the subscriber. For example,
the CSP might encrypt the subscriber’s data in storage us-
ing cryptographic keys managed solely by the CSP.
•	 The middle arrows show cases when controls are mutually
managed by both the CSP and the subscriber. For exam-
ple, data in transit is encrypted using a session key that is
dynamically established, based on an exchange of public
key certificates between the CSP and the subscriber.
•	 The bottommost dotted arrow shows the case when con-
trols are provided solely by the subscriber. For example,
the subscriber might encrypt or tokenize data before it is
sent to the CSP for storage or processing.
•	 The dotted arrow between the CSP and its service sub-pro-
vider shows the case when controls are provided indirect-
ly to the subscriber by the sub-provider. For example, the
sub-provider might be a tokenization service used by the
CSP to protect the subscriber’s data in storage.
While the X9.125 is still work in progress, another major as-
pect is to develop a reporting model such that a cloud sub-
scriber can verify a CSP’s compliance. Compliance might be
to the CSP policy and practices aligned with the subscriber,
or preferably the security requirements being defined in the
X9.125 standard adopted by both parties. Regardless, this im-
plies that the CSP provides compliance information that is
reliable and verifiable. One method for a digital ledger might
be blockchain technology, more contemporarily known be-
cause of Bitcoin.
Blockchains
Blockchains have been around for decades. Notably Merkle
trees were addressed in a US Patent [2] issued in 1982; so
the technology is well vetted. While the Bitcoin blockchain
is used as a general ledger for Bitcoin transactions, any in-
formation can be encapsulated within a blockchain that can
provide data integrity. Incorporating timestamps within the
blockchain (as does Bitcoin) also provides a historical record
40 – ISSA Journal | January 2017
Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
of what happened when and by whom. Consider figure 3 as
an example.
The blocks are number sequentially: Block 0, 1…to N. There is
always an initial block conventionally numbered “0” to indi-
cate its special nature. There is always a last block (N) which is
the most current addition to the blockchain. Each block con-
tains data, in this example a cloud service provider’s policy
numbered accordingly to its block number: P0, P1…PN and
so on. In this example, each block contains a hash (H) of its
own policy data, essentially a link to itself, so Block 0 contains
H(P0), Block 1 contains H(P1), and Block N contains H(PN).
Additionally, each block contains a hash of its processor, so
Block 1 contains H(P0) as a link to Block 0, and Block N con-
tains H(PN-1) as a link to Block N-1. Note that Block 0 does
not contain a previous link since Block 0 is the blockchain
origin. At this point one might think that the blockchain is
completely reliable, but it turns out that simple links based
on a hash of just the data in the previous block is unreliable.
Consider an attacker that takes some intermediary Block K
which links to Block K-1 and has Block K+1 linked to it. The
attacker makes a replica of Block K, which we will call Block J,
modifies the data so it contains PJ and no longer PK, and pub-
lishes it as the real Block K. The attacker then updates Block
K+1 to link to Block J instead of Block K. Thus, the blockchain
has been compromised but yet still appears to be valid since
all of the links are valid. Without some method of either ver-
ifying the publisher or the whole blockchain, a simple substi-
tution attack is possible. Replacing the previous link as a sim-
ple hash of the previous data with a digital signature would
prevent the substitution attack; however, this would require
the support of a public key infrastructure (PKI) with certif-
icates, private key storage, certificate authorities, revocation
lists, and the like. Alternatively, replacing the previous link
with a hash chain achieves the same anti-substitution control
without the PKI overhead.
Referring back to figure 3, we have provided another chain
field where each block contains a chain numbered by its block
number: C0, C1…CN. Each chain is a link to all of the pre-
vious blocks, which is a hash of two elements: the previous
chain and a hash of its own data. Thus, Block N contains a
hash of CN-1 and a hash of its own data H(PN), that is H(CN-
1, H(PN)). Likewise, Block 1 contains a hash of C0 and a hash
of its own data H(P1), namely H(C0, H(P1)). Block 0 only con-
tains a hash of a hash of its own data H(H(P0)) because there
Figure 3 – Simple blockchain
is no previous chain. In this manner an attacker cannot re-
place any of the published blocks without updating the whole
chain, which is the basis of the Bitcoin blockchain security.
The presumption is that it is cheaper to be honest than dis-
honest.
If a majority of CPU power is controlled by honest nodes,
the honest chain will grow the fastest and outpace any
competing chains. To modify a past block, an attacker
would have to redo the proof-of-work of the block and all
blocks after it and then catch up with and surpass the work
of the honest nodes. We will show later that the probability
of a slower attacker catching up diminishes exponentially
as subsequent blocks are added. [3]
Much of the media discussion around Bitcoin has focused
on its role as a crypto currency. Bitcoin provides a means
for achieving efficient, anonymous financial transactions. In
this context, Bitcoin is sometimes described as a disruptive
technology, one that facilitates the activities of drug deal-
ers and terrorists, one that threatens to disintermediate and
undermine the existing financial services industry, or one
that presents banks who serve Bitcoin industry players with
heightened “Bank Secrecy Act (BSA)/Anti-Money Launder-
ing (AML) Act compliance risks” [1].
On the other hand, Bitcoin has seen adoption by e-commerce
stalwarts such as PayPal, Overstock, Dish Network, and Dell
Computers, as well as “many community-driven organiza-
tions” that “allow anonymous donations using Bitcoin” [6].
Despite any negative aspects associated with Bitcoin, “there
remain many legitimate uses for Bitcoin and businesses that
facilitate these legitimate transactions” [7]. There is also
growing interest in leveraging the blockchain technology
that underpins Bitcoin to both reduce transaction costs and
strengthen financial services security. To this end, more gen-
eral purpose applications of the blockchain that are far re-
moved from the use of Bitcoin to facilitate financial services
transactions are being considered. For example, blockchains
might be used to evaluate, monitor, assess, or even audit a
cloud services provider:
•	 The CSP might publish its information security policy and
practices in a blockchain providing an historical record of
versions and changes. In this manner, new subscribers can
evaluate the CSP, existing subscribers can monitor chang-
es, internal audit can assess the CSP, and professionals can
perform independent audits of the CSP.
January 2017 | ISSA Journal – 41
Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
•	 The CSP might distribute information security news in a
blockchain providing notifications or alerts to its subscrib-
ers about incidents or events about new vulnerabilities in
a reliable manner. Today, this information is typically pro-
vided via emails or blogs.
•	 The CSP might issue information security details in a
blockchain providing real-time data about its controls.
In this manner, existing subscribers can monitor the CSP
for its dependability, consistency, and overall trustworthi-
ness. Another name for this would be compliance.
Hence, the concept of using blockchains to record and ver-
ify CSP compliance data is not as farfetched as might have
been initially considered. For cloud subscribers to gain such
assurance, and to exercise due diligence in the conduct of
their governance and risk management responsibilities, they
need some insight into what goes on under the covers at their
CSP. Cloud subscribers need the same types of operational
evidence of compliance from their CSP that they would ex-
pect their internal IT departments to provide. Whether an
organization’s data is inside its firewall or floating around in
the cloud, informed information security management prac-
tices still depend on access to the basics: vulnerability scan
results, penetration test results, system logs, application logs,
analytical results, security alerts, and summarized informa-
tion. Compliance evidence must have origin authenticity,
data integrity, and often confidentiality safeguards that pro-
hibit access by attackers and other unauthorized individuals.
The attractiveness of the Bitcoin blockchain includes its de-
centralization. Bitcoin spenders submit their transactions
(signature, inputs, outputs) to multiple Bitcoin nodes such
that the transaction gets published in the next block that is
originated by the next miner to solve the hash solution. The
idea is that the amount of work to perpetrate fraud far ex-
ceeds the work factor for mining. Sometimes a race condition
creates a bifurcated blockchain generated by two different
Bitcoin nodes; however, consensus processing will eventually
prune the blockchain to only one authentic version. There is
no central authority that provides a processing choke point, a
single point of failure, or a single point of attack.
However, blockchain management is not without its prob-
lems. There are orphaned blocks, which are valid but did not
make it into the main Bitcoin chain. There are always uncon-
firmed transactions waiting for the next block, which might
get lost during the bifurcation and pruning process. There are
double spends, transactions where the same Bitcoin fractions
get spent by the same entity to two different receivers. There
are strange transactions, where the syntax or semantics are
invalid. And there are outright rejected transactions dropped
by Bitcoin nodes that never get included in the chain. Some
of these might be processing errors due to software bugs, Bit-
coin versions, or rules issues. Alternatively, some transactions
might be fraudulent in nature. Bitcoin fraud management is
relatively nascent, and without a central authority there are
no arbitration or adjudication programs available.
Bitcoin information is publicly accessible by definition. Hash
algorithms provide the links between blocks and transac-
tions, and digital signatures provide transaction integrity
and authentication. Non-repudiation is not feasible as Bitcoin
identifiers support anonymity, and the lack of arbitration does
not meet legal needs discussed in the Digital Signature Guide-
lines [4] and the PKI Assessment Guideline [5]. Further, the
Bitcoin blockchain does not offer data confidentiality. Some
of the cloud server provider’s information security manage-
ment data is sensitive such that it might need to be encrypted,
but only accessible by authorized clients or regulatory bodies.
Thus, key management schemes need to be considered.
There is also growing interest in cloud data confidentiali-
ty and user anonymity. In a paper presented at the Security
Standardization Research (SSR) 20158
conference held re-
cently in Tokyo, Japan, researchers McCorry, Shahandashti,
Clarke, and Hao proposed a new category of Authenticated
Key Exchange (AKE) protocols. These new protocols, which
“bootstrap trust entirely from the blockchain,” are identi-
fied by the authors as “Bitcoin-based AKE” [6]. The SSR 2015
paper describes two new protocols, one with a guarantee of
forward secrecy and offers proof-of-concept prototypes with
experimental results to demonstrate their practical feasibil-
ity. Both protocols provide greater anonymity than can be
achieved using digital certificate- or password-based AKE.
Following the guidance of international security standards
can help ensure that the same information security policies
used to manage risk when information systems reside in
traditional non-cloud environments are also applied in the
cloud. Recently, the big three international security standard-
ization bodies published Recommendation ITU-T X.1631 |
ISO/IEC 27017 Code of practice for information security con-
trols based on ISO/IEC 27002 for cloud services.9
This standard
builds on selected parts of the familiar ISO/IEC 27002 Code
of practice for information security management10
but adds
additional cloud-specific recommendations and guidance.
Although ITU-T X.1631 | ISO/IEC 27017 provides import-
ant recommendations and guidance, it contains no actual
requirements. Conversely, the draft X9.125 standard hardens
the ISO, IEC, and ITU-T recommendations and guidance
into a set of specific information security management re-
quirements. Where ITU-T X.1631 | ISO/IEC 27017 relies on
clauses 5 through 18 of the ISO/IEC 27002 Code of Practice,
X9.125 defines requirements based on comparable clauses in
the ISO/IEC 27001 Information security management systems
– Requirements.11
Conclusions
Blockchains, a decades old cryptographic technology, has
become a creature of the Cloud. Its adoption and use carry
many of the same security concerns as other cloud-based ap-
8 http://www.ssr2015.com/.
9 http://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.
htm?csnumber=43757.
10 http://www.iso.org/iso/catalogue_detail?csnumber=54533.
11 https://en.wikipedia.org/wiki/ISO/IEC_27001:2013.
42 – ISSA Journal | January 2017
Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
plications and services. But for blockchains to be trusted in
the current financial services regulatory environment, and
for it to be widely adopted, blockchain-based systems must
comply with an organization’s existing security policy and
practices. Many of the policies needed to manage blockchains
and other cloud-based deployments are the same as those
used to manage security risk within an organization. Organi-
zations must continue to manage risk and fully exercise their
information security governance responsibilities regardless
of where their data and applications roam. Cloud subscribers
need the ability to verify that their cloud service providers are
securing information in a compliant manner with established
requirements.
ASC X9 is currently developing the X9.125 standard with the
option of the United States submitting the work to ISO for in-
ternational standardization. Once the cloud security require-
ments have been completed, the corresponding compliance
data might be encapsulated in a publicly available or privately
provided blockchain. Cloud subscribers, internal or external
auditors, regulators, or any independent third-party assessor
should be able to validate the CSP by verifying its informa-
tion security blockchain.
This article is also a call for participation. Cloud service pro-
viders, cloud subscribers, or organizations that are interested
in the development of the X9.125 standard are encouraged
to contact the ASC X9 or the X9F4 work group chair. Par-
ticipation by any X9 member is welcomed. Once the X9.125
standard is approved as a new ANSI standard, the possibility
of its being submitted to ISO as a USA offering is something
that will be seriously considered with the appropriate organi-
zations’ support.
References
1. King, Douglas. (2015). Banking Bitcoin-Related Businesses:
A Primer for Managing BSA/AML Risks. Federal Reserve
Bank of Atlanta. Retrieved November 19, 2015, from
https://www.frbatlanta.org/-/media/Documents/rprf/rprf_
pubs/2015/banking-Bitcoin-related-businesses.pdf.
2. Merkle, R. C. (1988). “A Digital Signature Based on a
Conventional Encryption Function.” Advances in Cryptol-
ogy — CRYPTO ‘87. Lecture Notes in Computer Science
293. p. 369. doi:10.1007/3-540-48184-2_32 ISBN 978-3-540-
18796-7. US Patent 4309569, Method of Providing Digital
Signatures, Ralph C. Merkle, January 5, 1982.
3. Satoshi Nakamoto, Bitcoin: A Peer-to-Peer Electronic Cash
System, Bitcoin.org, retrieved 31 October 2008, https://bit-
coin.org/bitcoin.pdf.
4. ISC, Digital Signature Guidelines, Legal Infrastructure for
Certification Authorities and Secure Electronic Commerce,
Information Security Committee (ISC), Electronic Com-
merce and Information Technology Division, Section of
Science and Technology, American Bar Association (ABA),
ISBN 1-57073-250-7, August 1996.
5. ISC, PKI Assessment Guideline (PAG), Information Security
Committee (ISC), Electronic Commerce Division, Section
of Science & Technology Law, American Bar Association
(ABA), ISBN 1-57073-943-9, June 2001.
6. Patrick McCorry, Siamak F. Shahandashti, Dylan Clarke,
Affiliated with School of Computing Science, Newcastle
UniversityFeng Hao, Authenticated Key Exchange over
Bitcoin, Security Standardisation Research, Volume 9497,
Lecture Notes in Computer Science, pp 3-20, December 9,
2015 – retrieved November 8, 2015, from http://eprint.iacr.
org/2015/308.pdf.
7. Douglas King, Retail Payments Risk Forum Working Paper,
Federal Reserve Bank of Atlanta, October 2015.
About the Authors
Phillip H. Griffin, CISM, has over 20 years
experience in the development of commer-
cial, national, and international security
standards and cryptographic messaging pro-
tocols. Phil has a Master’s of Information
Technology, Information Assurance and Se-
curity degree, and he has been awarded nine
US patents at the intersection of biometrics, radio frequency
identification (RFID), and information security management.
He may be reached at phil@phillipgriffin.com.
Jeff Stapleton has been an ISSA member and
participated in X9 for over twenty years; he
has contributed to the development of over
three dozen X9 and ISO security standards,
and has been the chair of the X9F4 work group
for over 15 years. The X9F4 work group’s pro-
gram of work includes the five-year review of
two published standards (X9.73, X9.84) and
development of three new standards (X9.112, X9.122, X9.125)
in addition to supporting ISO standard efforts. He may be
reached at jjs78023@yahoo.com.
Easy and
Convenient!
Place Your Order Today: ISSA Store!
www.issa.org/store/default.asp
Computer Bags • Short-Sleeve Shirt
Long-Sleeve Shirt • Padfolio
Travel Mug • Baseball Cap • Fleece Blanket
Proud Member Ribbon
January 2017 | ISSA Journal – 43
Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
T
here is now a political battle going on in Washington,
DC, the outcome of which could have a significant
impact on the future of how encryption is used. In
the US, law enforcement agencies currently have the ability
to easily intercept phone calls, and they would like to extend
this ability to give them easy access to forms of secure com-
munication like encrypted email and text messaging. Privacy
advocates seem to think that this is a very bad idea.
Law enforcement agencies argue that the world is “going
dark” for them because the use of encryption is reducing
their ability to investigate and prosecute criminals.
Privacy advocates argue that adding a backdoor to products
that use encryption is inherently dangerous because it also
creates a way for hackers to bypass the encryption. They also
argue that enough information is contained in messaging
metadata for law enforcement agencies to use to investigate
and prosecute criminals, and because of this, law enforce-
ment agencies do not need the content of encrypted messages
to do their job.
Here we look at the facts that might or might not support each
of these claims, but before doing that, it is probably helpful to
put the current debate into some context. A debate that was
very similar to the current one took place in the years just be-
fore the dot-com boom, right before the Internet transformed
the way that we both conduct business and live our daily lives.
Who is driving the debate?
It seems clear that the goal of law enforcement officials in
gaining the ability to decrypt encrypted messages is to in-
vestigate and prosecute criminals. The goals of the average
citizen are probably more varied. Their opinions probably
range from complete support for the goals of law enforcement
officials to complete opposition to them, as well as to more
moderate positions in between. This is what Alan Westin’s
1967 book Privacy and Freedom1
tells us to expect, because
people tend to fall into three general categories when it comes
to valuing their privacy: privacy fundamentalists, the privacy
unconcerned, and privacy pragmatists.
About 25 percent of people can be classified as privacy funda-
mentalists. They place a very high value on their privacy and
tend to favor strict government regulations supporting priva-
cy. About 16 percent of people can be classified as the privacy
unconcerned. They are generally not concerned much about
privacy at all and tend to not support strict government regu-
lations supporting privacy. The remaining 59 percent of peo-
ple have only moderate concerns about privacy and believe
that trade-offs between privacy and something else of value
are acceptable, and they tend to favor using the existing set of
laws and regulations to ensure that fair practices are enforced
when this happens.
Because each of these groups has a very different approach to
privacy, it seems unlikely that any single way to address pri-
vacy concerns will appeal to them all. In particular, it might
be the case that the point of view that the use of encryption
should be a fundamental right that governments should not
interfere with might be the belief held only by a small fraction
of the population, while a significant majority might find the
trade-off between personal privacy and law enforcement of-
ficials getting the ability to decrypt encrypted messages to be
an acceptable one.
But because this majority also tends not to get overly involved
in the political processes, their opinions are typically not
heard in debates like the current one. Yet in a democracy,
where governing by consensus is usually a good idea, learn-
1 Westin, Alan F. Privacy and Freedom. New York: Athenum. 1967.
The debate over whether or not to give US law enforcement officials the ability to decrypt encrypted
messaging has recently been revisited after a twenty-year break. This is an emotionally charged issue, but
an important one, so it might be useful to attempt an unbiased and impartial look at the facts and see how
well they correspond to the main talking points of the supporters of each side of the debate. The results
may be surprising. To learn more, read on.
By Luther Martin – ISSA member, Silicon Valley Chapter and Amy Vosters
Crypto Wars II
44 – ISSA Journal | January 2017
ISSA
DEVELOPING AND CONNECTING
CYBERSECURITY LEADERS GLOBALLY
ing exactly what the consensus is, is probably a good first step
towards finding a good solution.
The Clinton years: Crypto Wars I
To give some historical perspective to the current debate, it is
probably useful to recall what happened during the Clinton
administration. During the early 1990s, the US government
enacted legislation that gave law enforcement officials un-
precedented access to telecommunications traffic in the US—
this was accomplished by the Communications Assistance to
Law Enforcement Act (CALEA) of 1994.2
CALEA required manufacturers of telecommunication
equipment to include features in their products that made it
very easy for law enforcement officials to intercept telecom-
munications traffic and required telecommunications carri-
ers to use equipment with these features. Although CALEA
carefully distinguished between telecommunications ser-
vices (which it applied to) and information services (which
it did not apply to), this distinction has become somewhat
blurred recently, particularly after recent FCC rule making3
extended CALEA to facilities-based broadband Internet ac-
cess providers and providers of interconnected voice-over-in-
ternet-protocol (VoIP) services.
But while the government was successful in implementing
laws that made it easy for them to intercept telecommunica-
tions traffic, they were much less successful in implementing
laws that would give them access to encrypted traffic. They
attempted to do this by restricting the export from the US
any strong encryption (providing greater than 40 bits of
2 Public Law No. 103-414, 108 Stat. 4279 (1994), 47 USC 1001-1010. Available at ftp://
ftp.fcc.gov/pub/Bureaus/OSEC/library/legislative_histories/1743.pdf.
3 “Communications Assistance for Law Enforcement Act and Broadband Access and
Services,” 47 CFR Part 1, ET Docket No. 04-295; FCC 06-56 (2006). Available at
https://apps.fcc.gov/edocs_public/attachmatch/FCC-06-56A1.pdf.
cryptographic strength) that did not implement a form of key
escrow, in which two different government agencies would
independently hold information that could be combined to
recover any encryption key that the escrowed encryption
scheme used.
In April 1993, the Clinton White House announced the
“Clipper Chip,”4
technology that would be used to implement
the government’s plan for key escrow. In February 1994, the
Department of Commerce published the Escrowed Encryp-
tion Standard5
(FIPS 185) to support this initiative. Attorney
General Janet Reno then announced that NIST and the De-
partment of the Treasury would be the escrow agents for the
government’s escrowed encryption program and published
the procedures for releasing escrowed keys to law enforce-
ment officials for use under approved wiretap orders. Every-
thing seemed ready to go.
But a significant number of Americans were firmly against
the government’s plan for escrowed encryption and believed
that their need for privacy was more important than the gov-
ernment’s need to be able to read some encrypted messag-
es. In the face of strong public pressure, the US government
eventually abandoned their plans for escrowed encryption,
leaving FIPS 185 looking much like the Eighteenth Amend-
ment to the US Constitution that banned the production,
transport, and sale of alcoholic beverages starting in 1920—
something that might have made sense at the time but that
looks like a very poorly considered idea from a safe distance
into the future. (And if FIPS 185 is analogous to Prohibition,
its eventual withdrawal in 2015 is almost certainly analogous
to the passage of the Twenty-first Amendment in 1933.)
4 http://www.cryptomuseum.com/crypto/usa/clipper.htm.
5 http://csrc.nist.gov/publications/fips/fips185/fips185.pdf.
A Wealth of Resources for the Information Security Professional – www.ISSA.org
When TLS Reads “Totally Lost Security”
2-Hour Event Recorded Live: November 15, 2016
How to Recruit and Retain Cybersecurity Professionals
2-Hour Event Recorded Live: October 25, 2016
Security Architecture & Network Situational Awareness
2-Hour Event Recorded Live: September 27, 2016
IoT: The Information Ecosystem of the Future--And Its Issues
2-Hour Event Recorded Live:August 23, 2016
Hacking the Social Grid: Gullible People at 670 Million
Miles per Hour
2-Hour Event Recorded Live: July 26, 2016
Legislative Impact: When Privacy Hides the Guilty Party
2-Hour Event Recorded Live: June 28, 2016
Breach Report Analysis – SWOT or SWAT?
2-Hour Event Recorded Live: May 24, 2016
The Sky Is Falling... CVE-2016-9999(nth)
?
2-Hour Event Recorded Live: April 26, 2016
Security Software Supply Chain: Is What You See
What You Get?
2-Hour Event Recorded Live: March 22, 2016
Mobile App Security (Angry Birds Hacked My Phone)
2-Hour Event Recorded Live: February 23, 2016
2015 Security Review & Predictions for 2016
2-Hour Event Recorded Live: January 26, 2016
Forensics: Tracking the Hacker
2-Hour Event Recorded Live: November 17, 2015
Click here for On-Demand Conferences
www.issa.org/?OnDemandWebConf
January 2017 | ISSA Journal – 45
Crypto Wars II | Luther Martin and Amy Vosters
ment agencies, so they may be more likely to use encryption
than the average person is. But is this really happening?
Fortunately, there is lots of data available on the use of wire-
taps by US state and federal law enforcement agencies, and it
is collected and summarized in the US federal court’s annu-
al Wiretap Report. Since 2000, these reports have listed how
many court-authorized intercepts were unable to be read by
law enforcement officials because of the use of encryption.
This data does not include wiretaps authorized for some na-
tional security reasons, but it is the best data that is available
on the subject. The data from the 2000 through 2014 Wiretap
Reports is summarized in table 1.
Note that in the year 2013, in which the largest number of
wiretaps were thwarted by the use of encryption, the num-
ber of wiretaps that law enforcement were unable to read was
only about 0.4 percent of the total number of intercepts that
were installed that year.
In fact, in 2013, the worst year for law enforcement ever in
this particular area, they were still able to read over 99.5 per-
cent of intercepts. That is indeed less than the 100 percent
that they were able to read in most other years, but it is still
a significant fraction and it certainly looks like law enforce-
ment officials are having very little trouble in reading what
they get from their wiretaps. So using the best available data,
the claim that the world is “going dark” for law enforcement
because of the growing use of encryption seems to be false.
Backdoors mean weak security
Privacy advocates claim that if law enforcement had the abil-
ity to decrypt any encrypted messages, then hackers could
also take advantage of the same capability, and this would
cause the security provided by the encryption to be reduced
by an unacceptable level. This certainly sounds reasonable,
but is it true?
The use of encryption in some form is very common in the
business world. This can cause lots of regulatory and com-
pliance issues if it is done carelessly because if you lose the
key that was used to encrypt important business information,
then you also lose the information that was encrypted with
it. This is an unacceptable situation to almost all businesses.
To avoid this situation, most enterprise software that uses
encryption also has the capability to recover lost keys. It is
very common to control access and use of an encryption
key through some form of authentication, most commonly
a password. And when users forget their password, they will
need to get another copy of this key.
This is a necessary capability of enterprise software that uses
encryption, but it is also the very sort of backdoor that privacy
advocates claim will introduce serious vulnerabilities into the
software that hackers can exploit. But this seems to have not
occurred—enterprise software that implements key recover
does not seem to have serious vulnerabilities that hackers are
able to exploit. So it seems that this particular claim may not
be as accurate as we may first think it is.
So at the end of the Clinton administration it might have
looked like law enforcement officials had a significant victory
in CALEA, and privacy advocates had a significant victory in
the defeat of the government’s plans for escrowed encryption.
Over two decades later, law enforcement officials decided to
revisit the idea of extending the scope of CALEA to include
things that might be considered information services. This
was successful in 2006, when the scope of CALEA was ex-
panded to include some broadband Internet and VoIP ser-
vices.
Today, from the point of view of law enforcement officials,
extending the scope of existing laws to give them the ability
to read encrypted messaging probably seems like a reason-
able thing to do. From their point of view, this is probably
just an incremental change, particularly when compared to
the significant capabilities that they already have to intercept
communications.
Going dark
Law enforcement agencies now claim that they need the abili-
ty to decrypt encrypted communications because the world is
“going dark” for them. There is no precise definition for “go-
ing dark,” but the FBI has described this phenomenon as “a
growing gap between the existing statutory authority of law
enforcement to intercept communications pursuant to court
order and the practical ability to do so.”6
This certainly sounds plausible. After all, criminals probably
want to hide their intentions and actions from law enforce-
6 Congressman Peter T. King, “Remembering the Lessons of 9/11: Preserving Tools
and Authorities in the Fight Against Terrorism,” Journal of Legislation, Vol. 41, No. 2
(2015). Available at http://scholarship.law.nd.edu/jleg/vol41/iss2/1.
Year Intercepts installed Intercepts unread due to encryption
2000 1,139 0
2001 1,405 0
2002 1,273 0
2003 1,367 0
2004 1,633 0
2005 1,694 0
2006 1,714 0
2007 2,119 0
2008 1,809 0
2009 1,764 0
2010 2,311 0
2011 2,189 0
2012 2,501 4
2013 2,331 10
2014 2,433 4
Table 1 – Number of intercepts installed per year and how many of these were
unreadable because of the use of encryption (Source: US courts Wiretap Reports,
2000-2014) www.uscourts.gov/statistics-reports/analysis-reports/wiretap-reports
46 – ISSA Journal | January 2017
Crypto Wars II | Luther Martin and Amy Vosters
It might also be useful to note that because of CALEA, all
telecommunications equipment used within the US contain a
backdoor, but the existence of these backdoors does not seem
to have compromised the security of the US telecommunica-
tions networks.
Carelessly implemented key recovery can almost certainly in-
troduce weaknesses that hackers can exploit, but there seems
to be little evidence that deployed enterprise software prod-
ucts that include the ability to implement key recovery are
actually vulnerable in this way. In light of this, the general
claim that backdoors are inherently a dangerous security vul-
nerability seems to be false.
Metadata is enough
Metadata, or data about data, is typically transmitted in an
unencrypted form, even for encrypted messages. Metadata
about an email message includes information like the “to”
and “from” addresses, and this information needs to be avail-
able to the applications that get an encrypted message from
sender to recipient, even though the encrypted message body
does not. And because metadata is not encrypted, it is readily
available to law enforcement officials.
Some privacy advocates claim that the information in meta-
data is just as valuable to law enforcement officials as the
encrypted message is, so that by collecting and analyzing
metadata they should be able to investigate and prosecute
criminals without access to any encrypted messages at all.
The information in metadata can easily characterize you and
how you live your life. It can tell where you live and work, and
it can tell who you communicate with and about what. So the
claim that law enforcement officials can do their jobs just as
well with just messaging metadata instead of the content of
the corresponding messages sounds like it might be plausible.
There have been two recent careful studies of this issue, both
of which focused on the US government’s use of metadata to
identify terrorist activity. One was created in the recent case
of Klayman v. Obama7
(Klayman). In Judge Leon’s 2013 ruling
in Klayman, he summarized how effective the government’s
metadata collection programs had been, and noted that there
was no evidence at all that any government agency had been
able to identify and investigate imminent threats faster with
access to metadata that they were able to without metadata,
despite the claims by both the FBI8
and the NSA9
that meta-
data is an important tool in their investigations.
In the second study, The President’s Review Group on Intel-
ligence and Communications Technologies made a similar
finding.10
 In particular, when discussing the effectiveness of
the NSA’s metadata collection program, it noted that there
7 Klayman v. Obama, 957 F. Supp. 2d 1, 29 (D.D.C. 2013).
8 http://www.clearinghouse.net/chDocs/public/NS-DC-0007-0014.pdf.
9 http://www.clearinghouse.net/chDocs/public/NS-DC-0007-0016.pdf.
10 The President’s Review Group on Intelligence and Communications Technologies,
“Liberty and Security in a Changing World: Report and Recommendations of The
President’s Review Group on Intelligence and Communications Technologies,”
December 2013. Available at https://www.whitehouse.gov/sites/default/files/
docs/2013-12-12_rg_final_report.pdf
had been no instance in which the NSA could say with confi-
dence that the outcome of any investigation would have been
different without the program.
So from the point of view of learning about and preventing
terrorist attacks, metadata appears to be of limited value. And
while it is possible that a different pattern might be present
in metadata when it is used in other types of investigations,
it seems reasonable to expect that the same pattern would
also be present, and that metadata would also essentially be
of limited value to other investigations. Because of this, the
claim that metadata is an adequate replacement for the con-
tents of messages in criminal investigations seems to be false.
Summary
It should not be too surprising that both sides in political de-
bates may make claims that are not strictly based on facts.
The current debate over giving US law enforcement officials
the ability to decrypt encrypted messages seems to be no ex-
ception to this rule, and all of the main arguments on both
sides of the issue seem to essentially be false. The claim that
the use of encryption is making the world “go dark” for law
enforcement seems to be false. The claim that backdoors are
inherently bad because they create a weakness that hackers
can exploit seems to be false. And the claim that messaging
metadata is an adequate replacement for the text of encrypted
messages seems to be false.
The debate over whether or not to give US law enforcement
officials the ability to decrypt encrypted communications is
very important because it will dramatically affect the privacy
of hundreds of millions of people. The decision on whether or
not to do this should be based on the best available facts, not
on arguments that are excellent examples of why English has
the word “specious,” indicating something that is superficial-
ly plausible but actually wrong.
On the other hand, psychologists tell us that we tend to not
base most decisions on facts; instead we tend to have emo-
tional reactions that we then try to rationalize. So a more im-
portant aspect of this debate may really be whether or not
we feel that the government can be trusted to not abuse their
ability to decrypt encrypted messages if they had it. But that
is something that is beyond the scope of what we can easily
discuss here.
About the authors
Luther Martin is a Dis-
tinguished Technologist at
Hewlett Packard Enterprise.
You can reach him at luther.
martin@hpe.com.
Amy Vosters is a marketing
manager at SOASTA. You can reach her at amy_vosters@ya-
hoo.com.
January 2017 | ISSA Journal – 47
Crypto Wars II | Luther Martin and Amy Vosters
#ISSAConf
ISSA 2017 INTERNATIONAL CONFERENCE
DIGITAL DANGER ZONE
October 10-11, 2017 San Diego, California
Sponsorship information:
Joe Cavarretta – jcavarretta@issa.org

The Best Articles of 2016 DEVELOPING AND CONNECTING CYBERSECURITY LEADERS GLOBALLY

  • 1.
    January 2017 Volume 15Issue 1 Machine Learning: A Primer for Security Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals The Role of the Adjunct in Educating the Security Practitioner Fragmentation in Mobile Devices Gaining Confidence in the Cloud Crypto Wars II The Best Articles of 2016
  • 2.
    Table of Contents DEVELOPINGAND CONNECTING CYBERSECURITY LEADERS GLOBALLY Articles 22 Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals By Seetharaman Jeganathan In this article, the author shares his insights about why security architecture is critical for organizations and how it can be developed using a practical framework- based approach. 30 The Role of the Adjunct in Educating the Security Practitioner By Karen Quagliata – ISSA member, St. Louis Chapter The cybersecurity industry faces a shortage of qualified professionals. Part of the solution is to better deliver cybersecurity education in colleges and universities. The purpose of this article is to equip cybersecurity professionals working as adjunct instructors with resources to deliver a more efficient and effective class. Also in this Issue 3 From the President 4 editor@issa.org 5 Sabett’s Brief (Not) The Best of Cybersecurity, 2016 Version 6 Herding Cats Sweat the Small Stuff 7 Open Forum Executive Juris Doctor: Rewarding and Influential Career Path 8 Security in the News 9 Security Awareness Security in the News in 2016 10 Crypto Corner A Feeble Attempt at Humor 12 Association News Feature 14 Machine Learning: A Primer for Security By Stephan Jou – ISSA member, Toronto Chapter The author examines how machine learning can be leveraged to address the practical challenges of delivering lower-cost security by resolving more threats faster, with fewer resources. It will focus on machine learning security techniques that work at typical levels of data volumes, from those operating with “small data” to those implementing data lakes. ©2017 Information Systems Security Association, Inc. (ISSA) The ISSA Journal (1949-0550) is published monthly by Information Systems Security Association 11130 Sunrise Valley Drive, Suite 350, Reston, Virginia 20191 703.234.4095 (Direct) • +1 703.437.4377 (National/International) 35 Fragmentation in Mobile Devices By Ken Smith The purpose of this article is to explore the threat to consumers posed by mobile device fragmentation. The author categorizes mobile device fragmentation by operating systems, manufacturer, and carrier, exploring the vulnerabilities at each level. 39 Gaining Confidence in the Cloud By Phillip Griffin – ISSA Fellow, Raleigh Chapter and Jeff Stapleton – ISSA member, Fort Worth Chapter Can cloud-based technologies, such as the blockchain, play a role in providing cloud subscribers assurance their data is being properly managed and that their cloud service provider is in compliance with established security policies and practices? 44 Crypto Wars II By Luther Martin – ISSA member, Silicon Valley Chapter and Amy Vosters The debate over whether or not to give US law enforcement officials the ability to decrypt encrypted messaging has recently been revisited after a twenty- year break. The results may be surprising. Article of the Year 2 – ISSA Journal | January 2017
  • 3.
    From the President January2017 | ISSA Journal – 3 International Board Officers President Andrea C. Hoy, CISM, CISSP, MBA, Distinguished Fellow Vice President Justin White Secretary/Director of Operations Anne M. Rogers CISSP, Fellow Treasurer/Chief Financial Officer Pamela Fusco Distinguished Fellow Board of Directors Debbie Christofferson, CISM, CISSP, CIPP/IT, Distinguished Fellow Mary Ann Davidson Distinguished Fellow Rhonda Farrell, Fellow Geoff Harris, CISSP, ITPC, BSc, DipEE, CEng, CLAS, Fellow DJ McArthur, CISSP, HiTrust CCSFP, EnCE, GCIH, CEH, CPT Shawn Murray, C|CISO, CISSP, CRISC, FITSP-A, C|EI, Senior Member Alex Wood, Senior Member Keyaan Williams, Fellow Stefano Zanero, PhD, Fellow The Information Systems Security Asso- ciation, Inc. (ISSA)® is a not-for-profit, international organization of information security professionals and practitioners. It provides educational forums, publications and peer interaction opportunities that en- hance the knowledge, skill and professional growth of its members. With active participation from individuals and chapters all over the world, the ISSA is the largest international, not-for-profit association specifically for security pro- fessionals. Members include practitioners at all levels of the security field in a broad range of industries, such as communica- tions, education, healthcare, manufactur- ing, financial, and government. The ISSA International Board consists of some of the most influential people in the security industry. With an internation- al communications network developed throughout the industry, the ISSA is fo- cused on maintaining its position as the preeminent trusted global information se- curity community. The primary goal of the ISSA is to promote management practices that will ensure the confidentiality, integrity and availability of information resources. The ISSA facilitates interaction and education to create a more successful environment for global informa- tion systems security and for the profes- sionals involved. F rom a cybersecurity profession- al’s perspective, we probably can relate to the differentiation of having a “good” year versus a “hap- py” one. Many of us remember notable events in 2016 that probably did not make anyone “happy.” Those in our Healthcare SIG might recall cancer-care service provider 21st Century Oncol- ogy’s announcement that 2.2 million patients may have had their personal information affected by a breach in Oc- tober 2015: hackers had access to patient names, Social Security numbers, doc- tors, diagnosis and treatment informa- tion, along with insurance information. Even the loss of one password-protected laptop led to 200,000 patients’ sensitive information being exposed in the Pre- miere Healthcare case. Maybe it was the Yahoo breach announcements of 500 million accounts being stolen by a state-sponsored actor, then later in De- cember one billion accounts! Meanwhile it was a “good year” from the perspective of heightened awareness of cybersecurity and privacy issues by the average person on the street. As well, leading companies—and more impor- tantly their boards—have been address- ing and providing better protection of sensitive personal and company infor- mation. In 2016, with consumers embracing the Internet of Things, hackers brought us Mirai, causing possibly the largest DDoS attack known to date, delivering 665 Gigabits per second and 143 million packets per second of unwanted traffic via hijacked IoT devices to the Krebs on Security blog. The increase in regulations, as well as privacy concerns, meant an increase in regulatory compliance, leading many companies to address information se- curity budget in- creases. In the first six months of 2016, even the US federal government had hired 3,000+ new cybersecurity/IT professionals as part of its first Federal Cybersecurity Workforce Strategy. And the president’s 2017 budget contains a proposed $3.1 billion to overhaul diffi- cult-to-secure systems. So looking forward, ISSA aims to con- tinue providing timely and thought-pro- voking information and educational resources. And more importantly, we want to provide the peer/industry net- working necessary to give you a global helping hand. Our global Special Interest Groups (SIGS) are ready to ring in the new year with exciting webinars and meetings. We had two very successful joint events in December, one the IEEE Women in Engineering Internet of Things World Forum, the other with SANS Connect. ISSA members can look forward to more of these events throughout 2017. For CISOs, our excellent CISO Execu- tive Forum is set up by a committee of your peers and overseen by CISO Exec- utive Forum chair and International di- rector Debbie Christofferson. This year’s with be at RSA; in partnership with the IAPP conference in Washington, DC; at Black Hat in Las Vegas; and the ISSA International Conference in San Diego. And be sure to join us January 24 for this year’s first ISSA web conference where we discuss more of what to expect in 2017! To our ISSA members across the globe: have a Happy and Good New Year! Moving forward, Happy New Year! Bonne annee’! Szczesliwego Nowego Roku! Feliz año nuevo! Manigong Bagong Taon! Felice Anno Nuovo or Buon anno! Mutlu Yillar! Ein glückliches neues Jahr! Hauoli Makahiki hou! And Shanah tovah u’metuka (‫הנש‬ ‫הבוט‬ ‫)הקותמו‬ or hopes for a good and sweet year! Andrea Hoy, International President
  • 4.
    The information andarticles in this mag- azine have not been subjected to any formal testing by Information Systems Security Association, Inc. The implemen- tation, use and/or selection of software, hardware, or procedures presented within this publication and the results obtained from such selection or imple- mentation, is the responsibility of the reader. Articles and information will be present- ed as technically correct as possible, to the best knowledge of the author and editors. If the reader intends to make use of any of the information presented in this publication, please verify and test any and all procedures selected. Techni- cal inaccuracies may arise from printing errors, new developments in the indus- try, and/or changes/enhancements to hardware or software components. The opinions expressed by the authors who contribute to the ISSA Journal are their own and do not necessarily reflect the official policy of ISSA. Articles may be submitted by members of ISSA. The articles should be within the scope of in- formation systems security, and should be a subject of interest to the members and based on the author’s experience. Please call or write for more information. Upon publication, all letters, stories, and articles become the property of ISSA and may be distributed to, and used by, all of its members. ISSA is a not-for-profit, independent cor- poration and is not owned in whole or in part by any manufacturer of software or hardware. All corporate information se- curity professionals are welcome to join ISSA. For information on joining ISSA and for membership rates, see www. issa.org. All product names and visual represen- tations published in this magazine are the trademarks/registered trademarks of their respective manufacturers. 4 – ISSA Journal | January 2017 editor@issa.org The Best Articles of 2016 Thom Barrie – Editor, the ISSA Journal Editor: Thom Barrie editor@issa.org Advertising: vendor@issa.org 866 349 5818 +1 206 388 4584 Editorial Advisory Board Phillip Griffin, Fellow Michael Grimaila, Fellow John Jordan, Senior Member Mollie Krehnke, Fellow Joe Malec, Fellow Donn Parker, Distinguished Fellow Kris Tanaka Joel Weise – Chairman, Distinguished Fellow Branden Williams, Distinguished Fellow Services Directory Website webmaster@issa.org 866 349 5818 +1 206 388 4584 Chapter Relations chapter@issa.org 866 349 5818 +1 206 388 4584 Member Relations member@issa.org 866 349 5818 +1 206 388 4584 Executive Director execdir@issa.org 866 349 5818 +1 206 388 4584 Advertising and Sponsorships vendor@issa.org 866 349 5818 +1 206 388 4584 W e’d like to ac- knowl- edge the passing of 2016, not with reminiscing the breaches, malware, privacy invasions, legislations—Andrea, Geordie, and Randy help us out with that—but by cel- ebrating the articles the Editorial Advi- sory Board deemed the best of the year. The 2016 Article of the Year “Machine Learning: A Primer for Se- curity” by Stephan Jou [Toronto Chap- ter]. Stephan lays out the workings of machine learning and artificial intel- ligence, painting a clear picture of this growing technology that some argue is still not ready for prime time. But the promise of combining big data and ma- chine learning—whether for analyzing unimaginably huge amounts of data for business processes or picking up on the bad actors knocking, poking, and prod- ding our infrastructures—has me excit- ed to see how 2017 plays out in this field. The Best of 2016 “Enterprise Security Architecture: Key for Aligning Security Goals with Busi- ness Goals,” by Seetharaman Jegana- than—Seetharaman deserves an hon- orable mention as his article was a very close runner up. “The Role of the Adjunct in Educating the Security Practitioner,” by Karen Quagliata [St. Louis Chapter]. “Fragmentation in Mobile Devices,” by Ken Smith. “Gaining Confidence in the Cloud,” by Phillip Griffin [Raleigh Chapter] and Jeff Stapleton [Fort Worth Chapter]. “Crypto Wars II,” by Luther Martin [Sil- icon Valley Chapter] and Amy Vosters. Congratulations to our best authors of the year! A number are already plan- ning to submit further works in the up- coming year. Readers’ Choice for 2016 So, these are the board’s choices. Do you concur? Please take a look through the year and let us know your top three or four selections. We’d love to have a Readers’ Choice. Some of my favorites not mentioned are “Impact of Social Media on Cybersecurity Employment and How to Use It to Improve Your Ca- reer,” Tim Howard [South Texas Chap- ter]; “Stop Delivery of Phishing Emails,” Gary Landau [Los Angeles Chapter]; “Beware the Blockchain,” Karen Mar- tin; “The Race against Cyber Crime Is Lost without Artificial Intelligence,” Keith Moore [Capitol of Texas Chapter]; and “Why Information Security Teams Fail,” Jason Lang. Let me know at editor@issa.org. It’s been a great year in the ISSA Journal. Here’s looking forward to an even bet- ter year. Do you have an article to share. Bring it on. —Thom
  • 5.
    Sabett’s Brief By RandyV. Sabett – ISSA Senior Member, Northern Virginia Chapter (Not) The Best of Cybersecurity, 2016 Version S o how many cybersecurity “Best of 2016” lists have you seen over the past few weeks? Well, this won’t be one of those lists, because as I’ve done in prior years, I’m going to cover events that I think were notable but that weren’t necessarily “best of.” And, as in past years, my wife thinks that this is a silly approach, but here goes anyway… First off, the Internet has survived an- other year. Despite all of the predictions of gloom and doom that have been pos- ited over the past decade or more, we’re still plugging away with the same basic infrastructure we’ve had for several de- cades. To some extent, this survival is a testament to its original design—adapt- able to changing conditions and attacks. Turning to a legislative event from very early in the year, the passage of the Con- solidated Appropriations Act of 2016 included the Cybersecurity Information Sharing Act (CISA). CISA created a vol- untary process for sharing cybersecu- rity information without legal barriers or threats of litigation. DHS and DOJ released additional guidance on infor- mation sharing under CISA in February and June. Based on personal experience in 2016, I find CISA has influenced a number of decisions to share informa- tion, including B2B, B2G, and G2B. Continuing for a moment on the gov- ernment side of things, in February the Administration released the Cyberse- curity National Action Plan (“CNAP”). The CNAP provides a combination of near-term tactical actions and lon- ger-term strategy components intended to “enhance cybersecurity awareness and protections, protect privacy, main- tain public safety as well as economic and national security, and empower Americans to take better control of their digital security.”1 Good stuff, but proper implementation will be critical. On the commercial side, businesses continued to be subjected to a variety of ever-evolving threats, including the incredible rise in both frequency and insidiousness of ransomware. 2016 saw ransomware evolve from phish- ing-based attacks on individual ma- chines into an attack mechanism that threatened entire networks. In particu- lar, SamSam (which exploits unpatched servers, moves laterally to any machine it finds, and then encrypts the entire network) proved to be particularly over- whelming. Only robust patching and diligent backups offer resiliency. In 2016, we saw cybersecurity become an integral part of the due diligence process for most M&A transactions (and personal experience bore this out). In fact, according to a recent survey, 85 percent of public company directors and officers say that an M&A transaction in which they were involved would likely or very likely be affected by “major se- curity vulnerabilities.” In addition, 22 percent say that they wouldn’t acquire a company that had a high-profile data breach, while 52 percent said they would still go through with the transaction but only at a significantly reduced value.2 This interest in cybersecurity diligence is not just theoretical: in the midst of an October M&A transaction involv- ing Verizon and Yahoo!, news broke of a Yahoo! breach that had occurred ap- proximately two years earlier. This event raised speculation around what it might do to the deal. To me, the bigger question will be how the overall scope of the due 1 https://www.whitehouse.gov/the-press- office/2016/02/09/fact-sheet-cybersecurity-national- action-plan. 2 https://www.nyse.com/publicdocs/Cybersecurity_and_ the_M_and_A_Due_Diligence_Process.pdf. diligence process will be influenced by cybersecurity in future deals. To round out the year, I will end on a hopefully positive note. In December, the findings of the Commission on En- hancing National Cybersecurity were released.3 The Commission had been tasked with developing recommenda- tions for ways to strengthen cybersecu- rity across both the federal government and the private sector. In a statement, President Obama stated that “[t]he Commission’s recommendations...make clear that there is much more to do and the next administration, Congress, the private sector, and the general public need to build on this progress.” Amen to that—all stakeholders must meaningfully participate and address cybersecurity so that everyone benefits. Let’s hope that 2017 sees that partici- pation increase. With that, I hope that your holiday season has been enjoyable and that your new year is off to a great start. Now I’m headed off to the refrig- erator to come up with a top 10 list of leftovers for my wife. Looking forward to hearing from you in 2017! About the Author RandyV.Sabett,J.D.,CISSP,isViceChair of the Privacy & Data Protection practice group at Cooley LLP, and a member of the Boards of Directors of ISSA NOVA, MissionLink, and the Georgetown Cy- bersecurity Law Institute. He was named the ISSA Professional of the Year for 2013, and chosen as a Best Cybersecurity Lawyer by Washingtonian Magazine for 2015-2016. He can be reached at rsabett@ cooley.com. 3 https://www.nist.gov/cybercommission. January 2017 | ISSA Journal – 5
  • 6.
    I f you aregoing to be at RSA Conference this year, or perhaps you picked up a print copy and are reading this in the shad- ow of one of the expo halls, take a mo- ment to think about all the vendors on the floor who are selling amazing kit. If you have not walked the floor yet, be sure to allocate a few hours to do so. I like to start at the edges because that’s often where some of the best new stuff is. But remember, buyer beware. Snake oil salesmen work everywhere! As you speak to these vendors and un- derstand how their products work, you might get caught up in the excitement of new kit and new capabilities, so much that you lose rational thought for a mo- ment. I mean, how else do you end up with three timeshares at the end of a lav- ish Las Vegas weekend? Before you sign on the dotted line, think about the prob- lem that the kit is trying to solve and see if you have already solved it elsewhere (or should solve it elsewhere). Sometimes we forget our roots, but that’s understandable as our industry has grown from nothing to what you see around you in the expo halls over the last twenty years. Those of us who have been around that long certainly remem- ber security as something one of the IT guys did, that and building tools to help us manage our growing infrastructure on a small scale—often times in the same manner that the big vendors do to- day. Before you run to your finance guy for budget, let’s look at a couple basic things we all need to master first. How’s your logging? PCI DSS may have been the first step in forcing companies to capture good and usable logging information, but DevOps is the new darling on the block. Compa- nies I work with tend to check the box for PCI to close that nagging require- ment but have expanded their informa- tion generation capabilities dramatical- ly to gain extremely important insight into their infrastructure as it runs. Getting rich logging information to do both user behavior analysis and to gain valuable insights into your infrastruc- ture in real time will power your intel- ligence-gathering capabilities. Many of the products you will see on the fringe this year are going to make the case to shift from SIEM (security information event management) to UEBA (user and entity behavior analytics). If you don’t have solid—and I mean really solid— logging capabilities baked into every layer of your infrastructure, these tools won’t work as advertised. In fact, any tool you see that promises to look for trends, to do machine learning to alert you on anomalies, or to just make you more efficient will struggle to work if you are terrible at logging. Show me machine learning! I was at an expo a few months ago and had a string of vendors tell me about their machine learning capabilities. They show a graph with fifty bars on it, all of which are under a value of, say, ten except for one that is at a thousand. Then they point to it and say MACHINE LEARNING! For the record, that is anomaly detection. My godson who is almost three can do the exact same thing and make you laugh when he does it. Machine learning would be pointing to one of the small bars and telling an analyst to look at that one. Challenge your vendors to go beyond the glitz and buzzwords. Vaporware is just as present today as it has ever been. Machine learn- ing is a fantastic tool, but be sure you are covering your anomaly detection basics first. How’s your debt? Technical debt exists everywhere. It’s that patch you decided to leave off the list, or that coding workaround you built to solve a latency issue, or a default password you left in an application to make support easier. Good companies know exactly how far in debt they are and work to pay this debt back. No com- pany will always be debt free, but man- aging this debt will help you understand how to deploy your limited resourc- es. Sometimes it’s a system that has a known flaw in it, but it takes an attacker twenty minutes to compromise. Sounds like that virtual resource will only exist for ten to fifteen minutes at a time until you can address the root cause! This year’s RSA Conference is geared up to be the biggest ever. Tweet me at @BrandenWilliams with a comment about the article before February 18, 2017, and you could be the lucky winner of a $25 Amazon gift card! Look for me around the expo, in a session, or decom- pressing in the airport lounge on Friday as I hurry home for the weekend! About the Author Branden R. Williams, DBA, CISSP, CISM, is a seasoned infosec and pay- ments executive, ISSA Distinguished Fellow, and regularly assists top global firms with their information security and technology initiatives. Read his blog, buy his books, or reach him directly at http:// www.brandenwilliams.com/. Sweat the Small Stuff By Branden R. Williams – ISSA Distinguished Fellow, North Texas Chapter Herding Cats 6 – ISSA Journal | January 2017
  • 7.
    Open Forum Executive JurisDoctor: Rewarding and Influential Career Path I wanted to write in support of Randy V. Sabett’s column, “Who’s Ready for a JD?,” in the October issue of the ISSA Journal. I agree we need more people with legal education in the secu- rity profession, although I will take the position that one does not need to be a full-blown, bar-certified Juris Doctor (JD). I was told while in law school that 70 percent of JDs don’t practice law. So, if you don’t have the desire to be bar-cer- tified and practice law, a JD may not be the best option for you. In late 2005, I looked at the future of the security industry and saw that every- thing we do in security would have an ever-increasing legal implication. Be- cause of that, I decided I needed a better legal education. I did not have any inter- est in practicing law, so I did not want to go the JD route. I was looking for a Master’s in legal studies, but at that time none existed (there are several Master’s of legal studies degrees today). I came across an Executive Juris Doctor (EJD) degree distance learning program. As I describe it, it’s a law degree for peo- ple who want the same legal education that lawyers get but who have no inter- est in practicing law. You take courses in the same substantive courses JD stu- dents take (e.g., torts, contracts, crimi- nal law, and civil procedure to name a few), but because it is not bar eligible, you don’t take the full course load a JD student would take like wills and trusts or corporations, and there is flexibly to specialize. In my case, I specialized in law and technology and took courses in cyberlaw and intellectual property. I have reaped huge rewards for having this legal education as a security profes- sional. I have published and presented on legal topics in security since 2009. I was twice published in the ISSA Journal, one on e-discovery, the other on social media policy. Being able to take a law and translate it into business processes or technical controls is very hard to do if you do not understand how to read law, how courts will interpret the law, or even understanding rulings coming down from the courts. And laws per- meate our entire profession—CFAA, ECPA, HIPAA (which are actually reg- ulations effectuated by legislation), etc. But, there are other advantages for hav- ing a legal education. Much like we in the security industry have our own vo- cabulary, so too do lawyers; being able to speak to lawyers in a language they understand is very important today. For example, I explain to people that the word “risk” means nothing to a lawyer, but when you use the term “liability,” you can get a lawyer’s attention. As a security professional, when I speak to lawyers using their lexicon, most law- yers light up and become very interested in what I have to say and become very willing to help me. Getting a lawyer’s attention and support has another advantage—that of stake- holder in security. Rather than trying to futilely drive security initiatives with finance, marketing, or technology de- partments or even executive manage- ment, I use the legal department as my driving stakeholder. Their job is to pro- tect the organization from liability and lawsuits, and they usually have the ear of the CEO and the board. So, if they are aware of security issues that are creating liability for the organization, they can be your biggest advocate for advancing change. But, I would warn you not to jump into a legal education lightly. There is a tre- mendous amount of reading and a good bit of writing that goes with a legal edu- cation. Also, I pursued my legal educa- tion going to school full time and work- ing full time, so I slept about four hours a night for the first nine months I was in school. Be prepared for the amount of time that will be required from you. That being said, I can say having a legal education has been very advantageous in my career as a security professional, and it is something I am glad I pursued. About the Author Dr. Jon J. Banks, EJD, GPEN, CEH, OSWP, CISSP is a Sr. Security Architect at Link Technologies with 19 years of ex- perience building information security architectures and programs. Since 2009, Dr. Banks has used his legal education to give back to our profession by publishing, presenting, and teaching on various top- ics in law and information security. He can be reached at jonb@linktechconsult- ing.com. By Jon J. Banks – ISSA member, Denver Chapter The Open Forum is a vehicle for individuals to provide opinions or commentaries on infosec ideas, technologies, strategies, legislation, standards, and other topics of interest to the ISSA community. The views expressed in this column are the author’s and do not reflect the position of the ISSA, the ISSA Journal, or the Editorial Advisory Board. January 2017 | ISSA Journal – 7
  • 8.
    Security in theNews News That You Can Use… Compiled by Joel Weise – ISSA Distinguished Fellow, Vancouver, BC, Chapter and Kris Tanaka – ISSA member, Portland Chapter It’s Time to Pull Out Your Crystal Ball What do you think is going to happen with security and technology in 2017? Will things be better, worse, or will they remain status quo? Here is an assortment of forecast articles for your consideration. To me, these predic- tions are less about the future and more about a replay of 2016. What do we have to look forward to according to security experts? More of the same: The Internet of Things, more viruses and APTs, cloud everything, DoS attacks, ransomware, etc. My personal favorite? Dronejacking. I previously mentioned this to friends at an unnamed online retailer, but in spite of demonstrated attack scenarios they thought it was not possible. As always, it might be fun to hold on to these links and revisit them in December to see how accurate they really were. Here’s to the future and keeping cybersafe in 2017! Cheers! http://www.forbes.com/sites/gilpress/2016/12/12/2017-predictions-for-ai-big-data-iot-cybersecurity-and-jobs-from-se- nior-tech-executives/ - 5ff851ee62e9 http://www.usatoday.com/story/money/columnist/2016/12/17/think-cyberthreats-bad-now-theyll-get-worse-2017-spear- phishing-etc/95262574/ http://www.infosecisland.com/blogview/24860-Top-10-Cloud-and-Security-Predictions-for-2017.html https://blog.radware.com/security/2016/12/cyber-security-predictions-2017/ http://www.mcafee.com/us/resources/reports/rp-threats-predictions-2017.pdf http://www.csoonline.com/article/3150997/security/what-2017-has-in-store-for-cybersecurity.html https://www.scmagazine.com/gazing-ahead-security-predictions-part-2/article/578976/ Biggest Data Breaches and Hacks of 2016: Yahoo Data Breach, DNC Hacking, and More http://www.techtimes.com/articles/190021/20161225/biggest-data-breaches-and-hacks-of-2016-yahoo-data-breach-dnc- hacking-and-more.htm In addition to looking forward, the new year is also a time of reflection and taking stock of what transpired over the past year. Here’s a quick look at some of the biggest data breaches and hacks that took place in 2016. And just in case you haven’t seen it before, check out this frequently updated, interactive infographic from In- formation is Beautiful. http://www.informationisbeautiful.net/visualizations/worlds-biggest-data-breaches-hacks/ Major Cyberattacks on Health Care Grew 63 Percent in 2016 http://www.darkreading.com/attacks-breaches/major-cyberattacks-on-healthcare-grew-63--in-2016/d/d-id/1327779 The Internet of Things continues to open up new attack vectors, particularly in the healthcare industry as secu- rity experts reported a surge in medical device hijacking in 2016. The industry will continue to face challenges in 2017, thanks to predictions of unprecedented levels of ransomware and the increasing ability of hackers to launch multiple attacks at once. Cybersecurity Confidence Gets a C-. How to Improve Your Grade in 2017 http://www.csoonline.com/article/3151078/security/cybersecurity-confidence-gets-a-c-how-to-improve-your-grade- in-2017.html How do you feel about detecting and mitigating cyber threats in your organization? If your answer is “not very confident,” you are in good company. According to a new survey, global confidence in cybersecurity is dropping, while challenges, such as the expanding threat environment, are increasing. Although it is easy to get discour- aged, especially when we continue to see article after article revealing new breaches and cyberattacks, there are ways we can improve. Five Ways Cybersecurity Is Nothing Like the Way Hollywood Portrays It http://www.networkworld.com/article/3151064/security/five-ways-cybersecurity-is-nothing-like-the-way-hollywood-por- trays-it.html Cybersecurity is cool. Just take a look at how many television shows and movies have woven it into their scripts and storylines. But just how accurate is their portrayal of the industry? Yes, Hollywood usually tends to glam- orize things. We all know our day-to-day work lives rarely involve fist-fights and elaborate stunts found in action movies. But the increasing popularity around cybersecurity, even in fictional form, is a good thing. Awareness is one of the best weapons in the fight against the “bad guys.” Increasing the Cybersecurity Workforce Won’t Solve Everything http://www.csoonline.com/article/3153079/security/increasing-the-cybersecurity-workforce-wont-solve-everything.html The word is out—we all need to focus on cybersecurity, improving our security posture and infrastructure. Even the US government is receiving recommendations and guidelines on how to make this goal a reality. Unfortunate- ly, many of the proposed plans will take time and additional resources. What can you do while you are waiting for these “new” solutions to make an impact? Increase security awareness at all levels in your organization. As you have heard many times before, all it takes is just one click. Make sure the humans on your network are prepared to make the right choices. 8 – ISSA Journal | January 2017
  • 9.
    I t’s been ahuge year for information security in the public eye. It seemed like security was constantly in the news for massive corporate security breaches, election email leaks, or draco- nian new cyber laws. We had Apple vs. the FBI. Tempers flared. People got hysterical. And that was just the FBI’s legal team. Not all the commentary was credible. The well- known encryption experts the National Sheriffs’ Association stated that Apple was “putting profit over safety” and this had “nothing to do with privacy.” Aww bless. Yahoo announced yet another huge breach. It’s sad to see the once mighty Internet giant slowly transitioning from respected Internet pioneer to a honey- pot experiment with live customer data. The official line was that Yahoo had been the victim of “state-sponsored” attacks. That sounds a lot better than being re- peatedly caught out with obsolete se- curity controls like MD5 encryption to protect customer passwords. To be fair, MD5 encryption can be considered very strong. But only if your threat model is focused on Russian cryptographers attacking through a star gate from the 1990s. James Clapper announced his resigna- tion. The man who with a straight face denied to the US Congress that data was being collected on millions of Amer- icans is leaving the building. His exit interview would have been a hoot. Have you held anything back? Is there any clas- sified information that you’ve failed to return? Um, “Not wittingly.” Under Clapper’s direc- tion, national security objectives have pros- pered. However, tech- nologies we all depend on have been weak- ened, exposing us to risk from cyber crim- inals and repressive regimes. The profits of US companies have suffered as they’ve struggled to convince global customers that their data is safe with a US com- pany. If you’re a US citizen, you might think the national security trade off was worth it. However, if you live anywhere else in the world, or you’re a US com- pany who has lost customers, then you might have a different view. In November the most intrusive pow- ers ever proposed for the UK intelli- gence services were made law in the UK. Critics protested that the new law gave too many government agencies access to people’s browsing history without the need for a warrant. In fact, the list of agencies that can access browsing data without a warrant is so large that it might have been quicker just to list those that can’t. On the plus side we can all sleep safely knowing that the Welsh Ambulance Services National Health Service Trust knows what we’re doing online. Privacy activists took the UK government to the European Court of Justice, which ruled in December that government agen- cies needed inde- pendent judicial oversight and that access had to be in response to serious crime. If you swap “web history” with “that special bedroom drawer,” then the judgment is entirely consistent with re- al-world privacy. There were persisting concerns about the security weaknesses of voting ma- chines in the US elections. We should be grateful that the winner of this part- ly automated vote count wasn’t Select *. The FBI learned that Hillary Clinton’s campaign chief John Podesta’s email had been compromised. Unfortunately all their agents were busy ogling An- thony Weiner’s laptop, so they just left a message with Podesta’s IT helpdesk. It’s a mystery why Weiner’s laptop deserved thousands of hours of agent time and the compromise of Podesta’s email by a foreign power didn’t merit an agency visit. 2016 was also the year that the burgeon- ing Internet of trash really started to stink. Brian Kreb’s website was hit with the largest distributed denial of service attack ever: a great amorphous pudding of hijacked IP-enabled household appli- ances. People started waking up to the risks. Some even asked, what’s the point of a rice cooker having an IP address? Here’s to 2017. About the Author Geordie Stewart, MSc, CISSP, is the Principle Security Consultant at Risk Intelligence and is a regular speaker and writer on the topic of security awareness. His blog is available at www.risk-intelli- gence.co.uk/blog, and he may be reached at geordie@risk-intelligence.co.uk. Security Awareness Security in the News in 2016 By Geordie Stewart – ISSA member, UK Chapter Image used with permission January 2017 | ISSA Journal – 9
  • 10.
    Crypto Corner A FeebleAttempt at Humor By Luther Martin – ISSA member, Silicon Valley Chapter some hiring manager thought that be- ing able to understand and laugh at this particular joke was a good criterion to use for selecting employees. Really. Here is the joke, reproduced as well as my memory allows. This one requires more thought than the first one. You should not feel bad if you do not under- stand it right away. But even if you do understand it, you might want to feel lucky that you did not end up working for this particular company. Three cryptographers walk into a bar. The bartender says, “Are you all hav- ing beer tonight?” “Hmm,” says the first cryptographer, “I don’t know.” “Hmm,” says the second cryptogra- pher, “I don’t know.” “Yes,” says the third cryptographer. I’m not sure where explaining this joke ranks compared to other pointless in- terview questions, like asking how many ping-pong balls it would take to fill a school bus or asking why manhole cov- ers are round, but it seems to me like it is roughly just as useful. This joke actually made me laugh. It also made me wonder exactly how the discus- sion went among the people doing inter- views that led to this particular element being added to their interview process. I assume that nobody starts with the goal of making a bad decision, but using this as part of an interview seemed as good an example of something resulting from a bad decision as anything I have ever seen. The third and final example of humor is another one that I had the dubious hon- or of creating. It is even harder to un- derstand than the previous joke—unless I n f o r m a t i o n security pro- fessionals in general, and cryp- tographers in partic- ular, are not known for their senses of humor. This could be because the most common personality type in informa- tion security is MBTI type INTJ. People of type INTJ tend to be very competent but coldly rational. The characters Greg House from the TV show House and Sherlock Holmes from the TV show Sherlock are examples of how INTJs may come across to most people. But this does not mean that we do not appreciate humor when we see it. Every ten years or so, I come across examples of humor that seem to appeal to some security professionals and to almost all cryptographers. Here are three exam- ples. The word “rogue” is often misspelled as “rouge.” I first noticed this back in the dot-com era when a discussion started on a mailing list about how to handle “rouge CAs.” After other list members exchanged a few messages, I could not help asking what these “rouge CAs” were. I asked if they were described in some document that I had not “red,” but suggested that they were probably real, rather than something that someone would just “makeup.” Only one other list member seemed to understand my attempt at humor, while many others tried to provide serious answers to my obviously (at least to me) flippant questions. This might have been when I first suspected that humor might be quite rare in some parts of the securi- ty industry. It also might not have been as funny as I thought it was at the time. Several years later, I heard a joke in a rather unusual context. Apparently you spent time in college studying the theory of computation, of course. Several years ago I had to give a talk in Pittsburgh one morning, and then drive to Cincinnati that afternoon for a meet- ing the next day. The roads through that part of the US are notoriously bumpy and busy, and when I finally made it to Cincinnati that evening, I was very tired. When I went to check in at my ho- tel, I was greeted by an enthusiastic and cheerful young woman. “How are you today?” she asked. “I’m tired,” I replied, perhaps a bit too truthfully. Not realizing that I was a cryptogra- pher, she misattributed another pro- fession to me. “Being a traveling salesman can be tough,” she said. “Yes,” I said, “it can be. And the worst part is how NP-hard the car seats can get.” “What?” “Never mind.” What have I learned from my many years of experience in the security in- dustry? Apparently not enough. I still have a bad habit of starting talks with a joke, no matter how many times it ends up failing miserably. But isn’t that what we should expect from an INTJ? About the author Luther Martin is a Distinguished Tech- nologist at Hewlett Packard Enterprise and the author of the first attempt at hu- mor published in the ISSA Journal (“The Information Security Life Cycle,” March 2008). You can reach him at luther.mar- tin@hpe.com. 10 – ISSA Journal | January 2017
  • 11.
    SECURE ANY CLOUDWITH ARMOR ANYWHERE Start Your Secure Cloud Journey Here Armor Anywhere is a managed, scalable security solution designed for data within public, private, hybrid or on-premise cloud environments. Installed at the OS level and managed by a team of experienced security experts, it prevents data breaches so you can realize your multi-cloud strategy. How it works: cut along the dotted line and apply to your hosting infrastructure responsible for sensitive and regulated data. Managed Security for any cloud. Anywhere. armor.com | (US) 1 877 262 3473 | (UK) 800 500 3167
  • 12.
    Association News Through January13, 2017 – For information: www.issa.org/events/EventDetails.aspx?id=712365&group= T he second research report from the groundbreaking global study of cybersecurity professionals by ISSA and independent industry analyst firm Enterprise Strategy Group (ESG) has been released. In aggregate 54 percent of cybersecurity professionals sur- veyed admitted that their organizations experienced at least one type of security event over the past year. Yet, surprisingly, none of the top contributors to these cyber attacks and data breaches are related to cyber technology. Rather they point to human issues such as a lack of enough cybersecurity staff members as well as a lack of employee training and board- room prioritization. Further supporting this finding, 69 percent of cybersecurity professionals say the global cybersecurity skills shortage has had an impact on the organization they work for leading to excessive workloads, inappropriate skill levels, high turnover and an acute shortage especially in the areas of security ana- lytics, application security, and cloud security. In this time with fluid world events, such as the US presiden- tial transition, cybersecurity professionals surveyed also send a strong message to national government: the vast majority believe that their nation’s critical infrastructure is extreme- ly vulnerable or vulnerable to some type of significant cyber attack and want government more involved in cybersecurity strategies and defenses. Going further they recommend spe- cific actions government should take, leading with providing better ways to share security information with the private sector, incentives to organizations that improve cybersecu- rity, and funding for cybersecurity training and education. “There’s lots of research indicating a global cybersecurity skills shortage, but there was almost nothing that looked at the associated ramifications. Based upon the two ESG/ISSA reports, we now know that beyond the personnel shortage alone, cybersecu- rity professionals aren’t receiving appropriate lev- els of training, face an increas- ing workload, and don’t always receive adequate support from the business,” said Jon Oltsik, ESG senior prin- cipal analyst. “Simply stated, these findings represent an exis- tential threat. How can we expect cybersecurity professionals to mitigate risk and stay ahead of cyber threats when they are understaffed, underskilled, and burned-out?” Based upon the data collected from the first global survey to capture the voice of cybersecurity professionals on the state of their profession, this final report of the two-part series, ti- tled “Through the Eyes of Cybersecurity Professionals: An- nual Research Report (Part II),” concludes: • The clear majority (92 percent) believe that an average or- ganization is vulnerable to some type of cyber attack or data breach • People and organizational issues contribute to the on- slaught of security incidents • Most organizations are feeling the effect of the global cy- bersecurity skills shortage • Cybersecurity professionals have several suggestions to help improve the current situation • Sixty-two percent believe critical infrastructure is very vulnerable to cyber attacks • Sixty-six percent believe government cybersecurity strate- gy tends to be incoherent and incomplete • Eighty-nine percent of cybersecurity professionals want more help from their governments “The results gleaned from this research are both alarming and enlightening. Alarming in the sense that if we don’t collectively pay attention to the cries for help, we will put businesses unnecessarily at risk. Enlightening in that orga- nizations need to be willing to invest in their cybersecurity professionals, with clearly defined career paths and skills de- velopment in order to hire and retain qualified employees,” said Candy Alexander, cybersecurity consultant and chair of ISSA’s Cybersecurity Career Lifecycle. “This research data will help ISSA and other professional groups to clearly define career paths for our profession.” The Voice of Cybersecurity Professionals (Part II) Research Reveals “Human” Issues as Top Cybersecurity and Business Risk Figure 1 – Impact of cybersecurity skills shortage Has the global cybersecurity skills shortage impacted your organization over the past few years? 12 – ISSA Journal | January 2017
  • 13.
    CSCL Pre-Professional VirtualMeet-Ups ISSA.org => Learn => Web Events => CSCL Meet-Ups S o, you think you want to work in cyberse- curity? Not sure which way to go? Not sure if you’re doing all you need to do to be suc- cessful? Check out Pre-Professional Virtual Meet- Ups to help guide you through the maze of cybersecurity. January 19, 2017: 2:00 p.m. – 3:30 p.m. EDT. Future Chal- lenges: Are You Ready? This discussion will look at the history of security and tech- nology in order to identify what has changed and what hasn’t as well as lessons learned from our past to help prepare for our future. We will review methodologies, technologies, and business practices. Are the challenges really all that different? 2016 Security Review and Predictions for 2017 2-Hour live event Tuesday, January 24, 2017 9 a.m. US-Pacific/ 12 p.m. US-Eastern/ 5 p.m. London 2016 was a monumental year in cybersecurity: from email hacking impacting the US political world to the October DNS attacksandtheongoingriseofransomwareandIoTconcerns. “Cyber” is huge right now. How will this growing spotlight on security translate in terms of media and regulatory attention? And what kinds of threats will dominate the 2017 landscape? Join us, make notes, and then check back in a year to see how we did! Generously sponsored by For more information on this or other webinars: ISSA.org => Web Events => International Web Conferences ISSA.org => Learn => CISO Executive Forum T he CISO Executive Forum is a peer-to-peer event. The unique strength of this event is that members can feel free to share concerns, successes, and feedback in a peer-only environment. Membership is by invitation only and subject to approval. Membership criteria will act as a guideline for approval. The 2017 venues will be the following: San Francisco, CA Innovation and Technology February 11-12, 2017 Washington DC Information Security, Privacy, and Legal Collaboration April 20-21, 2017 Las Vegas, NV Security Awareness and Training—Enlisting Your Entire Workforce into Your Security Team July 23-24, 2017 San Diego, CA Payment Str ategies: The Game Has Changed October 11-12, 2017 For information on sponsorship opportunities, contact Joe Cavarretta, jcavarretta@issa.org. ISSA CISO Virtual Mentoring Series L EARN FROM THE EXPERTS! If you’re seeking a career in cybersecurity and are on the path to becom- ing a CISO, check out the 19 webinars from April 2015 through December 2016! ISSA.org => Learn => Web Events => CISO Mentoring We- binar Series ISSA.org => Career => Career Center Looking to Begin or Advance Your Career? T he ISSA Career Center offers a listing of current job openings in the infosec, assurance, privacy, and risk fields. Visit the Career Center to look for a new opportunity, post your resume, or post an open- ing. Questions? Email Monique dela Cruz at mdelacruz@ issa.org. The report also lays out the “Top 5 Research Implications” as a guideline for cybersecurity professionals and the organiza- tions they work for. “Assume your organization will experi- ence one or several cyber attacks or data breaches and take the cybersecurity skills shortage into account as part of every initiative and decision. Push for more all inclusive cybersecu- rity training and, as importantly, get involved in educating and lobbying business executives and government legislators alike,” recommended Oltsik. Leslie Kesselring, ISSA Public Relations Consultant —“Through the Eyes of Cybersecurity Professionals: Annual Research Report (Part I)”: http://www.issa.org/esgsurvey/. —“Through the Eyes of Cybersecurity Professionals: Annual Research Report (Part II)”: https://www.issa.org/page/is- saesg_survey_P2. January 2017 | ISSA Journal – 13
  • 14.
    ISSA DEVELOPING AND CONNECTING CYBERSECURITYLEADERS GLOBALLY Machine Learning: A Primer for Security By Stephan Jou – ISSA member, Toronto Chapter “Machine learning is revolutionizing the security landscape.” The author examines how machine learning can be leveraged to address the practical challenges of delivering lower-cost security by resolving more threats faster, with fewer resources. It will focus on machine learning security techniques that work at typical levels of data volumes, from those operating with “small data” to those implementing data lakes. P opular responses to that statement are all over the map. Some say machine learning is vastly over hyped in our market, while others contend it is the combi- nation of machine learning with access to more data that is the main reason to be optimistic about security in the future. In the day-to-day world of data security, analytics practi- tioners who have embraced machine learning are regularly catching bad actors, such as externally compromised ac- counts or malicious insiders. We do this by using machine learning and analytics to detect indicators of compromise and predict which employees or associates are likely to leave with stolen data. We succeed when we define what is normal, then determine anomalies using machine learning. Machines are simply faster at repetitive tasks like finding inconsisten- cies in the patterns of data usage, and machines do not tire from scouring through billions of data events per day. At present, the cybersecurity industry is still behind the curve in demonstrating the kind of success that machine learning has achieved in some other industries. But with rapidly grow- ing volumes of data and better behavioral monitoring aimed at leveraging data, big data, and data lakes, machine learning and security clearly will achieve more breakthroughs together. There are two good reasons why machine learning is useful to security. First, it can reduce the cost of standing up and maintaining a security system. In this industry, we’ve spent billions, yet we clearly need better tools to protect our data. The bad guys still have better tools than the good guys, and it still costs too much to investigate and respond to security incidents. The nature of defense is that it simply takes time to build up resistance, only to have a new attack render that de- fense ineffective or obsolete. This leads to the second reason that machine learning is important: it can reduce the time required to detect and respond to a breach once the inevitable occurs. Proper use of machine learning can have a measur- able impact on deployment time and cost, as well as dwell time from incident to response. In this article, I will examine how we leverage machine learning to address the practical challenges of delivering low- er-cost security by resolving more threats faster, with fewer resources. I will focus on machine learning security tech- 2016 Article of the Year 14 – ISSA Journal | January 2017
  • 15.
    niques that workat typical levels of data volumes, from those operating with “small data” to those of us implementing data lakes. My purpose is to empower security teams to make use of machine learning to automate what skilled experts can do: prioritize risks so that experts can focus attention on those high-threat anomalies that signify targeted attacks, compro- mised accounts, and insider threats. Automate and learn: What machine learning does best The concept of machine learning is based on the idea that we can use software to automate the building of analytical models and have them iteratively learn, without requiring constant tuning and configuring. Machine learning, if im- plemented properly, learns by observing your company’s par- ticular data. It should not require rules, tool kits, or a team of data scientists and integrators to endlessly examine the datasets in order to become operational. Similarly, the soft- ware should not require a team with system administration or DevOps skills to architect a big data infrastructure. Many companies’ experiences with analytics date back to when sci- entists and integrators had to spend months, or even years, to understand the business and how every aspect of the dataset intersected with users and machines. This is no longer the case. Modern machine learning works with the data in your organization, observing it persistently through continuous user, file, and machine monitoring. Further, machine learning can react automatically to typical business changes by detecting and reacting appropriately to shifting behavior. This is often a surprise to companies ac- customed to bringing in teams of consultants and having to re-engage them when a new business unit is created or a merger occurs. It is expected that if there are new behaviors; the old software must be configured; rules constantly rewrit- ten; new thresholds created. But if done correctly, machine learning can learn—then automatically continue to learn— based on updated data flowing through the system. Just as a teacher doesn’t have to tell an equa- tion how to compute the average grade score for the population of a class, the same equation for com- puting averages will work in class- rooms everywhere—or when class- es are added or removed. Math is magical, but not magic. The fact is, math cannot do any- thing that a human can’t do, given enough time and persistence. Math simply expresses what is happen- ing in an automated fashion using equations. In machine learning, such equations are imple- mented as software algorithms that can run continuously and tirelessly. There is plenty of mystique around the seemingly limitless capabilities of “magical” algorithms that are, in real- ity, far less responsible for what machine learning can do for security than the data itself. In fact, connecting the data to the math (a process known as feature engineering) and then implementing the math at scale (using appropriate big data technologies) is where the real magic of machine learning for security lies. Cost and time essentials One way to understand how machine learning can have an impact on cost is to look at the steps required to install and use an analytical product. We all know there is fixed time associated with installation and configuration, but it is the Automatic means no rules must be fine- tuned, no thresholds must be tweaked, no maintenance must be performed when your business shifts. January 2017 | ISSA Journal – 15 Machine Learning: A Primer for Security | Stephan Jou
  • 16.
    pendent on thecapabilities of the analytics. The real cost dis- parity emerges when we ask questions such as: • Do I need to set thresholds? • Will we have to write rules? • Am I paying service fees for these capabilities? • How easy is it? To get value from the system, you obviously want to ask the essential question: How long before we can actually learn something about a breach? By asking and answering this, we can know time to value. To obtain the answer, we need to focus on how machine learning extracts value. It’s popular to focus attention on the algorithm, most likely because recently algorithms such as Deep Learning have been achieving exciting successes in the news. And it’s naturally easy to get lost in that excitement! However, more important than the algorithm is a focus on the right data and correspondent use case appropriate for your particular organization. Getting the right datasets for the job and applying the right principles will trump any giv- en algorithm, every time. With this approach, we can allow machine learning to do what it does best: find evidence, and connect the dots between pieces of evidence, to create a true picture of what is happening. This “connecting of dots” is important because it allows us to show corroboration across datasets. When security profes- sionals talk about alert fatigue, they are really referring to the need for better corroboration so they can reduce the number of results the system fires. Simply put, when we have alert fa- tigue, the math is not helping us compress the results that the system is finding. But math can help compress billions of events per day into dozens of incidents by effectively scor- ing all events, and then corroborating multiple-scored events together. A machine learning implementation further means that this approach to reduce false positives and alert fatigue can be done automatically, to give us the reduced cost and fast- er time to value we’re looking for. But how does that work? The value of a score: Probabilistic methods vs. rules and thresholds One important machine-learning technique is using probabi- listic statistical methods1 to score events for risky indicators, rather than to rely on rules with thresholds that either fire or do not fire. When we talk about scoring an event, we are simply talking about computing a number, for example, between zero and 100. This contrasts with relying on rules that issue a Bool- ean alert. Boolean alerts either fire or do not fire, based on parameters and thresholds the operator has set. The problem with this approach is that since alerts either fire or do not fire, as the alerts accumulate (in your SIEM, for example), the best we can do is count them. Having 10 alerts, all with lim- 1 For a good overview of probabilistic and statistical methods as it applies to machine learning, see: Murphy, K. P. 2012. Machine Learning: A Probabilistic Approach, Cambridge, Massachusetts: MIT Press. tuning and training of the analytics that has been historically costly. There are many steps involved in the process between decid- ing to start to build a security analytics-enabled process, to receiving valid analytics that can detect and respond to inci- dents. Choosing the right approach can significantly reduce the time and the cost between the project start and when val- ue can be provided. Specifically, choosing a proper machine learning-based approach that does not require manual tun- ing, customization, building of rules, etc., can greatly accel- erate the time to value (figure 1). Whether total deployment time is fast (a couple of hours or few days) or painfully slow (as long as a year!) is largely de- Figure 1 – Time to value: Security analytics using rules, versus security analytics using machine learning Don’t Miss This Web Conference 2016 Security Review and Predictions for 2017 2-Hour live event Tuesday, January 24, 2017 9 a.m. US-Pacific/ 12 p.m. US-Eastern/ 5 p.m. London 2016 was a monumental year in cybersecurity: from email hacking impacting the US political world to the October DNS attacks and the ongoing rise of ransomware and IoT concerns. “Cyber” is huge right now. How will this growing spotlight on security translate in terms of media and regulatory attention? And what kinds of threats will dominate the 2017 landscape? Join us, make notes, and then check back in a year to see how we did! Generously sponsored by For more information on this or other webinars: ISSA.org => Web Events => International Web Conferences 16 – ISSA Journal | January 2017 Machine Learning: A Primer for Security | Stephan Jou
  • 17.
    are trained tolook for—bad or at least “weird” things happening to their data. Finally, we can collect and score all of the events and compute their likelihood of causing us problems. In this way, we cre- ate a system that can learn automatically. This automatic learning is an important component of why the machine learning approach works. Automatic means no rules must be fine-tuned, no thresholds must be tweaked, no maintenance must ited severity information and context, delivers little information that is helpful. When we score events for risk, we can as- sign them meaning—for example, 0% is no risk, while 100% is the most extreme risk—and then more smartly aggregate risk values to get a combined picture of the risks associated. Risk scores can give additional context by being associated with not only a particular activity, but also with the assets, people, and ma- chines involved. Mathematical weight- ing helps us tune and train our model for specific activities, people, assets, and end points on a per-behavior pattern basis. Aggregating scores, rather than simply counting alerts, is more effective because we can define a weighted representation of how risky behavior is. In contrast, if all you have is an alert, you can only say that “X” things happened. While it’s true that we can label events, labeling things either good or bad does not help. In fact, it can be risky. It quickly becomes easy to ignore low probability events or trick the system into ignoring them. You can see why it is possible to get 10,000 alerts when the threshold is set too low, for example. In a typical medium-size business environment, it is quite likely to have the data present us with billions of “events”—multiple bits of evidence of what is happening to the data. Machine learning can work quickly to distill these billions of events to tell the difference between low- and incredibly high-risk events, and then connect them together for a picture, or handful of pictures, that can tell us what is going on. Here, math helps us compress the results, so instead of having alert fatigue or a group of pat- terns with arbitrary values, we have a clear picture using statistics of what is anomalous. In addition to using scoring, effective machine learning in data security lets us use probabilistic math rather than thresholds. Probabilistic methods are better than thresholds because they tell us not just about badness, but the prob- ability or degree of badness. We can compute all of the events, not just those arbitrarily deemed likely to be interest- ing. We can much more accurately assess the overall risk posture of any entity and actually measure what security experts be performed when your business shifts. But how does machine learning pull off this trick? How machines learn Machines don’t learn in a vacuum; ma- chines learn by continually observing data. Given enough data, machines can turn data into patterns. Observation of patterns can lead to generalizations, a process accomplished by taking exam- January 2017 | ISSA Journal – 17 Machine Learning: A Primer for Security | Stephan Jou
  • 18.
    As a human,when given a set of observations that look like figure 2, you might eventually conclude (or learn) that cats generally have longer tails and whiskers than dogs. There are two broad classes of machine learning: supervised learning and unsupervised learning. In supervised learning, we are given the answers. In our cat and dog example, suppose that whenever we are given a whis- ker length and tail length, we are also told whether the animal is a cat or a dog; this is an example of supervised learning. Rather than simply asking us to “find me dogs and cats,” the data told us what these animals are. Since we, in turn, advised the algorithm about whisker and tail length, this class of al- gorithm is known as supervised learning. It requires accurate examples. The model, represented visually by the dotted line (figure 3), states that if the tail and whisker length is to the left of the dotted line, declare the animal to be a dog. If it’s on the right, call it a cat. Using the learned model shown in figure 3, we can start to make predictions. When we see animal X, and measure its tail and whisker length, we would predict that it’s a cat, since it is to the right of the dotted line (figure 4). X’s long whiskers and long tail give it away! In unsupervised learning, we hope that a grouping (or cluster- ing) pattern emerges based solely on the input data, without any output labels (figure 5). The data tells the story, self-or- ganizing into clusters. In general, unsupervised learning is a much harder problem than when output labels are available. ples and creating general statements or truths. This learning process is true not just of machines, but of humans. Machine learning is nothing more than algorithms2 that automate this same learning process that we as humans do naturally. Consider that when we as humans see something, we know what we probably saw because it is most similar to what we’ve seen before. This is actually an example of a machine learning algorithm known as “nearest neighbor” (or k-nearest neigh- bors, for the picky). Here is an example of applying machine learning to deter- mine whether an animal is a cat or a dog. By fitting points to a line we can observe that when we see an animal and it has long whiskers (cats) and longer tails (also cats), it is more like- ly to be a cat than a dog. The more examples we see, the more generalizations prove the rule. While it’s true that sometimes a cat has a short tail and occasionally a dog has really long whiskers, it is mostly not the case. Clusters emerge showing cats and dogs. Children quickly recognize by this method what is a cat and what is a dog. Algorithms, when given ex- amples, can be created to do the same thing, using math to automate this process. Suppose we go around our neighborhood and measure the whisker lengths and tail lengths, in inches, for the first 14 pets we see. We may end up with a set of data points like the fol- lowing (table 1): Whisker Length (input) Tail Length (input) Cat or Dog? (output) 5 6 Cat 5.7 11 Cat 4.3 9.5 Cat 4.2 7 Cat 6.4 8 Cat 5.9 10 Cat 5.2 9 Cat 2.3 5 Dog 2.5 3 Dog 4 9.5 Cat 2.1 7 Dog 1.3 9 Dog 3.4 7.5 Dog Table 1 – Whisker and tail lengths of sample pets 2 There are many good books that introduce the concepts of machine learning. The following book is short and very readable, and does not require a deep math background: Adriaans, P. and Zantinge D., 1996. Data Mining, England: Addison- Wesley Longman. The following is a great reference for those more comfortable with mathematical notation. Tan, P.-N.; Kumar, V. and Steinbach, M. 2006. Introduction to Data Mining, Boston: Addison-Wesley Longman. For the coders, try: Conway, D. and White, J. M. 2012. Machine Learning for Hackers, O’Reilly. Figure 2 – A plot of neighborhood dogs and cats, and their tail and whisker lengths, in inches. Figure 3 – A simple model that distinguishes between dogs and cats, based on tail and whisker length. Figure 4 – Predicting with a model Figure 5 – Data points without labels 18 – ISSA Journal | January 2017 Machine Learning: A Primer for Security | Stephan Jou
  • 19.
    But how dowe determine the right features? Selecting fea- tures requires knowledge. For example, we might include our historical experience or studies from industry organizations such as CERT, academic research, or our own brainstorming. This type of knowledge is the reason we need experts who can take what is in their heads and ask machines to automate it. Creating good features is a far better use of people skills and money, anyone would agree, than hiring expensive hunters to sift through a sea of alerts. Machine learning simply allows us to automate typical patterns so that our highly qualified hunters can focus on the edge cases specific to the company and the business. Online vs. offline learning There are two modes of machine learning: online and offline. Offline learning is when models learn based on a static data- set that does not change. Once the models have complet- ed their learning on the static dataset, we can then deploy those models to create scores on real-time data. Traditional credit-card fraud detection is an example of offline learning. Credit card companies can take a year of credit card trans- actions and have models learn what patterns of fraud look like. The learning can take many days or weeks to actually complete. Once completed, those models can be applied in real time as credit-card transactions occur, to flag potentially fraudulent transactions. But the learning part was done off– line from a static dataset. Online learning occurs when we take a live dataset and si- multaneously learn from it as the data comes in, while si- multaneously deploying models to score activity in real time. This process is quite a bit harder, since we are taking data as it comes in, using live data to get smarter and run models at the same time. This is the nature of modern, machine learn- ing-based, credit card fraud detection. It notices what you personally do or do not do. It involves individualized data, simultaneously scoring activity. We use machine learning online to learn and react at the same time. This distinction is important because, for security, many of our use cases require learning new patterns as quickly as pos- sible. We do not always have the luxury of using offline ma- chine learning to collect months and years of data. Instead, it is often more desirable to have models that learn as quickly as possible, as data comes in, and also react as quickly as possi- ble, as data changes. Historically, much of the machine learning we have done is offline because it has been hard to move and analyze data fast enough to run at scale. But now, with big data technologies such as Hadoop,3 HBase,4 Kafka,5 Spark,6 and others, we are able to learn and score as data streams into our system. The speed and volume of our data feeds are so much greater than ever before. Online learning (building the models) and scor- 3 Hadoop – http://hadoop.apache.org. 4 HBase – https://hbase.apache.org. 5 Kafka – http://kafka.apache.org. 6 Spark – http://spark.apache.org. Unsupervised learning means we do not have any “labels,” so we are not told the “answers.” In other words, we observe a set of whisker and tail lengths from 14 animals, but we do not know which are cats and which are dogs. Instead, all we might know (if we’re lucky!) is that there are exactly two types of animals. We might still arrive at a good model to distin- guish between dogs and cats (such as the one illustrated in Figure 4), but this is clearly a harder problem! In general, security use cases require a mix of supervised and unsupervised learning because datasets sometimes have la- bels, and sometimes have not. An example of datasets where we have a lot of labels is malware: we have many examples of malware in the wild, so for many malware use cases, we can use supervised learning to learn by example. An example of datasets where we have little to no labels is anything related to insider threat or APT; there is generally not enough data available to rely on supervised learning methods. The importance of the input The input that you give your machine learning model matters significantly. In trying to distinguish cats from dogs, know- ing to focus on whisker and tail lengths allowed our machine learning to be successful. If we had chosen less meaningful inputs—such as trying to distinguish cats from dogs by the number of legs—we would have been less successful. The process of picking and designing the right inputs for a model is critically important to succeeding with analytics. For security use cases, research and experience must guide the feature engineering process so that the right model inputs are chosen. For example, we know from CERT, Mandiant, and others that good indicators of insider threat and lateral movement are related to unusually high volumes of traffic. Our own research has discovered that the ratio of an individ- ual’s writes to and reads from an intellectual property reposi- tory—something we affectionately call the “mooch ratio”—is a valuable, predictable input as well. By observing such indi- cators, an effective machine-learning system can predict who might be getting ready to steal data. As you can see, the most important part of data science is selecting the inputs to feed the algorithm. It’s an important enough process to have its own special name: feature engi- neering. Feature engineering, not algorithm selection, is where data scientists spend most of their time and energy. This process involves taking data—for example, raw firewall, source code, application logs, or app logs—understanding the semantics of the dataset, and picking the right columns or calculated columns that will help surface interesting stories related to our use case. A feature is little more than a column that feeds the algorithm. Picking the right column or features gets us 90 percent of the way to an effective model, while picking the algorithm only gets us the remaining 10 percent. Why? If we are trying to distinguish between cats and dogs, and all we have as inputs are the number of legs, the fanciest algorithm in the world is still going to fail. January 2017 | ISSA Journal – 19 Machine Learning: A Primer for Security | Stephan Jou
  • 20.
    to search, forexample, on terabytes of data per day. And for this, we have widely available big data-suitable technologies like Solr7 and Elasticsearch.8 Such technology lets us scalably index across all analyses from all detected threats, from all datasets in the data lake. Technologies like Kibana are now readily available to give us a friendly UI and API to search and visualize our results. However, visualizing big data is hard. You can imagine how a pie chart of a thousand users, in which each bar corresponds to one person, leads to a sea of color (figure 6). Visualization in the data lake is obviously an enormous field for research involving the challenge of how to take huge amounts of data and convey meaning. It requires under- standing, aggregating, summarizing, and the ability to drill down into different levels of detail. Techniques from visual- ization research—like focus-and-context visualization or an understanding of visual cognition and biological precepts— all come into play here. In other words, visualization is more than just the drawing of the picture; the analytics underneath the picture is equally important. In figure 7, we can see the result of processing more than 45 billion events. We can see that the most important events happened in February and March. Visualization on a large amount of data must tell us a story. By using machine learn- ing and visualization tools, we see the end of a pipeline of analytics using computed risk scores to generate this picture from the raw data. As we learned, math using machine learn- ing is behind the tail end of a picture that shows risk over time. The “matrix” visualization at the top represents 45 billion events. However, the underlying machine learning analysis has processed the events to 7,535 “stories,” each with varying levels of risk, which appears in the visualization as areas oc- cupied by squares. Notice how quickly you see that two of the highest risk time periods occurred in mid-to-late February. Additional interactivity allows the user to zoom in and focus on that specific time region for more detail. 7 Solr – http://lucene.apache.org/solr/. 8 Elasticsearch – https://www.elastic.co/products/elasticsearch. ing (running the models) on terabytes of data a day is now technically possible, whereas it would have been impossible a decade ago. Leveraging the data lake A final reason that machine learning is more important to se- curity now than ever becomes clear when we consider its use with data lakes. Data lakes matter because they can be input sources for the storage of data logs, as well a repository of an organization’s intellectual property around which we build protection. Clearly, we need big data analytics and automated methods in order to see what threats are happening in this realm. Increasingly, big data lakes are giving us the oppor- tunity to analyze, detect, and predict threats—beyond seeing what has happened—for compliance and forensics purposes. This trend has occurred, in part, because data has gotten too big to store in a SIEM. As we know, most SIEMs can practical- ly store only a few months of data; anything older is dropped or stored where it is not available for analysis. Increasingly, organizations have focused on Hadoop and related technolo- gies as a more cost-effective way to act as the system of record for log files. But how can we better detect threats once we are storing data (e.g., log files) in our Hadoop data lake? Search, visualize, detect, predict—and repeat As with any data, we want to be able to search, visualize, detect, and predict threats. With ma- chine learning, we want to combine human ex- pertise with automated analyses for faster, more accurate results. All of these tasks are harder on big data, which requires newer technologies to be capable of handling them at scale. Data lakes let us search across and join all our datasets into a single query. We want to be able Figure 6 – A pie chart showing the top 100 most active tweeters. Source: http://chandoo.org/wp/2009/08/28/nightmarish-pie-charts/ Figure 7 – A big data interactive visualization from Interset 20 – ISSA Journal | January 2017 Machine Learning: A Primer for Security | Stephan Jou
  • 21.
    moves. It turnsout that the combination of humans and com- puters together produces stronger chess play than either hu- mans alone or computers alone. Why is the combination of humans with computers so pow- erful for playing chess? It turns out that computers are gener- ally better at calculating lots of moves, of being consistently tactical, and not making mistakes. Humans, however, tend to have a better holistic feel for the game. They see broad themes and are better able to identify an edge, excelling in strategic play. What is perhaps best, of course, is humans and computers working together. Why spend time looking at log files and billions of events when computers are so good at these tasks? Why look to an algorithm for a strategy on use cases? A skilled cyber hunter fed with amazing data sources and machine learning will save time, because the math never gets tired and rarely, if ever, makes a mistake. This leaves our experts far more free to focus on edge cases and provide feedback and guidance back to the system on new models and features. Better together, the human expert with proper machine learn- ing tools is the winning combination that makes the future of security analytics so optimistic, compelling, and powerful. References —Adriaans, P. and Zantinge D., 1996. Data Mining, En- gland: Addison-Wesley Longman —Conway, D. and White, J. M. 2012. Machine Learning for Hackers, Cambridge: O’Reilly Press. —Guyon, I.; Gunn, S.; Nikravesh, M. and Zadeh, L. A. 2006. Feature Extraction: Foundations and Applications, Nether- lands: Springer. —Marz, N. and Warren, J. 2015. Big Data: Principles and Best Practices of scalable Real-Time Data Systems, NY: Manning Publications. —Murphy, K. P. 2012. Machine Learning: A Probabilistic Approach, Cambridge, Massachusetts: MIT Press. —O’Neil, C. and Schutt, R. 2013. Doing Data Science: Straight Talk from the Frontline, Cambridge: O’Reilly Press. —Tan, P.-N.; Kumar, V. and Steinbach, M. 2006. Introduc- tion to Data Mining, Boston: Addison-Wesley Longman. —Tufte, E. R. 1983. The Visual Display of Quantitative Infor- mation, Connecticut: Graphics Press. —Zumel, N. and Mount, J. 2014. Practical Data Science with R, NY: Manning Publications. About the Author Stephan Jou is CTO at Interset. He was pre- viously with IBM and Cognos and holds an M.Sc. in Computational Neuroscience and Biomedical Engineering and a dual B.Sc. in Computer Science and Human Physiology from the University of Toronto. He may be reached at sjou@interset.com. Here, every visualization supports large amounts of data, with machine learning and the analytics working behind the scenes to surface and compresses billions of events into dozens of stories we can understand. Further, these visual- izations can be interactive, provided you have the right tech- nology to support that interactivity with filtering done using, for example, fast search. Taming big data Just as we need big data tools to search and visualize, we need tools to detect and predict that are suited to the data lake realm. It’s still important to allow humans to inject business context and priorities, as well as human intuition, into the process. But clearly, standard rules engines may struggle to keep up with the volumes and velocities of the data lake. They are simply not going to scale to the size volume and velocity of a big data engine. Fortunately, just as with search and vi- sualization, there are technologies to support rules engines at scale. Kafka, Spark, and Storm are good examples of technol- ogies which understand how to move data at scale, process patterns at scale, and trigger rules. We also use different math because small-data math does not apply to big datasets. To illustrate, remember how in high school statistics we would always have to make sure our sam- ple size was large enough to be statistically significant? A typ- ical rule was to make sure you had at least a sample size of 20! Back then, it was hard to get data, but that is no longer true. Standard frequentist methods are sometimes not appropriate for large datasets, where a Bayesian approach may be better at dealing with large, messy, data. We also had to invent ways of compressing large amounts of data into small, actionable results that we could visualize, investigate, and plug into workflow. This is best done using math and statistics, and not counting, because as covered earlier, simply adding up scores tells us little that is meaningful. We must use statistical ways of computing and comparing use-principled math and statis- tics. These are essential technology tools for the data lake. But what about our human experts? Where do we fit in? Humans and machines: Better together With big data and data lakes, machine learning can be far more automated than ever before and as unsupervised as we allow, while still accepting feedback such as in a semi-super- vised system. Because data is simply becoming bigger, it is safe to argue that the data lake is inevitable. With machine learning to help us automate and learn—and with the right technologies to help us search, visualize, and detect threats— our human experts take on a new, more expert and guiding role. Here is how I think the security professional is evolving. Ad- vanced chess,9 sometimes called Centaur chess, is a form of chess where the players are actually teams of humans with computer programs. The human players are fully in con- trol but use chess programs to analyze and explore possible 9 Centaur Chess – https://en.wikipedia.org/wiki/Advanced_Chess. January 2017 | ISSA Journal – 21 Machine Learning: A Primer for Security | Stephan Jou
  • 22.
    In this article,the author shares his insights about why security architecture is critical for organizations and how it can be developed using a practical framework-based approach. By Seetharaman Jeganathan Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals 22 – ISSA Journal | January 2017 ISSA DEVELOPING AND CONNECTING CYBERSECURITY LEADERS GLOBALLY Abstract Enterprise security architecture is an essential process that aims to integrate security as a part of business and technolo- gy initiatives handled by any organization. When the security goals and objectives are aligned with organizational business goals and objectives, any organization can make informed decisions about business ventures and protect organizational assets from ever-emerging security threats and risks. In this article, the author shares his insights about why security ar- chitecture is critical for organizations and how it can be de- veloped using a practical framework-based approach. Introduction E nterprise security architecture (ESA) is a design pro- cess where the current state of enterprise security is analyzed, gaps are identified based on effective risk management processes, and the identified gaps are fulfilled by applying cost-effective security controls. It is a life-cycle process that enables any organization to protect itself from advanced security threats. Until recently, ESA was a major technology effort wherein the IT technical team owned the definition, implementation, and operation of security pro- cesses and controls. However, this model has created a vac- uum with respect to business involvement and has failed to align the IT security functions with the organizational goals and objectives [11]. Security goals and objectives Traditionally, information security functions have been pro- viding confidentiality, integrity, availability, and accountabil- ity services to information systems and infrastructure. These services are often referred to as primary goals for informa- tion security functions. The primary objective is to secure the overall IT system and business functions as well as support growth of the underlying business. ESA is a key enabling factor to ensure that the security goals and objectives are achieved as per the expectations of the senior management [11]. Why security architecture? • Security architecture is a key in aligning security func- tions with the organization’s business functions • Without a clearly defined architecture, security solutions cannot be balanced between over protection and under protection • Security architecture functions enable accountability and help obtain support and commitment from senior man- agement
  • 23.
    Even though theproposed security architecture framework is a part of the enterprise architecture, it can also be rolled out separately as a new initiative for organizations that are not matured yet with respect to enterprise architecture. In the sections below, the author shares his practical experienc- es in implementing the proposed framework with several of his industry customers. The primary goal of the framework is to provide an organization-wide security architecture review process to ensure that security is an integral part of all busi- ness critical systems and processes [2][7]. Note: Since this article focuses on security architecture in general rather than information security architecture specifically, it will be appropriate to include corporate security, personnel security, and physical security aspects in this exercise. People factor This area focuses on several actors (people) who must operate together to effectively roll out the proposed framework. The enterprise security architecture group (ESAG) or enterprise security review board (ESRB) is a governance body that must be formed if not available already, as an initial step. The effec- tiveness of the framework will be dependent vis-a-vis the in- volvement and participation of the identified team members. They must fulfill their required roles and responsibilities as effectively as possible. Human resources being expensive as- sets for organizations, it is indispensable to get adequate sup- port and commitment from the senior management to effec- tively utilize human resources to protect the interests of the stakeholders. Senior management support can be obtained by developing a charter of this proposed ESA group by identi- fying key roles and responsibilities of the group members. It is important to map the goals and objectives of this group to the overall organizational business goals and objectives and portray how this group will enable or support the growth of the underlying business functions [1]. Figure 2 depicts the proposed people factor top-down ap- proach model to form the ESA group. • Security architecture functions support IT functions during changes in the business processes • Security architecture provides a snapshot of an organiza- tion’s security posture at any point of time [9] Enterprise security architecture framework Figure 1 shows the proposed enterprise security architecture framework discussed throughout this paper. The framework begins with defining the security strategy, based on risk profile of the organization. An organization’s security requirements are derived mainly from security threats and risks faced by the organization [4]. These require- ments are analyzed in the framework to clearly define a se- curity strategy for the organization. The framework leverag- es three major factors; people, processes, and technology to implement the defined strategy across the organization. It is supported by other essential elements such as organizational governance, risk management, and IT governance bodies to effectively achieve total security of the organization. The au- thor has referenced “The Business Model for Information Se- curity” (BMIS) model and designed this article with exclusive focus on the security architecture function. The BMIS model was originally created by Dr. Laree Kiely and Terry Benzel at the USC Marshall School of Business Institute for Critical Information Infrastructure Protection. Later in 2008, ISA- CA adopted this model and has been promoting its concepts globally. Figure 1 – Enterprise security architecture framework TOTAL SECURITY Organizational Governance Executives, Board of Directors, Stakeholders Enterprise Risk Management Chief Risk Officer, Risk management Group Enterprise IT / Security Governance CIO, CISO, CSO, etc. Enterprise Architecture Enterprise Architects Enterprise Security Architecture Framework  Security Strategy Company Assets Information Security Corporate Security Physical Security Organizational Entities IT Functions Business Units Business Partners Customers Enterprise Security Architecture Group Enterprise Security Governance Board Senior Management • Board Members • Stakeholders • Chief Risk Officer • Chief Security Officer • Corporate Security Head • Chief Information Security Officer • BU Heads • Security Architec ts • Information Risk Manager(s) • Information Security Manager(s) • Corporate Security Group Members Figure 2 – People factor (top-down approach) model January 2017 | ISSA Journal – 23 Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
  • 24.
    The ESA groupmust consist of people representing all busi- ness units of the organization such as HR, finance, R&D, IT, products, manufacturing, etc. It is important to note that the focus of this group is not only securing the information sys- tems but also securing the organization with a holistic ap- proach. Business insights and guidance are essential to derive a holistic “organization wide” security approach. A top-down approach will provide necessary commitment and oversight from senior management; also, when there is a disagreement between business groups, senior management can liaise and resolve critical issues. It is extremely important for this group to cascade the architectural functions and decisions to the entire organization below and/or above them. The head of this group or its representatives must conduct regular “con- nect meetings” with the business units to provide security architecture oversights and guidance for all their technology and business initiatives [1] One of the primary expectations and outcomes of this work- ing group should be developing security policies and stan- dards for all organizational functions wherein security is a key requirement. Security policies are directions by the se- nior management to the organization on what is allowed and what is not allowed from the security standpoint. Security standards are guidelines developed to substantiate/support each policy and set directions for business units on how to adhere to the required policies [8]. Note: The author is highly inspired by the series of books, In- formation Security Policies Made Simple, by Charles Cresson Wood and recommends them as reference material(s) to create relevant security policies by any organization. However, the samples provided in the book should be used as an inspira- tion and must not be adopted directly without careful review. The teams working on defining the policies must also take into consideration industry regulations, country-specific laws, and compliance requirements before defining the policies. Process factor This area focuses on how the security architecture review process should work in real time at any given organization. The need for an organization-wide risk management pro- cess is now more than ever because information systems and technology are widely used for business functions across the world. Information systems are subject to serious security threats. Threat agents exploit known and unknown vulner- abilities and cause damages to information systems. This will impact the confidentiality, integrity, availability, and accountability goals of security functions. Security breach- es even cause permanent damage to organizations and can make them go out of business. Recent laws and compliance requirements make senior management personally account- able for any negligence in securing their customer’s personal- ly identifiable information (PII), financial data, and personal health information (PHI) in the healthcare industry. There- fore, it is critical and of utmost importance that the senior management, mid-level, and lower-level employees of an organization understand their roles and responsibilities in protecting organization’s resources effectively from security risks [1]. Enterprise risk management is focused on managing risks faced by the organization. Security risks are one among sev- eral others risks faced, but security risks are more severe than the others. Organizations generally follow widely known risk management frameworks (NIST, ISACA, etc.) or cus- tom-made frameworks specific to the organization based on its culture, laws, and compliance requirements. The author discusses and illustrates this article based on the NIST (SP 800-39) risk management process, which suggests that risk management is carried out as a holistic, organization-wide activity that addresses risk from the strategic level to the tac- tical level. This enables organizations to make informed deci- sions about their security activities based on the outcome of the risk management process already in place [10]. Figure 3 depicts the NIST risk management process and multi-tiered organization-wide risk management approach. Note: As the scope of this paper is not to detail the NIST risk management process, readers are encouraged to read the NIST SP 800-39 document to understand the risk management framework. An important discussion in SP 800-39 is that information security architecture is an integral part of an organization’s enterprise architecture. However, the author from his experi- ence suggests that organizations that do not have a matured enterprise architecture yet must also roll out the security ar- chitecture processes in their IT program initiatives. The pri- mary purpose of the security architecture review process is to ensure that specific security requirements are reviewed and cost-effective security solutions (management, operational, and technical) are suggested/designed for qualified risks that must be mitigated as per the risk management strategy. Or- ganizational security requirements could also arise from oth- er factors such as policies, standards, laws, and compliance regulations among others. These requirements must also flow Figure 3 – NIST risk management process Strategic Risk Tactical Risk Multitiered Organization-Wide Risk Management Risk Management Process Tier 1 Organization Tier 2 Mission / Business Processes Tier 3 Information Systems Assess Frame Monitor Respond 24 – ISSA Journal | January 2017 Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
  • 25.
    More than 60%of companies recently surveyed had a data breach involving printers.1 Has yours? Only HP printers can stop an attack before it starts, with real-time threat detection, automated monitoring and built-in software validation that no one else offers.2 Reduce your risk with HP printers. See how at hp.com/go/ReinventSecurity 1 Ponemon Institute, “Insecurity of Network-Connected Printers,” October 2015. 2 Based on HP review of 2016 published security features of competitive in-class printers. Only HP offers a combination of security features that can monitor to detect and automatically stop an attack, then self-validate software integrity in a reboot. For a list of printers, visit: www.hp.com/go/PrintersThatProtect. For more information: www.hp.com/go/printersecurityclaims. © Copyright 2017 HP Development Company, L.P. The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein. Reinvent security
  • 26.
    After security designrecommendations are submitted to the implementation team, it is vital for security architects to pro- vide required support throughout the implementation pro- cess and ensure that the proposed design recommendations are implemented as per the suggestions made in the design review. This process is referred to as the security implementa- tion review. Post implementation of a security control, quality assurance testing must be done thoroughly to ensure whether the security control is mitigating the underlying risk(s) as per the requirements. If there is still any gap in risk mitigation, then the whole process must be repeated until the underlying risk is mitigated to the acceptable level. This way, the enter- prise security review process could also be aligned with the organization’s SDLC process being followed in the informa- tion systems project implementation [8]. Figure 5 describes the proposed review process flow from the beginning to end. Technology factor This area focuses on the technology aspects of the informa- tion systems layer supporting the business functions. The into the security architecture review process as depicted in figure 4 and be addressed properly [10]. Enterprise security architecture review process An enterprise security architecture review process is primar- ily conducted to derive the most appropriate security solu- tion(s) for the qualified requirements. Enterprise security ar- chitecture review board members should meet and review the security requirements and brainstorm possible cost-effective solutions that could be management, operational, technical, or combination of these controls to mitigate risks to an ac- ceptable level. These steps are referred to as a security require- ments review and security controls review. Once a cost-effec- tive security control is identified, a security design review is conducted for functional and non-functional requirements, depending on the type of the security control(s) identified [8]. A high-level summary of the security design review process for each control type is briefed in table 1; this is not an exten- sive list of design review steps but suggestions to kick start the process [8]. Security Control Types Design Review Management • Conduct current-state assessment and identify gaps • Review the associated security policies and standards • If required, create new security policies/stan- dards or modify the existing policies to meet the security requirements • Document and submit the recommendations Operational • Conduct current-state assessment and identify gaps • Suggest new or amendments to the operational model/process to meet the security requirements • Document and submit the recommendations Technical • Conduct current-state assessment and identify gaps • Suggest new controls or changes to the existing technical control to meet the security requirements • Conduct cost-benefit analysis, security return of investment (SROI) analysis to prove the cost effectiveness of the solutions • Provide functional design inputs • Provide non-functional design inputs • Document and submit the recommendations Table 1 – Security controls design review (high-level summary) Figure 4 –Enterprise security architecture process drivers Figure 5 – Enterprise security architecture process flow No Start Security Requirements Review Security Controls Review Decision Management Operational Technical Security Design Review Security Implementation Support Security Post-Implementation Review Risk Mitigation ? Yes Stop Other Drivers Risk Management Framework Enterprise Security Architecture Review Process Security Requirements Security Requirements 26 – ISSA Journal | January 2017 Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
  • 27.
    ate a detailedinventory of the current state of information security controls in place at the physical, network, infrastruc- ture, and application levels [3]. There are several possible approaches in assessing the cur- rent-state security controls of physical, network, infrastruc- ture, and application layer components; one such approach is creating a 3X3 matrix of current security controls classified into types such as preventive, detective, and corrective con- trols (table 2) [8]. After completing the above exercise at all the levels, the cur- rent security posture (“as-is”) of the organization would be created successfully. The next critical steps are: a) Conduct a gap analysis of current-state security b) Conduct a risk analysis of critical information systems An effective risk management process is a key here to find out the gaps in the current state and the residual risk on the business critical information systems. The architecture team must review the gap analysis and risk analysis reports to derive the requirements for the desired (“to-be”) state ar- chitecture to adequately protect the identified assets. The requirements must be presented to the senior management, governance board, etc., for approval, funding, and adequate support for implementation. It is extremely important to map the goals of the security architecture with business goals and process begins with creating a blueprint(s) of the current state (“as-is”) of the information systems layer (logical and technical) in the primary and secondary data center(s) of the organization. Figure 6 provides a high-level logical view of the primary data center. After a high-level blueprint is created, the team has to move on to creating more focused blueprints about the following areas but not limited to: 1. Detailed blueprints about the network segments such as the DMZ layer, internal network layers (intranet), and external networks such as partner networks, Internet, VPN networks, etc. 2. Detailed blueprints about the infrastructure layer such as servers, databases, mail servers, file servers, etc. 3. Request the individual application teams (mission criti- cal and all others) to derive their application-specific ar- chitecture diagrams if not already available [4] Figure 7 depicts a high-level logical and physical deployment sample architecture blueprint of an e-commerce application in a financial services company. After completion of above mentioned exercise, the architec- ture team would have relevant information to review and cre- Preventive Detective Corrective Administrative • Acceptable usage policy • Change management policy • Effective change management process Technical • Data encryption • SSL/TLS transport encryption • Secure configuration • Access control • Vulnerability scanning • Secure coding/review • Effective incident response Physical • Application-specific control(s) if any • Application-specific control(s) if any • Application-specific control(s) if any Table 2 –Sample controls review matrix (e-commerce application) Figure 6 – Organization data center – logical view Figure 7 – Sample e-commerce application – logical/deployment architecture January 2017 | ISSA Journal – 27 Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
  • 28.
    • US Departmentof Defense architectural framework, and much more Organizations can leverage these frameworks and custom- ize them to meet security architecture requirements. As the scope of this paper is not to review these frameworks in detail, the author notes that organizations must choose an architec- ture framework that is flexible enough to support business growth and adapt for changes in the business environment and processes [5]. Conclusion The enterprise security architecture process streamlines se- curity functions of an organization to achieve effective total security. In this article, the author proposed an enterprise se- curity architecture framework and detailed the pillars (peo- ple, process, and technology) of the framework. Whether adopting this framework or an industry-recognized frame- work is a decision to be made by the organization, depending on the culture, business environment, and industry com- pliance requirements. Any framework will be effective only when adequate support is given. In order to obtain desired outcomes of an architecture framework, it must be under- stood and supported by the senior management. In today’s economy-centric business environments, a security architec- ture framework must be increasingly business oriented rather than a technology-centric framework to obtain expected re- turns on security investments. References 1. Architecture Compliance. (n.d.), The Open Group. Re- trieved November 09, 2016, from http://pubs.opengroup. org/architecture/togaf9-doc/arch/chap48.html. 2. Breithaupt, J., & Merkow, M. S. Principle 11: People, Process, and Technology Are All Needed to Adequately Secure a System or Facility, Information Security Prin- ciples of Success, Pearson IT Certification (2014, July 4). Retrieved from http://www.pearsonitcertification.com/ articles/article.aspx?p=2218577&seqNum=12. 3. Building an e-commerce Solution Architecture, Dami- con. Retrieved November 9, 2016, from http://www.dam- icon.com/resources/Architecture_Practices.pdf. objectives to garner the support from management and board of directors [5]. As the technology environment is ever changing due to sev- eral factors such as changes in the business processes, en- vironment, technology adoption, automation, etc., security architecture must always be a live and vigilant group in any organization in order to adopt the changes and support the business to function without disruptions. It is essential to remember that IT and security functions must be business enablers rather than creating road blocks for business func- tions. Note: Organizations that are matured in information securi- ty processes have adopted one or more industry frameworks listed below to implement the minimum required security controls and desired security architecture for effective infor- mation security: • ISO 27001:2013 Information Security Standard • NIST Cybersecurity Framework • ISACA COBIT 5 Information Security • SANS Critical Security Controls, etc. [5] Incorporating security architecture in enterprise architecture MIT Sloan School’s Center for Information Systems Research (CISR) defines enterprise architecture as “the organizing log- ic for business process and IT infrastructure reflecting the integration and standardization requirements of the firm’s operating model. The enterprise architecture provides a long- term view of a company’s processes, systems, and technol- ogies so that individual projects can build capabilities—not just fulfill the immediate needs” [12]. In simple terms, enterprise architecture provides a logical view of the enterprise business layer (functions and process- es), the IT infrastructure layer, the data layer, and the applica- tions layer. As enterprise architecture is not the focus of this article, the author would emphasize that security architecture could be part of the enterprise architecture and provide ad- equate security for all the architecture functions. Figure 8 depicts a security-enabled enterprise architecture view of an organization [13]. Security architecture frameworks There are several industry frameworks that provide architec- tural approaches for meeting security requirements. Some of them to be noted are: • SABSA comprehensive framework for enterprise security architecture and management • Zachman framework of IBM • The Open Group architecture framework • The Open Security architecture group framework • UK Ministry of Defense architecture framework Enterprise Architecture Business Architecture Information Architecture Technology Architecture SECURITY SECURITY Figure 8 – Security enabled enterprise architecture view 28 – ISSA Journal | January 2017 Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
  • 29.
    4. Gunnar Peterson,Security Architecture Blueprint, Arctec Group LLC, 2007. 5. ISACA. CISM Review Manual 2015, ISACA (November 1, 2014). 6. ISACA. An Introduction to the Business Model for In- formation Security. Rolling Meadows IL: ISACA, 2006 – http://www.isaca.org/knowledge-center/research/re- searchdeliverables/pages/an-introduction-to-the-busi- ness-model-for-information-security.aspx. 7. Kreizman, G. (2011, October 4). An Introduction to In- formation Security Architecture. Retrieved from http:// gartnerinfo.com/futureofit2011/MEX38L_D2 mex38l_ d2.pdf. 8. Landoll, D. J. (2016). Information Security Policies, Pro- cedures, and Standards: A Practitioner’s Reference. Boca Raton: CRC Press, Taylor & Francis Group. 9. Olzak, T. (2010, September). Build an Enterprise Securi- ty Architcture-based Framework. Retrieved January 16, 2011, from CBS Interactive/TechRepublic: http://www. techrepublic.com/downloads/build-an-enterprise-archi- tecture-base-framework/2212421. 10. Publication: SP 800-39. Managing Information Security Risk: Organization, Mission, and Information System View, National Institute of Standards & Technology – http://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecial- publication800-39.pdf. 11. Ritchot, B.  2013.  An Enterprise Security Program and Architecture to Support Business Drivers.  Technology Innovation Management Review, 3(8): 25-33. http://tim- review.ca/article/713. 12. Ross, J. W., Weill, P., & Robertson, D. Enterprise Archi- tecture as Strategy: Creating a Foundation for Business Execution. Boston, MA: Harvard Business School Press (2006). 13. Security Architecture: A New Hype for Specialists, or a Useful Means of Communication? Retrieved November 9, 2016, from https://www.pvib.nl/down- load/?id=11542823. About the Author Seetharaman Jeganathan, CISSP, has more than 14 years of experience in IT technolo- gy security consulting and program man- agement. He mainly focuses on information systems risk assessments, identity and access management (IAM) solution strategy defini- tion, architecture definition, and design and implementation of IAM security solutions. He also specializes in cloud-based applications security consulting and implementation of IAM solutions in cloud. He may be reached at seetharaman.jegana- than@gmail.com. EDITOR@ISSA.ORG  •  WWW.ISSA.ORG ISSA Journal 2017 Calendar JANUARY Best of 2016 FEBRUARY Legal, Privacy, Regulation, Ethics Editorial Deadline 1/6/17 MARCH Internet of Things Editorial Deadline 1/22/17 APRIL New Technologies in Security Editorial Deadline 2/22/17 MAY The Cloud Editorial Deadline 3/22/17 JUNE Big Data/Machine Learning/Adaptive Systems Editorial Deadline 4/22/17 JULY Cybersecurity in World Politics Editorial Deadline 5/22/17 AUGUST Disruptive Technologies Editorial Deadline 6/22/17 SEPTEMBER Health Care Editorial Deadline 7/22/17 OCTOBER Addressing Malware Editorial Deadline 8/22/17 NOVEMBER Cryptography and Quantum Computing Editorial Deadline 9/22/17 DECEMBER Social Media, Gaming, and Security Editorial Deadline 10/22/17 March 2016 Volume 14 Issue 3 Crypto Wars II Fragmentation in Mobile Devices Mobile Application Security Mobile App Testing for the Enterprise Crypto Wars II MOBILE APPS May 2016 Volume 14 Issue 5 Do Data Breaches Matter? A Review of Breach Data and What to Do Next FedRAMP’s Database Scanning Requirement: The Letter and Spirit Smart Practices in Managing an Identity Auditing Project On the Costs of Bitcoin Connectivity ★ ★ ★ ISSA ★ ★ ELECTION ★ ★ 2016 ★ ★ ★ Do Data Breaches Matter? A Review of Breach Data and What to Do Next BREACH REPORTS: COMPARE/CONTRAST You are invited to share your expertise with the association and submit an article. Published authors are eligible for CPE credits. For theme descriptions, visit www.issa.org/?CallforArticles. January 2017 | ISSA Journal – 29 Enterprise Security Architecture: Key for Aligning Security Goals with Business Goals | Seetharaman Jeganathan
  • 30.
    Abstract The cybersecurity industryfaces a shortage of qualified professionals. Part of the solution is to better deliver cyber- security education in colleges and universities. While other professionals have addressed the issue by proposing a for- mal university curriculum, this article approaches the sub- ject from the perspective of professional security personnel teaching as adjunct instructors. The purpose of this article is to equip cybersecurity professionals working as adjunct instructors with resources to deliver a more efficient and ef- fective class. A key battle in the cybersecurity war today is not being fought with firewalls and encryption. It is not be- ing fought between cybersecurity professionals and cybercriminals. And it is not being fought in corporate net- works and the Internet. No, this struggle is between academia and the information security profession. It is being fought in the classroom. Challenges The cybersecurity industry continues to fall short of qualified professionals. Prominent information security organizations have stated that the shortage is real. For example, (ISC)2 pre- dicts that the cyber world will be short 1.5 million cybersecu- rity professionals by 2020.1 Furthermore, Enterprise Strategy Group (ESG)2 research shows that 28 percent of organizations 1 Jon Oltsik, “Creating a Cybersecurity Center of Excellence,” Network World (2015) – http://www.networkworld.com/article/3016593/security/creating-a-cybersecurity- center-of-excellence.html. 2 Enterprise Strategy Group - http://www.esg-global.com/. have a “problematic shortage” of IT security skills.3 Mean- while, as a society we are living in the world of the Internet of Things and ransomware. The threats are numerous, the crim- inals are resilient, and the rewards are rich. The US Bureau of Labor Statistics projects great growth in the computer and mathematical occupations, which include cybersecurity.4 In fact, they project that this group of indus- tries will produce more than 1.3 million job openings with- in the next six years. This equates to an 18 percent increase, which they label as faster than average. For information se- curity analysts, the outlook is even better. It is expected that this occupation will grow at a rate of 36.5 percent. The bureau attributes this fast growth to the increased use of electronic medical records and mobile technology. However, some of the greatest growth areas come with high- er educational requirements as illustrated in table 1. The bu- reau reports that the fastest growing occupations within the business and financial operations sectors will require at least a Bachelor’s degree. For the computer and mathematical oc- cupations the educational requirements are even higher. As seen in table 2, the bureau projects that most of the new jobs occurring in this group will require a master’s degree. Solutions The shortage of qualified cybersecurity professionals is a complex problem, which means there is no easy answer. One of the issues that must be addressed is properly educating a cybersecurity workforce. In their 2014 article, “Application of Pedagogical Fundamentals for the Holistic Development of 3 Jon Oltsik, “Creating a Cybersecurity Center of Excellence,” Network World (2015) – http://www.networkworld.com/article/3016593/security/creating-a-cybersecurity- center-of-excellence.html. 4 Bureau of Labor Statistics – http://www.bls.gov/opub/mlr/2013/article/occupational- employment-projections-to-2022.htm. The Role of the Adjunct in Educating the Security Practitioner By Karen Quagliata – ISSA member, St. Louis Chapter The cybersecurity industry faces a shortage of qualified professionals. Part of the solution is to better deliver cybersecurity education in colleges and universities. The purpose of this article is to equip cybersecurity professionals working as adjunct instructors with resources to deliver a more efficient and effective class. 30 – ISSA Journal | January 2017 ISSA DEVELOPING AND CONNECTING CYBERSECURITY LEADERS GLOBALLY
  • 31.
    As seen infigure 2, the KBP model is a multi-dis- cipline approach to information assurance. Mul- tiple schools within the university play a role in educating the security professional. They include the business school, the IT school, and the law school.7 Education transformation Such formal approaches to education delivery as the KBP model are vital to the cybersecu- rity workforce. However, they rely on a more traditional education delivery model. What is becoming more evident, though, is that higher education is undergoing a transformation. That transformation is a move from a staff of tenured professors with degrees in education to the use of adjunct instructors who are often profession- als working in the industry. For a multitude of reasons, colleges and universities are now rely- ing more heavily on adjunct instructors. In fact, in 2014 adjuncts made up more than 70 percent of all college and university faculty.8 Adjunct in- structors teach classes on a contract basis. Often a master’s degree in the subject that they will teach and professional experience are the main requirements. They can be trained teachers, but many are in- dividuals who work in a given profession full-time and teach part-time. It is the latter category where the cybersecurity world must focus. It is the responsibility of cybersecurity professionals to teach cybersecurity courses in an effective and efficient manner. Part of the problem, though, is that often a cybersecurity ad- junct is handed only a course name, brief course description, and textbook name along with his/her contract. In some cas- es another teacher who previously taught the class will share 7 Ibid. 8 J. Fruscione, “When a College Contracts ‘Adjunctivitis,’ It’s the Students Who Lose,” PBS Newshour - http://www.pbs.org/newshour/making-sense/when-a-college- contracts-adjunctivitis-its-the-students-who-lose/. Cybersecurity Professionals,” authors Barbara Endicott-Pop- ovsky and Viatcheslav Popovsky provide a wealth of informa- tion for universities looking to develop a formal curriculum for information assurance education.5 Figure 1 illustrates the Kuzmina-Bespalko-Popovsky (KBP) Pedagogical Model de- veloped at the Center for Information Assurance and Cyber- security (CIAC) at the University of Washington.6 The model takes into consideration such influences on education as the current job market as well as technical, economic, cultural and political trends. 5 Barbara Endicott-Popovsky and Viatcheslav Popovsky, “Application of Pedagogical Fundamentals for the Holistic Development of Cybersecurity Professionals,” ACM Inroads (2014 March) – https://niccs.us-cert.gov/sites/default/files/documents/files/ p57-endicott-popovsky.pdf?trackDocs=p57-endicott-popovsky.pdf. 6 Ibid. Education level Employment Projected change, 2012-2022 2012 2022 Number Percent Bachelor’s degree 2,893.1 3,415.2 522.1 18.0 Some college, no degree 547.7 658.5 110.8 20.2 Associate’s degree 316.1 356.6 40.6 12.8 Master’s degree 31.1 39.2 8.2 26.3 Doctoral or professional degree 26.7 30.8 4.1 15.3 Source: US Bureau of Labor Statistics Table 2 – Computer and mathematical occupations employment by educational requirement, 2012 and projected 2022 (employment in thousands) Education level Employment Projected change, 2012-2022 2012 2022 Number Percent Bachelor’s degree 5,344.3 6,131.4 787.1 14.7 High school diploma or equivalent 1,809.7 1,921.4 111.7 6.2 Postsecondary nondegree award 13.5 12.8 –0.7 –5.3 Source: US Bureau of Labor Statistics Table 1 – Business and financial operations occupations employment by educational requirement, 2012 and projected 2022 (employment in thousands) Figure 1 - The KBP Pedagogical Model: CIAC as a pedagogical system KBP Pedagogical Model for Information Assurance Curriculum Development IA ProfessionalsNew Students Dynamic Social / Professional Context Teachers Students Goals Content Didatic Processes Trends: Technological, Economic, Cultural, Political Job Market Figure 2 – Multi-disciplinary approach to information assurance Business School–IT iSchool Evans School Law School Tech Comm-Eng Business School–IT iSchool Evans School–InternetCenter Law School—Shidler Center Business School–IT iSchool Tech Comm-Eng Goal of System Policy Procedures & Practices Mechanisms SecurityAwareness Training Secure System iSchool Computer Science Elect Engr IA Audit Feedback January 2017 | ISSA Journal – 31 The Role of the Adjunct in Educating the Security Practitioner | Karen Quagliata
  • 32.
    his or hersyllabus and notes with the cybersecurity adjunct. Predictably, the results can be varying degrees of class quality. Recommendations Cybersecurity professionals teaching as adjuncts need a strong support system of resources in the form of profession- al organizations, industry research and publications, and assistance from experienced cybersecurity adjuncts. Figure 3 shows the suggested model needed to produce an effective adjunct-led cybersecurity classroom. Figure 3 – Adjunct-led cybersecurity classroom model The following are some recommendations for cybersecurity professionals who are entering the world of adjunct teaching. Topics Be sure to cover the key topics and pain points of the security industry today. Sure, the (ISC)2 Common Body of Knowledge (CBK)9 is a great foundation to build the class, but do not stop there. Go beyond the CBK basics—address the reality of im- plementing security in a corporate setting, address the strug- gle of trying to implement a security culture when senior leadership is lukewarm (at best) to the idea, address funding and threat landscapes specific to industries. • Be sure to address the sometimes forgotten risk areas such as third-party vendor management. Data breaches that can be attributed to third-party vendors are still occurring. For example, a 2015 study by the Ponemon Institute shows that 39 percent of respondents attributed their healthcare organization’s data breach to a third-party issue.10 • Adjunct instructors should certainly discuss any recent data breaches in their classes, but they should do so in an analytical manner. For example, compare three major data breaches and identify any similar aspects (compro- mised credentials, third-party backdoors, unpatched sys- tems, etc.) • Adjunct instructors should use industry resources to ad- dress key aspects of cybersecurity, such as the Verizon 9 (ISC)2 Common Body of Knowledge –https://www.isc2.org/cbk/default.aspx. 10 Ponemon Institute, Fifth Annual Benchmark Study on Privacy & Security of Healthcare Data Report – https://iapp.org/media/pdf/resource_center/Ponemon_ Privacy_Security_Healthcare_Data.pdf. Data Breach Investigations Report11 and Ponemon Insti- tute’s Cost of a Data Breach Report.12 These two free re- sources will not only help the adjunct address causes of breaches, but also how much they will cost an organiza- tion. • Adjunct instructors owe it to their students to address certifications. Students of higher education are painfully aware of the cost of a degree. Some may question whether it is better to forgo the degree and just pursue the certifi- cation route. Share a list of some of the more in-demand security certifications and what is required to obtain them. Go the extra mile and study the want ads to see what certi- fications employers want their candidates to have. Tests Strongly consider using standardized tests for introductory security classes only. It can be assumed that at higher course levels the students already have a firm grasp of basic secu- rity topics, such as bios, authentication methods, basics of encryption, etc. At the higher level courses adjunct instruc- tors should be looking for analytical skills in their students. Writing in a clear, concise, analytical manner is vital to any industry, including information security. Assignments Adjuncts should make their assignments as realistic as possi- ble. The goal of education is to expand the mind, but it should also prepare students for the real world. A good way to make realistic assignments is to draw upon actual work experienc- es, or use prewritten case studies. One approach is to pro- pose a business problem that must be solved by IT. Ask the students to research and propose a solution, complete with a cost summary and risk assessment. Adjuncts can expand upon that assignment by instructing the students to present the solution both to a technical audience and a business au- dience. Another possible assignment is to have students on a weekly basis choose a topic in security news and write a summary. While not an overly complex assignment, it forc- es students to get into the habit of staying abreast of current issues and topics in the industry. It also helps students hone their writing skills. Textbooks If the university does not already have a textbook in mind for the class, then the adjunct should consider using certifica- tion study guides. These resources lend themselves to a good foundation because of the breadth and depth of the material they cover. Professional organizations The classroom is also a good platform for encouraging stu- dents to consider how professional organizations, such as ISSA, can help them at any stage of their careers. The adjunct 11 Verizon Data Breach Report – http://news.verizonenterprise.com/2016/04/2016- data-breach-report-info/. 12 Ponemon Institute’s Cost of a Data Breach Report – http://www-03.ibm.com/ security/data-breach/. Experience Case Studies Assignments Speakers Professional Organizations Offsite Learing Field Trips Formal Tests Books, Professional Organizations Adjunct-Led Cybersecurity Classroom 32 – ISSA Journal | January 2017 The Role of the Adjunct in Educating the Security Practitioner | Karen Quagliata
  • 33.
    SecureWorld conferences providemore resources and facilitate more connections than any other cybersecurity event in North America. Our regional events are designed to equip and inspire those defending the digital frontier. Join like-minded security professionals in your local community for high-quality, affordable training and education. Attend featured keynotes, panel discussions and breakout sessions, and learn from nationally-recognized experts. Network with fellow practitioners, thought leaders, associations and solution vendors. Don’t go it alone. Register for a SecureWorld conference near you. See Globally. Defend Locally. Announcing our 2017 conference schedule. In addition to our lineup of 14 regional events, we’re excited to introduce two new markets: Chicago and Twin Cities. Mark your calendars and make plans to attend! *Dates subject to change Fall: Cincinnati, OH - September 7 Detroit, MI - September 13-14 St. Louis, MO - September 20-21 Denver, CO - October 4-5 Twin Cities, MN - October 12* Dallas, TX - October 18-19* Bay Area, CA - October 26* Seattle, WA - November 8-9 Spring: Charlotte, NC – March 2 Boston, MA – March 22-23 Philadelphia, PA – April 5-6 Portland, OR – April 20 Kansas City, KS – May 3 Houston, TX – May 18 Atlanta, GA – May 31-June 1 Chicago, IL – June 7 www.secureworldexpo.com
  • 34.
    should provide alist of website links to pertinent organiza- tions. The adjunct can also invite presidents of local chap- ters to speak during a class, or ask permission for students to come to a future event on a trial basis. Innovation Adjuncts should try to go beyond the typical lecture. Infor- mation security is a vibrant, constantly changing industry. Traditional lectures do not do it justice. While it is not always possible to avoid a lecture while speaking about the vulner- abilities of a kernel, there are other opportunities to educate without producing glazed stares. • Field trips – Adjuncts should take the education outside of the classroom. Even professionals working in the indus- try welcome a field trip. Reach out to local law enforce- ment to see if the class can visit a forensics lab, or if a local company will allow students to view its network monitor- ing system. • Guest speakers – Guest speakers can also provide an in- novative way to impart knowledge. No matter how excit- ing an adjunct may be as a speaker, students always seem to pay more attention to a guest speaker. Adjuncts can use a guest speaker to reinforce the topic of the day. For exam- ple, if the class discussion is authentication, the adjunct can bring in an expert on biometrics. Sources for speak- ers are abundant. Adjuncts can use professional organi- zations and/or colleagues as a source for speakers. Local law enforcement can be another source. Finally, with web conferencing there is no geographical boundary. • Video – Adjuncts should not be afraid to incorporate vid- eo into their lectures. YouTube provides a plethora of se- curity-related videos and instructions that will go beyond the typical lecture. There are also plenty of free webinars available from vendors and professional organizations. Along those lines, the adjunct can have students play free interactive phishing games available online. Implications for professional security organizations The changing dynamics of the information security indus- try and the higher education system are providing a perfect storm for professional security organizations to play a greater role in promoting and supporting the profession. Profession- al security organizations, such as ISSA, can play a vital role in helping to ensure the level of quality delivered by security professionals teaching in an adjunct capacity. Professional se- curity organizations can be a repository of resources to help the new—or established—adjunct instructor. Some of the possible ways that organizations can help include: • Classroom material – Professional security organiza- tions can solicit their members to submit case studies, test questions, and project ideas to their local chapters to help establish a repository of classroom material. Participation can be counted toward CPEs. • Speaker’s bureau – Professional security organizations contain a wealth of experience. Organizations can estab- lish a speaker’s bureau to help bring some of that expe- rience to the classroom. The local chapter can establish a repository of speakers based on their area of expertise and availability. Participation can be counted toward CPEs. • Mentoring program – Almost certainly there will be members of a professional security organization who are already working as adjunct instructors. Organizations can establish a mentoring program where experienced ad- juncts can help new adjuncts with lesson plans, tips, and effective teaching strategies. Participation can be counted toward CPEs. • Textbook recommendations – Professional security orga- nization members can also provide support to the adjunct community by recommending potential textbooks or sup- plemental reading material for cybersecurity classes. Conclusion The growing cybersecurity field is expecting its personnel to have terminal degrees. At the same time colleges and univer- sities are relying upon adjunct instructors to help supplement their staff of tenured professors. The result is that cyberse- curity programs at colleges and universities are turning to cybersecurity professional to fill adjunct instructor roles. Often cybersecurity adjunct instructors are thrown into the classroom with little support. To help ensure a consistent lev- el of quality of cybersecurity classes, cybersecurity adjunct instructors should follow guidelines outlined in this article. Furthermore, professional security organizations should act as a support system by providing classroom material and ex- perienced professionals. About the Author Karen Quagliata, PhD, PMP, CISA, CISSP, is an information security analyst working in risk management and governance. She is also an adjunct instructor for multiple uni- versities and colleges. Karen can be reached at Karen.quagliata@gmail.com. ISSA Journal Back Issues – 2016 Past Issues – digital versions: click the download link: ISSA.org => Learn => Journal Securing the Cloud Mobile Apps Big Data / Data Mining & Analytics Malware Threat Evolution Breach Reports – Compare/Contrast Legal, Privacy, Regulation Social Media Impact Internet of Things Payment Security Cybersecurity Careers & Guidance Practical Application and Use of Cryptography 34 – ISSA Journal | January 2017 The Role of the Adjunct in Educating the Security Practitioner | Karen Quagliata
  • 35.
    Fragmentation by OS Apple’ssuccess in this area is primarily due to the company’s control over both the hardware and software components of the iOS line of devices. With all mobile hardware, there are three primary stake-holders, aside from the end user. The de- velopers create the operating system. Device manufacturers, of course, are responsible for actually building or assembling phones, tablets, and additional hardware. And carriers sup- ply the means through which the devices communicate with the world. Apple comprises two of these three stakeholders, a position the tech giant has been able to leverage to limit carri- er incursions into its operating system and hardware. Device fragmentation among Android and Windows mobile devices, on the other hand, is significantly more complicated (figure 1). Due in large part to the open nature of the Android oper- ating system, there are hundreds of Android-compatible de- vice manufacturers from the well-known Samsung to smaller companies like Blu Dash2 (figure 2). Each manufacturer is able to make alterations to the operating system itself, which can include useful features such as Samsung or HTC’s fin- gerprint scanners (which were not natively supported by An- droid until recently) or proprietary front ends like Samsung’s 2 OpenSignal. “Android Fragmentation Visualization.” Aug. 2015. Web. 19 Jan. 2016 - http://opensignal.com/reports/2015/08/android-fragmentation/. T hirty seconds and a USB cable are all it takes to compromise an iPhone 4. Through the execution of an “SSH Ram Disk” attack, a malicious at- tacker can achieve root access on a victim device without jailbreak- ing or causing permanent changes to the hardware or operating sys- tem. The vulnerability responsible for this exposure can be found at the hardware level, and as a result of device fragmentation, it is unlikely that Apple will ever be able to directly address the problem for affected con- sumers. Mobile device fragmentation is a phenomenon that occurs at a point in time when groups of mobile users are running various versions of an operating system across a variety of hardware platforms. As mobile devices age, software develop- ers and hardware manufacturers cease to provide updates for older devices, leaving users, often times unknowingly, with a wide range of potential problems ranging from minor bugs to compromise-level security flaws. Device fragmentation happens naturally over time as new hardware is released that includes additional features or sys- tem requirements that cannot be supported by older models. Of the three major kings of the mobile market, Apple iOS de- vices fare far better than their Android and Windows phone counterparts when it comes to device fragmentation. As of August 2015, iOS 8 had been adopted by 85 percent of Apple device users; Android Lollipop, Google’s latest release at the time, had only an 18 percent adoption rate despite there only being a two-month difference in their release times.1 1 Whitney, Lance. “iOS 8 Hits 85% Adoption Rate; Android Lollipop Only at 18%.” 5 Aug. 2015. Web. 24 Feb. 2016 – http://www.cnet.com/news/ios-8-hits-85-adoption- rate-android-lollipop-only-at-18/. The purpose of this article is to explore the threat to consumers posed by mobile device fragmentation. The author categorizes mobile device fragmentation by operating systems, manufacturer, and carrier, exploring the vulnerabilities at each level. By Ken Smith Fragmentation in Mobile Devices Figure 1 – OS and Android Versions on the Market in 2015 (OpenSignal, 2015) January 2017 | ISSA Journal – 35 ISSA DEVELOPING AND CONNECTING CYBERSECURITY LEADERS GLOBALLY
  • 36.
    TouchWiz. Additionally, carrierssuch as Verizon, AT&T, and T-Mobile are able to leverage their control of the airwaves to drop proprietary third-party software onto many Android devices before they hit the market. The NFL Mobile applica- tion and Verizon’s line of in-house applications are examples of such installations. These applications cannot normally be removed by the user without gaining root access to the phone or tablet, which, of course, creates additional security prob- lems and exposures. Microsoft controls a much smaller portion of the mobile mar- ket; according to the International Data Corporation (IDC), in Q2 2015, a mere 2.6 percent of the world’s mobile users were on Windows-powered phones.3 Nevertheless, Windows mobile devices, specifically Windows phones, have historical- ly been subjected to significant device fragmentation (though it is not as big as a problem as it is with Android) due in large part to a lack of support from carrier brands. When Micro- soft’s Denim update for Windows Phones was first released in September 2014, it took months for T-Mobile and AT&T to push the update to compatible phones. Verizon was quicker to respond, but only after it had continuously refused to push the previously available update, Cyan, to compatible devices.4 In response to these kinds of problems, in the spring of 2015 Microsoft announced that it would be taking over updates for all Windows 8 devices, assuring customers with compatible devices that they would receive an upgrade to Windows 10 in a timely manner.5 Unfortunately, at the time of the an- nouncement there were still many users on Windows 7 devic- es, many of which would not be able to make the upgrade. In November 2014, nearly 17 percent of Windows phones on the market were running Windows 7, and a large portion of those 3 IDC. “Smartphone OS Market Share.” International Data Corporation. 2015. Web. 10 Feb. 2016 – http://www.idc.com/prodserv/smartphone-os-market-share.jsp. 4 Murphy, David. “Microsoft Going after Smartphone Fragmentation in Windows 10 Mobile.” PC Magazine. 16 May 2015. Web. 10 Feb. 2016 – http://www.pcmag.com/ article2/0,2817,2484312,00.asp. 5 Bott, Ed. “Microsoft Says It’s Taking Over Updates for Windows 10 Mobile Devices.” ZDNet, 25 May 2015. Web. 10 Feb 2016 – http://www.zdnet.com/article/microsoft- says-its-taking-over-updates-for-windows-10-mobile-devices/. devices had already been unable to make the jump to Windows 8/8.16 (figure 3). This has subsequently left a significant portion of the Windows Phone customer base unsupported, necessitating an upgrade to newer hardware. Consumer vulnerabilities It goes without saying that device fragmentation presents a number of serious problems for consumers. First-party ap- plications and changes to the underlying operating system introduced by device manufacturers can put users at serious risk. Ryan Welton, a security researcher with NowSecure, re- vealed at BlackHat 2015 that Samsung’s update process for its customized version of the Swift keyboard application could be exploited to create a man-in-the-middle scenario through which remote code execution could be achieved. According to a 2015 report by Dan Goodin for Ars Technica:7 Attackers in a man-in-the-middle position can imperson- ate the [update] server and send a response that includes a malicious payload that’s injected into a language pack update. Because Samsung phones grant extraordinarily elevated privileges to the updates, the malicious payload is able to bypass protections built into Google’s Android op- erating system that normally limit the access third-party apps have over the device. Welton successfully demonstrated the attack at the confer- ence, compromising a variety of Samsung phones attached to Verizon, Sprint, and AT&T carrier networks.8 The vulnera- bility was quickly patched by Samsung, but carrier networks were slow to push the update to affected devices. Often times, very serious vulnerabilities are discovered and problems arise when patches must be written, tested, mod- ified, and verified across a variety of devices, operating sys- tem versions, and carrier or manufacturer builds. On July 27, 2015, Joshua Drake of Zimperium announced that he had 6 Kumar, Ramesh. “Microsoft Hopes 0% Windows Phone 8.x OS fragmentation; Windows 10 Is the Future.” Inferse, 15 Nov 2014. Web. 10 Feb 2016 – http://www. inferse.com/19397/microsoft-hopes-0-windows-phone-8-x-os-fragmentation- windows-10-future/. 7 Goodin, Dan. “New Exploit Turns Samsung Galaxy Phones into Remote Bugging Devices.” Ars Technica. 16 Jun 2015. Web. 10 Feb. 2016 – http://arstechnica.com/ security/2015/06/new-exploit-turns-samsung-galaxy-phones-into-remote-bugging- devices/. 8 Welton, Ryan. “Remote Code Execution as System User on Samsung Phones.” NowSecure Blogs. NowSecure, 15 June 2015. Web. 19 Jan. 2016 – https://www. nowsecure.com/blog/2015/06/16/remote-code-execution-as-system-user-on- samsung-phones/. Figure 2 – Android Market Shares by Manufacturer (OpenSignal, 2015) Figure 3 – Q4 2014, Windows Phone OS Distribution (Kumar, 2014) 36 – ISSA Journal | January 2017 Fragmentation in Mobile Devices | Ken Smith
  • 37.
    discovered a varietyof Android vulnerabilities that are now collectively known as “Stagefright.” The Stagefright bug af- fected Android version 2.2 (“Froyo”) and newer. When ex- ploited successfully, the vulnerability allowed for an attacker to perform remote code execution and privilege-escalation attacks against affected devices. Zimperium estimated that, at the time of the initial announcement, a successful exploit could be triggered on roughly fifty percent of affected devic- es without any user interaction.9 Aside from the seriousness of the vulnerability itself, most concerns about Stagefright stemmed from the hurdles to patching presented by device fragmentation. In spite of the fact that Google fixed the prob- lem in its code base days after being notified in April 2015, it took months for major manufacturers and carrier networks to push the patches out to affected devices. Even worse, cer- tain older devices may never be patched for Stagefright.10 To cap off the crisis, the slow release of patches and updates was complicated when a second round of Stage- fright vulnerabilities was discovered and publicly released in early October of that same year.11 Logo for the Stagefright vulnerability (Rundle, 2015) Often times, vulnerabilities are discovered in specific mod- els of hardware, but manufacturers opt not to issue a fix in favor of promoting new versions of the devices. In 2010, an independent researcher discovered a technique for gaining root access to iOS devices (iPhone 4 and older) without the need for a jailbreak (the “SSH Ram Disk” attack described at the beginning of the article).12 At the time of this writing, the iPhone 4 is still vulnerable to this particular attack, and as these devices are no longer officially supported by Apple (and the fact that the exposure is hardware-based), there will likely never be a security update that resolves the issue. Similarly, in 2013, independent security researchers discovered two lock- screen bypasses for the Samsung Galaxy S3. Both require lit- tle technical knowledge or time to execute successfully. At the time of the discovery, Samsung neglected to patch the vul- nerabilities in favor of releasing the Galaxy S4. According to a study by OpenSignal, the Samsung Galaxy S III was still the most widely owned Android phone in the United States as of Q4 2014.13 Future of mobile technologies Given the current state of the mobile landscape, device frag- mentation is likely to remain a big part of the future of mo- bile technologies. Fortunately, there are steps consumers can take to protect themselves and their devices. For those users 9 Z Team. “How to Protect from Stagefright Vulnerability.” Zimperium. 30 Jul. 2015. Web. 10 Feb. 2016 – http://blog.zimperium.com/how-to-protect-from-stagefright- vulnerability/. 10 Rundle, Michael. “‘Stagefright’ Android Bug Is the ‘Worst Ever Discovered.’” Wired Magazine. 27 Jul. 2015. Web. 10 Feb. 2016 – http://www.wired.co.uk/news/ archive/2015-07/27/stagefight-android-bug. 11 Mimoso, Michael. “Stagefright 2.0 Vulnerabilities Affect 1 Billion Android Devices.” ThreatPost. 01 Oct. 2015. Web. 10 Feb. 2016 – https://threatpost.com/stagefright-2-0- vulnerabilities-affect-1-billion-android-devices/114863/. 12 “Working iPhone Recovery Ramdisk with SSH;-).”16 May 2010. 19 Jan. 2016 – http:// msftguy.blogspot.com/2010/05/working-ramdisk-with-ssh.html. 13 OpenSignal (2015). January 2017 | ISSA Journal – 37 Fragmentation in Mobile Devices | Ken Smith IT’S GOOD FOR BUSINESS Why Advertise in the ISSA Journal? Contact Joe Cavarretta jcavarretta@issa.org Learn More Today Access nearly 11,000 ISSA members, of which 74 percent are CISO/Security Leaders, Senior Executives, and Mid-Career Professionals either making decisions or highly influencing decisions on products and services to evaluate or purchase. Place your advertising strategically to surround our monthly themes with your organization’s products and services... JANUARY 2017 Best of 2016 FEBRUARY Legal, Privacy, Regulation, Ethics MARCH Internet of Things APRIL New Technologies in Security MAY The Cloud JUNE Big Data/Machine Learning/Adaptive Systems JULY Cybersecurity in World Politics AUGUST Disruptive Technologies SEPTEMBER Health Care OCTOBER Addressing Malware NOVEMBER Cryptography and Quantum Computing DECEMBER Social Media, Gaming, and Security
  • 38.
    of mobile devicemanufacturers. Earlier this year, a Dutch consumer group issued a lawsuit against Samsung over what they perceived to be the manufacturer’s slow security update policy. According to a January 2016 Neowin report by Andy Weir: The [consumer group] cites a survey that it carried out, which it says shows that “82% of the Samsung phones ex- amined had not been provided with the latest Android version in the two years after being introduced.” Indeed, it’s worth noting that Samsung isn’t exactly known for delivering updates with any great sense of urgency, as its record with the newest major version of Android shows.15 The group ultimately wants Samsung to provide a full two years of updates for its devices starting on the day that the de- vice is sold (as opposed to the date of its initial release or even manufacturing). While these demands may be overzealous, and keeping in mind that the case has yet to be decided, it is the first time such action has taken place: consumers break- ing new security ground in the mobile device market. Conclusion This means that users are likely, at least a small extent, to have more positive control over the security of their devices of choice from the very beginning. And that’s a crucial step in the growth and maturity of the mobile device market. And as that market continues to expand, all stakeholders, includ- ing consumers, have a responsibility to ensure that issues like device fragmentation do not seriously impact the security posture of the market at large, thus ensuring that all mobile devices are as secure as possible from initial release to end-of-life. About the Author Ken Smith works for SecureState, a global management consulting firm specializing in information security. Ken works primarily on wireless and physical assessments as well as in mobile device and application security. He enjoys presenting and regularly speaks at industry conferences. Reach Ken via email at ksmith@securestate.com or call 216.927.8200. 15 Weir, Andy. “Samsung Sued by Dutch Consumer Group over ‘Poor Software Update Policy for Android Phones.’” Neowin. 20 Jan. 2016. Web. 10 Feb. 2016 – http://www. neowin.net/news/samsung-sued-by-dutch-consumer-group-over-poor-software- update-policy-for-android-phones. whose primary concern is device fragmentation, iOS is the best option of the “big three.” Consumers with a preference for Android or Windows devices should purchase the newest technologies whenever possible and stick to the more well- known manufacturers. These companies are the most likely to maintain the resources that will allow them to continue to support future versions of Android or Windows on legacy hardware. Additionally, the devices of those manufacturers with larger portions of the market (Google, Samsung, HTC, and LG) typically receive security updates faster and more frequently than the lesser-known manufacturers. Application developers can also help alleviate the sting of de- vice fragmentation. While the latest hardware can be allur- ing, responsible developers should continue to support lega- cy hardware for as long as possible. A significant portion of the smartphone market is comprised of outdated hardware.14 Dropping support for those devices that still see regular use can place large portions of the market at risk. Whenever pos- sible, developers should seek to support at least two gener- ations of hardware. Businesses and organizations, including those with bring-your-own-device (BYOD) policies, should consider the implementation of a mobile-device-manage- ment (MDM) solution. Though not perfect, MDM solutions like AirWatch and MobileIron offer a variety of benefits in the mobile security space: • Automatically update corporate-relevant applications and services • Flag jailbroken or rooted devices and disconnect them from corporate resources • Group devices together logically for ease of control and monitoring • Prevent vulnerable or unsupported devices from be- ing connected to corporate networks and resources • Push corporate mobile policies to all employee devic- es Finally, consumers also have a responsibility to protect them- selves. Users should avoid jailbreaking or rooting their phones and tablets. Doing so can significantly amplify the fallout of a successful exploit, giving attackers complete control over the contents of compromised devices. There may also be emerg- ing methods through which customers can force the hands 14 Ibid. 38 – ISSA Journal | January 2017 Fragmentation in Mobile Devices | Ken Smith ISSA Special Interest Groups Special Interest Groups — Join Today! — It’s Free! ISSA.org => Learn => Special Interest Groups Security Awareness Sharing knowledge, experience, and methodologies regarding IT security education, awareness and training programs. Women in Security Connecting the world, one cybersecurity practitioner at a time; developing women leaders globally; building a stronger cybersecurity community fabric. Health Care Driving collaborative thought and knowledge-sharing for information security leaders within healthcare organizations. Financial Promoting knowledge sharing and collaboration between information security professionals and leaders within financial industry organizations.
  • 39.
    has physical controlover the data—whether within a virtual environment or the cloud—the organization remains respon- sible for ensuring it can meet legal and regulatory control requirements. For the financial services industry, the X9.125 standard for cloud compliance is being developed to address requirements and compliance between a cloud subscriber and its CSP. Some background might help clarify how X9.125 fits into fi- nancial and cloud services. As shown in figure 1, the Ameri- can National Standards Institute3 (ANSI) is the United States’ representative to the International Standards Organization4 (ISO) among others. However, ANSI does not develop stan- dards; rather, they accredit other organizations as indus- try-specific standards developers and technical advisory groups (TAG) to ISO technical committees. The Accredited Standards Committee X95 (ASC X9 or just X9) is one such organization designated by ANSI to perform the following roles: • Develop ANSI standards for the financial services in- dustry 3 www.ansi.org. 4 www.iso.org. 5 www.x9.org. Abstract The Cloud offers organizations faster, cheaper, richer, and some- times more secure application deployments than they them- selves can orchestrate. However, organizations remain re- sponsible for ensuring the security of their data, even when they transfer its physical control to a cloud service provider (CSP). What  information does an organization require from a CSP to gain confidence they are meeting their data gover- nance obligations? Can cloud-based technologies, such as the blockchain, play a role in providing cloud subscribers assur- ance their data is being properly managed and that their CSP is in compliance with established security policies and prac- tices? For the financial service industry the X9.125 standard is under development to define requirements and provide a compliance model using blockchain technology. Introduction A s organizations embrace the Cloud and migration or deploy applications and invariably data, they trans- fer control from internal processes to a cloud service provider (CSP). However, organizations (subscribers) remain responsible for industry information security compliance despite the delegation to the CSP. Healthcare1 data and pay- ment2 data notwithstanding, organizations must ensure they exert adequate governance over how their data is protected. Regardless of where their data is located and who actually 1 http://www.hhs.gov/ocr/privacy/index.html. 2 https://www.pcisecuritystandards.org/. Gaining Confidence in the Cloud By Phillip Griffin – ISSA Fellow, Raleigh Chapter and Jeff Stapleton – ISSA member, Fort Worth Chapter In cloud deployments organizations remain responsible for ensuring the security of their data. Can cloud-based technologies, such as the blockchain, play a role in providing cloud subscribers assurance their data is being properly managed and that their cloud service provider is in compliance with established security policies and practices? Figure 1 – Standards overview January 2017 | ISSA Journal – 39 ISSA DEVELOPING AND CONNECTING CYBERSECURITY LEADERS GLOBALLY
  • 40.
    Figure 2 –Controls overview • Represent the United States as the TAG to ISO technical committee 68 Financial Services (TC68) • Manage TC68 and as the official secretariat Consequently, many X9 standards are sub- mitted to ISO for international standardization. Further, X9 often initiates ISO work items, adopts ISO financial stan- dards, and retires ANSI standards in favor of its ISO version. Sometimes the US markets are uniquely distinct that a do- mestic X9 standard is needed in absence or parallel with an ISO standard. The cloud security work item is assigned to the X9F4 cryptographic protocols and application security work group; Jeff Stapleton is the X9F4 chair, and Phil Griffin is the X9.125 editor. One of the first X9F4 actions was to review the existing body of work including special publications from the National Institute of Standards and Technology (NIST),6 Federal Fi- nancial Institutions Examination Council (FFIEC)7 cloud computing recommendations, Central Intelligence Agency (CIA) views on cloud computing, and the Cloud Security Al- liance (CSA) research on comparable audit programs. These materials were digested to formulate a core set of security requirements for managing and securing information in the cloud, whether this information is located in a private cloud completely under control of the organization, or managed in a hybrid or public cloud environment. Regardless of the cloud service type or environment, these basic questions were iden- tified: 1. What security controls does the cloud subscriber (the consumer of the cloud services) need to protect the confi- dentiality and integrity of its data? 2. What security controls does the cloud service provider offer to protect the confidentiality and integrity of its sub- scriber’s data? 3. What security controls provided by the cloud service pro- vider can be monitored by the cloud subscriber to verify compliance? While the development of X9.125 is still work in progress and has undergone several redesigns, cloud services and its adoption in the financial industry have continued to evolve. Thus the X9.125 standard is attempting to hit a moving tar- get. X9 standards provide requirements (“shall”) and recom- mendations (“should”) that are practical and verifiable. Thus, as shown in figure 2, the standard needs to address security controls and interoperability between the cloud service pro- vider and the cloud subscriber, in addition transparency of any service sub-providers. Cloud service providers, like any organization relying on in- formation technology (IT), need to have their security con- trols documented in policy (why), practices (what), and pro- 6 http://csrc.nist.gov/publications/PubsSPs.html. 7 http://ithandbook.ffiec.gov/media/153119/06-28-12_-_external_cloud_ computing_-_public_statement.pdf cedures (who). They also need to securely manage resources, including people, places, and processes. IT controls include network, systems, and applications addressing authentica- tion, authorization, and accountability (AAA). Data must also be managed across its life cycle including creation, dis- tribution, storage, and termination. When cryptography is used, the keys must be managed in a secure manner. How the controls are deployed and managed depends on the re- lationship between the CSP and the subscriber is depicted in figure 2: • The topmost solid arrow shows the case when controls are provided solely by the CSP to the subscriber. For example, the CSP might encrypt the subscriber’s data in storage us- ing cryptographic keys managed solely by the CSP. • The middle arrows show cases when controls are mutually managed by both the CSP and the subscriber. For exam- ple, data in transit is encrypted using a session key that is dynamically established, based on an exchange of public key certificates between the CSP and the subscriber. • The bottommost dotted arrow shows the case when con- trols are provided solely by the subscriber. For example, the subscriber might encrypt or tokenize data before it is sent to the CSP for storage or processing. • The dotted arrow between the CSP and its service sub-pro- vider shows the case when controls are provided indirect- ly to the subscriber by the sub-provider. For example, the sub-provider might be a tokenization service used by the CSP to protect the subscriber’s data in storage. While the X9.125 is still work in progress, another major as- pect is to develop a reporting model such that a cloud sub- scriber can verify a CSP’s compliance. Compliance might be to the CSP policy and practices aligned with the subscriber, or preferably the security requirements being defined in the X9.125 standard adopted by both parties. Regardless, this im- plies that the CSP provides compliance information that is reliable and verifiable. One method for a digital ledger might be blockchain technology, more contemporarily known be- cause of Bitcoin. Blockchains Blockchains have been around for decades. Notably Merkle trees were addressed in a US Patent [2] issued in 1982; so the technology is well vetted. While the Bitcoin blockchain is used as a general ledger for Bitcoin transactions, any in- formation can be encapsulated within a blockchain that can provide data integrity. Incorporating timestamps within the blockchain (as does Bitcoin) also provides a historical record 40 – ISSA Journal | January 2017 Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
  • 41.
    of what happenedwhen and by whom. Consider figure 3 as an example. The blocks are number sequentially: Block 0, 1…to N. There is always an initial block conventionally numbered “0” to indi- cate its special nature. There is always a last block (N) which is the most current addition to the blockchain. Each block con- tains data, in this example a cloud service provider’s policy numbered accordingly to its block number: P0, P1…PN and so on. In this example, each block contains a hash (H) of its own policy data, essentially a link to itself, so Block 0 contains H(P0), Block 1 contains H(P1), and Block N contains H(PN). Additionally, each block contains a hash of its processor, so Block 1 contains H(P0) as a link to Block 0, and Block N con- tains H(PN-1) as a link to Block N-1. Note that Block 0 does not contain a previous link since Block 0 is the blockchain origin. At this point one might think that the blockchain is completely reliable, but it turns out that simple links based on a hash of just the data in the previous block is unreliable. Consider an attacker that takes some intermediary Block K which links to Block K-1 and has Block K+1 linked to it. The attacker makes a replica of Block K, which we will call Block J, modifies the data so it contains PJ and no longer PK, and pub- lishes it as the real Block K. The attacker then updates Block K+1 to link to Block J instead of Block K. Thus, the blockchain has been compromised but yet still appears to be valid since all of the links are valid. Without some method of either ver- ifying the publisher or the whole blockchain, a simple substi- tution attack is possible. Replacing the previous link as a sim- ple hash of the previous data with a digital signature would prevent the substitution attack; however, this would require the support of a public key infrastructure (PKI) with certif- icates, private key storage, certificate authorities, revocation lists, and the like. Alternatively, replacing the previous link with a hash chain achieves the same anti-substitution control without the PKI overhead. Referring back to figure 3, we have provided another chain field where each block contains a chain numbered by its block number: C0, C1…CN. Each chain is a link to all of the pre- vious blocks, which is a hash of two elements: the previous chain and a hash of its own data. Thus, Block N contains a hash of CN-1 and a hash of its own data H(PN), that is H(CN- 1, H(PN)). Likewise, Block 1 contains a hash of C0 and a hash of its own data H(P1), namely H(C0, H(P1)). Block 0 only con- tains a hash of a hash of its own data H(H(P0)) because there Figure 3 – Simple blockchain is no previous chain. In this manner an attacker cannot re- place any of the published blocks without updating the whole chain, which is the basis of the Bitcoin blockchain security. The presumption is that it is cheaper to be honest than dis- honest. If a majority of CPU power is controlled by honest nodes, the honest chain will grow the fastest and outpace any competing chains. To modify a past block, an attacker would have to redo the proof-of-work of the block and all blocks after it and then catch up with and surpass the work of the honest nodes. We will show later that the probability of a slower attacker catching up diminishes exponentially as subsequent blocks are added. [3] Much of the media discussion around Bitcoin has focused on its role as a crypto currency. Bitcoin provides a means for achieving efficient, anonymous financial transactions. In this context, Bitcoin is sometimes described as a disruptive technology, one that facilitates the activities of drug deal- ers and terrorists, one that threatens to disintermediate and undermine the existing financial services industry, or one that presents banks who serve Bitcoin industry players with heightened “Bank Secrecy Act (BSA)/Anti-Money Launder- ing (AML) Act compliance risks” [1]. On the other hand, Bitcoin has seen adoption by e-commerce stalwarts such as PayPal, Overstock, Dish Network, and Dell Computers, as well as “many community-driven organiza- tions” that “allow anonymous donations using Bitcoin” [6]. Despite any negative aspects associated with Bitcoin, “there remain many legitimate uses for Bitcoin and businesses that facilitate these legitimate transactions” [7]. There is also growing interest in leveraging the blockchain technology that underpins Bitcoin to both reduce transaction costs and strengthen financial services security. To this end, more gen- eral purpose applications of the blockchain that are far re- moved from the use of Bitcoin to facilitate financial services transactions are being considered. For example, blockchains might be used to evaluate, monitor, assess, or even audit a cloud services provider: • The CSP might publish its information security policy and practices in a blockchain providing an historical record of versions and changes. In this manner, new subscribers can evaluate the CSP, existing subscribers can monitor chang- es, internal audit can assess the CSP, and professionals can perform independent audits of the CSP. January 2017 | ISSA Journal – 41 Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
  • 42.
    • The CSPmight distribute information security news in a blockchain providing notifications or alerts to its subscrib- ers about incidents or events about new vulnerabilities in a reliable manner. Today, this information is typically pro- vided via emails or blogs. • The CSP might issue information security details in a blockchain providing real-time data about its controls. In this manner, existing subscribers can monitor the CSP for its dependability, consistency, and overall trustworthi- ness. Another name for this would be compliance. Hence, the concept of using blockchains to record and ver- ify CSP compliance data is not as farfetched as might have been initially considered. For cloud subscribers to gain such assurance, and to exercise due diligence in the conduct of their governance and risk management responsibilities, they need some insight into what goes on under the covers at their CSP. Cloud subscribers need the same types of operational evidence of compliance from their CSP that they would ex- pect their internal IT departments to provide. Whether an organization’s data is inside its firewall or floating around in the cloud, informed information security management prac- tices still depend on access to the basics: vulnerability scan results, penetration test results, system logs, application logs, analytical results, security alerts, and summarized informa- tion. Compliance evidence must have origin authenticity, data integrity, and often confidentiality safeguards that pro- hibit access by attackers and other unauthorized individuals. The attractiveness of the Bitcoin blockchain includes its de- centralization. Bitcoin spenders submit their transactions (signature, inputs, outputs) to multiple Bitcoin nodes such that the transaction gets published in the next block that is originated by the next miner to solve the hash solution. The idea is that the amount of work to perpetrate fraud far ex- ceeds the work factor for mining. Sometimes a race condition creates a bifurcated blockchain generated by two different Bitcoin nodes; however, consensus processing will eventually prune the blockchain to only one authentic version. There is no central authority that provides a processing choke point, a single point of failure, or a single point of attack. However, blockchain management is not without its prob- lems. There are orphaned blocks, which are valid but did not make it into the main Bitcoin chain. There are always uncon- firmed transactions waiting for the next block, which might get lost during the bifurcation and pruning process. There are double spends, transactions where the same Bitcoin fractions get spent by the same entity to two different receivers. There are strange transactions, where the syntax or semantics are invalid. And there are outright rejected transactions dropped by Bitcoin nodes that never get included in the chain. Some of these might be processing errors due to software bugs, Bit- coin versions, or rules issues. Alternatively, some transactions might be fraudulent in nature. Bitcoin fraud management is relatively nascent, and without a central authority there are no arbitration or adjudication programs available. Bitcoin information is publicly accessible by definition. Hash algorithms provide the links between blocks and transac- tions, and digital signatures provide transaction integrity and authentication. Non-repudiation is not feasible as Bitcoin identifiers support anonymity, and the lack of arbitration does not meet legal needs discussed in the Digital Signature Guide- lines [4] and the PKI Assessment Guideline [5]. Further, the Bitcoin blockchain does not offer data confidentiality. Some of the cloud server provider’s information security manage- ment data is sensitive such that it might need to be encrypted, but only accessible by authorized clients or regulatory bodies. Thus, key management schemes need to be considered. There is also growing interest in cloud data confidentiali- ty and user anonymity. In a paper presented at the Security Standardization Research (SSR) 20158 conference held re- cently in Tokyo, Japan, researchers McCorry, Shahandashti, Clarke, and Hao proposed a new category of Authenticated Key Exchange (AKE) protocols. These new protocols, which “bootstrap trust entirely from the blockchain,” are identi- fied by the authors as “Bitcoin-based AKE” [6]. The SSR 2015 paper describes two new protocols, one with a guarantee of forward secrecy and offers proof-of-concept prototypes with experimental results to demonstrate their practical feasibil- ity. Both protocols provide greater anonymity than can be achieved using digital certificate- or password-based AKE. Following the guidance of international security standards can help ensure that the same information security policies used to manage risk when information systems reside in traditional non-cloud environments are also applied in the cloud. Recently, the big three international security standard- ization bodies published Recommendation ITU-T X.1631 | ISO/IEC 27017 Code of practice for information security con- trols based on ISO/IEC 27002 for cloud services.9 This standard builds on selected parts of the familiar ISO/IEC 27002 Code of practice for information security management10 but adds additional cloud-specific recommendations and guidance. Although ITU-T X.1631 | ISO/IEC 27017 provides import- ant recommendations and guidance, it contains no actual requirements. Conversely, the draft X9.125 standard hardens the ISO, IEC, and ITU-T recommendations and guidance into a set of specific information security management re- quirements. Where ITU-T X.1631 | ISO/IEC 27017 relies on clauses 5 through 18 of the ISO/IEC 27002 Code of Practice, X9.125 defines requirements based on comparable clauses in the ISO/IEC 27001 Information security management systems – Requirements.11 Conclusions Blockchains, a decades old cryptographic technology, has become a creature of the Cloud. Its adoption and use carry many of the same security concerns as other cloud-based ap- 8 http://www.ssr2015.com/. 9 http://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail. htm?csnumber=43757. 10 http://www.iso.org/iso/catalogue_detail?csnumber=54533. 11 https://en.wikipedia.org/wiki/ISO/IEC_27001:2013. 42 – ISSA Journal | January 2017 Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
  • 43.
    plications and services.But for blockchains to be trusted in the current financial services regulatory environment, and for it to be widely adopted, blockchain-based systems must comply with an organization’s existing security policy and practices. Many of the policies needed to manage blockchains and other cloud-based deployments are the same as those used to manage security risk within an organization. Organi- zations must continue to manage risk and fully exercise their information security governance responsibilities regardless of where their data and applications roam. Cloud subscribers need the ability to verify that their cloud service providers are securing information in a compliant manner with established requirements. ASC X9 is currently developing the X9.125 standard with the option of the United States submitting the work to ISO for in- ternational standardization. Once the cloud security require- ments have been completed, the corresponding compliance data might be encapsulated in a publicly available or privately provided blockchain. Cloud subscribers, internal or external auditors, regulators, or any independent third-party assessor should be able to validate the CSP by verifying its informa- tion security blockchain. This article is also a call for participation. Cloud service pro- viders, cloud subscribers, or organizations that are interested in the development of the X9.125 standard are encouraged to contact the ASC X9 or the X9F4 work group chair. Par- ticipation by any X9 member is welcomed. Once the X9.125 standard is approved as a new ANSI standard, the possibility of its being submitted to ISO as a USA offering is something that will be seriously considered with the appropriate organi- zations’ support. References 1. King, Douglas. (2015). Banking Bitcoin-Related Businesses: A Primer for Managing BSA/AML Risks. Federal Reserve Bank of Atlanta. Retrieved November 19, 2015, from https://www.frbatlanta.org/-/media/Documents/rprf/rprf_ pubs/2015/banking-Bitcoin-related-businesses.pdf. 2. Merkle, R. C. (1988). “A Digital Signature Based on a Conventional Encryption Function.” Advances in Cryptol- ogy — CRYPTO ‘87. Lecture Notes in Computer Science 293. p. 369. doi:10.1007/3-540-48184-2_32 ISBN 978-3-540- 18796-7. US Patent 4309569, Method of Providing Digital Signatures, Ralph C. Merkle, January 5, 1982. 3. Satoshi Nakamoto, Bitcoin: A Peer-to-Peer Electronic Cash System, Bitcoin.org, retrieved 31 October 2008, https://bit- coin.org/bitcoin.pdf. 4. ISC, Digital Signature Guidelines, Legal Infrastructure for Certification Authorities and Secure Electronic Commerce, Information Security Committee (ISC), Electronic Com- merce and Information Technology Division, Section of Science and Technology, American Bar Association (ABA), ISBN 1-57073-250-7, August 1996. 5. ISC, PKI Assessment Guideline (PAG), Information Security Committee (ISC), Electronic Commerce Division, Section of Science & Technology Law, American Bar Association (ABA), ISBN 1-57073-943-9, June 2001. 6. Patrick McCorry, Siamak F. Shahandashti, Dylan Clarke, Affiliated with School of Computing Science, Newcastle UniversityFeng Hao, Authenticated Key Exchange over Bitcoin, Security Standardisation Research, Volume 9497, Lecture Notes in Computer Science, pp 3-20, December 9, 2015 – retrieved November 8, 2015, from http://eprint.iacr. org/2015/308.pdf. 7. Douglas King, Retail Payments Risk Forum Working Paper, Federal Reserve Bank of Atlanta, October 2015. About the Authors Phillip H. Griffin, CISM, has over 20 years experience in the development of commer- cial, national, and international security standards and cryptographic messaging pro- tocols. Phil has a Master’s of Information Technology, Information Assurance and Se- curity degree, and he has been awarded nine US patents at the intersection of biometrics, radio frequency identification (RFID), and information security management. He may be reached at phil@phillipgriffin.com. Jeff Stapleton has been an ISSA member and participated in X9 for over twenty years; he has contributed to the development of over three dozen X9 and ISO security standards, and has been the chair of the X9F4 work group for over 15 years. The X9F4 work group’s pro- gram of work includes the five-year review of two published standards (X9.73, X9.84) and development of three new standards (X9.112, X9.122, X9.125) in addition to supporting ISO standard efforts. He may be reached at jjs78023@yahoo.com. Easy and Convenient! Place Your Order Today: ISSA Store! www.issa.org/store/default.asp Computer Bags • Short-Sleeve Shirt Long-Sleeve Shirt • Padfolio Travel Mug • Baseball Cap • Fleece Blanket Proud Member Ribbon January 2017 | ISSA Journal – 43 Gaining Confidence in the Cloud | Phillip Griffin and Jeff Stapleton
  • 44.
    T here is nowa political battle going on in Washington, DC, the outcome of which could have a significant impact on the future of how encryption is used. In the US, law enforcement agencies currently have the ability to easily intercept phone calls, and they would like to extend this ability to give them easy access to forms of secure com- munication like encrypted email and text messaging. Privacy advocates seem to think that this is a very bad idea. Law enforcement agencies argue that the world is “going dark” for them because the use of encryption is reducing their ability to investigate and prosecute criminals. Privacy advocates argue that adding a backdoor to products that use encryption is inherently dangerous because it also creates a way for hackers to bypass the encryption. They also argue that enough information is contained in messaging metadata for law enforcement agencies to use to investigate and prosecute criminals, and because of this, law enforce- ment agencies do not need the content of encrypted messages to do their job. Here we look at the facts that might or might not support each of these claims, but before doing that, it is probably helpful to put the current debate into some context. A debate that was very similar to the current one took place in the years just be- fore the dot-com boom, right before the Internet transformed the way that we both conduct business and live our daily lives. Who is driving the debate? It seems clear that the goal of law enforcement officials in gaining the ability to decrypt encrypted messages is to in- vestigate and prosecute criminals. The goals of the average citizen are probably more varied. Their opinions probably range from complete support for the goals of law enforcement officials to complete opposition to them, as well as to more moderate positions in between. This is what Alan Westin’s 1967 book Privacy and Freedom1 tells us to expect, because people tend to fall into three general categories when it comes to valuing their privacy: privacy fundamentalists, the privacy unconcerned, and privacy pragmatists. About 25 percent of people can be classified as privacy funda- mentalists. They place a very high value on their privacy and tend to favor strict government regulations supporting priva- cy. About 16 percent of people can be classified as the privacy unconcerned. They are generally not concerned much about privacy at all and tend to not support strict government regu- lations supporting privacy. The remaining 59 percent of peo- ple have only moderate concerns about privacy and believe that trade-offs between privacy and something else of value are acceptable, and they tend to favor using the existing set of laws and regulations to ensure that fair practices are enforced when this happens. Because each of these groups has a very different approach to privacy, it seems unlikely that any single way to address pri- vacy concerns will appeal to them all. In particular, it might be the case that the point of view that the use of encryption should be a fundamental right that governments should not interfere with might be the belief held only by a small fraction of the population, while a significant majority might find the trade-off between personal privacy and law enforcement of- ficials getting the ability to decrypt encrypted messages to be an acceptable one. But because this majority also tends not to get overly involved in the political processes, their opinions are typically not heard in debates like the current one. Yet in a democracy, where governing by consensus is usually a good idea, learn- 1 Westin, Alan F. Privacy and Freedom. New York: Athenum. 1967. The debate over whether or not to give US law enforcement officials the ability to decrypt encrypted messaging has recently been revisited after a twenty-year break. This is an emotionally charged issue, but an important one, so it might be useful to attempt an unbiased and impartial look at the facts and see how well they correspond to the main talking points of the supporters of each side of the debate. The results may be surprising. To learn more, read on. By Luther Martin – ISSA member, Silicon Valley Chapter and Amy Vosters Crypto Wars II 44 – ISSA Journal | January 2017 ISSA DEVELOPING AND CONNECTING CYBERSECURITY LEADERS GLOBALLY
  • 45.
    ing exactly whatthe consensus is, is probably a good first step towards finding a good solution. The Clinton years: Crypto Wars I To give some historical perspective to the current debate, it is probably useful to recall what happened during the Clinton administration. During the early 1990s, the US government enacted legislation that gave law enforcement officials un- precedented access to telecommunications traffic in the US— this was accomplished by the Communications Assistance to Law Enforcement Act (CALEA) of 1994.2 CALEA required manufacturers of telecommunication equipment to include features in their products that made it very easy for law enforcement officials to intercept telecom- munications traffic and required telecommunications carri- ers to use equipment with these features. Although CALEA carefully distinguished between telecommunications ser- vices (which it applied to) and information services (which it did not apply to), this distinction has become somewhat blurred recently, particularly after recent FCC rule making3 extended CALEA to facilities-based broadband Internet ac- cess providers and providers of interconnected voice-over-in- ternet-protocol (VoIP) services. But while the government was successful in implementing laws that made it easy for them to intercept telecommunica- tions traffic, they were much less successful in implementing laws that would give them access to encrypted traffic. They attempted to do this by restricting the export from the US any strong encryption (providing greater than 40 bits of 2 Public Law No. 103-414, 108 Stat. 4279 (1994), 47 USC 1001-1010. Available at ftp:// ftp.fcc.gov/pub/Bureaus/OSEC/library/legislative_histories/1743.pdf. 3 “Communications Assistance for Law Enforcement Act and Broadband Access and Services,” 47 CFR Part 1, ET Docket No. 04-295; FCC 06-56 (2006). Available at https://apps.fcc.gov/edocs_public/attachmatch/FCC-06-56A1.pdf. cryptographic strength) that did not implement a form of key escrow, in which two different government agencies would independently hold information that could be combined to recover any encryption key that the escrowed encryption scheme used. In April 1993, the Clinton White House announced the “Clipper Chip,”4 technology that would be used to implement the government’s plan for key escrow. In February 1994, the Department of Commerce published the Escrowed Encryp- tion Standard5 (FIPS 185) to support this initiative. Attorney General Janet Reno then announced that NIST and the De- partment of the Treasury would be the escrow agents for the government’s escrowed encryption program and published the procedures for releasing escrowed keys to law enforce- ment officials for use under approved wiretap orders. Every- thing seemed ready to go. But a significant number of Americans were firmly against the government’s plan for escrowed encryption and believed that their need for privacy was more important than the gov- ernment’s need to be able to read some encrypted messag- es. In the face of strong public pressure, the US government eventually abandoned their plans for escrowed encryption, leaving FIPS 185 looking much like the Eighteenth Amend- ment to the US Constitution that banned the production, transport, and sale of alcoholic beverages starting in 1920— something that might have made sense at the time but that looks like a very poorly considered idea from a safe distance into the future. (And if FIPS 185 is analogous to Prohibition, its eventual withdrawal in 2015 is almost certainly analogous to the passage of the Twenty-first Amendment in 1933.) 4 http://www.cryptomuseum.com/crypto/usa/clipper.htm. 5 http://csrc.nist.gov/publications/fips/fips185/fips185.pdf. A Wealth of Resources for the Information Security Professional – www.ISSA.org When TLS Reads “Totally Lost Security” 2-Hour Event Recorded Live: November 15, 2016 How to Recruit and Retain Cybersecurity Professionals 2-Hour Event Recorded Live: October 25, 2016 Security Architecture & Network Situational Awareness 2-Hour Event Recorded Live: September 27, 2016 IoT: The Information Ecosystem of the Future--And Its Issues 2-Hour Event Recorded Live:August 23, 2016 Hacking the Social Grid: Gullible People at 670 Million Miles per Hour 2-Hour Event Recorded Live: July 26, 2016 Legislative Impact: When Privacy Hides the Guilty Party 2-Hour Event Recorded Live: June 28, 2016 Breach Report Analysis – SWOT or SWAT? 2-Hour Event Recorded Live: May 24, 2016 The Sky Is Falling... CVE-2016-9999(nth) ? 2-Hour Event Recorded Live: April 26, 2016 Security Software Supply Chain: Is What You See What You Get? 2-Hour Event Recorded Live: March 22, 2016 Mobile App Security (Angry Birds Hacked My Phone) 2-Hour Event Recorded Live: February 23, 2016 2015 Security Review & Predictions for 2016 2-Hour Event Recorded Live: January 26, 2016 Forensics: Tracking the Hacker 2-Hour Event Recorded Live: November 17, 2015 Click here for On-Demand Conferences www.issa.org/?OnDemandWebConf January 2017 | ISSA Journal – 45 Crypto Wars II | Luther Martin and Amy Vosters
  • 46.
    ment agencies, sothey may be more likely to use encryption than the average person is. But is this really happening? Fortunately, there is lots of data available on the use of wire- taps by US state and federal law enforcement agencies, and it is collected and summarized in the US federal court’s annu- al Wiretap Report. Since 2000, these reports have listed how many court-authorized intercepts were unable to be read by law enforcement officials because of the use of encryption. This data does not include wiretaps authorized for some na- tional security reasons, but it is the best data that is available on the subject. The data from the 2000 through 2014 Wiretap Reports is summarized in table 1. Note that in the year 2013, in which the largest number of wiretaps were thwarted by the use of encryption, the num- ber of wiretaps that law enforcement were unable to read was only about 0.4 percent of the total number of intercepts that were installed that year. In fact, in 2013, the worst year for law enforcement ever in this particular area, they were still able to read over 99.5 per- cent of intercepts. That is indeed less than the 100 percent that they were able to read in most other years, but it is still a significant fraction and it certainly looks like law enforce- ment officials are having very little trouble in reading what they get from their wiretaps. So using the best available data, the claim that the world is “going dark” for law enforcement because of the growing use of encryption seems to be false. Backdoors mean weak security Privacy advocates claim that if law enforcement had the abil- ity to decrypt any encrypted messages, then hackers could also take advantage of the same capability, and this would cause the security provided by the encryption to be reduced by an unacceptable level. This certainly sounds reasonable, but is it true? The use of encryption in some form is very common in the business world. This can cause lots of regulatory and com- pliance issues if it is done carelessly because if you lose the key that was used to encrypt important business information, then you also lose the information that was encrypted with it. This is an unacceptable situation to almost all businesses. To avoid this situation, most enterprise software that uses encryption also has the capability to recover lost keys. It is very common to control access and use of an encryption key through some form of authentication, most commonly a password. And when users forget their password, they will need to get another copy of this key. This is a necessary capability of enterprise software that uses encryption, but it is also the very sort of backdoor that privacy advocates claim will introduce serious vulnerabilities into the software that hackers can exploit. But this seems to have not occurred—enterprise software that implements key recover does not seem to have serious vulnerabilities that hackers are able to exploit. So it seems that this particular claim may not be as accurate as we may first think it is. So at the end of the Clinton administration it might have looked like law enforcement officials had a significant victory in CALEA, and privacy advocates had a significant victory in the defeat of the government’s plans for escrowed encryption. Over two decades later, law enforcement officials decided to revisit the idea of extending the scope of CALEA to include things that might be considered information services. This was successful in 2006, when the scope of CALEA was ex- panded to include some broadband Internet and VoIP ser- vices. Today, from the point of view of law enforcement officials, extending the scope of existing laws to give them the ability to read encrypted messaging probably seems like a reason- able thing to do. From their point of view, this is probably just an incremental change, particularly when compared to the significant capabilities that they already have to intercept communications. Going dark Law enforcement agencies now claim that they need the abili- ty to decrypt encrypted communications because the world is “going dark” for them. There is no precise definition for “go- ing dark,” but the FBI has described this phenomenon as “a growing gap between the existing statutory authority of law enforcement to intercept communications pursuant to court order and the practical ability to do so.”6 This certainly sounds plausible. After all, criminals probably want to hide their intentions and actions from law enforce- 6 Congressman Peter T. King, “Remembering the Lessons of 9/11: Preserving Tools and Authorities in the Fight Against Terrorism,” Journal of Legislation, Vol. 41, No. 2 (2015). Available at http://scholarship.law.nd.edu/jleg/vol41/iss2/1. Year Intercepts installed Intercepts unread due to encryption 2000 1,139 0 2001 1,405 0 2002 1,273 0 2003 1,367 0 2004 1,633 0 2005 1,694 0 2006 1,714 0 2007 2,119 0 2008 1,809 0 2009 1,764 0 2010 2,311 0 2011 2,189 0 2012 2,501 4 2013 2,331 10 2014 2,433 4 Table 1 – Number of intercepts installed per year and how many of these were unreadable because of the use of encryption (Source: US courts Wiretap Reports, 2000-2014) www.uscourts.gov/statistics-reports/analysis-reports/wiretap-reports 46 – ISSA Journal | January 2017 Crypto Wars II | Luther Martin and Amy Vosters
  • 47.
    It might alsobe useful to note that because of CALEA, all telecommunications equipment used within the US contain a backdoor, but the existence of these backdoors does not seem to have compromised the security of the US telecommunica- tions networks. Carelessly implemented key recovery can almost certainly in- troduce weaknesses that hackers can exploit, but there seems to be little evidence that deployed enterprise software prod- ucts that include the ability to implement key recovery are actually vulnerable in this way. In light of this, the general claim that backdoors are inherently a dangerous security vul- nerability seems to be false. Metadata is enough Metadata, or data about data, is typically transmitted in an unencrypted form, even for encrypted messages. Metadata about an email message includes information like the “to” and “from” addresses, and this information needs to be avail- able to the applications that get an encrypted message from sender to recipient, even though the encrypted message body does not. And because metadata is not encrypted, it is readily available to law enforcement officials. Some privacy advocates claim that the information in meta- data is just as valuable to law enforcement officials as the encrypted message is, so that by collecting and analyzing metadata they should be able to investigate and prosecute criminals without access to any encrypted messages at all. The information in metadata can easily characterize you and how you live your life. It can tell where you live and work, and it can tell who you communicate with and about what. So the claim that law enforcement officials can do their jobs just as well with just messaging metadata instead of the content of the corresponding messages sounds like it might be plausible. There have been two recent careful studies of this issue, both of which focused on the US government’s use of metadata to identify terrorist activity. One was created in the recent case of Klayman v. Obama7 (Klayman). In Judge Leon’s 2013 ruling in Klayman, he summarized how effective the government’s metadata collection programs had been, and noted that there was no evidence at all that any government agency had been able to identify and investigate imminent threats faster with access to metadata that they were able to without metadata, despite the claims by both the FBI8 and the NSA9 that meta- data is an important tool in their investigations. In the second study, The President’s Review Group on Intel- ligence and Communications Technologies made a similar finding.10  In particular, when discussing the effectiveness of the NSA’s metadata collection program, it noted that there 7 Klayman v. Obama, 957 F. Supp. 2d 1, 29 (D.D.C. 2013). 8 http://www.clearinghouse.net/chDocs/public/NS-DC-0007-0014.pdf. 9 http://www.clearinghouse.net/chDocs/public/NS-DC-0007-0016.pdf. 10 The President’s Review Group on Intelligence and Communications Technologies, “Liberty and Security in a Changing World: Report and Recommendations of The President’s Review Group on Intelligence and Communications Technologies,” December 2013. Available at https://www.whitehouse.gov/sites/default/files/ docs/2013-12-12_rg_final_report.pdf had been no instance in which the NSA could say with confi- dence that the outcome of any investigation would have been different without the program. So from the point of view of learning about and preventing terrorist attacks, metadata appears to be of limited value. And while it is possible that a different pattern might be present in metadata when it is used in other types of investigations, it seems reasonable to expect that the same pattern would also be present, and that metadata would also essentially be of limited value to other investigations. Because of this, the claim that metadata is an adequate replacement for the con- tents of messages in criminal investigations seems to be false. Summary It should not be too surprising that both sides in political de- bates may make claims that are not strictly based on facts. The current debate over giving US law enforcement officials the ability to decrypt encrypted messages seems to be no ex- ception to this rule, and all of the main arguments on both sides of the issue seem to essentially be false. The claim that the use of encryption is making the world “go dark” for law enforcement seems to be false. The claim that backdoors are inherently bad because they create a weakness that hackers can exploit seems to be false. And the claim that messaging metadata is an adequate replacement for the text of encrypted messages seems to be false. The debate over whether or not to give US law enforcement officials the ability to decrypt encrypted communications is very important because it will dramatically affect the privacy of hundreds of millions of people. The decision on whether or not to do this should be based on the best available facts, not on arguments that are excellent examples of why English has the word “specious,” indicating something that is superficial- ly plausible but actually wrong. On the other hand, psychologists tell us that we tend to not base most decisions on facts; instead we tend to have emo- tional reactions that we then try to rationalize. So a more im- portant aspect of this debate may really be whether or not we feel that the government can be trusted to not abuse their ability to decrypt encrypted messages if they had it. But that is something that is beyond the scope of what we can easily discuss here. About the authors Luther Martin is a Dis- tinguished Technologist at Hewlett Packard Enterprise. You can reach him at luther. martin@hpe.com. Amy Vosters is a marketing manager at SOASTA. You can reach her at amy_vosters@ya- hoo.com. January 2017 | ISSA Journal – 47 Crypto Wars II | Luther Martin and Amy Vosters
  • 48.
    #ISSAConf ISSA 2017 INTERNATIONALCONFERENCE DIGITAL DANGER ZONE October 10-11, 2017 San Diego, California Sponsorship information: Joe Cavarretta – jcavarretta@issa.org