SlideShare a Scribd company logo
1 of 257
RATIO ANALYSIS RATIO ANALYSIS Note: Please change
the column names based on your industry and your selected
companies.RATIOS<INDUSTRY><COMPANY
#1><COMPANY #2>ANALYSIS (your comments), which
company is stronger, better/worse than industry, what results
meanProfitability Ratios (%)show calculationshow ending
resultshoww Calculationshow resultGross Margin EBITD
Margin Operating Margin Pretax Margin Effective Tax Rate
Financial StrengthQuick RatioCurrent Ratio LT Debt to Equity
Total Debt to Equity Interest Coverage Valuation RatiosP/E
Ratio Price to Sales (P/S)Price to Book (P/B)Price to Tangible
Book Price to Cash FlowPrice to Free Cash Flow Management
Effectiveness (%)Return On Assets Return On Investment
Return On Equity DividendsDividend YieldPayout Ratio
EfficiencyRevenue/Employee Net Income/Employee Receivable
TurnoverInventory TurnoverAsset Turnover SummaryWhat is
ratio analysis? Briefly explain in this space, and reference your
resources: Referring to your ratio analysis above, in which
company would you be willing to invest, and why?
Heuristics and biases in cyber security dilemmas
Heather Rosoff • Jinshu Cui • Richard S. John
Published online: 28 September 2013
� Springer Science+Business Media New York 2013
Abstract Cyber security often depends on decisions
made by human operators, who are commonly considered a
major cause of security failures. We conducted 2 behav-
ioral experiments to explore whether and how cyber
security decision-making responses depend on gain–loss
framing and salience of a primed recall prior experience. In
Experiment I, we employed a 2 9 2 factorial design,
manipulating the frame (gain vs. loss) and the presence
versus absence of a prior near-miss experience. Results
suggest that the experience of a near-miss significantly
increased respondents’ endorsement of safer response
options under a gain frame. Overall, female respondents
were more likely to select a risk averse (safe) response
compared with males. Experiment II followed the same
general paradigm, framing all consequences in a loss frame
and manipulating recall to include one of three possible
prior experiences: false alarm, near-miss, or a hit involving
a loss of data. Results indicate that the manipulated prior
hit experience significantly increased the likelihood of
respondents’ endorsement of a safer response relative to
the manipulated prior near-miss experience. Conversely,
the manipulated prior false-alarm experience significantly
decreased respondents’ likelihood of endorsing a safer
response relative to the manipulated prior near-miss
experience. These results also showed a main effect for age
and were moderated by respondent’s income level.
Keywords Cyber security � Framing effect �
Near-miss � Decision making
1 Introduction
Individual users regularly make decisions that affect the
security of their personal devices connected to the internet
and, in turn, to the security of the cybersphere. For
example, they must decide whether to install software to
protect from viruses and hackers, download files from
unknown sources, or submit personal identification infor-
mation for web site access or online purchases. Such
decisions involve actions that could result in various neg-
ative consequences (loss of data, reduced computer per-
formance or destruction of a computer’s hard drive).
Conversely, other alternative actions are available that
could protect individuals from negative outcomes, but also
could limit the efficiency and ease of use of the personal
device.
Aytes and Connolly (2004) propose a decision model of
computer-related behavior that suggests individuals make a
rational choice to either engage in safe or unsafe cyber
behavior. In their model, individual behavior is driven by
perceptions of the usefulness of safe and unsafe behaviors
and the consequences of each. More specifically, the model
captures how information sources, the user’s base knowl-
edge of cyber security, the user’s relevant perceptions (e.g.,
interpretations of the applicability of the knowledge), and
the user’s risk attitude influence individual cyber decision
making.
H. Rosoff (&)
Sol Price School of Public Policy, University of Southern
California, Los Angeles, CA, USA
e-mail: [email protected]
H. Rosoff � J. Cui � R. S. John
Center for Risk and Economic Analysis of Terrorism Events
(CREATE), University of Southern California, Los Angeles,
CA, USA
J. Cui � R. S. John
Department of Psychology, University of Southern California,
Los Angeles, CA, USA
123
Environ Syst Decis (2013) 33:517–529
DOI 10.1007/s10669-013-9473-2
This paper reports on two behavioral experiments, using
over 500 respondents, designed to explore whether and
how recommended cyber security decision-making
responses depend on gain–loss framing and salience of
prior cyber dilemma experiences. More specifically, we
explored whether priming individuals to recall a prior
cyber-related experience influenced their decision to select
either a safe versus risky option in responding to a hypo-
thetical cyber dilemma. We hypothesized that recall of a hit
experience involving negative consequences would
increase feelings of vulnerability, even more so than a
near-miss, and lead to the endorsement of a risk averse
option. This result has been reported in the disaster liter-
ature, which has shown that individual decision making
depends on prior experiences, including hits, near-misses
(events where a hazardous or fatal outcome could have
occurred, but do not), and false alarms (Barnes et al. 2007;
Dillon et al. 2011; Siegrist and Gutscher 2008). Further-
more, damage from past disasters has been shown to sig-
nificantly influence individual perceptions of future risk
and to motivate more protective and mitigation-related
behavior (Kunreuther and Pauly 2004; Siegrist and Gut-
scher 2008; Slovic et al. 2005).
We anticipated that the effect of prior near-miss expe-
riences would depend on the interpretation of the prior
near-miss event by the respondent. This expectation was
based on near-miss research that has shown that future-
intended mitigation behavior depends greatly on the per-
ception of the near-miss event outcome. Tinsley et al.
(2012) describe two near-miss types—a resilient and vul-
nerable near-miss. A resilient near-miss is as an event that
did not occur. In these situations, individuals were found to
underestimate the danger of subsequent events and were
more likely to engage in risky behavior by choosing not to
take protective action. A vulnerable near-miss occurs when
a disaster almost happened. New information is incorpo-
rated into the assessment that counters the basic ‘‘near-
miss’’ definition and results in the individual being more
inclined to engage in risk averse behavior (the opposite
behavior related to a resilient near-miss interpretation). In
the cyber context, we expected that respondents who fail to
recognize a prior near-miss as a cyber threat would be more
likely to recommend the risky course of action. However, if
respondents view a recalled near-miss as evidence of vul-
nerability, then they would be more inclined to endorse the
safer option.
In the case of a recalled prior false-alarm experience,
one hypothesis known as the ‘‘cry-wolf effect’’ (Breznitz
2013) suggests that predictions of disasters that do not
materialize affect beliefs about the uncertainty associated
with future events. In this context, false alarms are believed
to create complacency and reduce willingness to respond to
future warnings, resulting in a greater likelihood of
engaging in risky behavior (Barnes et al. 2007; Donner
et al. 2012; Dow and Cutter 1998; Simmons and Sutter
2009). In contrast, there is research showing that the public
may have a higher tolerance for false alarms than antici-
pated. This is because of the increased credibility given to
the event due to the frequency with which it is discussed,
both through media sources and informal discussion, thus,
suggesting that false alarms might increase individuals’
willingness to be risk averse (Dow and Cutter 1998). We
anticipated that recall of prior false alarms would likely
make respondents feel less vulnerable and more willing to
prefer the risky option, compared with the near-miss and
hit conditions.
In our research, we also anticipated that there would be
some influence of framing on individual cyber decision
making under risk. Prospect theory and related empirical
research suggest that decision making under risk depends
on whether potential outcomes are perceived as a gain or as
a loss in relation to a reference point (Kahneman and
Tversky 1979; Tversky and Kahneman 1986). A common
finding in the literature on individual preferences in deci-
sion making shows that people tend to avoid risk under
gain frames, but seek risk when outcomes are framed as a
loss.
Prospect theory is discussed in the security literature,
but empirical studies in cyber security contexts are limited
(Acquisti and Grossklags 2007; Garg and Camp 2013;
Helander and Khalid 2000; Shankar et al. 2002; Verendel
2008). Among the security studies that have been con-
ducted, the results are mixed. The work by Schroeder and
colleagues on computer information security presented at
the 2006 Information Resources Management Association
International Conference found that decision makers were
risk averse in the gain frame, yet they showed no risk
preference in the loss frame. Similarly, in a 1999 presen-
tation about online shopping behavior by Helander and Du
at the International Conference on TQM and Human Fac-
tors, perceived risk of credit card fraud and the potential for
price inflation did not negatively affect purchase intention
(loss frame), while perceived value of a product was found
to positively affect purchase intention. We anticipated that
gain-framed messages in cyber dilemmas would increase
endorsement of protective responses and loss-framed
messages would have no effect on the endorsement of
protective options.
We also explored how subject variables affect the
strength and/or the direction of the relationship between the
manipulated variables, prior experience and gain–loss
framing, and the dependent variable, endorsement of safe
or unsafe options in response to cyber dilemmas. For
example, one possibility is that the relationship between
prior experience and risk averse behavior is greater for
individuals with higher self-reported victimization given
518 Environ Syst Decis (2013) 33:517–529
123
their increased exposure to cyber dilemma consequences.
Another possibility is that the relationship between the gain
frame and protective behavior would be less for younger
individuals because they are more familiar and comfortable
with the nuances of internet security. We anticipated that
there would be some difference in the patterns of response
as a function of sex, age, income, education, job domain,
and self-reported victimization.
The next section of this article describes the methods,
results, and a brief discussion for Experiment I, and Sect. 3
describes the methods, results, and a brief discussion for
Experiment II. The paper closes with a discussion of
findings across both experiments and how these results
suggest approaches to enhance and improve cyber security
by taking into account user decision making.
2 Experiment I
We conducted an experiment of risky cyber dilemmas with
two manipulated variables, gain–loss framing and primed
recall of a prior personal near-miss experience, to evaluate
individual cyber user decision making. The cyber dilem-
mas were developed to capture commonly confronted risky
cyber choices faced by individual users. In addition, in
Experiment I, the dependent variable focused on the advice
the respondent would provide to their best friend so as to
encourage more normative thinking about what might be
the correct response to the cyber dilemma. As such, each
cyber scenario described a risky choice dilemma faced by
the respondent’s ‘‘best friend,’’ and the respondent was
asked to recommend either a safe but inconvenient course
of action (e.g., recommend not downloading the music file
from an unknown source), or a risky but more convenient
option (e.g., recommend downloading the music file from
an unknown source).
2.1 Method
2.1.1 Design overview
In Experiment I, four cyber dilemmas were developed to
evaluate respondents’ risky choice behavior using a 2
(recalled personal near-miss experience or no recall control
condition) by 2 (gain versus loss-framed message) mixed
model factorial design with two dichotomous subject
variables: sex and self-reported victimization. Each par-
ticipant received all four dilemmas in a constant order.
Within this order, each of the four treatment conditions was
paired with each of the four dilemmas and counterbalanced
such that each of the dilemmas was randomly assigned to
each of the four treatment conditions.
After each cyber dilemma, respondents were asked to
respond on a 6-point scale (1 = strongly disagree to
6 = strongly agree) whether they would advise their ‘‘best
friend’’ to proceed in taking a risky course of action.
Responses of 1–3 indicated endorsement of the safe but
inconvenient option, while responses of 4–6 indicated
endorsement of the risky but expedient option. Following
the four cyber dilemmas, respondents were given four
attention check questions to determine whether they were
reading the cyber scenarios carefully. In addition, basic
demographic information was collected as well as infor-
mation on each respondent’s personal experience and self-
reported victimization, if any, with the topics of the cyber
dilemmas.
2.1.2 Scenarios and manipulations
The four cyber dilemma scenarios involved the threat of a
computer virus resulting from the download of a music file,
the use of an unknown USB drive device, the download of
a Facebook application, and the risk of financial fraud from
an online purchase. Gain–loss framing and primed recall of
a prior personal experience were manipulated independent
variables. The framing messages were used to describe the
potential outcome of the risky cyber choice. The gain-
framed messages endorsed the safe, more protective rec-
ommendation. For example, for the download of a music
file scenario, the gain frame was worded as ‘‘If she presses
‘do not proceed,’ she may avoid the risk of acquiring a
virus that will cause serious damage to her computer.’’
Conversely, the loss-framed messages endorsed the risky
option/choice. For the download of a music file scenario,
the loss frame was worded as ‘‘If she presses ‘proceed,’ she
may risk acquiring a virus that will cause serious damage to
her computer.’’ The experimental design also included a
manipulation of primed recall of a prior personal experi-
ence. Respondents either recalled a near-miss experience of
their own before advising their friend, or did not (a control
condition). In each near-miss experience, the respondent’s
dilemma was similar to the situation faced by their best
friend and the consequences of the threat were benign. A
complete description of the four scenarios, including the
near-miss and gain–loss framing manipulations, is pro-
vided in Table 1.
2.1.3 Subjects
The experiment was conducted using the University of
Southern California’s Psychology Subject Pool. Students
participated for course credit. Of the 365 students who
participated in the experiment, 99 were omitted for not
answering all 4 of the attention check questions correctly,
resulting in a sample of 266 respondents. Most, 203 (76 %)
Environ Syst Decis (2013) 33:517–529 519
123
Table 1 Summary of four scenarios and manipulations
(Experiment I)
Scenario 1: Music File Scenario 2: USB Scenario 3: Facebook
Scenario 4: Rare Book
Scenario Your best friend has contacted you
for advice. She wants to open a
music file linking to an early
release of her favorite band’s
new album. When she clicks on
the link, a window pops up
indicating that she needs to turn
off her firewall program in order
to access the file
Your best friend has contacted you
for advice. Her computer keeps
crashing because it is overloaded
with programs, documents and
media files. She consults a
computer technician who
advises her to purchase a 1
terabyte USB drive (data storage
device) to free up space on her
computer. She does her research
and narrows down the selection
to two choices
Your best friend has contacted you
for advice. She has opened her
Facebook page to find an app
request for a game that her
friends have been really excited
about. In order to download the
app, access to some of her
personal information is required
including her User ID and other
information from her profile
Your best friend has contacted you
for advice. She is going to buy a
rare book from an unknown
online store. The book is highly
desirable, expensive and only
available from this online store’s
website. By deciding to purchase
the book online with her credit
card, there is a risk that her
personal information will be
exploited which can generate
unauthorized credit card
charges. Her credit card charges
$50 for the investigation and
retrieval of funds expended
when resolving fraudulent credit
card issues
Gain
framing
If she presses ‘‘do not proceed,’’
she may avoid the risk of
acquiring a virus that will cause
serious damage to her computer
The first USB drive when used on
a computer other than your own
has a 10 % chance of becoming
infected with a virus that will
delete all the files and programs
on the drive. The second drive is
double the price, but has less
than a 5 % chance of becoming
infected with a virus when used
on a computer other than your
own
If she chooses not to agree to the
terms of the app, she is
protecting her private
information from being made
available to the developer of the
app
If she decides not to buy the book,
she may save up to $50 and the
time spent talking with the credit
card company
Loss
framing
If she presses ‘‘proceed,’’ she may
risk acquiring a virus that will
cause serious damage to her
computer
The first USB drive when used on
a computer other than her own
has a 5 % chance of becoming
infected with a virus that will
delete all the files and programs
on the drive. The second drive is
half the price but has more than
a 5 % chance of becoming
infected with a virus when used
on a computer other than her
own
If she chooses to agree to the terms
of the app, she risks the chance
of her private information being
made available to the developer
of the app
If she decides to buy the book, she
may lose up to $50 and the time
spent talking with the credit card
company
Near-miss
experience
As you consider how to advise
your friend, you recall that you
were confronted by a similar
situation in the past. You
attempted to open a link to a
music file and a window popped
up saying that you need to turn
off your firewall program in
order to access the file. You
pressed ‘‘proceed’’ and your
computer immediately crashed.
Fortunately, after restarting your
computer everything was
functioning normally again
As you consider how to advise
your friend, you recall that your
USB drive recently was infected
with a virus after being plugged
into a computer at work. You
contacted a computer technician
to see if there was any way to
repair the drive. The technician
was able to recover all the files
and told you that you were really
lucky because normally such
drives cannot be restored
As you consider how to advise
your friend, you recall that you
once agreed to share some of
your personal information in
order to download an app on
Facebook. The developers of the
app made your User ID publicly
available and because of this you
started to receive messages from
strangers on your profile page.
You were very upset about the
invasion of your privacy.
Fortunately, you discovered that
you could change the privacy
settings of your profile so that
only your friends could access
your page
As you consider how to advise
your friend, you recall that you
once purchased a rare book from
an unknown online store. You
were expecting the book to
arrive 1 week later. About
2 weeks later, you had yet to
receive the book. You were very
concerned that you had done
business with a fake online store.
You contacted the store’s
customer service who
fortunately tracked down the
book’s location and had it
shipped with overnight delivery.
Question Below please indicate your level
of agreement with the statement
‘‘You will advise your best
friend to press ‘‘proceed’’ and
risk acquiring a virus that will
cause serious damage to her
computer’’
Below please indicate your level
of agreement with the statement
‘‘You will advise your best
friend to buy the first USB drive
that has a 10 % chance of
becoming infected with a
virus.’’/’’You will advise your
best friend to buy the second
USB drive that has a greater than
5 % chance of becoming
infected with a virus’’
Below please indicate your level
of agreement with the statement
‘‘You will advise your best
friend to download the app and
risk having her private
information made available to
the app developer’’
Below please indicate your level
of agreement with the statement:
‘‘You will advise your best
friend to purchase the book
online and risk having her
personal information exploited’’
520 Environ Syst Decis (2013) 33:517–529
123
of the respondents, were female. Respondents ranged in
age from 18 to 41 years (95 % percentile is 22 years old).
Table 2 shows a summary of personal experience and
self-reported victimization associated with each of the four
cyber dilemmas. All respondents reported having been a
victim of one of the four cyber dilemmas. Twenty-four
percent of respondents further reported being a victim of
one or more of the four cyber dilemmas. We coded whether
the respondent had ever been victimized by one of the four
scenarios as a variable of self-reported victimization.
2.2 Results
Raw responses (1–6) were centered around the midpoint
(3.5) such that negative responses indicate endorsement of
the safe option, and positive responses indicate endorse-
ment of the risky option. Mean endorsement responses for
each of the four treatment conditions are displayed in
Fig. 1. The negative means in all four conditions indicate
that subjects were more likely to endorse risk averse
actions compared with the risky alternative.
1
In addition, a 2 (recalled personal near-miss experience
or no recall control condition) by 2 (gain vs. loss-framed
message) by 2 (sex) by 2 (self-reported victimization)
4-way factorial ANOVA was used to evaluate respondents
endorsement of risky versus safe options in cyber dilem-
mas. Analyses were specified to only include main effects
and 2-way interactions with the manipulated variables.
Preliminary data screening was conducted, and q–q plots
indicated that the dependent variable is approximately
normally distributed.
Results indicated that the near-miss manipulation was
significant, F (1, 260) = 7.42, p = .01, g2 = .03.
Respondents who received a description of a recalled near-
miss experience preferred the safe but inconvenient option
to the risky, more expedient option. No main effect was
found for the gain–loss framing manipulation, suggesting
that respondents were indifferent between safe versus risky
decision options when the outcomes were described as
gains or losses from a reference point. There also was a
significant interaction between the framing and near-miss
manipulations: F (1, 260) = 4.01, p = .05, g2 = .02. As
seen in Fig. 1, the near-miss manipulation was much larger
under the gain frame compared with the loss frame.
Basic demographic data also was collected to assess
whether individual differences moderated the effect of the
two manipulations. A significant main effect was found for
sex: F (1, 260) = 3.81, p = .05, g2 = .01; Sex’s cohen’s
d are 0.33 for gain framing without near-miss, 0.09 for gain
framing with near-miss, 0.18 for loss framing without near-
miss, and 0.19 for loss framing with near-miss. Female
respondents were more likely to avoid risks and choose the
safe option. No significant main effect was found for self-
reported victimization. Also, none of the interactions were
significant; sex and framing, sex and near-miss experience,
victimization and framing, and victimization and near-
miss.
Table 2 Summary of experience and victimization
Scenario (N = 266) Personal
experience
Previous
victimization
Music file download 205 (77 %) 40 (15 %)
USB drive 110 (41 %) 12 (4.5 %)
Facebook App
download
253
a
(95 %) 3 (1 %)
Online purchase 259 (97 %) 18 (7 %)
Overall (at least once) 265
b
(100 %) 64 (24 %)
a
An app is downloaded from Facebook at least once a week
b
There is one missing value
1
Since the four scenarios are in a constant order, a second
analysis
was run that ignored the manipulated factors and included
scenario/
order as a repeated factor. A one-way repeated measure
ANOVA
found a significant scenario/order effect: F (3, 265) = 30.42,
p  .001, g2 = .10. Over time, respondents were more likely to
endorse the risky option. Because the nature of the dilemma
scenario
and order are confounded, it is impossible to determine whether
the
significant main effect indicates an order effect or a scenarios
effect or
a combination of both. The counterbalanced design distributed
all 4
combinations of framing and prior experience recall evenly
across the
four scenario dilemmas. Order and/or scenario effects are
independent
of the manipulated factors, and thus are included in the error
term in
the ANOVA.
-1.4
-1.2
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
Control Near-missM
e
a
n
E
n
d
o
rs
e
m
e
n
t
o
f
S
a
fe
v
s
.
R
is
k
y
A
d
v
ic
e
Near- miss
Advice by Framing and Near-miss
Gain
Loss
Risky
Safe
Framing
Fig. 1 Mean endorsement of risky versus safe responses to cyber
threats by gain–loss frame and prior near-miss
Environ Syst Decis (2013) 33:517–529 521
123
2.3 Discussion
The results of Experiment I suggest that respondents’ cyber
security recommendations to their best friend were signif-
icantly influenced by the personal experience recall
manipulation. More specifically, respondents who recalled
a near-miss experience were more likely to advise their
best friend to avoid making the risky cyber choice com-
pared with their no recall counterpart. This finding is
consistent with Tinsley et al. (2012) definition of a vul-
nerable near-miss—an ‘‘almost happened’’ event that
makes individuals feel vulnerable and, in turn, leads to a
greater likelihood of endorsing the safer option.
Respondents who recalled a near-miss experience were
even more likely to advise their best friend to take the safer
course of action if they also received the gain message.
Comparatively, the loss frame had a negligible effect on
the primed recall prior experience manipulation. That is,
respondents who received the loss frame were as likely to
recommend the risk averse course of action to their best
friend regardless of whether their prior experience was a
near-miss or not. This finding suggests that people will be
more risk averse when they are exposed either to a recall of
a prior near-miss and/or a loss frame. The combination of
no prior recall of a near-miss and a gain frame did produce
less risk averse responses. This suggests a highly interac-
tive, synergistic effect, in which the frame and the near-
miss recall substitute for each other.
In addition, sex and prior victimization were found to
have no moderating effect on the relationship between cyber
dilemma responses and the two manipulated variables.
Cyber dilemma decision making was found to significantly
vary by respondents’ sex, but not by self-reported victim-
ization. The results suggest that females make more pro-
tective decisions when faced with risky cyber dilemmas
compared with males. This pattern has been replicated in
cyber research in an experiment of online shopping services
where males demonstrated a greater tendency to engage in
risky behavior online (Milne et al. 2009). Disaster risk per-
ception studies also have shown that risks tend to be judged
higher by females (Flynn et al. 1994; Bateman and Edwards
2002; Kung and Chen 2012; Bourque et al. 2012) and that
females tend to have a stronger desire to take preventative
and preparedness measures compared with males (Ho et al.
2008; Cameron and Shah 2012).
3 Experiment II
The primary purpose of Experiment II was to expand the
primed recall prior experience manipulation to compare
three prior cyber experiences: a near-miss, a false alarm,
and a hit involving a loss of data. The prior cyber experi-
ence recall prime for Experiment II involved experiences
of a good friend, rather than the respondents’ past experi-
ences (used in Experiment I). We also posed all questions
using a loss frame to enhance the ecological validity of the
cyber dilemmas posed, the consequences of which are
naturally perceived as losses from a status quo. The
dependent variable was also changed for Experiment II.
Each respondent was asked to report whether they would
select the safe or risky option in response to their own
cyber dilemma, as opposed to providing advice to their best
friend involved in a risky cyber dilemma as in Experiment
I. One interpretation of the finding from Experiment I that
respondents generally favored the safe option was that they
were possibly more risk averse in advising a friend com-
pared to how they would respond to their own cyber
dilemma. By posing the dilemma in the first person, we
sought to characterize how respondents would be likely to
respond when facing a cyber dilemma. The cyber dilemmas
were also described in a more concrete fashion for
Experiment II, including a ‘‘screenshot’’ of the dilemma
facing the respondent.
3.1 Method
3.1.1 Design overview
In Experiment II, three cyber dilemmas were constructed to
evaluate respondents’ risky choice behavior using one
manipulated variable, recall of a friend’s false alarm, near-
miss or hit experience. In addition, six individual differ-
ence variables were included in the design: sex, age,
income, education, job domain, and self-reported victim-
ization. Each participant received all three dilemmas in a
constant order. Each of the three primed recall prior cyber
experiences was paired with one of the three scenarios in a
counterbalanced design such that each of the cyber
dilemmas appeared in each of the three treatment condi-
tions with equal frequency.
After each cyber dilemma, respondents were asked to
respond on a 6-point scale (1 = strongly disagree to
6 = strongly agree) regarding their intention to ignore the
warning and proceed with the riskier course of action.
Following all three cyber dilemmas, respondents were
given three attention check questions related to the nature
of each dilemma. Respondents also were asked to provide
basic demographic information and answer a series of
questions about their experience with computers and cyber
dilemmas, such as their experience with purchasing from a
fraudulent online store, being locked out from an online
account, or having unauthorized withdrawals made from
their online banking account.
522 Environ Syst Decis (2013) 33:517–529
123
3.1.2 Scenarios and manipulations
The three cyber dilemma scenarios involved the threat of
causing serious damage to the respondents’ computer as a
result of downloading a music file, installing a plug-in for
an online game, and downloading a media player to legally
stream videos. The scenarios were written to share the
same possible negative outcome—the computer’s operat-
ing system crashes, resulting in an unusable computer until
repaired. Establishing uniformity of consequences across
the three scenarios reduced potential unexplained variance
across the three levels of the manipulated variable.
Experiment II also included screenshots of ‘‘pop-up’’
window images similar to those that would appear on the
computer display when the cyber dilemma is presented.
These images were intended to make the scenarios more
concrete and enhance the realism of the cyber dilemma
scenarios.
Primed recall of a friend’s prior cyber experience was
the only manipulated variable in this experiment.
Respondents either recalled their friend’s near-miss, false
alarm or hit experience before deciding whether to select
the safe or risky option in response to the described cyber
dilemma. All potential outcomes were presented in a loss
frame, with wording held constant except for details spe-
cific to the scenario under consideration. For example, the
wording of the loss frame for the hit outcome of the
download a music file scenario was ‘‘She pressed ‘allow
access’ and her computer immediately crashed. She ended
up having to wipe the computer’s hard drive clean and to
reinstall the operating system.’’ The only modification
made for the installation of the plug-in scenario was
switching the words ‘‘allow access’’ to ‘‘run.’’ A complete
description of the scenarios, including the primed recall of
the friend’s prior experiences, is provided in Table 3.
3.1.3 Subjects
Three hundred and seventy-six US residents were recruited
through Amazon Mechanical Turk (AMT) to participate in
the experiment. Researchers have assessed the representa-
tiveness of AMT samples compared with convenience
samples found locally and found AMT samples to be
representative (Buhrmester et al. 2011; Mason and Suri
2012; Paolacci et al. 2010) and ‘‘significantly more diverse
than typical American college samples’’ (Buhrmester et al.
2011). Each respondent earned $1 for completion of the
experiment. After removing respondents who did not
answer all three of the attention check questions correctly
or completed the experiment in less than 7 min, the sample
consisted of 247 respondents. Five additional respondents
skipped questions, resulting in a final sample size of
N = 242. Table 4 includes a summary of sample
characteristics, including sex, age, income, education, job
domain, and self-reported victimization. Self-reported
victimization is defined in terms of experiences with four
types of negative cyber events: (1) getting a virus on an
electronic device, (2) purchasing from a fraudulent online
store, (3) being locked out from an online account, or (4)
having unauthorized withdrawals made from their online
banking account. Respondents also responded to a number
of experience questions that are summarized in Table 5 as
additional detail about the study sample.
3.2 Results
A mixed model ANOVA with one within-subject factor
(primed recall of a prior experience) and six individual
difference variables as between-subject factors were used.
This model included only the seven main effects and the
six 2-way interactions involving the manipulated within-
subject variable and each of the six between-subject vari-
able. Preliminary data screening was done; q–q plots
showed the scores on the repeated measures variable, prior
salient experience, to have an approximately normal
distribution.
2
Results show that the primed recall prior experience
manipulation had a significant effect on how respondents
intended to respond to the cyber dilemmas, F (1,
231) = 31.60, p  .00, g2 = .12. Moreover, post hoc
comparisons using the least significant difference (LSD)
test indicate that the mean score for the false-alarm con-
dition (M = 3.65, SD = 0.11) was significantly different
from the near-miss condition (M = 2.97, SD = 0.11) with
p  .01, and the hit condition (M = 2.34, SD = 0.11)
significantly differed from the near-miss and false-alarm
conditions with p  .01. This suggests that respondents
who received a description of a friend’s near-miss experi-
ence recall preferred the safer, risk averse option compared
with respondents who were primed to recall a friend’s prior
false-alarm experience. Respondents were found to be even
more likely to select the safe option when they were primed
to recall a friend’s prior hit experience. As displayed in
Fig. 2, the positive means for the false-alarm condition
indicate that respondents were more likely to engage in
risky behavior compared with the negative means for the
near-miss and hit conditions.
The analysis also included both main effects and inter-
action terms for six different subject variables, including
2
As in Exp I, a one-way repeated measure ANOVA shows there
is a
significant scenario/order effect: F (2, 265) =4.47, p = .035, g2
= .02.
Over time and/or scenario, respondents were more likely to
endorse the
risky option. However, as in Experiment I, it is difficult to
determine
whether the main effect is for the scenarios or the order effect.
The study
design we used overcame this limitation by using a
counterbalanced
design.
Environ Syst Decis (2013) 33:517–529 523
123
T
a
b
le
3
S
u
m
m
a
ry
o
f
th
re
e
sc
e
n
a
ri
o
s
a
n
d
m
a
n
ip
u
la
ti
o
n
s
(E
x
p
e
ri
m
e
n
t
II
)
S
c
e
n
a
ri
o
1
:
M
u
si
c
F
il
e
S
c
e
n
a
ri
o
2
:
P
lu
g
-i
n
In
st
a
ll
S
c
e
n
a
ri
o
3
:
U
n
k
n
o
w
n
N
e
tw
o
rk
S
c
e
n
a
ri
o
Y
o
u
w
a
n
t
to
d
o
w
n
lo
a
d
a
m
u
si
c
fi
le
li
n
k
in
g
to
a
n
e
a
rl
y
re
le
a
se
o
f
y
o
u
r
fa
v
o
ri
te
b
a
n
d
’s
n
e
w
a
lb
u
m
.
W
h
e
n
y
o
u
c
li
c
k
o
n
th
e
li
n
k
,
th
e
fo
ll
o
w
in
g
w
in
d
o
w
p
o
p
s
u
p
:
If
y
o
u
p
re
ss
‘‘
a
ll
o
w
a
c
c
e
ss
,’
’
y
o
u
m
a
y
ri
sk
c
a
u
si
n
g
se
ri
o
u
s
d
a
m
a
g
e
to
y
o
u
r
c
o
m
p
u
te
r
Y
o
u
a
re
in
te
re
st
e
d
in
p
la
y
in
g
a
n
o
n
li
n
e
g
a
m
e
th
a
t
re
q
u
ir
e
s
a
p
lu
g
-i
n
to
ru
n
.
B
e
fo
re
in
st
a
ll
in
g
th
e
p
lu
g
-i
n
,
th
e
fo
ll
o
w
in
g
w
in
d
o
w
p
o
p
s
u
p
:
If
y
o
u
c
li
c
k
‘‘
ru
n
,’
’
y
o
u
m
a
y
ri
sk
in
st
a
ll
in
g
a
p
lu
g
-i
n
th
a
t
c
o
u
ld
se
ri
o
u
sl
y
d
a
m
a
g
e
y
o
u
r
c
o
m
p
u
te
r
Y
o
u
h
a
v
e
d
o
w
n
lo
a
d
e
d
a
m
e
d
ia
p
la
y
e
r
to
le
g
a
ll
y
st
re
a
m
v
id
e
o
s
fr
o
m
y
o
u
r
c
o
m
p
u
te
r.
W
h
e
n
y
o
u
o
p
e
n
th
e
p
la
y
e
r,
th
e
fo
ll
o
w
in
g
w
in
d
o
w
p
o
p
s
u
p
:
If
y
o
u
p
re
ss
‘‘
y
e
s,
y
o
u
m
a
y
ri
sk
u
si
n
g
a
m
e
d
ia
p
la
y
e
r
th
a
t
c
o
u
ld
se
ri
o
u
sl
y
d
a
m
a
g
e
y
o
u
r
c
o
m
p
u
te
r
Y
o
u
r
e
x
p
e
ri
e
n
c
e
Y
o
u
re
c
a
ll
th
a
t
y
o
u
r
fr
ie
n
d
to
ld
y
o
u
sh
e
w
a
s
c
o
n
fr
o
n
te
d
b
y
a
si
m
il
a
r
si
tu
a
ti
o
n
in
th
e
p
a
st
.
S
h
e
w
a
s
a
tt
e
m
p
ti
n
g
to
o
p
e
n
a
m
u
si
c
fi
le
a
n
d
a
w
in
d
o
w
p
o
p
p
e
d
u
p
sa
y
in
g
th
e
p
ro
g
ra
m
w
a
s
b
lo
c
k
e
d
b
y
a
fi
re
w
a
ll
Y
o
u
re
c
a
ll
th
a
t
y
o
u
r
fr
ie
n
d
to
ld
y
o
u
sh
e
o
n
c
e
d
o
w
n
lo
a
d
e
d
a
p
lu
g
-i
n
to
p
la
y
a
n
o
n
li
n
e
g
a
m
e
a
n
d
w
a
s
w
a
rn
e
d
p
ri
o
r
to
in
st
a
ll
a
ti
o
n
th
a
t
th
e
p
u
b
li
sh
e
r
c
o
u
ld
n
o
t
b
e
v
e
ri
fi
e
d
Y
o
u
re
c
a
ll
th
a
t
y
o
u
r
fr
ie
n
d
to
ld
y
o
u
sh
e
o
n
c
e
in
st
a
ll
e
d
a
m
e
d
ia
p
la
y
e
r
a
n
d
re
c
e
iv
e
d
a
w
a
rn
in
g
a
b
o
u
t
a
ll
o
w
in
g
a
n
u
n
k
n
o
w
n
p
u
b
li
sh
e
r
to
m
a
k
e
c
h
a
n
g
e
s
to
h
e
r
c
o
m
p
u
te
r
F
a
ls
e
a
la
rm
S
h
e
p
re
ss
e
d
‘‘
a
ll
o
w
a
c
c
e
ss
’’
a
n
d
su
c
c
e
ss
fu
ll
y
d
o
w
n
lo
a
d
e
d
th
e
m
u
si
c
fi
le
w
it
h
o
u
t
a
n
y
d
a
m
a
g
e
o
c
c
u
rr
in
g
to
h
e
r
c
o
m
p
u
te
r
S
h
e
c
li
c
k
e
d
‘‘
ru
n
,’
’
a
n
d
su
c
c
e
ss
fu
ll
y
p
la
y
e
d
th
e
g
a
m
e
w
it
h
o
u
t
c
a
u
si
n
g
a
n
y
d
a
m
a
g
e
to
h
e
r
c
o
m
p
u
te
r
S
h
e
p
re
ss
e
d
‘‘
a
ll
o
w
’’
a
n
d
su
c
c
e
ss
fu
ll
y
u
se
d
th
e
p
la
y
e
r
to
w
a
tc
h
v
id
e
o
s
w
it
h
o
u
t
a
n
y
d
a
m
a
g
e
o
c
c
u
rr
in
g
to
h
e
r
c
o
m
p
u
te
r
N
e
a
r-
m
is
s
S
h
e
p
re
ss
e
d
‘‘
a
ll
o
w
a
c
c
e
ss
’’
a
n
d
h
e
r
c
o
m
p
u
te
r
im
m
e
d
ia
te
ly
fl
a
sh
e
d
a
b
lu
e
sc
re
e
n
a
n
d
a
u
to
m
a
ti
c
a
ll
y
re
b
o
o
te
d
b
e
fo
re
sh
e
h
a
d
ti
m
e
to
re
a
d
a
n
y
th
in
g
.
F
o
rt
u
n
a
te
ly
,
fo
ll
o
w
in
g
th
e
re
b
o
o
t
h
e
r
c
o
m
p
u
te
r
w
a
s
o
p
e
ra
ti
n
g
n
o
rm
a
ll
y
S
h
e
c
li
c
k
e
d
‘‘
ru
n
’’
a
n
d
h
e
r
c
o
m
p
u
te
r
im
m
e
d
ia
te
ly
fl
a
sh
e
d
a
b
lu
e
sc
re
e
n
a
n
d
a
u
to
m
a
ti
c
a
ll
y
re
b
o
o
te
d
b
e
fo
re
sh
e
h
a
d
ti
m
e
to
re
a
d
a
n
y
th
in
g
.
F
o
rt
u
n
a
te
ly
,
fo
ll
o
w
in
g
th
e
re
b
o
o
t
h
e
r
c
o
m
p
u
te
r
w
a
s
o
p
e
ra
ti
n
g
n
o
rm
a
ll
y
S
h
e
p
re
ss
e
d
‘‘
a
ll
o
w
’’
a
n
d
h
e
r
c
o
m
p
u
te
r
im
m
e
d
ia
te
ly
fl
a
sh
e
d
a
b
lu
e
sc
re
e
n
a
n
d
a
u
to
m
a
ti
c
a
ll
y
re
b
o
o
te
d
b
e
fo
re
sh
e
h
a
d
ti
m
e
to
re
a
d
a
n
y
th
in
g
.
F
o
rt
u
n
a
te
ly
,
fo
ll
o
w
in
g
th
e
re
b
o
o
t
h
e
r
c
o
m
p
u
te
r
w
a
s
o
p
e
ra
ti
n
g
n
o
rm
a
ll
y
H
it
S
h
e
p
re
ss
e
d
‘‘
a
ll
o
w
a
c
c
e
ss
’’
a
n
d
h
e
r
c
o
m
p
u
te
r
im
m
e
d
ia
te
ly
c
ra
sh
e
d
.
S
h
e
e
n
d
e
d
u
p
h
a
v
in
g
to
w
ip
e
th
e
c
o
m
p
u
te
r’
s
h
a
rd
d
ri
v
e
c
le
a
n
a
n
d
to
re
in
st
a
ll
th
e
o
p
e
ra
ti
n
g
sy
st
e
m
S
h
e
c
li
c
k
e
d
‘‘
ru
n
’’
a
n
d
h
e
r
c
o
m
p
u
te
r
im
m
e
d
ia
te
ly
c
ra
sh
e
d
.
S
h
e
e
n
d
e
d
u
p
h
a
v
in
g
to
w
ip
e
th
e
c
o
m
p
u
te
r’
s
h
a
rd
d
ri
v
e
c
le
a
n
a
n
d
to
re
in
st
a
ll
th
e
o
p
e
ra
ti
n
g
sy
st
e
m
S
h
e
p
re
ss
e
d
‘‘
a
ll
o
w
’’
a
n
d
h
e
r
c
o
m
p
u
te
r
im
m
e
d
ia
te
ly
c
ra
sh
e
d
.
S
h
e
e
n
d
e
d
u
p
h
a
v
in
g
to
w
ip
e
th
e
c
o
m
p
u
te
r
c
le
a
n
a
n
d
to
re
in
st
a
ll
th
e
o
p
e
ra
ti
n
g
sy
st
e
m
Q
u
e
st
io
n
B
e
lo
w
p
le
a
se
in
d
ic
a
te
y
o
u
r
le
v
e
l
o
f
a
g
re
e
m
e
n
t
w
it
h
th
e
st
a
te
m
e
n
t
‘‘
Y
o
u
w
il
l
p
re
ss
‘‘
a
ll
o
w
a
c
c
e
ss
’’
a
n
d
ri
sk
in
st
a
ll
in
g
a
fi
le
th
a
t
c
o
u
ld
se
ri
o
u
sl
y
d
a
m
a
g
e
y
o
u
r
c
o
m
p
u
te
r’
’
B
e
lo
w
p
le
a
se
in
d
ic
a
te
y
o
u
r
le
v
e
l
o
f
a
g
re
e
m
e
n
t
w
it
h
th
e
st
a
te
m
e
n
t
‘‘
Y
o
u
w
il
l
c
li
c
k
‘‘
ru
n
’’
a
n
d
ri
sk
in
st
a
ll
in
g
a
p
lu
g
-i
n
th
a
t
c
o
u
ld
se
ri
o
u
sl
y
d
a
m
a
g
e
y
o
u
r
c
o
m
p
u
te
r’
’
B
e
lo
w
p
le
a
se
in
d
ic
a
te
y
o
u
r
le
v
e
l
o
f
a
g
re
e
m
e
n
t
w
it
h
th
e
st
a
te
m
e
n
t:
‘‘
Y
o
u
w
il
l
p
re
ss
‘‘
a
ll
o
w
’’
’
a
n
d
ri
sk
u
si
n
g
a
m
e
d
ia
p
la
y
e
r
th
a
t
c
o
u
ld
se
ri
o
u
sl
y
d
a
m
a
g
e
y
o
u
r
c
o
m
p
u
te
r’
’
524 Environ Syst Decis (2013) 33:517–529
123
sex, age, level of education, income level, job domain, and
self-reported victimization. For the purpose of analysis, age
was collapsed into three levels: 18–29, 30–39, and 40 years
and older; education level was collapsed into three cate-
gories: high school and 2-year college, 4-year college, and
master’s degree or higher; and annual income level was
collapsed into three categories: below $30,000/year,
$30,000–$59,999/year, and $60,000/year and more.
The results of the ANOVA indicated there was a sig-
nificant main effect for age: F (2, 231) = 4.9, p = .01,
g2 = .04, and no significant main effects for sex,
education, income, job domain, and self-reported victim-
ization. Figure 2 suggests that younger respondents com-
pared with older respondents were more likely to choose
the riskier option in cyber dilemmas across all 3 levels of
the primed prior recall experience manipulation.
Results also showed a significant interaction effect
between income and the primed prior recall experience
manipulation: F (2, 231) = 3.40, p = .01, g2 = .03. Fig-
ure 3 indicates that respondents with higher income levels
(greater than $60 K per year) were less sensitive to the
primed recall of a friend’s experience. There was no
Table 4 Demographic information for AMT respondents
Demographic variable (N = 242) Variable response category
Number and percentage of sample
Sex Male 108 (44.6 %)
Female 134 (55.4 %)
Highest level of education High school 65 (26.9 %)
2-year college 38 (15.7 %)
4-year college 102 (42.1 %)
Master’s degree 30 (12.4 %)
Professional (e.g., M.D.,
Ph.D., J.D.) degree
7 (2.9 %)
Personal gross annual income range Below $20,000/year 66
(27.3 %)
$20,000–$29,999/year 31 (12.8 %)
$30,000–$39,999/year 35 (14.5 %)
$40,000–$49,999/year 28 (11.6 %)
$50,000–$59,999/year 15 (6.2 %)
$60,000–$69,999/year 23 (9.5 %)
$70,000–$79,999/year 13 (5.4 %)
$80,000–$89,999/year 10 (4.1 %)
$90,000/year or more 21 (8.7 %)
Does your work relate to technology? I use computers normally
but my
work has nothing to do with
technology.
172 (71.1 %)
My work is about technology 70 (28.9 %)
Victim of getting a virus on an electronic device Yes 165 (68.2
%)
No 77 (31.8 %)
Victim of purchasing from a fake online store Yes 15 (6.2 %)
No 221 (91.3 %)
I don’t shop online 6 (2.5 %)
Victim of failure to log into an online account Yes 85 (35.1 %)
No 157 (64.9 %)
Victim of unauthorized withdrawals from an online banking
account Yes 44 (18.2 %)
No 198 (81.8 %)
Overall self-reported victimization None 46 (19.0 %)
One type 104 (43.0 %)
Two or more types 92 (38.0 %)
Age (years) Range 18–75
Percentiles 25th 27
50th 33
75th 44
Environ Syst Decis (2013) 33:517–529 525
123
significant interaction effect between the manipulation and
the other five individual difference variables, including sex:
F (1, 231) 1, age: F (2, 231) = 1.84, p = .12, g2 = .02,
education: F (2, 231) 1, job domain, F (1, 231) = 2.01,
p = .14, g2 = .01, and self-reported victimization, F (2,
231) = 2.03, p = .09, g2 = .02.
3.3 Discussion
Responses to risky cyber dilemmas in Experiment II were
significantly predicted by the primed recall of a friend’s
prior cyber experience. Consistent with our hypotheses, the
more negative the consequence associated with the prior
cyber experience, the more likely the respondents were to
choose the safer course of action. In particular, respondents
who were primed to recall a prior near-miss or hit event
interpreted the experience as a sign of vulnerability com-
pared with the recall of a prior false alarm and, in turn,
were more likely to promote more conservative (safe)
endorsements of actions. In the case of false alarms, our
findings suggest that respondents were more likely to
endorse the risky alternative.
-2
-1.5
-1
-0.5
0
0.5
1
False-alarm Near-miss Hit
M
e
a
n
E
n
d
o
rs
e
m
e
n
t
o
f
S
a
fe
v
s
.
R
is
k
y
O
p
ti
o
n
Salient Prior Experience
Endorsement by Age
18 - 29 years old
30 - 39 years old
40 years old and
older
Risky
Safe
Age
Fig. 2 Mean endorsement of risky versus safe responses to cyber
threats by primed recall of friend’s prior experience and age
M
e
a
n
E
n
d
o
rs
e
m
e
n
t
o
f
S
a
fe
v
s
.
R
is
k
y
O
p
ti
o
n
Salient Prior Experience
Endorsement by Income Level
Income
Fig. 3 Mean endorsement of risky versus safe responses to cyber
threats by primed recall of friend’s prior experience and income
level
Table 5 Cyber-related responses for AMT respondents
Questions Response
category
Number and
percentage of
sample
Personal computer PC 213 (88.0 %)
Mac 28 (11.6 %)
Do not have a
personal
computer
1 (0.4 %)
Smartphone iOS 67 (27.6 %)
Android 95 (39.3 %)
Do not have a
smartphone
80 (33.1 %)
Protection software Yes 211 (87.2 %)
No 31 (12.8 %)
Have you ever downloaded free
music, an e-book, a movie, or a
television show from an
unfamiliar website found
through a Google search?
Yes 135 (55.8 %)
No 107 (44.2 %)
How often do you access your
social networking accounts
(Facebook, Twitter, Myspace,
MSN, Match.com, etc.)?
Every day 150 (62.0 %)
Once a week 35 (14.5 %)
Once a month 8 (3.3 %)
2-3 times a
month
10 (4.1 %)
Every couple
months
14 (5.8 %)
Once a year 4 (1.7 %)
Never 21 (8.7 %)
Have you ever clicked on an
advertisement and a window
popped up saying something
along the lines of
‘‘Congratulations, you are
eligible to win an iPad!’’?
Yes 122 (50.4 %)
No 120 (49.6 %)
Have you ever clicked on a link in
a suspicious email (e.g., an
email in a different language,
with an absurd subject)?
Yes 32 (13.2 %)
No 210 (86.8 %)
526 Environ Syst Decis (2013) 33:517–529
123
In addition, endorsement of safe versus risky resolutions
to the cyber dilemmas varied by respondents’ age,
regardless of the primed recall of a friend’s prior experi-
ence. Middle-aged and older respondents were more likely
to endorse the safe choice option compared with younger
respondents. Research on age differences is inconsistent in
the domain of cyber security related to privacy (Hoofnagle
et al. 2010), risk of data loss from a cyber threat (Howe
et al. 2012—‘‘The psychology of security for the home
computer user’’ in Proceedings of 2012 IEEE Symposium
on the Security and Privacy) or fear of a cyber threat
(Alshalan 2006). Our findings suggest that younger indi-
viduals’ extensive use and dependence on computers for
daily activities may result in the association of a greater
cost with being risk averse in response to cyber dilemmas.
Younger individuals’ familiarity with computers likely
makes it easier for them to determine whether a cyber
dilemma is a real threat or a computer’s standard warning
message. In the same vein, their familiarity with computers
may also lead to a greater awareness of a major cyber
dilemma being a small probability event, the consequences
of which are likely to be repairable. Ultimately, younger
individuals do not perceive the unsafe option as overly
risky compared with the safe option.
Respondents’ income was also found to moderate the
effect of the primed recall of a friend’s prior experience
on respondents’ endorsement of safe versus risky
options. Of the three income levels, the wealthiest
respondents were the least sensitive to variations in the
primed recall of a friend’s prior cyber experience. In
the literature on cyber security, only a significant main
effect for income is reported. In a 2001 presentation by
Tyler, Zhang, Southern and Joiner at the IACIS Con-
ference, the research team reported findings suggesting
that higher income individuals have a lower probability
of considering e-commerce to be safe and therefore
avoid e-commerce transactions. Similarly, in a study by
Downs et al. (2008), respondents from more affluent
areas were reported to update their anti-virus program
more frequently than respondents from poorer areas,
further validating the tendency toward risk averse cyber
behavior for higher income individuals. Our finding
suggests that wealthier respondents were not as
impacted compared with the low and medium income
respondents by the primed prior recall experience
manipulation because they can afford to be riskier.
Their wealth allows them to have access to enhanced
baseline security measures. This creates a sense that
they are exempt from risks that apply to others and for
this reason, do not need to pay much attention to the
primed prior recall experiences and consequences.
Interestingly, there were no significant main effects or
interactions for the remaining four individual difference
variables, including sex, education, work domain, or
previous cyber victimization. The absence of main effects
for five of the six individual difference variables suggests
that respondents’ cyber dilemma decisions are determined
more by recall of prior cyber-related experiences, and not
by background of the decision maker, with the sole
exception of respondent age. The absence of interaction
effects for five of the six individual difference variables
suggests that the effect of primed recall of a prior expe-
rience is robust; respondent income was the sole moder-
ator identified.
4 Conclusion
Experiments I and II were designed to explore how com-
puter users’ responses to common cyber dilemmas are
influenced by framing and salience of prior cyber experi-
ences. Despite using two different dependent variables, the
advice the respondent would give to a friend (Experiment
I), and how the respondents themselves would respond to
cyber dilemmas (Experiment II), the extent to which the
two different questions elicit more or less risk averse
responses was found to be similar. The results indicate that
for prior near-miss experiences (the one manipulation
condition included in both experiments), the mean
responses were 2.39 and 2.97 for Experiments I and II,
respectively. This finding suggests that whether the
respondent was making a personal recommendation or
providing advice to a friend; the recalled experience
manipulation was found to significantly influence the
respondent’s endorsement of the safer cyber option. Simi-
larly, in prior cyber research, Aytes and Connolly (2004)
found that students were more attuned to cyber risks and
likely to take action against them when the primary source
of information was their own or friends’ experiences with
security problems.
The one inconsistent finding between the two experi-
ments is the effect of respondent sex on risky cyber choice
behavior. In Experiment I, females were found to be more
risk averse than males, while in Experiment II, sex was
found to be unrelated to whether respondents endorsed a
risky or safe option. Previous studies are also inconsistent
with respect to the role of sex in predicting cyber-related
behavior and decision making. At the 2012 Annual Con-
ference of the Society for Industrial and Organizational
Psychology, Byrne et al. report that women provided
slightly higher scores of behavioral intentions to click on a
risky cyber link, while Milne et al. (2009) found that males
had a greater tendency to engage in risky behaviors online.
In the context of security compliance, Downs et al. (2008)
report that males were more involved in computer security
management, such as updating their anti-virus software and
Environ Syst Decis (2013) 33:517–529 527
123
using pop-up blockers, while Herath and Rao (2009) found
women to have higher security procedure compliance
intentions, but were less likely to act on them.
One explanation for our inconsistent results related to
sex may be differences in the two populations sampled:
college students in Experiment I and a more diverse, AMT
sample in Experiment II. College samples tend to be more
sex stereotyped, such that risk tends to be judged lower by
men than by women, and females tend to have a stronger
desire to take preventative and preparedness measures
(Harris et al. 2006). This tends to be attributed to their lack
of real-world experiences; as evidenced by only a small
percentage of the sample, 24 %, have previously experi-
enced a cyber dilemma. By these assumptions, males
would be expected to be more risk seeking than females in
Experiment I. Conversely, the AMT sample consists of
older adults with more diverse backgrounds, as evidenced
in Table 5, which tends to blur the line between traditional
male and female stereotypes. In addition, 80 % of the AMT
sample had previously experienced a cyber dilemma, fur-
ther suggesting that shared experiences of males and
females could lead to the lack of sex differences found in
Experiment II.
Overall, these two experiments indicate that recall of prior
cyber experiences and framing strongly influence individual
decision making in response to cyber dilemmas. It is useful to
know about how prior experience and framing jointly
influence responses to cyber dilemmas. The implications of
our findings are that salience of prior negative experiences
certainly attenuates risky cyber behavior. We found that this
attenuation is greater for gain-framed decisions, and for low-
and middle-income respondents. Responses to cyber
dilemmas were determined more by proximal variables, such
as recall of prior experiences and framing, and were largely
robust to individual difference variables, with only a couple
of exceptions.
Given that safety in the cyber context is an abstract
concept, it would be worthwhile to further explore how
framing influences cyber dilemma decision making.
Additionally, this research design could be used to evaluate
differences across cyber dilemma contexts to examine the
robustness of the relationships identified in our research.
Such further research is warranted to better understand how
individual users respond to cyber dilemmas. This infor-
mation would be useful to cyber security policymakers
faced with the task of designing better security systems,
including computer displays and warning messages rele-
vant to cyber dilemmas.
Acknowledgments This research was supported by the U.S.
Department of Homeland Security (DHS) through the National
Center
for Risk and Economic Analysis of Terrorism Events. However,
any
opinions, findings, conclusions, and recommendations in this
article
are those of the authors and do not necessarily reflect the views
of
DHS. We would like to thank Society for Risk Analysis (SRA)
conference attendees for their feedback on this work at a
session at the
2012 SRA Annual Meeting in San Francisco. We would also
thank
the blind reviewers for their time and comments, as they were
extremely valuable in developing this paper.
References
Acquisti A, Grossklags J (2007) What can behavioral economics
teach us about privacy. In: Acquisti A, Gritzalis S, Lambrino-
udakis C, Vimercati S (eds) Digital privacy: theory,
technologies
and practices. Auerbach Publications, Florida, pp 363–377
Alshalan A (2006) Cyber-crime fear and victimization: an
analysis of
a national survey. Dissertation, Mississippi State University
Aytes K, Connolly T (2004) Computer security and risky
computing
practices: a rational choice perspective. J Organ End User
Comput 16:22–40
Barnes LR, Gruntfest EC, Hayden MH, Schultz DM, Benight C
(2007) False alarms and close calls: a conceptual model of
warning accuracy. Weather Forecast 22:1140–1147
Bateman JM, Edwards B (2002) Gender and evacuation: a closer
look
at why women are more likely to evacuate for hurricanes. Nat
Hazard Rev 3:107–117
Bourque LB, Regan R, Kelley MM, Wood MM, Kano M, Mileti
DS
(2012) An examination of the effect of perceived risk on
preparedness behavior. Environ Behav 45:615–649
Breznitz S (2013) Cry wolf: the psychology of false alarms.
Psychology Press, Florida
Buhrmester M, Kwang T, Gosling SD (2011) Amazon’s
Mechanical
Turk: a new source of inexpensive, yet high-quality, data?
Perspect Psychol Sci 6:3–5
Cameron L, Shah M (2012) Risk-taking behavior in the wake of
natural disasters. IZA Discussion Paper No. 6756. http://ssrn.
com/abstract=2157898
Dillon RL, Tinsley CH, Cronin M (2011) Why near-miss events
can
decrease an individual’s protective response to hurricanes. Risk
Anal 31:440–449
Donner WR, Rodriguez H, Diaz W (2012) Tornado warnings in
three
southern states: a qualitative analysis of public response
patterns.
J Homel Secur Emerg Manage 9:1547–7355
Dow K, Cutter SL (1998) Crying wolf: repeat responses to
hurricane
evacuation orders. Coast Manage 26:237–252
Downs DM, Ademaj I, Schuck AM (2008) Internet security:
who is
leaving the ‘virtual door’ open and why? First Monday 14.
doi:10.5210%2Ffm.v14i1.2251
Flynn J, Slovic P, Mertz CK (1994) Gender, race, and
perception of
environmental health risks. Risk Anal 14:1101–1108
Garg V, Camp J (2013) Heuristics and biases: implications for
security design. IEEE Technol Soc Mag 32:73–79
Harris C, Jenkins M, Glaser D (2006) Gender differences in risk
assessment: why do women take fewer risks than men? Judgm
Decis Mak 1:48–63
Helander MG, Khalid HM (2000) Modeling the customer in
electronic commerce. Appl Ergon 31:609–619
Herath T, Rao HR (2009) Encouraging information security
behaviors
in organizations: role of penalties, pressures and perceived
effectiveness. Decis Support Syst 47:154–165
Ho MC, Shaw D, Lin S, Chiu YC (2008) How do disaster
characteristics influence risk perception? Risk Anal 28:635–643
Hoofnagle C, King J, Li S, Turow J (2010) How different are
young
adults from older adults when it comes to information privacy
attitudes and policies? April 14, 2010. http://ssrn.com/abstract=
1589864
528 Environ Syst Decis (2013) 33:517–529
123
http://ssrn.com/abstract=2157898
http://ssrn.com/abstract=2157898
http://dx.doi.org/10.5210%2Ffm.v14i1.2251
http://ssrn.com/abstract=1589864
http://ssrn.com/abstract=1589864
Kahneman D, Tversky A (1979) Prospect theory: an analysis of
decision under risk. Econom J Econom Soc 47:263–291
Kung YW, Chen SH (2012) Perception of earthquake risk in
Taiwan:
effects of gender and past earthquake experience. Risk Anal
32:1535–1546
Kunreuther H, Pauly M (2004) Neglecting disaster: why don’t
people
insure against large losses? J Risk Uncertain 28:5–21
Mason W, Suri S (2012) Conducting behavioral research on
Amazon’s Mechanical Turk. Behav Res Methods 44:1–23
Milne GR, Labrecque LI, Cromer C (2009) Toward an
understanding
of the online consumer’s risky behavior and protection
practices.
J Consum Aff 43:449–473
Paolacci G, Chandler J, Ipeirotis P (2010) Running experiments
on
Amazon Mechanical Turk. Judgm Decis Mak 5:411–419
Shankar V, Urban GL, Sultan F (2002) Online trust: a
stakeholder
perspective, concepts, implications, and future directions. J
Stra-
teg Inf Syst 11:325–344
Siegrist M, Gutscher H (2008) Natural hazards and motivation
for
mitigation behavior: people cannot predict the affect evoked by
a
severe flood. Risk Anal 28:771–778
Simmons KM, Sutter D (2009) False alarms, tornado warnings,
and
tornado casualties. Weather Clim Soc 1:38–53
Slovic P, Peters E, Finucane ML, MacGregor DG (2005) Affect,
risk,
and decision making. Health Psychol 24:S35–S40
Tinsley CH, Dillon RL, Cronin MA (2012) How near-miss
events
amplify or attenuate risky decision making. Manage Sci
58:1596–1613
Tversky A, Kahneman D (1986) Rational choice and the framing
of
decisions. J Bus 59:S251–S278
Verendel V (2008) A prospect theory approach to security.
Technical
Report No. 08-20. Sweden. Department of Computer Science
and Engineering, Chalmers University of Technology/Goteborg
University. http://citeseerx.ist.psu.edu/viewdoc/download?doi=
10.1.1.154.9098&rep=rep1&type=pdf
Environ Syst Decis (2013) 33:517–529 529
123
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.154.9
098&rep=rep1&type=pdf
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.154.9
098&rep=rep1&type=pdfHeuristics and biases in cyber security
dilemmasAbstractIntroductionExperiment IMethodDesign
overviewScenarios and
manipulationsSubjectsResultsDiscussionExperiment
IIMethodDesign overviewScenarios and
manipulationsSubjectsResultsDiscussionConclusionAcknowledg
mentsReferences
ww.sciencedirect.com
c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1
Available online at w
journal homepage: www.elsevier.com/locate/cose
Leveraging behavioral science to mitigate cyber security
risk
Shari Lawrence Pfleeger a,1, Deanna D. Caputo b,*
a Institute for Information Infrastructure Protection, Dartmouth
College, 4519 Davenport St., NW, Washington, DC 20016, USA
b MITRE Corporation, 7515 Colshire Drive, McLean, VA
22102-7539, USA
a r t i c l e i n f o
Article history:
Received 16 August 2011
Received in revised form
21 November 2011
Accepted 22 December 2011
Keywords:
Cyber security
Cognitive load
Bias
Heuristics
Risk communication
Health models
* Corresponding author. Tel.: þ1 703 983 384
E-mail addresses: [email protected]
1 Tel.: þ1 603 729 6023.
0167-4048/$ e see front matter ª 2012 Publi
doi:10.1016/j.cose.2011.12.010
a b s t r a c t
Most efforts to improve cyber security focus primarily on
incorporating new technological
approaches in products and processes. However, a key element
of improvement involves
acknowledging the importance of human behavior when
designing, building and using
cyber security technology. In this survey paper, we describe
why incorporating an under-
standing of human behavior into cyber security products and
processes can lead to more
effective technology. We present two examples: the first
demonstrates how leveraging
behavioral science leads to clear improvements, and the other
illustrates how behavioral
science offers the potential for significant increases in the
effectiveness of cyber security.
Based on feedback collected from practitioners in preliminary
interviews, we narrow our
focus to two important behavioral aspects: cognitive load and
bias. Next, we identify
proven and potential behavioral science findings that have cyber
security relevance, not
only related to cognitive load and bias but also to heuristics and
behavioral science models.
We conclude by suggesting several next steps for incorporating
behavioral science findings
in our technological design, development and use.
ª 2012 Published by Elsevier Ltd.
1. Introduction create a cyber environment that provides users
with all of the
“Only amateurs attack machines; professionals target
people.” (Schneier, 2000)
What is the best way to deal with cyber attacks? Cyber
security promises protection and prevention, using both
innovative technology and an understanding of the human
user. Which aspects of human behavior offer the most
promise in making cyber security processes and products
more effective? What role should education and training play?
How can we encourage good security practices without
unnecessarily interrupting or annoying users? How can we
6.
outh.edu (S.L. Pfleeger), d
shed by Elsevier Ltd.
functionality they need without compromising enterprise or
national security? We investigate the answers to these ques-
tions by examining the behavioral science literature to iden-
tify behavioral science theories and research findings that
have the potential to improve cyber security and reduce risk.
In this paper, we report on our initial findings, describe several
behavioral science areas that offer particularly useful appli-
cations to security, and describe how to use them in a general
risk-reduction process.
The remainder of this paper is organized in five sections.
Section 2 describes some of the problems that a technology-
alone solution cannot address. Section 3 explains how we
used a set of scenarios to elicit suggestions about the behav-
iors of most concern to technology designers and users.
[email protected] (D.D. Caputo).
mailto:[email protected]
mailto:[email protected]
www.sciencedirect.com/science/journal/01674048
www.elsevier.com/locate/cose
http://dx.doi.org/10.1016/j.cose.2011.12.010
http://dx.doi.org/10.1016/j.cose.2011.12.010
http://dx.doi.org/10.1016/j.cose.2011.12.010
c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1598
Sections 4 and 5 highlight several areas of behavioral science
with demonstrated and potential relevance to security tech-
nology. Finally, Section 6 suggests possible next steps toward
inclusion of behavioral science in security technology’s
design, construction and use.
3 See the First Interdisciplinary Workshop on Security and
Human Behavior, described at http://www.schneier.com/blog/
archives/2008/06/security_and_http://www.cl.cam.ac.uk/wrja14/
shb08.html.
4 See workshop papers at http://www.informatik.uni-trier.de/
wley/db/conf/itrust/itrust2006.html.
5 The National Science Foundation program is interested in the
connections between social science and cyber security. It has
announced a new program that encourages computer scientists
2. Why technology alone is not enough
The media frequently express the private sector’s concern
about liability for cyber attacks and its eagerness to minimize
risk. The public sector has similar concerns, because aspects
of everyday life (such as operation and defense of critical
infrastructure, protection of national security information,
and operation of financial markets) involve both government
regulation and private sector administration.2 The govern-
ment’s concern is warranted: the Consumer’s Union found
that government was the source of one-fifth of the publicly-
reported data breaches between 2005 and mid-2008
(Consumer’s Union, 2008). The changing nature of both tech-
nology and the threat environment makes the risks to infor-
mation and infrastructure difficult to anticipate and quantify.
Problems of appropriate response to cyber incidents are
exacerbated when security technology is perceived as an
obstacle to the user. The user may be overwhelmed by diffi-
culties in security implementation, or may mistrust, misinter-
pret or override the security. A recent study of users at Virginia
Tech illustrates the problem (Virginia Tech, 2011). Bellanger
et al. examined user attitudes and the “resistance behavior” of
individuals faced with a mandatory password change. The
researchers found that, even when passwords were changed as
required, the changes were intentionally delayed and the
request perceived as being an unnecessary interruption.
“People are conscious that a password breach can have severe
consequences, but it does not affect their attitude toward the
security policy implementation.” Moreover, “the more tech-
nical competence respondents have, the less they favor the
policy enhancement. .In a voluntary implementation, that
competence may be a vector of pride and accomplishment. In
a mandatory context, the individual may feel her competence
challenged, triggering a negative attitude toward the process.”
In the past, solutions to these problems have ranged from
strict, technology-based control of computer-based human
behavior (often with inconsistent or sometimes rigid
enforcement) to comprehensive education and training of
system developers and users. Neither extreme has been
particularly successful, but recent studies suggest that
a blending of the two can lead to effective results. For
example, the U.K. Office for Standards in Education, Chil-
dren’s Services and Skills (Ofsted) evaluated the safety of
online behavior at 35 representative schools across the U.K.
“Where the provision for e-safety was outstanding, the
schools had managed rather than locked down systems. In the
best practice seen, pupils were helped, from a very early age,
to assess the risk of accessing sites and therefore gradually to
acquire skills which would help them adopt safe practices
even when they were not supervised.” (Ofsted, 2010) In other
2 See, for example, http://www.cbsnews.com/video/watch/?
id¼5578986n&tag¼related;photovideo.
words, the most successful security behaviors were exhibited
in schools where students were taught appropriate behaviors
and then trusted to behave responsibly. The Ofsted report
likens the approach to teaching children how to cross the road
safely, rather than relying on adults to accompany the chil-
dren across the road each time.
This approach is at the core of our research. Our over-
arching hypothesis is that, if humans using computer systems
are given the tools and information they need, taught the
meaning of responsible use, and then trusted to behave
appropriately with respect to cyber security, desired outcomes
may be obtained without security’s being perceived as onerous
or burdensome. By both understanding the role of human
behavior and leveraging behavioral science findings, the
designers, developers and maintainers of information infra-
structure can address real and perceived obstacles to produc-
tivity and provide more effective security. These behavioral
changes take time, so plans for initiating change should
include sufficient time to propose the change, implement it,
and have it become part of the culture or common practice.
Other evidence (Predd et al., 2008; Pfleeger et al., 2010) is
beginning to emerge that points to the importance of under-
standing human behaviors when developing and providing
cyber security.3 There is particular interest in using trust to
mitigate risk, especially online. For example, the European
Union funded a several-year, multi-disciplinary project on
online trust (iTrust),4 documenting the many ways that trust
can be created and broken. Now, frameworks are being
developed for analyzing the degree to which trust is built and
maintained in computer applications (Riegelsberger et al.,
2005). More broadly, a rich and relevant behavioral science
literature addresses critical security problems, such as
employee deviance, employee compliance, effective decision-
making, and the degree to which emotions (Lerner and
Tiedens, 2006) or stressful conditions (Klein and Salas, 2001)
can lead to riskier choices by decision-makers.5 At the same
time, there is much evidence that technological advances can
have unintended consequences that reduce trust or increase
risk (Tenner, 1991). For these reasons, we conclude that it is
important to include the human element when designing,
building and using critical systems.
To understand how to design and build systems that
encourage users to act responsibly when using them, we iden-
tified two types of behavioral science findings: those that have
already been shown to demonstrate a welcome effect on cyber
security implementation and use, and those with potential to
have such an effect. In the first case, we documented the rele-
vant findings, so that practitioners and researchers can
and social scientists to work together (Secure and Trustworthy
Cyberspace, described at
http://www.nsf.gov/pubs/2012/nsf12503/
nsf12503.htm?WT.mc_id¼USNSF_25&WT.mc_ev¼click).
http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t
ag%3drelated;photovideo
http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t
ag%3drelated;photovideo
http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t
ag%3drelated;photovideo
http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t
ag%3drelated;photovideo
http://www.schneier.com/blog/archives/2008/06/security_and_ht
tp://www.cl.cam.ac.uk/%7Erja14/shb08.html
http://www.schneier.com/blog/archives/2008/06/security_and_ht
tp://www.cl.cam.ac.uk/%7Erja14/shb08.html
http://www.schneier.com/blog/archives/2008/06/security_and_ht
tp://www.cl.cam.ac.uk/%7Erja14/shb08.html
http://www.schneier.com/blog/archives/2008/06/security_and_ht
tp://www.cl.cam.ac.uk/%7Erja14/shb08.html
http://www.informatik.uni-
trier.de/%7Eley/db/conf/itrust/itrust2006.html
http://www.informatik.uni-
trier.de/%7Eley/db/conf/itrust/itrust2006.html
http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m
c_id%3dUSNSF_25%26WT.mc_ev%3dclick
http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m
c_id%3dUSNSF_25%26WT.mc_ev%3dclick
http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m
c_id%3dUSNSF_25%26WT.mc_ev%3dclick
http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m
c_id%3dUSNSF_25%26WT.mc_ev%3dclick
http://dx.doi.org/10.1016/j.cose.2011.12.010
http://dx.doi.org/10.1016/j.cose.2011.12.010
c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1 599
determine which approaches are most applicable to their
environment. In the second case, we are designing a series of
studies to test promising behavioral science results in a cyber
security setting; setting with the goal of determining which
results (with associated strategies for reducing or mitigating the
behavioral problems they reflect) are the most effective.
However, applying behavioral science findings to cyber
security problems is an enormous undertaking. To maximize
the likely effectiveness of outcomes, we used a set of inter-
views to elicit practitioners’ opinions about behaviors of
concern, so that we could focus on those perceived as most
significant. We describe the interviews and results in Section
3. These findings suggest hypotheses about the role of
behavior in addressing cyber security issues.
3. Identifying behavioral aspects of security
Designers and developers of security technology can leverage
what is known about people and their perceptions to provide
more effective security. A former Israeli airport security chief
said,
“I say technology should support people. And it should be
skilled people at the center of our security concept rather
than the other way around” (Amos, 2010).
To implement this kind of human-centered security,
technologists must understand the behavioral sciences as
they design, develop and use technology. However, trans-
lating behavioral results to a technological environment can
be a difficult process. For example, system designers must
address the human elements obscured by computer media-
tion. Consumers making a purchase online trusts that the
merchant represented by the website is not simply taking
their money, but also is fulfilling its obligation to provide
goods in return. The consumer infers the human involvement
of the online merchant behind the scenes. Thus, at some level,
the buyer and seller are humans enacting a transaction
enabled by a system designed, developed and maintained by
humans. There may be neither actual human contact nor
direct knowledge of the other human actors involved, but the
transaction process reflects its human counterpart.
Preventing or mitigating adverse cyber security incidents
requires action at many stages: designing the technology
being incorporated in the infrastructure; implementing,
testing and maintaining the technology; and using the tech-
nology to provide essential products and services. Behavioral
science has addressed notions of cyber security in these
activities for many years. Indeed, Sasse and Flechais (2005)
note that secure systems are socio-technical systems in
which we should use an understanding of behavioral science
to “prevent users from being the ‘weakest link.’” For example,
some behavioral scientists have investigated how trust
mechanisms affect cyber security. Others have reported
findings related to the design and use of cyber systems, but
the relevance and degree of effect have not yet been tested.
Some of the linkage between behavioral science and security
is specific to certainkinds of systems. For example,
Castelfranchi
and Falcone (1998, 2002) analyze trust in multi-agent systems
from a behavioral perspective. They view trust as having
several
components, including beliefs that must be held to develop trust
(the social context, as described by Riegelsberger et al.(2003))
and
relationships to previousexperience (the temporal context of the
RiegelsbergereSasseeMcCarthy framework). They use psycho-
logical factors to model trust in multi-agent systems. In addition
to social and temporal concerns, we add expectations of fulfill-
ment, where someone trusting someone or something else
expects something in return (Baier, 1986). This behavioral
research sheds light on the nature of a user’s expectation and on
perceived trustworthiness of technology-mediated interactions
and has important implications related to the design of protec-
tive systems and processes.
Sasse and Flechais (2005) view security from three distinct
perspectives: product, process and panorama:
� Product. This perspective includes the effect of the security
controls, such as the policies and mechanisms on stake-
holders (e.g., designers, developers, users). The controls
involve requirements affecting physical and mental work-
load, behavior, and cost (human and financial). Users trust
the product to maintain security while getting the primary
task done.
� Process. This aspect addresses how security decisions are
made, especially in early stages of requirements-gathering
and design. The process should allow the security mecha-
nisms to be “an integral part of the design and development
of the system, rather than being ‘added on’” (Sasse and
Flechais, 2005). Because “mechanisms that are not
employed in practice, or that are used incorrectly, provide
little or no protection,” designers must consider the impli-
cations of each mechanism on workload, behavior and
workflow (Sasse and Flechais, 2005). From this perspective,
the stakeholders must trust the process to enable them to
make appropriate and effective decisions, particularly about
their primary tasks
� Panorama. This aspect describes the context in which the
security operates. Because security is usually not the
primary task, users are likely to “look for shortcuts and
workarounds, especially when users do not understand why
their behavior compromises security. .A positive security
culture, based on a shared understanding of the importance
of security. is the key to achieving desired behavior” (Sasse
and Flechais, 2005). From this perspective, the user views
security mechanisms as essential even when they seem
intrusive, limiting, or counterproductive.
3.1. Scenario creation
Because the infrastructure types and threats are vast, we used
interview results to narrow our investigation to those behav-
ioral science areas with demonstrated or likely potential to
enhance an actor’s confidence in using any information
infrastructure. To guide our interviews, we worked with two
dozen U.S. government and industry employees familiar with
information infrastructure protection issues to define three
threat scenarios relevant to protecting the information infra-
structure. The methodology and resulting analyses were
conducted by the paper’s first author and involved five steps:
http://dx.doi.org/10.1016/j.cose.2011.12.010
http://dx.doi.org/10.1016/j.cose.2011.12.010
c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1600
� Choosing topics. We chose three security topics to discuss,
based on recent events. The combination of the three was
intended to represent a (admittedly incomplete but) signif-
icant number of typical concerns, the discussion of which
would reveal underlying areas ripe for improvement.
� Creating a representative, realistic scenario for each topic.
Using
our knowledge of recent cyber incidents and attacks, we
created an attack scenario for each plausible topic, por-
traying a cyber security problem for which a solution would
be welcomed by industry and government.
� Identifying people with decision making authority about
cyber
security products and usage to interview about the scenarios.
We
identified people from industry and government who were
willing to participate in interviews.
� Conducting interviews. Our discussions focused on two
questions: Are these scenarios realistic, and how could the
cyber security in each situation be improved?
� Analyzing the results and their implications. We analyzed the
results of these interviews and their implications for our
research.
3.1.1. Scenario 1: improving security awareness among
builders of information infrastructure
Security is rarely the primary task of those who use the
information infrastructure. Typically, users seek information,
analyze relationships, produce documents, and perform tasks
that help them understand situations and take action. Simi-
larly, system developers often focus on these primary tasks
before incorporating security into an architecture or design.
Moreover, system developers often implement security
requirements by choosing security mechanisms that are easy
to build and test or that meet some other technical system
objective (e.g., reliability). Developers rarely take into account
the usability of the mechanism or the additional cognitive
load it places on the user. Scenario 1 describes ways to
improve security awareness among system builders so that
security is more likely to be useful and effective.
Suppose software engineers are designing and building
a system to support the creation and transmission of sensitive
documents among members of an organization. Many aspects
of document creation and transmission are well known, but
security mechanisms for evaluating sensitivity, labeling
documents appropriately and transmitting documents
securely have presented difficulties for many years. In our
scenario, software engineers are tasked to design a system
that solicits information from document creators, modifiers
and readers, so that a trust designation can be assigned to
each document. Security issues include understanding they
types of trust-related information needed, determining the
role of a changing threat environment, and defining the
frequency at which the trust information should be refreshed
and re-evaluated (particularly in light of cyber security inci-
dents that may occur during the life of the document). In
addition, the software engineers must implement some type
of summary trust designation that will have meaning to
document creators, modifiers and readers alike.
This trust designation, different from the classification of
document sensitivity, represents the degree to which both the
content and provider (or modifier) can be trusted and for how
long. For example, a document about a nation’s emerging
military capability may be highly classified (that is, highly
sensitive), regardless of whether the information provider is
highly trusted (because, for example, he has repeatedly
provided highly useful information in the past) or not (because,
for example, he frequently provides incorrect or misleading
information).
There are two important aspects of the software engineers’
security awareness. First, they must be able to select security
mechanisms for implementing the trust designation that
allow them to balance security with performance and
usability requirements. This balancing entails appreciating
and accommodating the role of security in the larger context
of the system’s intended purpose and multiple uses. Second,
the users must be able to trust that the appropriate security
mechanism is chosen. Trust means that the mechanism itself
must be appropriate to the task. For example, the Biba Integ-
rity Model (Biba, 1977), a system of computer security policies
expressed as access control rules, is designed to ensure data
integrity. The model defines a hierarchy of integrity levels,
and then prevents participants from corrupting data of an
integrity level higher than the subject, or from being corrupted
by data from a level lower than the subject. The Biba model
was developed to extend the Bell and La Padula (1973) model,
which addresses only data confidentiality. Thus, under-
standing and choice of policies and mechanisms are impor-
tant aspects in which we trust software engineers to exercise
discretion. In addition, software engineers must be able to
trust the provenance, correctness and conformance to
expectations of the security mechanisms. Here, “provenance”
means not only the applicability of the mechanisms and
algorithms but also the source of architectural or imple-
mentation modules. With the availability of open source
modules and product line architectures (see, for example,
Clements and Northrup, 2001), it is likely that some parts of
some security mechanisms will have been built for a different
purpose, often by a different team of engineers. Builders and
modifiers of the current system must know to what degree to
trust someone else’s modules.
3.1.2. Scenario 2: enhancing situational awareness during
a “cyber event”
Situational awareness is the degree to which a person or
system knows about a threat in the environment. When an
emergency is unfolding, the people and systems involved in
watching it unfold must determine what has already
happened, what is currently happening, and what is likely to
happen in the future; then, they make recommendations for
reaction based on their situational awareness. The people or
systems perceiving the situation have varying degrees of trust
in the information they gather and in the providers of that
information. When a cyber event is unfolding, information can
come from primary sources (such as sensors in process control
systems or measurements of network activity) and secondary
sources (such as human or automated interpreters of trends).
Consider analysts using a computer system that monitors
the network of power systems around the United States. The
system itself interacts with a network of systems, each of
which collects and analyzes data about power generation and
distribution stations and their access points. The analysts
http://dx.doi.org/10.1016/j.cose.2011.12.010
http://dx.doi.org/10.1016/j.cose.2011.12.010
c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1 601
notice a series of network failures around the country: first,
a power station in California fails, then one in Missouri, and so
on during the first few hours of the event.6 The analysts must
determine not only what is really unfolding but also how to
respond appropriately. Security and human behavior are
involved in many ways. First, the analyst must know whether
to trust the information being reported to her monitoring
system. For example, is the analyst viewing a failure in the
access point or in the monitoring system? Next, the analyst
must be able to know when and whether she has enough
information to make a decision about which reactions are
appropriate. This decision must be made in the context of an
evolving situation, where some evidence at first considered
trustworthy is eventually determined not to be (and vice versa).
Finally, the analyst must analyze the data being reported, form
hypotheses about possible causes, and then determine which
interpretation of the data to use. For instance, is the sequence
of failures the result of incorrect data transmission, a cyber
attack, random system failures, or simply the various power
companies’ having purchased some of their software from the
same vendor (whose system is now failing)? Choosing the
wrong interpretation can have serious consequences.
3.1.3. Scenario 3: supporting decisions about trustworthiness
of network transactions
On Christmas Day, 2009, a Nigerian student flying from
Amsterdam to Detroit attempted to detonate a bomb to
destroy the plane. Fortunately, the bomb did little damage,
and passengers prevented the student from completing his
intended task. However, in analyzing why the student was not
detected by a variety of airport security screens, it was
determined that important information was never presented
to the appropriate decision-makers (Baker and Hulse, 2009).
This situation forms the core of Scenario 3, where a system
queries an interconnected set of databases to find information
about a person or situation.
In this scenario, an analyst uses an interface to a collection
of data repositories, each of which contains information about
crime and terrorism. When the analyst receives a warning
about a particular person of interest, she must query the
repositories to determine what is known about that person.
There are many security issues related to this scenario. First,
the analyst must determine the degree to which she can trust
that all of the relevant information resides in at least one of
the connected repositories. After the Christmas bombing
attempt, it was revealed that the U.K. had denied a visa
request by the student, but information about the denial was
not available to the Transportation Security Administration
when decisions were made about whether to subject the
student to extra security screening. Spira (2010) points out
that the problem is not the number of databases; it is the lack
of ability to search the entire “federation” of databases.
Next, even if the relevant items are found, the most
important ones must be visible at the appropriate time. Libicki
and Pfleeger (2004) have documented the difficulties in
6 Indeed, at this stage it may not be clear that the event is
actually a cyber event. A similar event with similar characteris-
tics occurred on August 14, 2003, in the United States. See
http://
www.cnn.com/2003/US/08/14/power.outage/index.html.
“collecting the dots” before an analyst can take the next step
to connect them. If a “dot” is not as visible as it should be, it
can be overlooked or given insufficient attention during
subsequent analysis. Moreover, Spira (2010) highlights the
need for viewing the information in its appropriate context.
Third, the analyst must also determine the degree to which
each piece of relevant information can be trusted. That is, not
only must she know the accuracy and timeliness of each data
item, but she also must determine whether the data source
itself can be trusted. There are several aspects to this latter
degree of trust, such as knowing how frequently the data
source provides the information (that is, whether it is old
news), knowing whether the data source is trustworthy
enough, and whether circumstances may change the source’s
trustworthiness. For example, Predd et al. (2008) and Pfleeger
et al. (2010) point out the varying types of people with legiti-
mate access to systems taking unwelcome action. A trust-
worthy insider may become a threat because of a pending
layoff or personal problem, inattention or confusion, or her
attempt to overcome a system weakness. So the trustworthi-
ness of information and sources must be re-evaluated
repeatedly and perhaps even forecast based on predictions
about a changing environment.
Finally, the analyst must also determine the degree to
which the analysis is correct. Any analysis involves assump-
tions about variables and their importance, as well as the
relationships among dependent and independent variables.
Many times, it is a faulty assumption that leads to failure,
rather than faulty data.
3.2. Analysis of results
The three scenarios were intriguing to our interviewees, and all
agreed that they were realistic, relevant and important.
However, having the interviewees scrutinize the scenarios
revealed fewer behavioral insights than we had hoped. In each
case, the interviewee viewed each scenario from his or her
particular perspective, highlighting only a small portion of the
scenario to confirm an opinion he or she held. For example, one
of the interviewees used Scenario 3 to emphasize the need for
information sharing; another interviewee said that privacy is
a key concern, especially in situations like Scenario 2 where
significantmonitoring mustbe balanced with protecting privacy.
Nevertheless, many of the interviewees had good sugges-
tions for shaping the way forward. For instance, one said that
there is much to be learned from command and control
algorithms, where military actors have learned to deal with
risk perception, uncertainty, incomplete information, and the
need to make an important decision under extreme pressures.
There is rich literature addressing decision-making under
pressure, from Ellsberg (1964) through Klein (Klein, 1998,
2009). In particular, Klein’s models of adaptive decision-
making may be applicable (Klein and Calderwood, 1991;
Klein and Salas, 2001). While the scenario methodology was
not a structured idea generation approach, to the extent
possible, we endeavored to be unbiased in our interpretation
of interviewee responses. We were not trying to gather
support for preconceived ideas and were genuinely trying to
explore new ideas where behavioral science could be lever-
aged to address security issues.
http://www.cnn.com/2003/US/08/14/power.outage/index.html
http://www.cnn.com/2003/US/08/14/power.outage/index.html
http://dx.doi.org/10.1016/j.cose.2011.12.010
http://dx.doi.org/10.1016/j.cose.2011.12.010
c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1602
There were several messages that emerged from the
interviews:
� Security is intertwined with the way humans behave when
trying to meet a goal or perform a task. The separation of
primary task from secondary, as well as its impact on user
behavior, was first clearly expressed in Smith et al. (1997)
and elaborated in the security realm by Sasse et al. (2002).
Our interviews reconfirmed that, in most instances, security
is secondary to a user’s primary task (e.g., finding a piece of
information, processing a transaction, making a decision).
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx
RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx

More Related Content

Similar to RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx

The Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docx
The Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docxThe Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docx
The Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docxjmindy
 
Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...
Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...
Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...Stockholm Institute of Transition Economics
 
Database Security Is Vital For Any And Every Organization
Database Security Is Vital For Any And Every OrganizationDatabase Security Is Vital For Any And Every Organization
Database Security Is Vital For Any And Every OrganizationApril Dillard
 
RCDM Dissertation 2 - Laurence Horton (129040266)
RCDM Dissertation 2 - Laurence Horton (129040266)RCDM Dissertation 2 - Laurence Horton (129040266)
RCDM Dissertation 2 - Laurence Horton (129040266)Laurence Horton
 
Comments to FTC on Mobile Data Privacy
Comments to FTC on Mobile Data PrivacyComments to FTC on Mobile Data Privacy
Comments to FTC on Mobile Data PrivacyMicah Altman
 
Existential Risk Prevention as Global Priority
Existential Risk Prevention as Global PriorityExistential Risk Prevention as Global Priority
Existential Risk Prevention as Global PriorityKarlos Svoboda
 
BOS 3651, Total Environmental Health and Safety Managemen.docx
 BOS 3651, Total Environmental Health and Safety Managemen.docx BOS 3651, Total Environmental Health and Safety Managemen.docx
BOS 3651, Total Environmental Health and Safety Managemen.docxarnit1
 
CHI abstract camera ready
CHI abstract camera readyCHI abstract camera ready
CHI abstract camera readyMark Sinclair
 
White Paper Presentation
White Paper PresentationWhite Paper Presentation
White Paper Presentationbritmeredith
 
Sonal Pirani Discussion  When the end-users are involved in t.docx
Sonal Pirani Discussion  When the end-users are involved in t.docxSonal Pirani Discussion  When the end-users are involved in t.docx
Sonal Pirani Discussion  When the end-users are involved in t.docxrosemariebrayshaw
 
Healthcares Vulnerability to Ransomware AttacksResearch questio
Healthcares Vulnerability to Ransomware AttacksResearch questioHealthcares Vulnerability to Ransomware AttacksResearch questio
Healthcares Vulnerability to Ransomware AttacksResearch questioSusanaFurman449
 
A Pluralistic Understanding of Sustainable Engineering Science
A Pluralistic Understanding of Sustainable Engineering ScienceA Pluralistic Understanding of Sustainable Engineering Science
A Pluralistic Understanding of Sustainable Engineering ScienceThomas P Seager
 
Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...
Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...
Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...IJERA Editor
 
APPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTS
APPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTSAPPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTS
APPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTSIJNSA Journal
 
CONCEPTUALIZING AI RISK
CONCEPTUALIZING AI RISKCONCEPTUALIZING AI RISK
CONCEPTUALIZING AI RISKcscpconf
 
Discussion Questions The difficulty in predicting the future is .docx
Discussion Questions The difficulty in predicting the future is .docxDiscussion Questions The difficulty in predicting the future is .docx
Discussion Questions The difficulty in predicting the future is .docxduketjoy27252
 
10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docx
10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docx10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docx
10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docxchristiandean12115
 
Surveillance Systems And Studies That Should Be...
Surveillance Systems And Studies That Should Be...Surveillance Systems And Studies That Should Be...
Surveillance Systems And Studies That Should Be...Ann Johnson
 
REPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docx
REPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docxREPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docx
REPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docxchris293
 

Similar to RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx (19)

The Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docx
The Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docxThe Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docx
The Journal of forensic PsychiaTry & Psychology, 2017Vol. 28.docx
 
Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...
Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...
Risking Other People’s Money: Experimental Evidence on Bonus Schemes, Competi...
 
Database Security Is Vital For Any And Every Organization
Database Security Is Vital For Any And Every OrganizationDatabase Security Is Vital For Any And Every Organization
Database Security Is Vital For Any And Every Organization
 
RCDM Dissertation 2 - Laurence Horton (129040266)
RCDM Dissertation 2 - Laurence Horton (129040266)RCDM Dissertation 2 - Laurence Horton (129040266)
RCDM Dissertation 2 - Laurence Horton (129040266)
 
Comments to FTC on Mobile Data Privacy
Comments to FTC on Mobile Data PrivacyComments to FTC on Mobile Data Privacy
Comments to FTC on Mobile Data Privacy
 
Existential Risk Prevention as Global Priority
Existential Risk Prevention as Global PriorityExistential Risk Prevention as Global Priority
Existential Risk Prevention as Global Priority
 
BOS 3651, Total Environmental Health and Safety Managemen.docx
 BOS 3651, Total Environmental Health and Safety Managemen.docx BOS 3651, Total Environmental Health and Safety Managemen.docx
BOS 3651, Total Environmental Health and Safety Managemen.docx
 
CHI abstract camera ready
CHI abstract camera readyCHI abstract camera ready
CHI abstract camera ready
 
White Paper Presentation
White Paper PresentationWhite Paper Presentation
White Paper Presentation
 
Sonal Pirani Discussion  When the end-users are involved in t.docx
Sonal Pirani Discussion  When the end-users are involved in t.docxSonal Pirani Discussion  When the end-users are involved in t.docx
Sonal Pirani Discussion  When the end-users are involved in t.docx
 
Healthcares Vulnerability to Ransomware AttacksResearch questio
Healthcares Vulnerability to Ransomware AttacksResearch questioHealthcares Vulnerability to Ransomware AttacksResearch questio
Healthcares Vulnerability to Ransomware AttacksResearch questio
 
A Pluralistic Understanding of Sustainable Engineering Science
A Pluralistic Understanding of Sustainable Engineering ScienceA Pluralistic Understanding of Sustainable Engineering Science
A Pluralistic Understanding of Sustainable Engineering Science
 
Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...
Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...
Understanding Construction Workers’ Risk Decisions Using Cognitive Continuum ...
 
APPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTS
APPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTSAPPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTS
APPLYING THE HEALTH BELIEF MODEL TO CARDIAC IMPLANTED MEDICAL DEVICE PATIENTS
 
CONCEPTUALIZING AI RISK
CONCEPTUALIZING AI RISKCONCEPTUALIZING AI RISK
CONCEPTUALIZING AI RISK
 
Discussion Questions The difficulty in predicting the future is .docx
Discussion Questions The difficulty in predicting the future is .docxDiscussion Questions The difficulty in predicting the future is .docx
Discussion Questions The difficulty in predicting the future is .docx
 
10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docx
10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docx10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docx
10.11770022427803260263ARTICLEJOURNAL OF RESEARCH IN CRIME AN.docx
 
Surveillance Systems And Studies That Should Be...
Surveillance Systems And Studies That Should Be...Surveillance Systems And Studies That Should Be...
Surveillance Systems And Studies That Should Be...
 
REPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docx
REPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docxREPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docx
REPLY TO EACH POST 100 WORDS MIN EACH1. I have chosen .docx
 

More from audeleypearl

Mr. Bush, a 45-year-old middle school teacher arrives at the emergen.docx
Mr. Bush, a 45-year-old middle school teacher arrives at the emergen.docxMr. Bush, a 45-year-old middle school teacher arrives at the emergen.docx
Mr. Bush, a 45-year-old middle school teacher arrives at the emergen.docxaudeleypearl
 
Movie Project Presentation Movie TroyInclude Architecture i.docx
Movie Project Presentation Movie TroyInclude Architecture i.docxMovie Project Presentation Movie TroyInclude Architecture i.docx
Movie Project Presentation Movie TroyInclude Architecture i.docxaudeleypearl
 
Motivation and Retention Discuss the specific strategies you pl.docx
Motivation and Retention Discuss the specific strategies you pl.docxMotivation and Retention Discuss the specific strategies you pl.docx
Motivation and Retention Discuss the specific strategies you pl.docxaudeleypearl
 
Mother of the Year In recognition of superlative paren.docx
Mother of the Year         In recognition of superlative paren.docxMother of the Year         In recognition of superlative paren.docx
Mother of the Year In recognition of superlative paren.docxaudeleypearl
 
Mrs. G, a 55 year old Hispanic female, presents to the office for he.docx
Mrs. G, a 55 year old Hispanic female, presents to the office for he.docxMrs. G, a 55 year old Hispanic female, presents to the office for he.docx
Mrs. G, a 55 year old Hispanic female, presents to the office for he.docxaudeleypearl
 
Mr. Rivera is a 72-year-old patient with end stage COPD who is in th.docx
Mr. Rivera is a 72-year-old patient with end stage COPD who is in th.docxMr. Rivera is a 72-year-old patient with end stage COPD who is in th.docx
Mr. Rivera is a 72-year-old patient with end stage COPD who is in th.docxaudeleypearl
 
Mr. B, a 40-year-old avid long-distance runner previously in goo.docx
Mr. B, a 40-year-old avid long-distance runner previously in goo.docxMr. B, a 40-year-old avid long-distance runner previously in goo.docx
Mr. B, a 40-year-old avid long-distance runner previously in goo.docxaudeleypearl
 
Moving members of the organization through the change process ca.docx
Moving members of the organization through the change process ca.docxMoving members of the organization through the change process ca.docx
Moving members of the organization through the change process ca.docxaudeleypearl
 
Mr. Friend is acrime analystwith the SantaCruz, Califo.docx
Mr. Friend is acrime analystwith the SantaCruz, Califo.docxMr. Friend is acrime analystwith the SantaCruz, Califo.docx
Mr. Friend is acrime analystwith the SantaCruz, Califo.docxaudeleypearl
 
Mr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docx
Mr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docxMr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docx
Mr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docxaudeleypearl
 
Motor Milestones occur in a predictable developmental progression in.docx
Motor Milestones occur in a predictable developmental progression in.docxMotor Milestones occur in a predictable developmental progression in.docx
Motor Milestones occur in a predictable developmental progression in.docxaudeleypearl
 
Most women experience their closest friendships with those of th.docx
Most women experience their closest friendships with those of th.docxMost women experience their closest friendships with those of th.docx
Most women experience their closest friendships with those of th.docxaudeleypearl
 
Most patients with mental health disorders are not aggressive. Howev.docx
Most patients with mental health disorders are not aggressive. Howev.docxMost patients with mental health disorders are not aggressive. Howev.docx
Most patients with mental health disorders are not aggressive. Howev.docxaudeleypearl
 
Most of our class readings and discussions to date have dealt wi.docx
Most of our class readings and discussions to date have dealt wi.docxMost of our class readings and discussions to date have dealt wi.docx
Most of our class readings and discussions to date have dealt wi.docxaudeleypearl
 
Most people agree we live in stressful times. Does stress and re.docx
Most people agree we live in stressful times. Does stress and re.docxMost people agree we live in stressful times. Does stress and re.docx
Most people agree we live in stressful times. Does stress and re.docxaudeleypearl
 
Most of the ethical prescriptions of normative moral philosophy .docx
Most of the ethical prescriptions of normative moral philosophy .docxMost of the ethical prescriptions of normative moral philosophy .docx
Most of the ethical prescriptions of normative moral philosophy .docxaudeleypearl
 
Most healthcare organizations in the country are implementing qualit.docx
Most healthcare organizations in the country are implementing qualit.docxMost healthcare organizations in the country are implementing qualit.docx
Most healthcare organizations in the country are implementing qualit.docxaudeleypearl
 
More work is necessary on how to efficiently model uncertainty in ML.docx
More work is necessary on how to efficiently model uncertainty in ML.docxMore work is necessary on how to efficiently model uncertainty in ML.docx
More work is necessary on how to efficiently model uncertainty in ML.docxaudeleypearl
 
Mortgage-Backed Securities and the Financial CrisisKelly Finn.docx
Mortgage-Backed Securities and the Financial CrisisKelly Finn.docxMortgage-Backed Securities and the Financial CrisisKelly Finn.docx
Mortgage-Backed Securities and the Financial CrisisKelly Finn.docxaudeleypearl
 
Moral Development  Lawrence Kohlberg developed six stages to mora.docx
Moral Development  Lawrence Kohlberg developed six stages to mora.docxMoral Development  Lawrence Kohlberg developed six stages to mora.docx
Moral Development  Lawrence Kohlberg developed six stages to mora.docxaudeleypearl
 

More from audeleypearl (20)

Mr. Bush, a 45-year-old middle school teacher arrives at the emergen.docx
Mr. Bush, a 45-year-old middle school teacher arrives at the emergen.docxMr. Bush, a 45-year-old middle school teacher arrives at the emergen.docx
Mr. Bush, a 45-year-old middle school teacher arrives at the emergen.docx
 
Movie Project Presentation Movie TroyInclude Architecture i.docx
Movie Project Presentation Movie TroyInclude Architecture i.docxMovie Project Presentation Movie TroyInclude Architecture i.docx
Movie Project Presentation Movie TroyInclude Architecture i.docx
 
Motivation and Retention Discuss the specific strategies you pl.docx
Motivation and Retention Discuss the specific strategies you pl.docxMotivation and Retention Discuss the specific strategies you pl.docx
Motivation and Retention Discuss the specific strategies you pl.docx
 
Mother of the Year In recognition of superlative paren.docx
Mother of the Year         In recognition of superlative paren.docxMother of the Year         In recognition of superlative paren.docx
Mother of the Year In recognition of superlative paren.docx
 
Mrs. G, a 55 year old Hispanic female, presents to the office for he.docx
Mrs. G, a 55 year old Hispanic female, presents to the office for he.docxMrs. G, a 55 year old Hispanic female, presents to the office for he.docx
Mrs. G, a 55 year old Hispanic female, presents to the office for he.docx
 
Mr. Rivera is a 72-year-old patient with end stage COPD who is in th.docx
Mr. Rivera is a 72-year-old patient with end stage COPD who is in th.docxMr. Rivera is a 72-year-old patient with end stage COPD who is in th.docx
Mr. Rivera is a 72-year-old patient with end stage COPD who is in th.docx
 
Mr. B, a 40-year-old avid long-distance runner previously in goo.docx
Mr. B, a 40-year-old avid long-distance runner previously in goo.docxMr. B, a 40-year-old avid long-distance runner previously in goo.docx
Mr. B, a 40-year-old avid long-distance runner previously in goo.docx
 
Moving members of the organization through the change process ca.docx
Moving members of the organization through the change process ca.docxMoving members of the organization through the change process ca.docx
Moving members of the organization through the change process ca.docx
 
Mr. Friend is acrime analystwith the SantaCruz, Califo.docx
Mr. Friend is acrime analystwith the SantaCruz, Califo.docxMr. Friend is acrime analystwith the SantaCruz, Califo.docx
Mr. Friend is acrime analystwith the SantaCruz, Califo.docx
 
Mr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docx
Mr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docxMr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docx
Mr. E is a pleasant, 70-year-old, black, maleSource Self, rel.docx
 
Motor Milestones occur in a predictable developmental progression in.docx
Motor Milestones occur in a predictable developmental progression in.docxMotor Milestones occur in a predictable developmental progression in.docx
Motor Milestones occur in a predictable developmental progression in.docx
 
Most women experience their closest friendships with those of th.docx
Most women experience their closest friendships with those of th.docxMost women experience their closest friendships with those of th.docx
Most women experience their closest friendships with those of th.docx
 
Most patients with mental health disorders are not aggressive. Howev.docx
Most patients with mental health disorders are not aggressive. Howev.docxMost patients with mental health disorders are not aggressive. Howev.docx
Most patients with mental health disorders are not aggressive. Howev.docx
 
Most of our class readings and discussions to date have dealt wi.docx
Most of our class readings and discussions to date have dealt wi.docxMost of our class readings and discussions to date have dealt wi.docx
Most of our class readings and discussions to date have dealt wi.docx
 
Most people agree we live in stressful times. Does stress and re.docx
Most people agree we live in stressful times. Does stress and re.docxMost people agree we live in stressful times. Does stress and re.docx
Most people agree we live in stressful times. Does stress and re.docx
 
Most of the ethical prescriptions of normative moral philosophy .docx
Most of the ethical prescriptions of normative moral philosophy .docxMost of the ethical prescriptions of normative moral philosophy .docx
Most of the ethical prescriptions of normative moral philosophy .docx
 
Most healthcare organizations in the country are implementing qualit.docx
Most healthcare organizations in the country are implementing qualit.docxMost healthcare organizations in the country are implementing qualit.docx
Most healthcare organizations in the country are implementing qualit.docx
 
More work is necessary on how to efficiently model uncertainty in ML.docx
More work is necessary on how to efficiently model uncertainty in ML.docxMore work is necessary on how to efficiently model uncertainty in ML.docx
More work is necessary on how to efficiently model uncertainty in ML.docx
 
Mortgage-Backed Securities and the Financial CrisisKelly Finn.docx
Mortgage-Backed Securities and the Financial CrisisKelly Finn.docxMortgage-Backed Securities and the Financial CrisisKelly Finn.docx
Mortgage-Backed Securities and the Financial CrisisKelly Finn.docx
 
Moral Development  Lawrence Kohlberg developed six stages to mora.docx
Moral Development  Lawrence Kohlberg developed six stages to mora.docxMoral Development  Lawrence Kohlberg developed six stages to mora.docx
Moral Development  Lawrence Kohlberg developed six stages to mora.docx
 

Recently uploaded

ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxsocialsciencegdgrohi
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupJonathanParaisoCruz
 

Recently uploaded (20)

ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized Group
 

RATIO ANALYSIS RATIO ANALYSIS Note Please change the column names.docx

  • 1. RATIO ANALYSIS RATIO ANALYSIS Note: Please change the column names based on your industry and your selected companies.RATIOS<INDUSTRY><COMPANY #1><COMPANY #2>ANALYSIS (your comments), which company is stronger, better/worse than industry, what results meanProfitability Ratios (%)show calculationshow ending resultshoww Calculationshow resultGross Margin EBITD Margin Operating Margin Pretax Margin Effective Tax Rate Financial StrengthQuick RatioCurrent Ratio LT Debt to Equity Total Debt to Equity Interest Coverage Valuation RatiosP/E Ratio Price to Sales (P/S)Price to Book (P/B)Price to Tangible Book Price to Cash FlowPrice to Free Cash Flow Management Effectiveness (%)Return On Assets Return On Investment Return On Equity DividendsDividend YieldPayout Ratio EfficiencyRevenue/Employee Net Income/Employee Receivable TurnoverInventory TurnoverAsset Turnover SummaryWhat is ratio analysis? Briefly explain in this space, and reference your resources: Referring to your ratio analysis above, in which company would you be willing to invest, and why? Heuristics and biases in cyber security dilemmas Heather Rosoff • Jinshu Cui • Richard S. John Published online: 28 September 2013 � Springer Science+Business Media New York 2013 Abstract Cyber security often depends on decisions made by human operators, who are commonly considered a
  • 2. major cause of security failures. We conducted 2 behav- ioral experiments to explore whether and how cyber security decision-making responses depend on gain–loss framing and salience of a primed recall prior experience. In Experiment I, we employed a 2 9 2 factorial design, manipulating the frame (gain vs. loss) and the presence versus absence of a prior near-miss experience. Results suggest that the experience of a near-miss significantly increased respondents’ endorsement of safer response options under a gain frame. Overall, female respondents were more likely to select a risk averse (safe) response compared with males. Experiment II followed the same general paradigm, framing all consequences in a loss frame and manipulating recall to include one of three possible prior experiences: false alarm, near-miss, or a hit involving a loss of data. Results indicate that the manipulated prior hit experience significantly increased the likelihood of respondents’ endorsement of a safer response relative to
  • 3. the manipulated prior near-miss experience. Conversely, the manipulated prior false-alarm experience significantly decreased respondents’ likelihood of endorsing a safer response relative to the manipulated prior near-miss experience. These results also showed a main effect for age and were moderated by respondent’s income level. Keywords Cyber security � Framing effect � Near-miss � Decision making 1 Introduction Individual users regularly make decisions that affect the security of their personal devices connected to the internet and, in turn, to the security of the cybersphere. For example, they must decide whether to install software to protect from viruses and hackers, download files from unknown sources, or submit personal identification infor- mation for web site access or online purchases. Such decisions involve actions that could result in various neg- ative consequences (loss of data, reduced computer per- formance or destruction of a computer’s hard drive).
  • 4. Conversely, other alternative actions are available that could protect individuals from negative outcomes, but also could limit the efficiency and ease of use of the personal device. Aytes and Connolly (2004) propose a decision model of computer-related behavior that suggests individuals make a rational choice to either engage in safe or unsafe cyber behavior. In their model, individual behavior is driven by perceptions of the usefulness of safe and unsafe behaviors and the consequences of each. More specifically, the model captures how information sources, the user’s base knowl- edge of cyber security, the user’s relevant perceptions (e.g., interpretations of the applicability of the knowledge), and the user’s risk attitude influence individual cyber decision making. H. Rosoff (&) Sol Price School of Public Policy, University of Southern California, Los Angeles, CA, USA
  • 5. e-mail: [email protected] H. Rosoff � J. Cui � R. S. John Center for Risk and Economic Analysis of Terrorism Events (CREATE), University of Southern California, Los Angeles, CA, USA J. Cui � R. S. John Department of Psychology, University of Southern California, Los Angeles, CA, USA 123 Environ Syst Decis (2013) 33:517–529 DOI 10.1007/s10669-013-9473-2 This paper reports on two behavioral experiments, using over 500 respondents, designed to explore whether and how recommended cyber security decision-making responses depend on gain–loss framing and salience of prior cyber dilemma experiences. More specifically, we explored whether priming individuals to recall a prior cyber-related experience influenced their decision to select either a safe versus risky option in responding to a hypo-
  • 6. thetical cyber dilemma. We hypothesized that recall of a hit experience involving negative consequences would increase feelings of vulnerability, even more so than a near-miss, and lead to the endorsement of a risk averse option. This result has been reported in the disaster liter- ature, which has shown that individual decision making depends on prior experiences, including hits, near-misses (events where a hazardous or fatal outcome could have occurred, but do not), and false alarms (Barnes et al. 2007; Dillon et al. 2011; Siegrist and Gutscher 2008). Further- more, damage from past disasters has been shown to sig- nificantly influence individual perceptions of future risk and to motivate more protective and mitigation-related behavior (Kunreuther and Pauly 2004; Siegrist and Gut- scher 2008; Slovic et al. 2005). We anticipated that the effect of prior near-miss expe- riences would depend on the interpretation of the prior near-miss event by the respondent. This expectation was
  • 7. based on near-miss research that has shown that future- intended mitigation behavior depends greatly on the per- ception of the near-miss event outcome. Tinsley et al. (2012) describe two near-miss types—a resilient and vul- nerable near-miss. A resilient near-miss is as an event that did not occur. In these situations, individuals were found to underestimate the danger of subsequent events and were more likely to engage in risky behavior by choosing not to take protective action. A vulnerable near-miss occurs when a disaster almost happened. New information is incorpo- rated into the assessment that counters the basic ‘‘near- miss’’ definition and results in the individual being more inclined to engage in risk averse behavior (the opposite behavior related to a resilient near-miss interpretation). In the cyber context, we expected that respondents who fail to recognize a prior near-miss as a cyber threat would be more likely to recommend the risky course of action. However, if respondents view a recalled near-miss as evidence of vul-
  • 8. nerability, then they would be more inclined to endorse the safer option. In the case of a recalled prior false-alarm experience, one hypothesis known as the ‘‘cry-wolf effect’’ (Breznitz 2013) suggests that predictions of disasters that do not materialize affect beliefs about the uncertainty associated with future events. In this context, false alarms are believed to create complacency and reduce willingness to respond to future warnings, resulting in a greater likelihood of engaging in risky behavior (Barnes et al. 2007; Donner et al. 2012; Dow and Cutter 1998; Simmons and Sutter 2009). In contrast, there is research showing that the public may have a higher tolerance for false alarms than antici- pated. This is because of the increased credibility given to the event due to the frequency with which it is discussed, both through media sources and informal discussion, thus, suggesting that false alarms might increase individuals’ willingness to be risk averse (Dow and Cutter 1998). We
  • 9. anticipated that recall of prior false alarms would likely make respondents feel less vulnerable and more willing to prefer the risky option, compared with the near-miss and hit conditions. In our research, we also anticipated that there would be some influence of framing on individual cyber decision making under risk. Prospect theory and related empirical research suggest that decision making under risk depends on whether potential outcomes are perceived as a gain or as a loss in relation to a reference point (Kahneman and Tversky 1979; Tversky and Kahneman 1986). A common finding in the literature on individual preferences in deci- sion making shows that people tend to avoid risk under gain frames, but seek risk when outcomes are framed as a loss. Prospect theory is discussed in the security literature, but empirical studies in cyber security contexts are limited (Acquisti and Grossklags 2007; Garg and Camp 2013;
  • 10. Helander and Khalid 2000; Shankar et al. 2002; Verendel 2008). Among the security studies that have been con- ducted, the results are mixed. The work by Schroeder and colleagues on computer information security presented at the 2006 Information Resources Management Association International Conference found that decision makers were risk averse in the gain frame, yet they showed no risk preference in the loss frame. Similarly, in a 1999 presen- tation about online shopping behavior by Helander and Du at the International Conference on TQM and Human Fac- tors, perceived risk of credit card fraud and the potential for price inflation did not negatively affect purchase intention (loss frame), while perceived value of a product was found to positively affect purchase intention. We anticipated that gain-framed messages in cyber dilemmas would increase endorsement of protective responses and loss-framed messages would have no effect on the endorsement of protective options.
  • 11. We also explored how subject variables affect the strength and/or the direction of the relationship between the manipulated variables, prior experience and gain–loss framing, and the dependent variable, endorsement of safe or unsafe options in response to cyber dilemmas. For example, one possibility is that the relationship between prior experience and risk averse behavior is greater for individuals with higher self-reported victimization given 518 Environ Syst Decis (2013) 33:517–529 123 their increased exposure to cyber dilemma consequences. Another possibility is that the relationship between the gain frame and protective behavior would be less for younger individuals because they are more familiar and comfortable with the nuances of internet security. We anticipated that there would be some difference in the patterns of response as a function of sex, age, income, education, job domain,
  • 12. and self-reported victimization. The next section of this article describes the methods, results, and a brief discussion for Experiment I, and Sect. 3 describes the methods, results, and a brief discussion for Experiment II. The paper closes with a discussion of findings across both experiments and how these results suggest approaches to enhance and improve cyber security by taking into account user decision making. 2 Experiment I We conducted an experiment of risky cyber dilemmas with two manipulated variables, gain–loss framing and primed recall of a prior personal near-miss experience, to evaluate individual cyber user decision making. The cyber dilem- mas were developed to capture commonly confronted risky cyber choices faced by individual users. In addition, in Experiment I, the dependent variable focused on the advice the respondent would provide to their best friend so as to encourage more normative thinking about what might be
  • 13. the correct response to the cyber dilemma. As such, each cyber scenario described a risky choice dilemma faced by the respondent’s ‘‘best friend,’’ and the respondent was asked to recommend either a safe but inconvenient course of action (e.g., recommend not downloading the music file from an unknown source), or a risky but more convenient option (e.g., recommend downloading the music file from an unknown source). 2.1 Method 2.1.1 Design overview In Experiment I, four cyber dilemmas were developed to evaluate respondents’ risky choice behavior using a 2 (recalled personal near-miss experience or no recall control condition) by 2 (gain versus loss-framed message) mixed model factorial design with two dichotomous subject variables: sex and self-reported victimization. Each par- ticipant received all four dilemmas in a constant order. Within this order, each of the four treatment conditions was
  • 14. paired with each of the four dilemmas and counterbalanced such that each of the dilemmas was randomly assigned to each of the four treatment conditions. After each cyber dilemma, respondents were asked to respond on a 6-point scale (1 = strongly disagree to 6 = strongly agree) whether they would advise their ‘‘best friend’’ to proceed in taking a risky course of action. Responses of 1–3 indicated endorsement of the safe but inconvenient option, while responses of 4–6 indicated endorsement of the risky but expedient option. Following the four cyber dilemmas, respondents were given four attention check questions to determine whether they were reading the cyber scenarios carefully. In addition, basic demographic information was collected as well as infor- mation on each respondent’s personal experience and self- reported victimization, if any, with the topics of the cyber dilemmas. 2.1.2 Scenarios and manipulations
  • 15. The four cyber dilemma scenarios involved the threat of a computer virus resulting from the download of a music file, the use of an unknown USB drive device, the download of a Facebook application, and the risk of financial fraud from an online purchase. Gain–loss framing and primed recall of a prior personal experience were manipulated independent variables. The framing messages were used to describe the potential outcome of the risky cyber choice. The gain- framed messages endorsed the safe, more protective rec- ommendation. For example, for the download of a music file scenario, the gain frame was worded as ‘‘If she presses ‘do not proceed,’ she may avoid the risk of acquiring a virus that will cause serious damage to her computer.’’ Conversely, the loss-framed messages endorsed the risky option/choice. For the download of a music file scenario, the loss frame was worded as ‘‘If she presses ‘proceed,’ she may risk acquiring a virus that will cause serious damage to her computer.’’ The experimental design also included a
  • 16. manipulation of primed recall of a prior personal experi- ence. Respondents either recalled a near-miss experience of their own before advising their friend, or did not (a control condition). In each near-miss experience, the respondent’s dilemma was similar to the situation faced by their best friend and the consequences of the threat were benign. A complete description of the four scenarios, including the near-miss and gain–loss framing manipulations, is pro- vided in Table 1. 2.1.3 Subjects The experiment was conducted using the University of Southern California’s Psychology Subject Pool. Students participated for course credit. Of the 365 students who participated in the experiment, 99 were omitted for not answering all 4 of the attention check questions correctly, resulting in a sample of 266 respondents. Most, 203 (76 %) Environ Syst Decis (2013) 33:517–529 519 123
  • 17. Table 1 Summary of four scenarios and manipulations (Experiment I) Scenario 1: Music File Scenario 2: USB Scenario 3: Facebook Scenario 4: Rare Book Scenario Your best friend has contacted you for advice. She wants to open a music file linking to an early release of her favorite band’s new album. When she clicks on the link, a window pops up indicating that she needs to turn off her firewall program in order to access the file Your best friend has contacted you for advice. Her computer keeps crashing because it is overloaded with programs, documents and media files. She consults a
  • 18. computer technician who advises her to purchase a 1 terabyte USB drive (data storage device) to free up space on her computer. She does her research and narrows down the selection to two choices Your best friend has contacted you for advice. She has opened her Facebook page to find an app request for a game that her friends have been really excited about. In order to download the app, access to some of her personal information is required including her User ID and other information from her profile Your best friend has contacted you
  • 19. for advice. She is going to buy a rare book from an unknown online store. The book is highly desirable, expensive and only available from this online store’s website. By deciding to purchase the book online with her credit card, there is a risk that her personal information will be exploited which can generate unauthorized credit card charges. Her credit card charges $50 for the investigation and retrieval of funds expended when resolving fraudulent credit card issues Gain framing
  • 20. If she presses ‘‘do not proceed,’’ she may avoid the risk of acquiring a virus that will cause serious damage to her computer The first USB drive when used on a computer other than your own has a 10 % chance of becoming infected with a virus that will delete all the files and programs on the drive. The second drive is double the price, but has less than a 5 % chance of becoming infected with a virus when used on a computer other than your own If she chooses not to agree to the terms of the app, she is protecting her private
  • 21. information from being made available to the developer of the app If she decides not to buy the book, she may save up to $50 and the time spent talking with the credit card company Loss framing If she presses ‘‘proceed,’’ she may risk acquiring a virus that will cause serious damage to her computer The first USB drive when used on a computer other than her own has a 5 % chance of becoming infected with a virus that will delete all the files and programs
  • 22. on the drive. The second drive is half the price but has more than a 5 % chance of becoming infected with a virus when used on a computer other than her own If she chooses to agree to the terms of the app, she risks the chance of her private information being made available to the developer of the app If she decides to buy the book, she may lose up to $50 and the time spent talking with the credit card company Near-miss experience As you consider how to advise
  • 23. your friend, you recall that you were confronted by a similar situation in the past. You attempted to open a link to a music file and a window popped up saying that you need to turn off your firewall program in order to access the file. You pressed ‘‘proceed’’ and your computer immediately crashed. Fortunately, after restarting your computer everything was functioning normally again As you consider how to advise your friend, you recall that your USB drive recently was infected with a virus after being plugged into a computer at work. You
  • 24. contacted a computer technician to see if there was any way to repair the drive. The technician was able to recover all the files and told you that you were really lucky because normally such drives cannot be restored As you consider how to advise your friend, you recall that you once agreed to share some of your personal information in order to download an app on Facebook. The developers of the app made your User ID publicly available and because of this you started to receive messages from strangers on your profile page. You were very upset about the
  • 25. invasion of your privacy. Fortunately, you discovered that you could change the privacy settings of your profile so that only your friends could access your page As you consider how to advise your friend, you recall that you once purchased a rare book from an unknown online store. You were expecting the book to arrive 1 week later. About 2 weeks later, you had yet to receive the book. You were very concerned that you had done business with a fake online store. You contacted the store’s customer service who
  • 26. fortunately tracked down the book’s location and had it shipped with overnight delivery. Question Below please indicate your level of agreement with the statement ‘‘You will advise your best friend to press ‘‘proceed’’ and risk acquiring a virus that will cause serious damage to her computer’’ Below please indicate your level of agreement with the statement ‘‘You will advise your best friend to buy the first USB drive that has a 10 % chance of becoming infected with a virus.’’/’’You will advise your best friend to buy the second
  • 27. USB drive that has a greater than 5 % chance of becoming infected with a virus’’ Below please indicate your level of agreement with the statement ‘‘You will advise your best friend to download the app and risk having her private information made available to the app developer’’ Below please indicate your level of agreement with the statement: ‘‘You will advise your best friend to purchase the book online and risk having her personal information exploited’’ 520 Environ Syst Decis (2013) 33:517–529 123
  • 28. of the respondents, were female. Respondents ranged in age from 18 to 41 years (95 % percentile is 22 years old). Table 2 shows a summary of personal experience and self-reported victimization associated with each of the four cyber dilemmas. All respondents reported having been a victim of one of the four cyber dilemmas. Twenty-four percent of respondents further reported being a victim of one or more of the four cyber dilemmas. We coded whether the respondent had ever been victimized by one of the four scenarios as a variable of self-reported victimization. 2.2 Results Raw responses (1–6) were centered around the midpoint (3.5) such that negative responses indicate endorsement of the safe option, and positive responses indicate endorse- ment of the risky option. Mean endorsement responses for each of the four treatment conditions are displayed in Fig. 1. The negative means in all four conditions indicate
  • 29. that subjects were more likely to endorse risk averse actions compared with the risky alternative. 1 In addition, a 2 (recalled personal near-miss experience or no recall control condition) by 2 (gain vs. loss-framed message) by 2 (sex) by 2 (self-reported victimization) 4-way factorial ANOVA was used to evaluate respondents endorsement of risky versus safe options in cyber dilem- mas. Analyses were specified to only include main effects and 2-way interactions with the manipulated variables. Preliminary data screening was conducted, and q–q plots indicated that the dependent variable is approximately normally distributed. Results indicated that the near-miss manipulation was significant, F (1, 260) = 7.42, p = .01, g2 = .03. Respondents who received a description of a recalled near- miss experience preferred the safe but inconvenient option to the risky, more expedient option. No main effect was found for the gain–loss framing manipulation, suggesting
  • 30. that respondents were indifferent between safe versus risky decision options when the outcomes were described as gains or losses from a reference point. There also was a significant interaction between the framing and near-miss manipulations: F (1, 260) = 4.01, p = .05, g2 = .02. As seen in Fig. 1, the near-miss manipulation was much larger under the gain frame compared with the loss frame. Basic demographic data also was collected to assess whether individual differences moderated the effect of the two manipulations. A significant main effect was found for sex: F (1, 260) = 3.81, p = .05, g2 = .01; Sex’s cohen’s d are 0.33 for gain framing without near-miss, 0.09 for gain framing with near-miss, 0.18 for loss framing without near- miss, and 0.19 for loss framing with near-miss. Female respondents were more likely to avoid risks and choose the safe option. No significant main effect was found for self- reported victimization. Also, none of the interactions were significant; sex and framing, sex and near-miss experience, victimization and framing, and victimization and near-
  • 31. miss. Table 2 Summary of experience and victimization Scenario (N = 266) Personal experience Previous victimization Music file download 205 (77 %) 40 (15 %) USB drive 110 (41 %) 12 (4.5 %) Facebook App download 253 a (95 %) 3 (1 %) Online purchase 259 (97 %) 18 (7 %) Overall (at least once) 265 b (100 %) 64 (24 %) a An app is downloaded from Facebook at least once a week
  • 32. b There is one missing value 1 Since the four scenarios are in a constant order, a second analysis was run that ignored the manipulated factors and included scenario/ order as a repeated factor. A one-way repeated measure ANOVA found a significant scenario/order effect: F (3, 265) = 30.42, p .001, g2 = .10. Over time, respondents were more likely to endorse the risky option. Because the nature of the dilemma scenario and order are confounded, it is impossible to determine whether the significant main effect indicates an order effect or a scenarios effect or a combination of both. The counterbalanced design distributed all 4 combinations of framing and prior experience recall evenly across the four scenario dilemmas. Order and/or scenario effects are independent of the manipulated factors, and thus are included in the error term in
  • 35. Loss Risky Safe Framing Fig. 1 Mean endorsement of risky versus safe responses to cyber threats by gain–loss frame and prior near-miss Environ Syst Decis (2013) 33:517–529 521 123 2.3 Discussion The results of Experiment I suggest that respondents’ cyber security recommendations to their best friend were signif- icantly influenced by the personal experience recall manipulation. More specifically, respondents who recalled a near-miss experience were more likely to advise their best friend to avoid making the risky cyber choice com- pared with their no recall counterpart. This finding is consistent with Tinsley et al. (2012) definition of a vul- nerable near-miss—an ‘‘almost happened’’ event that
  • 36. makes individuals feel vulnerable and, in turn, leads to a greater likelihood of endorsing the safer option. Respondents who recalled a near-miss experience were even more likely to advise their best friend to take the safer course of action if they also received the gain message. Comparatively, the loss frame had a negligible effect on the primed recall prior experience manipulation. That is, respondents who received the loss frame were as likely to recommend the risk averse course of action to their best friend regardless of whether their prior experience was a near-miss or not. This finding suggests that people will be more risk averse when they are exposed either to a recall of a prior near-miss and/or a loss frame. The combination of no prior recall of a near-miss and a gain frame did produce less risk averse responses. This suggests a highly interac- tive, synergistic effect, in which the frame and the near- miss recall substitute for each other. In addition, sex and prior victimization were found to
  • 37. have no moderating effect on the relationship between cyber dilemma responses and the two manipulated variables. Cyber dilemma decision making was found to significantly vary by respondents’ sex, but not by self-reported victim- ization. The results suggest that females make more pro- tective decisions when faced with risky cyber dilemmas compared with males. This pattern has been replicated in cyber research in an experiment of online shopping services where males demonstrated a greater tendency to engage in risky behavior online (Milne et al. 2009). Disaster risk per- ception studies also have shown that risks tend to be judged higher by females (Flynn et al. 1994; Bateman and Edwards 2002; Kung and Chen 2012; Bourque et al. 2012) and that females tend to have a stronger desire to take preventative and preparedness measures compared with males (Ho et al. 2008; Cameron and Shah 2012). 3 Experiment II The primary purpose of Experiment II was to expand the
  • 38. primed recall prior experience manipulation to compare three prior cyber experiences: a near-miss, a false alarm, and a hit involving a loss of data. The prior cyber experi- ence recall prime for Experiment II involved experiences of a good friend, rather than the respondents’ past experi- ences (used in Experiment I). We also posed all questions using a loss frame to enhance the ecological validity of the cyber dilemmas posed, the consequences of which are naturally perceived as losses from a status quo. The dependent variable was also changed for Experiment II. Each respondent was asked to report whether they would select the safe or risky option in response to their own cyber dilemma, as opposed to providing advice to their best friend involved in a risky cyber dilemma as in Experiment I. One interpretation of the finding from Experiment I that respondents generally favored the safe option was that they were possibly more risk averse in advising a friend com- pared to how they would respond to their own cyber
  • 39. dilemma. By posing the dilemma in the first person, we sought to characterize how respondents would be likely to respond when facing a cyber dilemma. The cyber dilemmas were also described in a more concrete fashion for Experiment II, including a ‘‘screenshot’’ of the dilemma facing the respondent. 3.1 Method 3.1.1 Design overview In Experiment II, three cyber dilemmas were constructed to evaluate respondents’ risky choice behavior using one manipulated variable, recall of a friend’s false alarm, near- miss or hit experience. In addition, six individual differ- ence variables were included in the design: sex, age, income, education, job domain, and self-reported victim- ization. Each participant received all three dilemmas in a constant order. Each of the three primed recall prior cyber experiences was paired with one of the three scenarios in a counterbalanced design such that each of the cyber
  • 40. dilemmas appeared in each of the three treatment condi- tions with equal frequency. After each cyber dilemma, respondents were asked to respond on a 6-point scale (1 = strongly disagree to 6 = strongly agree) regarding their intention to ignore the warning and proceed with the riskier course of action. Following all three cyber dilemmas, respondents were given three attention check questions related to the nature of each dilemma. Respondents also were asked to provide basic demographic information and answer a series of questions about their experience with computers and cyber dilemmas, such as their experience with purchasing from a fraudulent online store, being locked out from an online account, or having unauthorized withdrawals made from their online banking account. 522 Environ Syst Decis (2013) 33:517–529 123
  • 41. 3.1.2 Scenarios and manipulations The three cyber dilemma scenarios involved the threat of causing serious damage to the respondents’ computer as a result of downloading a music file, installing a plug-in for an online game, and downloading a media player to legally stream videos. The scenarios were written to share the same possible negative outcome—the computer’s operat- ing system crashes, resulting in an unusable computer until repaired. Establishing uniformity of consequences across the three scenarios reduced potential unexplained variance across the three levels of the manipulated variable. Experiment II also included screenshots of ‘‘pop-up’’ window images similar to those that would appear on the computer display when the cyber dilemma is presented. These images were intended to make the scenarios more concrete and enhance the realism of the cyber dilemma scenarios. Primed recall of a friend’s prior cyber experience was
  • 42. the only manipulated variable in this experiment. Respondents either recalled their friend’s near-miss, false alarm or hit experience before deciding whether to select the safe or risky option in response to the described cyber dilemma. All potential outcomes were presented in a loss frame, with wording held constant except for details spe- cific to the scenario under consideration. For example, the wording of the loss frame for the hit outcome of the download a music file scenario was ‘‘She pressed ‘allow access’ and her computer immediately crashed. She ended up having to wipe the computer’s hard drive clean and to reinstall the operating system.’’ The only modification made for the installation of the plug-in scenario was switching the words ‘‘allow access’’ to ‘‘run.’’ A complete description of the scenarios, including the primed recall of the friend’s prior experiences, is provided in Table 3. 3.1.3 Subjects Three hundred and seventy-six US residents were recruited
  • 43. through Amazon Mechanical Turk (AMT) to participate in the experiment. Researchers have assessed the representa- tiveness of AMT samples compared with convenience samples found locally and found AMT samples to be representative (Buhrmester et al. 2011; Mason and Suri 2012; Paolacci et al. 2010) and ‘‘significantly more diverse than typical American college samples’’ (Buhrmester et al. 2011). Each respondent earned $1 for completion of the experiment. After removing respondents who did not answer all three of the attention check questions correctly or completed the experiment in less than 7 min, the sample consisted of 247 respondents. Five additional respondents skipped questions, resulting in a final sample size of N = 242. Table 4 includes a summary of sample characteristics, including sex, age, income, education, job domain, and self-reported victimization. Self-reported victimization is defined in terms of experiences with four types of negative cyber events: (1) getting a virus on an
  • 44. electronic device, (2) purchasing from a fraudulent online store, (3) being locked out from an online account, or (4) having unauthorized withdrawals made from their online banking account. Respondents also responded to a number of experience questions that are summarized in Table 5 as additional detail about the study sample. 3.2 Results A mixed model ANOVA with one within-subject factor (primed recall of a prior experience) and six individual difference variables as between-subject factors were used. This model included only the seven main effects and the six 2-way interactions involving the manipulated within- subject variable and each of the six between-subject vari- able. Preliminary data screening was done; q–q plots showed the scores on the repeated measures variable, prior salient experience, to have an approximately normal distribution. 2
  • 45. Results show that the primed recall prior experience manipulation had a significant effect on how respondents intended to respond to the cyber dilemmas, F (1, 231) = 31.60, p .00, g2 = .12. Moreover, post hoc comparisons using the least significant difference (LSD) test indicate that the mean score for the false-alarm con- dition (M = 3.65, SD = 0.11) was significantly different from the near-miss condition (M = 2.97, SD = 0.11) with p .01, and the hit condition (M = 2.34, SD = 0.11) significantly differed from the near-miss and false-alarm conditions with p .01. This suggests that respondents who received a description of a friend’s near-miss experi- ence recall preferred the safer, risk averse option compared with respondents who were primed to recall a friend’s prior false-alarm experience. Respondents were found to be even more likely to select the safe option when they were primed to recall a friend’s prior hit experience. As displayed in Fig. 2, the positive means for the false-alarm condition indicate that respondents were more likely to engage in risky behavior compared with the negative means for the
  • 46. near-miss and hit conditions. The analysis also included both main effects and inter- action terms for six different subject variables, including 2 As in Exp I, a one-way repeated measure ANOVA shows there is a significant scenario/order effect: F (2, 265) =4.47, p = .035, g2 = .02. Over time and/or scenario, respondents were more likely to endorse the risky option. However, as in Experiment I, it is difficult to determine whether the main effect is for the scenarios or the order effect. The study design we used overcame this limitation by using a counterbalanced design. Environ Syst Decis (2013) 33:517–529 523 123 T a
  • 137. o u r c o m p u te r’ ’ 524 Environ Syst Decis (2013) 33:517–529 123 sex, age, level of education, income level, job domain, and self-reported victimization. For the purpose of analysis, age was collapsed into three levels: 18–29, 30–39, and 40 years and older; education level was collapsed into three cate- gories: high school and 2-year college, 4-year college, and master’s degree or higher; and annual income level was collapsed into three categories: below $30,000/year,
  • 138. $30,000–$59,999/year, and $60,000/year and more. The results of the ANOVA indicated there was a sig- nificant main effect for age: F (2, 231) = 4.9, p = .01, g2 = .04, and no significant main effects for sex, education, income, job domain, and self-reported victim- ization. Figure 2 suggests that younger respondents com- pared with older respondents were more likely to choose the riskier option in cyber dilemmas across all 3 levels of the primed prior recall experience manipulation. Results also showed a significant interaction effect between income and the primed prior recall experience manipulation: F (2, 231) = 3.40, p = .01, g2 = .03. Fig- ure 3 indicates that respondents with higher income levels (greater than $60 K per year) were less sensitive to the primed recall of a friend’s experience. There was no Table 4 Demographic information for AMT respondents Demographic variable (N = 242) Variable response category Number and percentage of sample Sex Male 108 (44.6 %)
  • 139. Female 134 (55.4 %) Highest level of education High school 65 (26.9 %) 2-year college 38 (15.7 %) 4-year college 102 (42.1 %) Master’s degree 30 (12.4 %) Professional (e.g., M.D., Ph.D., J.D.) degree 7 (2.9 %) Personal gross annual income range Below $20,000/year 66 (27.3 %) $20,000–$29,999/year 31 (12.8 %) $30,000–$39,999/year 35 (14.5 %) $40,000–$49,999/year 28 (11.6 %) $50,000–$59,999/year 15 (6.2 %) $60,000–$69,999/year 23 (9.5 %) $70,000–$79,999/year 13 (5.4 %) $80,000–$89,999/year 10 (4.1 %) $90,000/year or more 21 (8.7 %)
  • 140. Does your work relate to technology? I use computers normally but my work has nothing to do with technology. 172 (71.1 %) My work is about technology 70 (28.9 %) Victim of getting a virus on an electronic device Yes 165 (68.2 %) No 77 (31.8 %) Victim of purchasing from a fake online store Yes 15 (6.2 %) No 221 (91.3 %) I don’t shop online 6 (2.5 %) Victim of failure to log into an online account Yes 85 (35.1 %) No 157 (64.9 %) Victim of unauthorized withdrawals from an online banking account Yes 44 (18.2 %) No 198 (81.8 %) Overall self-reported victimization None 46 (19.0 %) One type 104 (43.0 %) Two or more types 92 (38.0 %)
  • 141. Age (years) Range 18–75 Percentiles 25th 27 50th 33 75th 44 Environ Syst Decis (2013) 33:517–529 525 123 significant interaction effect between the manipulation and the other five individual difference variables, including sex: F (1, 231) 1, age: F (2, 231) = 1.84, p = .12, g2 = .02, education: F (2, 231) 1, job domain, F (1, 231) = 2.01, p = .14, g2 = .01, and self-reported victimization, F (2, 231) = 2.03, p = .09, g2 = .02. 3.3 Discussion Responses to risky cyber dilemmas in Experiment II were significantly predicted by the primed recall of a friend’s prior cyber experience. Consistent with our hypotheses, the more negative the consequence associated with the prior cyber experience, the more likely the respondents were to
  • 142. choose the safer course of action. In particular, respondents who were primed to recall a prior near-miss or hit event interpreted the experience as a sign of vulnerability com- pared with the recall of a prior false alarm and, in turn, were more likely to promote more conservative (safe) endorsements of actions. In the case of false alarms, our findings suggest that respondents were more likely to endorse the risky alternative. -2 -1.5 -1 -0.5 0 0.5 1 False-alarm Near-miss Hit M e a n
  • 144. o n Salient Prior Experience Endorsement by Age 18 - 29 years old 30 - 39 years old 40 years old and older Risky Safe Age Fig. 2 Mean endorsement of risky versus safe responses to cyber threats by primed recall of friend’s prior experience and age M e a n E n d o
  • 146. Income Fig. 3 Mean endorsement of risky versus safe responses to cyber threats by primed recall of friend’s prior experience and income level Table 5 Cyber-related responses for AMT respondents Questions Response category Number and percentage of sample Personal computer PC 213 (88.0 %) Mac 28 (11.6 %) Do not have a personal computer 1 (0.4 %) Smartphone iOS 67 (27.6 %) Android 95 (39.3 %) Do not have a
  • 147. smartphone 80 (33.1 %) Protection software Yes 211 (87.2 %) No 31 (12.8 %) Have you ever downloaded free music, an e-book, a movie, or a television show from an unfamiliar website found through a Google search? Yes 135 (55.8 %) No 107 (44.2 %) How often do you access your social networking accounts (Facebook, Twitter, Myspace, MSN, Match.com, etc.)? Every day 150 (62.0 %) Once a week 35 (14.5 %) Once a month 8 (3.3 %)
  • 148. 2-3 times a month 10 (4.1 %) Every couple months 14 (5.8 %) Once a year 4 (1.7 %) Never 21 (8.7 %) Have you ever clicked on an advertisement and a window popped up saying something along the lines of ‘‘Congratulations, you are eligible to win an iPad!’’? Yes 122 (50.4 %) No 120 (49.6 %) Have you ever clicked on a link in a suspicious email (e.g., an
  • 149. email in a different language, with an absurd subject)? Yes 32 (13.2 %) No 210 (86.8 %) 526 Environ Syst Decis (2013) 33:517–529 123 In addition, endorsement of safe versus risky resolutions to the cyber dilemmas varied by respondents’ age, regardless of the primed recall of a friend’s prior experi- ence. Middle-aged and older respondents were more likely to endorse the safe choice option compared with younger respondents. Research on age differences is inconsistent in the domain of cyber security related to privacy (Hoofnagle et al. 2010), risk of data loss from a cyber threat (Howe et al. 2012—‘‘The psychology of security for the home computer user’’ in Proceedings of 2012 IEEE Symposium on the Security and Privacy) or fear of a cyber threat
  • 150. (Alshalan 2006). Our findings suggest that younger indi- viduals’ extensive use and dependence on computers for daily activities may result in the association of a greater cost with being risk averse in response to cyber dilemmas. Younger individuals’ familiarity with computers likely makes it easier for them to determine whether a cyber dilemma is a real threat or a computer’s standard warning message. In the same vein, their familiarity with computers may also lead to a greater awareness of a major cyber dilemma being a small probability event, the consequences of which are likely to be repairable. Ultimately, younger individuals do not perceive the unsafe option as overly risky compared with the safe option. Respondents’ income was also found to moderate the effect of the primed recall of a friend’s prior experience on respondents’ endorsement of safe versus risky options. Of the three income levels, the wealthiest respondents were the least sensitive to variations in the
  • 151. primed recall of a friend’s prior cyber experience. In the literature on cyber security, only a significant main effect for income is reported. In a 2001 presentation by Tyler, Zhang, Southern and Joiner at the IACIS Con- ference, the research team reported findings suggesting that higher income individuals have a lower probability of considering e-commerce to be safe and therefore avoid e-commerce transactions. Similarly, in a study by Downs et al. (2008), respondents from more affluent areas were reported to update their anti-virus program more frequently than respondents from poorer areas, further validating the tendency toward risk averse cyber behavior for higher income individuals. Our finding suggests that wealthier respondents were not as impacted compared with the low and medium income respondents by the primed prior recall experience manipulation because they can afford to be riskier. Their wealth allows them to have access to enhanced
  • 152. baseline security measures. This creates a sense that they are exempt from risks that apply to others and for this reason, do not need to pay much attention to the primed prior recall experiences and consequences. Interestingly, there were no significant main effects or interactions for the remaining four individual difference variables, including sex, education, work domain, or previous cyber victimization. The absence of main effects for five of the six individual difference variables suggests that respondents’ cyber dilemma decisions are determined more by recall of prior cyber-related experiences, and not by background of the decision maker, with the sole exception of respondent age. The absence of interaction effects for five of the six individual difference variables suggests that the effect of primed recall of a prior expe- rience is robust; respondent income was the sole moder- ator identified. 4 Conclusion
  • 153. Experiments I and II were designed to explore how com- puter users’ responses to common cyber dilemmas are influenced by framing and salience of prior cyber experi- ences. Despite using two different dependent variables, the advice the respondent would give to a friend (Experiment I), and how the respondents themselves would respond to cyber dilemmas (Experiment II), the extent to which the two different questions elicit more or less risk averse responses was found to be similar. The results indicate that for prior near-miss experiences (the one manipulation condition included in both experiments), the mean responses were 2.39 and 2.97 for Experiments I and II, respectively. This finding suggests that whether the respondent was making a personal recommendation or providing advice to a friend; the recalled experience manipulation was found to significantly influence the respondent’s endorsement of the safer cyber option. Simi- larly, in prior cyber research, Aytes and Connolly (2004)
  • 154. found that students were more attuned to cyber risks and likely to take action against them when the primary source of information was their own or friends’ experiences with security problems. The one inconsistent finding between the two experi- ments is the effect of respondent sex on risky cyber choice behavior. In Experiment I, females were found to be more risk averse than males, while in Experiment II, sex was found to be unrelated to whether respondents endorsed a risky or safe option. Previous studies are also inconsistent with respect to the role of sex in predicting cyber-related behavior and decision making. At the 2012 Annual Con- ference of the Society for Industrial and Organizational Psychology, Byrne et al. report that women provided slightly higher scores of behavioral intentions to click on a risky cyber link, while Milne et al. (2009) found that males had a greater tendency to engage in risky behaviors online. In the context of security compliance, Downs et al. (2008)
  • 155. report that males were more involved in computer security management, such as updating their anti-virus software and Environ Syst Decis (2013) 33:517–529 527 123 using pop-up blockers, while Herath and Rao (2009) found women to have higher security procedure compliance intentions, but were less likely to act on them. One explanation for our inconsistent results related to sex may be differences in the two populations sampled: college students in Experiment I and a more diverse, AMT sample in Experiment II. College samples tend to be more sex stereotyped, such that risk tends to be judged lower by men than by women, and females tend to have a stronger desire to take preventative and preparedness measures (Harris et al. 2006). This tends to be attributed to their lack of real-world experiences; as evidenced by only a small percentage of the sample, 24 %, have previously experi-
  • 156. enced a cyber dilemma. By these assumptions, males would be expected to be more risk seeking than females in Experiment I. Conversely, the AMT sample consists of older adults with more diverse backgrounds, as evidenced in Table 5, which tends to blur the line between traditional male and female stereotypes. In addition, 80 % of the AMT sample had previously experienced a cyber dilemma, fur- ther suggesting that shared experiences of males and females could lead to the lack of sex differences found in Experiment II. Overall, these two experiments indicate that recall of prior cyber experiences and framing strongly influence individual decision making in response to cyber dilemmas. It is useful to know about how prior experience and framing jointly influence responses to cyber dilemmas. The implications of our findings are that salience of prior negative experiences certainly attenuates risky cyber behavior. We found that this attenuation is greater for gain-framed decisions, and for low-
  • 157. and middle-income respondents. Responses to cyber dilemmas were determined more by proximal variables, such as recall of prior experiences and framing, and were largely robust to individual difference variables, with only a couple of exceptions. Given that safety in the cyber context is an abstract concept, it would be worthwhile to further explore how framing influences cyber dilemma decision making. Additionally, this research design could be used to evaluate differences across cyber dilemma contexts to examine the robustness of the relationships identified in our research. Such further research is warranted to better understand how individual users respond to cyber dilemmas. This infor- mation would be useful to cyber security policymakers faced with the task of designing better security systems, including computer displays and warning messages rele- vant to cyber dilemmas. Acknowledgments This research was supported by the U.S.
  • 158. Department of Homeland Security (DHS) through the National Center for Risk and Economic Analysis of Terrorism Events. However, any opinions, findings, conclusions, and recommendations in this article are those of the authors and do not necessarily reflect the views of DHS. We would like to thank Society for Risk Analysis (SRA) conference attendees for their feedback on this work at a session at the 2012 SRA Annual Meeting in San Francisco. We would also thank the blind reviewers for their time and comments, as they were extremely valuable in developing this paper. References Acquisti A, Grossklags J (2007) What can behavioral economics teach us about privacy. In: Acquisti A, Gritzalis S, Lambrino- udakis C, Vimercati S (eds) Digital privacy: theory, technologies and practices. Auerbach Publications, Florida, pp 363–377 Alshalan A (2006) Cyber-crime fear and victimization: an
  • 159. analysis of a national survey. Dissertation, Mississippi State University Aytes K, Connolly T (2004) Computer security and risky computing practices: a rational choice perspective. J Organ End User Comput 16:22–40 Barnes LR, Gruntfest EC, Hayden MH, Schultz DM, Benight C (2007) False alarms and close calls: a conceptual model of warning accuracy. Weather Forecast 22:1140–1147 Bateman JM, Edwards B (2002) Gender and evacuation: a closer look at why women are more likely to evacuate for hurricanes. Nat Hazard Rev 3:107–117 Bourque LB, Regan R, Kelley MM, Wood MM, Kano M, Mileti DS (2012) An examination of the effect of perceived risk on preparedness behavior. Environ Behav 45:615–649 Breznitz S (2013) Cry wolf: the psychology of false alarms. Psychology Press, Florida Buhrmester M, Kwang T, Gosling SD (2011) Amazon’s
  • 160. Mechanical Turk: a new source of inexpensive, yet high-quality, data? Perspect Psychol Sci 6:3–5 Cameron L, Shah M (2012) Risk-taking behavior in the wake of natural disasters. IZA Discussion Paper No. 6756. http://ssrn. com/abstract=2157898 Dillon RL, Tinsley CH, Cronin M (2011) Why near-miss events can decrease an individual’s protective response to hurricanes. Risk Anal 31:440–449 Donner WR, Rodriguez H, Diaz W (2012) Tornado warnings in three southern states: a qualitative analysis of public response patterns. J Homel Secur Emerg Manage 9:1547–7355 Dow K, Cutter SL (1998) Crying wolf: repeat responses to hurricane evacuation orders. Coast Manage 26:237–252 Downs DM, Ademaj I, Schuck AM (2008) Internet security: who is leaving the ‘virtual door’ open and why? First Monday 14.
  • 161. doi:10.5210%2Ffm.v14i1.2251 Flynn J, Slovic P, Mertz CK (1994) Gender, race, and perception of environmental health risks. Risk Anal 14:1101–1108 Garg V, Camp J (2013) Heuristics and biases: implications for security design. IEEE Technol Soc Mag 32:73–79 Harris C, Jenkins M, Glaser D (2006) Gender differences in risk assessment: why do women take fewer risks than men? Judgm Decis Mak 1:48–63 Helander MG, Khalid HM (2000) Modeling the customer in electronic commerce. Appl Ergon 31:609–619 Herath T, Rao HR (2009) Encouraging information security behaviors in organizations: role of penalties, pressures and perceived effectiveness. Decis Support Syst 47:154–165 Ho MC, Shaw D, Lin S, Chiu YC (2008) How do disaster characteristics influence risk perception? Risk Anal 28:635–643 Hoofnagle C, King J, Li S, Turow J (2010) How different are young
  • 162. adults from older adults when it comes to information privacy attitudes and policies? April 14, 2010. http://ssrn.com/abstract= 1589864 528 Environ Syst Decis (2013) 33:517–529 123 http://ssrn.com/abstract=2157898 http://ssrn.com/abstract=2157898 http://dx.doi.org/10.5210%2Ffm.v14i1.2251 http://ssrn.com/abstract=1589864 http://ssrn.com/abstract=1589864 Kahneman D, Tversky A (1979) Prospect theory: an analysis of decision under risk. Econom J Econom Soc 47:263–291 Kung YW, Chen SH (2012) Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience. Risk Anal 32:1535–1546 Kunreuther H, Pauly M (2004) Neglecting disaster: why don’t people insure against large losses? J Risk Uncertain 28:5–21 Mason W, Suri S (2012) Conducting behavioral research on Amazon’s Mechanical Turk. Behav Res Methods 44:1–23
  • 163. Milne GR, Labrecque LI, Cromer C (2009) Toward an understanding of the online consumer’s risky behavior and protection practices. J Consum Aff 43:449–473 Paolacci G, Chandler J, Ipeirotis P (2010) Running experiments on Amazon Mechanical Turk. Judgm Decis Mak 5:411–419 Shankar V, Urban GL, Sultan F (2002) Online trust: a stakeholder perspective, concepts, implications, and future directions. J Stra- teg Inf Syst 11:325–344 Siegrist M, Gutscher H (2008) Natural hazards and motivation for mitigation behavior: people cannot predict the affect evoked by a severe flood. Risk Anal 28:771–778 Simmons KM, Sutter D (2009) False alarms, tornado warnings, and tornado casualties. Weather Clim Soc 1:38–53 Slovic P, Peters E, Finucane ML, MacGregor DG (2005) Affect,
  • 164. risk, and decision making. Health Psychol 24:S35–S40 Tinsley CH, Dillon RL, Cronin MA (2012) How near-miss events amplify or attenuate risky decision making. Manage Sci 58:1596–1613 Tversky A, Kahneman D (1986) Rational choice and the framing of decisions. J Bus 59:S251–S278 Verendel V (2008) A prospect theory approach to security. Technical Report No. 08-20. Sweden. Department of Computer Science and Engineering, Chalmers University of Technology/Goteborg University. http://citeseerx.ist.psu.edu/viewdoc/download?doi= 10.1.1.154.9098&rep=rep1&type=pdf Environ Syst Decis (2013) 33:517–529 529 123 http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.154.9 098&rep=rep1&type=pdf http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.154.9 098&rep=rep1&type=pdfHeuristics and biases in cyber security dilemmasAbstractIntroductionExperiment IMethodDesign
  • 165. overviewScenarios and manipulationsSubjectsResultsDiscussionExperiment IIMethodDesign overviewScenarios and manipulationsSubjectsResultsDiscussionConclusionAcknowledg mentsReferences ww.sciencedirect.com c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1 Available online at w journal homepage: www.elsevier.com/locate/cose Leveraging behavioral science to mitigate cyber security risk Shari Lawrence Pfleeger a,1, Deanna D. Caputo b,* a Institute for Information Infrastructure Protection, Dartmouth College, 4519 Davenport St., NW, Washington, DC 20016, USA b MITRE Corporation, 7515 Colshire Drive, McLean, VA 22102-7539, USA a r t i c l e i n f o Article history: Received 16 August 2011 Received in revised form 21 November 2011 Accepted 22 December 2011 Keywords: Cyber security
  • 166. Cognitive load Bias Heuristics Risk communication Health models * Corresponding author. Tel.: þ1 703 983 384 E-mail addresses: [email protected] 1 Tel.: þ1 603 729 6023. 0167-4048/$ e see front matter ª 2012 Publi doi:10.1016/j.cose.2011.12.010 a b s t r a c t Most efforts to improve cyber security focus primarily on incorporating new technological approaches in products and processes. However, a key element of improvement involves acknowledging the importance of human behavior when designing, building and using cyber security technology. In this survey paper, we describe why incorporating an under- standing of human behavior into cyber security products and processes can lead to more effective technology. We present two examples: the first demonstrates how leveraging behavioral science leads to clear improvements, and the other illustrates how behavioral
  • 167. science offers the potential for significant increases in the effectiveness of cyber security. Based on feedback collected from practitioners in preliminary interviews, we narrow our focus to two important behavioral aspects: cognitive load and bias. Next, we identify proven and potential behavioral science findings that have cyber security relevance, not only related to cognitive load and bias but also to heuristics and behavioral science models. We conclude by suggesting several next steps for incorporating behavioral science findings in our technological design, development and use. ª 2012 Published by Elsevier Ltd. 1. Introduction create a cyber environment that provides users with all of the “Only amateurs attack machines; professionals target people.” (Schneier, 2000) What is the best way to deal with cyber attacks? Cyber security promises protection and prevention, using both innovative technology and an understanding of the human user. Which aspects of human behavior offer the most
  • 168. promise in making cyber security processes and products more effective? What role should education and training play? How can we encourage good security practices without unnecessarily interrupting or annoying users? How can we 6. outh.edu (S.L. Pfleeger), d shed by Elsevier Ltd. functionality they need without compromising enterprise or national security? We investigate the answers to these ques- tions by examining the behavioral science literature to iden- tify behavioral science theories and research findings that have the potential to improve cyber security and reduce risk. In this paper, we report on our initial findings, describe several behavioral science areas that offer particularly useful appli- cations to security, and describe how to use them in a general risk-reduction process. The remainder of this paper is organized in five sections. Section 2 describes some of the problems that a technology- alone solution cannot address. Section 3 explains how we used a set of scenarios to elicit suggestions about the behav-
  • 169. iors of most concern to technology designers and users. [email protected] (D.D. Caputo). mailto:[email protected] mailto:[email protected] www.sciencedirect.com/science/journal/01674048 www.elsevier.com/locate/cose http://dx.doi.org/10.1016/j.cose.2011.12.010 http://dx.doi.org/10.1016/j.cose.2011.12.010 http://dx.doi.org/10.1016/j.cose.2011.12.010 c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1598 Sections 4 and 5 highlight several areas of behavioral science with demonstrated and potential relevance to security tech- nology. Finally, Section 6 suggests possible next steps toward inclusion of behavioral science in security technology’s design, construction and use. 3 See the First Interdisciplinary Workshop on Security and Human Behavior, described at http://www.schneier.com/blog/ archives/2008/06/security_and_http://www.cl.cam.ac.uk/wrja14/ shb08.html. 4 See workshop papers at http://www.informatik.uni-trier.de/ wley/db/conf/itrust/itrust2006.html. 5 The National Science Foundation program is interested in the connections between social science and cyber security. It has announced a new program that encourages computer scientists 2. Why technology alone is not enough
  • 170. The media frequently express the private sector’s concern about liability for cyber attacks and its eagerness to minimize risk. The public sector has similar concerns, because aspects of everyday life (such as operation and defense of critical infrastructure, protection of national security information, and operation of financial markets) involve both government regulation and private sector administration.2 The govern- ment’s concern is warranted: the Consumer’s Union found that government was the source of one-fifth of the publicly- reported data breaches between 2005 and mid-2008 (Consumer’s Union, 2008). The changing nature of both tech- nology and the threat environment makes the risks to infor- mation and infrastructure difficult to anticipate and quantify. Problems of appropriate response to cyber incidents are exacerbated when security technology is perceived as an obstacle to the user. The user may be overwhelmed by diffi- culties in security implementation, or may mistrust, misinter- pret or override the security. A recent study of users at Virginia
  • 171. Tech illustrates the problem (Virginia Tech, 2011). Bellanger et al. examined user attitudes and the “resistance behavior” of individuals faced with a mandatory password change. The researchers found that, even when passwords were changed as required, the changes were intentionally delayed and the request perceived as being an unnecessary interruption. “People are conscious that a password breach can have severe consequences, but it does not affect their attitude toward the security policy implementation.” Moreover, “the more tech- nical competence respondents have, the less they favor the policy enhancement. .In a voluntary implementation, that competence may be a vector of pride and accomplishment. In a mandatory context, the individual may feel her competence challenged, triggering a negative attitude toward the process.” In the past, solutions to these problems have ranged from strict, technology-based control of computer-based human behavior (often with inconsistent or sometimes rigid enforcement) to comprehensive education and training of
  • 172. system developers and users. Neither extreme has been particularly successful, but recent studies suggest that a blending of the two can lead to effective results. For example, the U.K. Office for Standards in Education, Chil- dren’s Services and Skills (Ofsted) evaluated the safety of online behavior at 35 representative schools across the U.K. “Where the provision for e-safety was outstanding, the schools had managed rather than locked down systems. In the best practice seen, pupils were helped, from a very early age, to assess the risk of accessing sites and therefore gradually to acquire skills which would help them adopt safe practices even when they were not supervised.” (Ofsted, 2010) In other 2 See, for example, http://www.cbsnews.com/video/watch/? id¼5578986n&tag¼related;photovideo. words, the most successful security behaviors were exhibited in schools where students were taught appropriate behaviors and then trusted to behave responsibly. The Ofsted report likens the approach to teaching children how to cross the road safely, rather than relying on adults to accompany the chil- dren across the road each time.
  • 173. This approach is at the core of our research. Our over- arching hypothesis is that, if humans using computer systems are given the tools and information they need, taught the meaning of responsible use, and then trusted to behave appropriately with respect to cyber security, desired outcomes may be obtained without security’s being perceived as onerous or burdensome. By both understanding the role of human behavior and leveraging behavioral science findings, the designers, developers and maintainers of information infra- structure can address real and perceived obstacles to produc- tivity and provide more effective security. These behavioral changes take time, so plans for initiating change should include sufficient time to propose the change, implement it, and have it become part of the culture or common practice. Other evidence (Predd et al., 2008; Pfleeger et al., 2010) is beginning to emerge that points to the importance of under- standing human behaviors when developing and providing cyber security.3 There is particular interest in using trust to
  • 174. mitigate risk, especially online. For example, the European Union funded a several-year, multi-disciplinary project on online trust (iTrust),4 documenting the many ways that trust can be created and broken. Now, frameworks are being developed for analyzing the degree to which trust is built and maintained in computer applications (Riegelsberger et al., 2005). More broadly, a rich and relevant behavioral science literature addresses critical security problems, such as employee deviance, employee compliance, effective decision- making, and the degree to which emotions (Lerner and Tiedens, 2006) or stressful conditions (Klein and Salas, 2001) can lead to riskier choices by decision-makers.5 At the same time, there is much evidence that technological advances can have unintended consequences that reduce trust or increase risk (Tenner, 1991). For these reasons, we conclude that it is important to include the human element when designing, building and using critical systems. To understand how to design and build systems that
  • 175. encourage users to act responsibly when using them, we iden- tified two types of behavioral science findings: those that have already been shown to demonstrate a welcome effect on cyber security implementation and use, and those with potential to have such an effect. In the first case, we documented the rele- vant findings, so that practitioners and researchers can and social scientists to work together (Secure and Trustworthy Cyberspace, described at http://www.nsf.gov/pubs/2012/nsf12503/ nsf12503.htm?WT.mc_id¼USNSF_25&WT.mc_ev¼click). http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t ag%3drelated;photovideo http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t ag%3drelated;photovideo http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t ag%3drelated;photovideo http://www.cbsnews.com/video/watch/%3fid%3d5578986n%26t ag%3drelated;photovideo http://www.schneier.com/blog/archives/2008/06/security_and_ht tp://www.cl.cam.ac.uk/%7Erja14/shb08.html http://www.schneier.com/blog/archives/2008/06/security_and_ht tp://www.cl.cam.ac.uk/%7Erja14/shb08.html http://www.schneier.com/blog/archives/2008/06/security_and_ht tp://www.cl.cam.ac.uk/%7Erja14/shb08.html http://www.schneier.com/blog/archives/2008/06/security_and_ht tp://www.cl.cam.ac.uk/%7Erja14/shb08.html http://www.informatik.uni- trier.de/%7Eley/db/conf/itrust/itrust2006.html http://www.informatik.uni-
  • 176. trier.de/%7Eley/db/conf/itrust/itrust2006.html http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m c_id%3dUSNSF_25%26WT.mc_ev%3dclick http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m c_id%3dUSNSF_25%26WT.mc_ev%3dclick http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m c_id%3dUSNSF_25%26WT.mc_ev%3dclick http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm%3fWT.m c_id%3dUSNSF_25%26WT.mc_ev%3dclick http://dx.doi.org/10.1016/j.cose.2011.12.010 http://dx.doi.org/10.1016/j.cose.2011.12.010 c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1 599 determine which approaches are most applicable to their environment. In the second case, we are designing a series of studies to test promising behavioral science results in a cyber security setting; setting with the goal of determining which results (with associated strategies for reducing or mitigating the behavioral problems they reflect) are the most effective. However, applying behavioral science findings to cyber security problems is an enormous undertaking. To maximize the likely effectiveness of outcomes, we used a set of inter- views to elicit practitioners’ opinions about behaviors of concern, so that we could focus on those perceived as most
  • 177. significant. We describe the interviews and results in Section 3. These findings suggest hypotheses about the role of behavior in addressing cyber security issues. 3. Identifying behavioral aspects of security Designers and developers of security technology can leverage what is known about people and their perceptions to provide more effective security. A former Israeli airport security chief said, “I say technology should support people. And it should be skilled people at the center of our security concept rather than the other way around” (Amos, 2010). To implement this kind of human-centered security, technologists must understand the behavioral sciences as they design, develop and use technology. However, trans- lating behavioral results to a technological environment can be a difficult process. For example, system designers must address the human elements obscured by computer media- tion. Consumers making a purchase online trusts that the merchant represented by the website is not simply taking
  • 178. their money, but also is fulfilling its obligation to provide goods in return. The consumer infers the human involvement of the online merchant behind the scenes. Thus, at some level, the buyer and seller are humans enacting a transaction enabled by a system designed, developed and maintained by humans. There may be neither actual human contact nor direct knowledge of the other human actors involved, but the transaction process reflects its human counterpart. Preventing or mitigating adverse cyber security incidents requires action at many stages: designing the technology being incorporated in the infrastructure; implementing, testing and maintaining the technology; and using the tech- nology to provide essential products and services. Behavioral science has addressed notions of cyber security in these activities for many years. Indeed, Sasse and Flechais (2005) note that secure systems are socio-technical systems in which we should use an understanding of behavioral science to “prevent users from being the ‘weakest link.’” For example,
  • 179. some behavioral scientists have investigated how trust mechanisms affect cyber security. Others have reported findings related to the design and use of cyber systems, but the relevance and degree of effect have not yet been tested. Some of the linkage between behavioral science and security is specific to certainkinds of systems. For example, Castelfranchi and Falcone (1998, 2002) analyze trust in multi-agent systems from a behavioral perspective. They view trust as having several components, including beliefs that must be held to develop trust (the social context, as described by Riegelsberger et al.(2003)) and relationships to previousexperience (the temporal context of the RiegelsbergereSasseeMcCarthy framework). They use psycho- logical factors to model trust in multi-agent systems. In addition to social and temporal concerns, we add expectations of fulfill- ment, where someone trusting someone or something else expects something in return (Baier, 1986). This behavioral research sheds light on the nature of a user’s expectation and on
  • 180. perceived trustworthiness of technology-mediated interactions and has important implications related to the design of protec- tive systems and processes. Sasse and Flechais (2005) view security from three distinct perspectives: product, process and panorama: � Product. This perspective includes the effect of the security controls, such as the policies and mechanisms on stake- holders (e.g., designers, developers, users). The controls involve requirements affecting physical and mental work- load, behavior, and cost (human and financial). Users trust the product to maintain security while getting the primary task done. � Process. This aspect addresses how security decisions are made, especially in early stages of requirements-gathering and design. The process should allow the security mecha- nisms to be “an integral part of the design and development of the system, rather than being ‘added on’” (Sasse and Flechais, 2005). Because “mechanisms that are not employed in practice, or that are used incorrectly, provide
  • 181. little or no protection,” designers must consider the impli- cations of each mechanism on workload, behavior and workflow (Sasse and Flechais, 2005). From this perspective, the stakeholders must trust the process to enable them to make appropriate and effective decisions, particularly about their primary tasks � Panorama. This aspect describes the context in which the security operates. Because security is usually not the primary task, users are likely to “look for shortcuts and workarounds, especially when users do not understand why their behavior compromises security. .A positive security culture, based on a shared understanding of the importance of security. is the key to achieving desired behavior” (Sasse and Flechais, 2005). From this perspective, the user views security mechanisms as essential even when they seem intrusive, limiting, or counterproductive. 3.1. Scenario creation Because the infrastructure types and threats are vast, we used interview results to narrow our investigation to those behav-
  • 182. ioral science areas with demonstrated or likely potential to enhance an actor’s confidence in using any information infrastructure. To guide our interviews, we worked with two dozen U.S. government and industry employees familiar with information infrastructure protection issues to define three threat scenarios relevant to protecting the information infra- structure. The methodology and resulting analyses were conducted by the paper’s first author and involved five steps: http://dx.doi.org/10.1016/j.cose.2011.12.010 http://dx.doi.org/10.1016/j.cose.2011.12.010 c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1600 � Choosing topics. We chose three security topics to discuss, based on recent events. The combination of the three was intended to represent a (admittedly incomplete but) signif- icant number of typical concerns, the discussion of which would reveal underlying areas ripe for improvement. � Creating a representative, realistic scenario for each topic. Using our knowledge of recent cyber incidents and attacks, we created an attack scenario for each plausible topic, por-
  • 183. traying a cyber security problem for which a solution would be welcomed by industry and government. � Identifying people with decision making authority about cyber security products and usage to interview about the scenarios. We identified people from industry and government who were willing to participate in interviews. � Conducting interviews. Our discussions focused on two questions: Are these scenarios realistic, and how could the cyber security in each situation be improved? � Analyzing the results and their implications. We analyzed the results of these interviews and their implications for our research. 3.1.1. Scenario 1: improving security awareness among builders of information infrastructure Security is rarely the primary task of those who use the information infrastructure. Typically, users seek information, analyze relationships, produce documents, and perform tasks that help them understand situations and take action. Simi- larly, system developers often focus on these primary tasks before incorporating security into an architecture or design.
  • 184. Moreover, system developers often implement security requirements by choosing security mechanisms that are easy to build and test or that meet some other technical system objective (e.g., reliability). Developers rarely take into account the usability of the mechanism or the additional cognitive load it places on the user. Scenario 1 describes ways to improve security awareness among system builders so that security is more likely to be useful and effective. Suppose software engineers are designing and building a system to support the creation and transmission of sensitive documents among members of an organization. Many aspects of document creation and transmission are well known, but security mechanisms for evaluating sensitivity, labeling documents appropriately and transmitting documents securely have presented difficulties for many years. In our scenario, software engineers are tasked to design a system that solicits information from document creators, modifiers and readers, so that a trust designation can be assigned to
  • 185. each document. Security issues include understanding they types of trust-related information needed, determining the role of a changing threat environment, and defining the frequency at which the trust information should be refreshed and re-evaluated (particularly in light of cyber security inci- dents that may occur during the life of the document). In addition, the software engineers must implement some type of summary trust designation that will have meaning to document creators, modifiers and readers alike. This trust designation, different from the classification of document sensitivity, represents the degree to which both the content and provider (or modifier) can be trusted and for how long. For example, a document about a nation’s emerging military capability may be highly classified (that is, highly sensitive), regardless of whether the information provider is highly trusted (because, for example, he has repeatedly provided highly useful information in the past) or not (because, for example, he frequently provides incorrect or misleading
  • 186. information). There are two important aspects of the software engineers’ security awareness. First, they must be able to select security mechanisms for implementing the trust designation that allow them to balance security with performance and usability requirements. This balancing entails appreciating and accommodating the role of security in the larger context of the system’s intended purpose and multiple uses. Second, the users must be able to trust that the appropriate security mechanism is chosen. Trust means that the mechanism itself must be appropriate to the task. For example, the Biba Integ- rity Model (Biba, 1977), a system of computer security policies expressed as access control rules, is designed to ensure data integrity. The model defines a hierarchy of integrity levels, and then prevents participants from corrupting data of an integrity level higher than the subject, or from being corrupted by data from a level lower than the subject. The Biba model was developed to extend the Bell and La Padula (1973) model,
  • 187. which addresses only data confidentiality. Thus, under- standing and choice of policies and mechanisms are impor- tant aspects in which we trust software engineers to exercise discretion. In addition, software engineers must be able to trust the provenance, correctness and conformance to expectations of the security mechanisms. Here, “provenance” means not only the applicability of the mechanisms and algorithms but also the source of architectural or imple- mentation modules. With the availability of open source modules and product line architectures (see, for example, Clements and Northrup, 2001), it is likely that some parts of some security mechanisms will have been built for a different purpose, often by a different team of engineers. Builders and modifiers of the current system must know to what degree to trust someone else’s modules. 3.1.2. Scenario 2: enhancing situational awareness during a “cyber event” Situational awareness is the degree to which a person or system knows about a threat in the environment. When an
  • 188. emergency is unfolding, the people and systems involved in watching it unfold must determine what has already happened, what is currently happening, and what is likely to happen in the future; then, they make recommendations for reaction based on their situational awareness. The people or systems perceiving the situation have varying degrees of trust in the information they gather and in the providers of that information. When a cyber event is unfolding, information can come from primary sources (such as sensors in process control systems or measurements of network activity) and secondary sources (such as human or automated interpreters of trends). Consider analysts using a computer system that monitors the network of power systems around the United States. The system itself interacts with a network of systems, each of which collects and analyzes data about power generation and distribution stations and their access points. The analysts http://dx.doi.org/10.1016/j.cose.2011.12.010 http://dx.doi.org/10.1016/j.cose.2011.12.010
  • 189. c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1 601 notice a series of network failures around the country: first, a power station in California fails, then one in Missouri, and so on during the first few hours of the event.6 The analysts must determine not only what is really unfolding but also how to respond appropriately. Security and human behavior are involved in many ways. First, the analyst must know whether to trust the information being reported to her monitoring system. For example, is the analyst viewing a failure in the access point or in the monitoring system? Next, the analyst must be able to know when and whether she has enough information to make a decision about which reactions are appropriate. This decision must be made in the context of an evolving situation, where some evidence at first considered trustworthy is eventually determined not to be (and vice versa). Finally, the analyst must analyze the data being reported, form hypotheses about possible causes, and then determine which interpretation of the data to use. For instance, is the sequence of failures the result of incorrect data transmission, a cyber
  • 190. attack, random system failures, or simply the various power companies’ having purchased some of their software from the same vendor (whose system is now failing)? Choosing the wrong interpretation can have serious consequences. 3.1.3. Scenario 3: supporting decisions about trustworthiness of network transactions On Christmas Day, 2009, a Nigerian student flying from Amsterdam to Detroit attempted to detonate a bomb to destroy the plane. Fortunately, the bomb did little damage, and passengers prevented the student from completing his intended task. However, in analyzing why the student was not detected by a variety of airport security screens, it was determined that important information was never presented to the appropriate decision-makers (Baker and Hulse, 2009). This situation forms the core of Scenario 3, where a system queries an interconnected set of databases to find information about a person or situation. In this scenario, an analyst uses an interface to a collection of data repositories, each of which contains information about
  • 191. crime and terrorism. When the analyst receives a warning about a particular person of interest, she must query the repositories to determine what is known about that person. There are many security issues related to this scenario. First, the analyst must determine the degree to which she can trust that all of the relevant information resides in at least one of the connected repositories. After the Christmas bombing attempt, it was revealed that the U.K. had denied a visa request by the student, but information about the denial was not available to the Transportation Security Administration when decisions were made about whether to subject the student to extra security screening. Spira (2010) points out that the problem is not the number of databases; it is the lack of ability to search the entire “federation” of databases. Next, even if the relevant items are found, the most important ones must be visible at the appropriate time. Libicki and Pfleeger (2004) have documented the difficulties in 6 Indeed, at this stage it may not be clear that the event is actually a cyber event. A similar event with similar characteris-
  • 192. tics occurred on August 14, 2003, in the United States. See http:// www.cnn.com/2003/US/08/14/power.outage/index.html. “collecting the dots” before an analyst can take the next step to connect them. If a “dot” is not as visible as it should be, it can be overlooked or given insufficient attention during subsequent analysis. Moreover, Spira (2010) highlights the need for viewing the information in its appropriate context. Third, the analyst must also determine the degree to which each piece of relevant information can be trusted. That is, not only must she know the accuracy and timeliness of each data item, but she also must determine whether the data source itself can be trusted. There are several aspects to this latter degree of trust, such as knowing how frequently the data source provides the information (that is, whether it is old news), knowing whether the data source is trustworthy enough, and whether circumstances may change the source’s trustworthiness. For example, Predd et al. (2008) and Pfleeger et al. (2010) point out the varying types of people with legiti- mate access to systems taking unwelcome action. A trust-
  • 193. worthy insider may become a threat because of a pending layoff or personal problem, inattention or confusion, or her attempt to overcome a system weakness. So the trustworthi- ness of information and sources must be re-evaluated repeatedly and perhaps even forecast based on predictions about a changing environment. Finally, the analyst must also determine the degree to which the analysis is correct. Any analysis involves assump- tions about variables and their importance, as well as the relationships among dependent and independent variables. Many times, it is a faulty assumption that leads to failure, rather than faulty data. 3.2. Analysis of results The three scenarios were intriguing to our interviewees, and all agreed that they were realistic, relevant and important. However, having the interviewees scrutinize the scenarios revealed fewer behavioral insights than we had hoped. In each case, the interviewee viewed each scenario from his or her
  • 194. particular perspective, highlighting only a small portion of the scenario to confirm an opinion he or she held. For example, one of the interviewees used Scenario 3 to emphasize the need for information sharing; another interviewee said that privacy is a key concern, especially in situations like Scenario 2 where significantmonitoring mustbe balanced with protecting privacy. Nevertheless, many of the interviewees had good sugges- tions for shaping the way forward. For instance, one said that there is much to be learned from command and control algorithms, where military actors have learned to deal with risk perception, uncertainty, incomplete information, and the need to make an important decision under extreme pressures. There is rich literature addressing decision-making under pressure, from Ellsberg (1964) through Klein (Klein, 1998, 2009). In particular, Klein’s models of adaptive decision- making may be applicable (Klein and Calderwood, 1991; Klein and Salas, 2001). While the scenario methodology was not a structured idea generation approach, to the extent
  • 195. possible, we endeavored to be unbiased in our interpretation of interviewee responses. We were not trying to gather support for preconceived ideas and were genuinely trying to explore new ideas where behavioral science could be lever- aged to address security issues. http://www.cnn.com/2003/US/08/14/power.outage/index.html http://www.cnn.com/2003/US/08/14/power.outage/index.html http://dx.doi.org/10.1016/j.cose.2011.12.010 http://dx.doi.org/10.1016/j.cose.2011.12.010 c o m p u t e r s & s e c u r i t y 3 1 ( 2 0 1 2 ) 5 9 7 e6 1 1602 There were several messages that emerged from the interviews: � Security is intertwined with the way humans behave when trying to meet a goal or perform a task. The separation of primary task from secondary, as well as its impact on user behavior, was first clearly expressed in Smith et al. (1997) and elaborated in the security realm by Sasse et al. (2002). Our interviews reconfirmed that, in most instances, security is secondary to a user’s primary task (e.g., finding a piece of information, processing a transaction, making a decision).