The human mind evolved to draw quick conclusions for survival. Behavioral economists, like Daniel Kahneman and Dan Ariely, are publishing research on when, why and how decision making can be consistently and predictably irrational. You could say these researchers are reverse engineering the wetware, finding bugs and race conditions and disclosing them.People are key to an organization’s information security, even if you believe in the “people, processes and technology” tripod. People define and execute processes. People decide funding for, implement, operate and/or monitor the technology. Your adversaries are people. At least until we reach the AI singularity, that is.Until then, the aim of this talk is to present some of the counter-intuitive findings of behavioral economics research and their implications for how information security is handled at the organizational and market levels. Our hope is that the audience will find they could benefit from changing established, seemingly sensible and logical actions we all do to better match how the wetware actually works.
Presented at BSides SF on Feb. 28th, 2016.
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Reverse Engineering the Wetware: Understanding Human Behavior to Improve Information Security
1. Reverse Engineering
the Wetware
Understanding Human Behavior to Improve
Information Security
#retww
Alexandre Sieira
@AlexandreSieira
CTO @NiddelCorp
Principal @MLSecProject
Matt Hathaway
@TheWay99
Products Leader @Rapid7
2. Let’s Start On A Very Serious Note
Because we are very serious people. Seriously.
3. People… are not rational.
More evidence is available at @awardsdarwin if you’re still not convinced.
4. The People-Process-Technology Triad
• People define and execute
processes.
• People decide funding, implement,
operate and/or monitor the
technology.
• Your adversaries are people.
• People are key.
5. Yes, people. Really.
• A lot of humans are needed to keep any organization secure
• Executive teams – funding (or lack thereof)
• Security teams – build the process, use the technology
• IT teams – implement controls, operate infrastructure
• Security vendors – ummm… perfect people, of course, right?
• The end users – your organization wouldn’t exist without them
• Blame game achieves nothing
• Executives don’t get it.
• It’s the shitty tech, not us.
• It’s the shitty team, not the tech.
• I took that dumb training, but I want my bonus.
6. Why do Humans Think Like They Do?
• Wetware evolved to meet survival
requirements of hunter-gatherer lifestyle and
threat model:
• Low latency, near-real-time decisions;
• High aversion to loss;
• False positives have much lower cost than false
negatives;
• Social living in small communities where it is easy to
keep track of reputations.
... so not really surprising to find heuristics don’t
work as well now.
7. Why do Humans Think Like They Do?
From ”The Art of Thinking Security Clearly” RSAC USA 2015 presentation by @apbarros
System 1 operates automatically and quickly, with little or no
effort, no sense of voluntary control and no self-awareness.
”System 1 runs the show, that’s the one you want to move.” –
Daniel Kahneman
System 2 allocates attention to the effortful mental activities that
demand it, including complex computations.
Its operations are often associated with the subjective experience of
agency, choice, and concentration.
9. You Are Not Above Average At Everything
• Ever heard of this town? Yeah, me
neither.
• “…all the women are strong, all the
men are good-looking, and all the
children are above average.” –
Garrison Keillor
• I’m a better than average X
• I am better than the sysadmin
who got pwned at RSA.
10. Experts Who Are Actually Amateurs
• Successful lawyers of [major law firm]: I spend all day avoiding traps.
No phishing email is going to work on me.
• Some silly finance team wired $1 million to help the CFO close a last-
minute deal in China. How could they fall for that?
12. Experts Who Feel Like Amateurs
• Can’t get through all your email?
• In awe of a colleague’s
knowledge?
• Hungover and can’t focus?
• Feel underqualified and at risk of
being exposed at any moment?
• Recommended reading: @sroberts “Imposter Syndrome in DFIR” blog
13. “Expert” Reality
• If you don’t think you can be
outwitted, you probably already
are
• If you don’t ever feel like an
imposter, you’re not challenging
yourself to get better
14. Professional Certifications
• Can be objectively useful as signals and
filters in professionals / organizations
market (information asymmetry);
• Getting a certificate causes you to
overestimate your own expertise;
H/T to @fsmontenegro
• Positive bias to vendor and/or other certificate holders:
• “If I went through the effort of certifying on that vendor’s product and I consider myself a
good person, then that vendor must be good too”;
• Endowment effect;
• Might impact vendor selection and/or hiring processes.
15. More Tired == Less Rational
• Tired people are less likely to make rational
decisions:
• Less oversight from system 2;
• Less capacity to avoid and resist temptation.
• Think IT maintenance windows and DFIR:
• Mandatory down time for people involved?
• Follow the sun teams working on their own
mornings?
• References:
• https://hbr.org/2016/02/dont-make-important-
decisions-late-in-the-day
• http://sloanreview.mit.edu/article/why-sleep-is-a-
strategic-resource/
16. You Will Be Judged Unfairly
• Think the team was at fault for
Target/Neiman Marcus breaches?
• February 2014:
• “Hackers Set Off 60,000 Alerts While
Bagging Credit Card Data”
• December 2014:
• Lawsuit not dismissed because of
“disabling certain security features”
17. Your Heart Is Not Scientific
• Always had a “gut feeling” about
something you couldn’t prove?
• Know in your heart that every time
you got sick, you waited too long?
• Maybe you have a specific event
which always seems to reveal the
truth?
18. We Search For Explanations In What We Have
• Ever heard an educated person explain that their team only lost
because they didn’t stick to their ritual?
• What if simply checking the hash against VirusTotal was enough to
find malware the first time, so you kept doing it?
19. Test. Trick. Get Attention. Trigger System 2.
• Still not considering phishing
your employees?
• It might be the only way they’ll
ever think twice
• Incident response exercises
seem cheesy?
• Consider using random data or
fake incidents to go further
21. Why and When do Humans Cheat?
• Rational humans would cheat when cost-
benefit analysis merited it:
• Personal gain from cheating;
• Chance of being caught;
• Penalty if caught.
• Most actual humans cheat when:
• There’s a gain for self or others;
• It’s possible to justify or rationalize;
• ”Fudge factor” model.
22. Dan Ariely - Dishonesty
• ”Distance” makes cheating
easier to justify:
• Moving the golf ball: hand << foot
<< club;
• Same monetary values, different
behaviors:
• Stolen Skype account and charges
against PayPal account
• “Would this person have taken
cash from my wallet?”
23. Cybercrime Feels Victimless
• Just as spending tokens removes
a lot of the pain
• Only writing software or sending
emails prevents criminals from
feeling the guilt
• The vast majority would never
rob a stranger at knifepoint
24. Conflicts of Interest and Self-Deception
• Dentists that purchase CAD/CAM machines
prescribe more unnecessary treatments that use
it;
• Art appreciation under fMRI experiment –
positive bias towards vendors.
• ”Perfectly well-meaning people can get tripped
up on the quirks of the human mind, make
egregious mistakes, and still consider themselves
to be good and moral.”
25. WTH Effect
• The more we perceive ourselves as immoral, the
more immoral behavior we allow ourselves;
• ”Normalization of Deviance” effect over time;
• Pay attention to feedback from outsiders, make
data-driven decisions when possible;
• ”Resetting” events / rituals might help overcome
this: confession, New Year’s resolutions, Truth
and Reconciliation Commission in South Africa.
26. Payback and Altruism
• Coffee shop experiment shows revenge
can be unconscious justification for
dishonesty:
• 45% returned extra money if experimenter
was polite;
• Only 14% if impolite.
• Social utility also a factor:
• ”Robin Hood” effect;
• Impact of not cheating on peers or
underlings due to group quotas and
bonuses.
27. The Value of Good Examples
”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves”, Dan Ariely
28. Priming the Morality Cache
• If you remind people about moral codes or their
responsibility to act correctly cheating is
reduced:
• Honor code;
• Religious commandments;
• Personal commitment.
• Simply changing signature to top of insurance
form instead of bottom increased self-reported
car mileage by 15%;
• Implications for process and UI design.
30. Designing Rules and Incentives
”When the rules are somewhat open to interpretation, when there are
gray areas, and when people are left to score their own performance –
even honorable games may be traps for dishonesty”
”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves” – Dan Ariely
http://www.zdnet.com/article/facebook-engages-in-instagram-bug-spat-with-security-researcher/
31. But At Least We Can Trust
Vendors, Right?
You must be nodding right now. Why else would this section be here?
32. Don’t Trust Statistics… Without Specifics
• “…the malware that was used would have slipped
or probably got past 90 percent of internet
defenses that are out there today in private
industry and [would have] challenged even state
government.”
-- Joseph Demarest, assistant director of the FBI’s cyber
division
• “FBI: 90% Of US Companies Could Be Hacked Just
Like Sony”
-- Business Insider
33. Why Does FUD Marketing Still Exist?
• Ever hear “such a dangerous time we live in”?
• Did you FLY HERE?!
• What if you told your CFO a solution would
prevent stolen HVAC accounts from
authenticating?
34. InfoSec marketing rings a bell?
”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves”, Chapter 7
35. The Experts Again. From Adjacent Fields.
• Ever heard of “green lumber” trading?
• An expert on green lumber would make a
great trader, right?
• NSA retiree == InfoSec expert?
• Intelligence official [asked about Snowden’s
skills]: “It’s 2013 and the NSA is stuck in
2003 technology.”
• Buy our detection because… HD!
36. Always Consider What’s Unsaid
• Would you buy something that
“eliminated 90% of herpes”?
• What if it said “10% of herpes survives”?
• So now “blocks 90% of malware”?
• “Pricing starts at $99”?
37. Find Leadership Who Lets You Admit #FAIL
• Ever feel like it would be a waste to not
finish your meal at a restaurant?
• Think you’ve already committed to this
talk, so you should stay?
• How about “no more money for
detection until the SIEM ROI is proven”?
38. Should You Always Push For ‘Build’ Over ‘Buy’?
• Never give a product consideration
because of a shitty demo?
• IT and Security software UIs have
been terrible for a decade, so it
won’t ever change, right?
• Think your security team can build
better software every time?
39. Human Minds Find Patterns In Randomness
• Confident you can differentiate
between random and malicious?
• Remember iPod shuffle mode
complaints when a song was repeated?
• Manually reviewing logs all day would
probably be best for finding patterns,
though…
40. Look To Take The Conclusion From Humans
• Battling cognitive biases is hard
• We will make the wrong decision often and consistently (as covered here)
• Be aware of this and strive to call yourself out
• Humans are best at deciding what to do
• Identifying symptoms - machines
• Identifying root cause – machines
• Identifying solutions - humans
41. Recommended Reading
• The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves
and everything else by @danariely
• The Black Swan: The Impact of the Highly Improbable and everything else by Nassim
Nicholas Taleb
• Thinking, Fast and Slow by the legendary Daniel Kahneman
• Economic aspects of tech certifications by @fsmontenegro:
• https://fsmontenegro.wordpress.com/2016/01/03/professional-certifications-information-
asymmetry/
• https://fsmontenegro.wordpress.com/2016/01/13/professional-certifications-behavioural-
economics/
• RSAC USA 2015 – The Art of Thinking Security Clearly by @apbarros
https://www.rsaconference.com/events/us15/agenda/sessions/1531/the-art-of-thinking-security-clearly
” I did my job” is not an effective approach
LOVE ”you can’t patch stupid” but it’s inaccurate.
If you think you’re the best at everything, you’re either horribly delusional or just not observant.
Lake Wobegon effect (illusory superiority)
Will never forget the discussion with Law Firm director of security
Fake e-mail from CEO at Magnolia Health Corporation – just a week ago
Expert biases can certainly go the other way, too.
This all means you probably have the wrong view of your own expertise, so you have to directly consider that
Experiment:
Allow people to cheat, and they do;
Ask them to predict future performance knowing that cheating will not be possible;
People estimate on cheating performance (75%), not actual performance (50%);
Monetary reward for correct performance estimation made no difference.
Ego depletion.
Judges reviewing parole cases: more granted early morning and right after lunch.
Social engineering at the end of shifts?
Public with Insufficient evidence
Dan Ariely founded the ”Center for Advanced Hindsight” at Duke university as a joke about this effect.
http://www.bloomberg.com/bw/articles/2014-02-21/neiman-marcus-hackers-set-off-60-000-alerts-while-bagging-credit-card-data
Going with your gut will only justify the ”quick to judge” crowd
Illusion of validity
Illusory correlation
Have you prevented attacks by rebooting the firewall appliance regularly?
To summarize: you need to question everything. Especially, your first reaction.
Why do we care?
How attackers think
Why IT people bypass security measures
Club: 23%
Kick: 14%
Hand: 10%
Make losses concrete on awareness training, make consequences clearer on UI / tickets.
What if you’re not even stealing the data?
Think you conscience is capable of justifying such behavior?
You actions can impact how independent your decisions are.
Conflict of interest consultant doing risk analysis / recommendations but they also sell controls.
When it was known a gallery sponsored the lunch break, the reward center (ventromedial frontal cortex) of their brain reacted more favorably to same painting when associated with their logo.
When asked, subjects genuinely said the gallery had no impact on their decision.
Start of discussion on practical ways to reduce dishonesty.
Reminder works if reference is wrong, such as a non-existent MIT/Yale honor code or atheists swearing on the bible.
On bug bounties,researchers are basically deciding by themselves what and how to proceed.
But they are the ones that stand to gain from finding vulns.
Conflict of interest of sorts.
Creating rules and processes to allow that to be done
Shout out to BugCrowd and HackerOne
Don’t trust the headline. Read the details.
Availability Heuristic (and then, Cascade)
Visible and newsworthy threats lead to funding – not necessarily the most relevant
When a company pitches its people as a reason to buy its software, does it seem relevant?
http://www.dailykos.com/story/2013/08/27/1234282/-The-NSA-s-astounding-internal-security-nbsp-failures#
If it sounds really exciting, make sure to invoke System 2 before proceeding.
Framing Effect
Always try to acknowledge when your investments fail so you can find an alternative sooner
SUNK-COST Fallacy
Don’t constantly look for yet another reason that your team should build everything. Keep your expectations high, but look for something to surpass them.
Confirmation Bias
Because opting to use no technology will get you stuck in rabbit holes.
Clustering illusion
Manual review of large datasets yields patterns, even when none exist
So we all have to do what? Second guess ourselves.
With too few security pros, we need to use technology where effective.
Two meanings of analytics: allowing analyst to do whatever analysis he wants or automating decision making and detection on a statistically sound way.