Adversary simulation is a key component of a mature security program. Without it organizations might not truly understand their weaknesses until they face a real world adversary. This talk will promote the concept of the “Assumed Breach” model and discuss some steps security program owners can take to validate a security program is effective.
2. Talk Background
Introduction and Overview of Red Teaming
What are our organizations Challenges &
Opportunities?
What makes Red Teaming / Red Cell effective?
What is Adversary simulation
TLDR… Extra Resources
3. $whoami
• Chris Hernandez
• Red Teamer
• Former:
• Pentester @Veris Group ATD
• Lots of other stuff
• Exploit / Bug Research
• Blog= Nopsled.ninja
• @piffd0s
5. • Mindset and Tactics
• Takes many forms, Tabletop Exercises,
Alternative analysis, computer models, and
vulnerability probes.
• Not limited to InfoSec
• Critical Thinking
• Cognitive Psychologist
7. • Originated in the 1960’s military war-game
exercises
• “Red” = the soviet union
• 1963 - First public / documented example was
a red team exercise structured around
procuring a long range bomber.
• Most early examples are structured around
determining Soviet Unions capability
13. Unified Vision ‘01 & Millennium Challenge ‘02
• Millennium challenge ’02
• Red Cell Is highly restricted in
its actions
• Red Cell pre-emptively attacks
US navy fleet with all of their
air and sea resources sinking
21 Navy Vessels
• White Cell “refloats” sunken
navy vessels
• Unified Vision ’01
• White Cell informs Red Cell
that Blue Team has destroyed
all of their 21 hidden ballistic
missile silos
• Blue Team commander never
actually new the location of
any of the 21 silos
15. RedTeam Success Stories
• New York Marathon, NYPD and New York Roadrunners
• Cover scenarios like:
• How do you identify tainted water sources
• How to respond if drones show up in specific locations
• Race can be diverted at any point
• Israeli Defense Force – “Ipcha Mistabra”
• The opposite is most likely
• Small group in the intelligence branch
• Briefs Officials and Leaders on opposite explanations for scenarios
16. How does any of that apply to my business?
• Red Team Failure
• Agendas
• Restricted actions
• Poor Communication
• Narrow scope
• Unrealistic Scenarios
• Not having a red team
• Red Team Success
• Good questions
• Make no assumptions
• Open Access
• Fluid Communication
• Realistic Scenarios
• Agendas
18. Red Cell Effectiveness
• Ex. 57th adversary tactics group
• Only Highly skilled pilots are
allowed to become “aggressors”
• Allowed only to use known
adversary tactics and techniques
depending on who they are
emulating
• Same should apply to all red
teams
• Adversary emulation is key to
realistic simulations
19. Red Cell Effectiveness
• Effective adversary emulation
can mean being a “worse”
threat actor
• Tests defenders “post-
compromise” security posture.
Aka “assumed breach model”
• Post compromise / foothold
can also save valuable time
and money.
20. What are the benefits of an effective Red Cell?
• Train and measure IR teams detection and response.
• MSFT measures this as MTTD MTTR Mean time to
detect, and Mean Time to Recovery
• Validates investment in very expensive security
products, services, and subscriptions
21. Putting it all together – Adversary simulation
• Emulate realistic threat actors TTPs
• Assume breach model
• Model attacker activity to your environment / risk
• Information exchange between red and blue teams*
• Protect Red Team culture
• Repeat in a reasonable amount of time
22. ADDITIONAL RESOURCES
Books:
Red Team – Micah Zenko
Applied Critical Thinking Handbook – UFMCS
Online:
Microsoft Enterprise Cloud Redteaming Whitepaper
2015’s Red team Tradecraft / Adversary Simulation – Raphael Mudge
The Pyramid of Pain – David Bianco
Veris Group - Adaptive Threat Devision – Will Shroeder and Justin Warner
The Adversary Manifesto - Crowdstrike
Editor's Notes
It was a dark and stormy night.
“Captain, captain, wake up!”
“Ohh … What is it?”
“Sorry to awaken you sir, but we have a serious problem.”
“Well what is it?”
“There’s a ship in our sea-lane about 20 miles away, and they refuse to move.”
“Tell them to move.”
“Sir, we have. They won’t move.”
“I’ll tell them!” The signal goes out: “Move starboard 20 degrees. At once!”
The signal returns: “Move starboard yourself 20 degrees. At once.”
“I can’t believe this. I mean, I’m a captain. Let them know who I am. I am important.” The signal goes out: “This is Captain Horatio Hornblower the 26th commanding you to move starboard 20 degrees at once.”
The signal returns: “This is seaman Carl Jones the third commanding you to move starboard 20 degrees at once.”
“What arrogance! Who is this joker? I mean, we’re a battleship! We could just blow them — let them know who we are!” The signal goes out: “This is the Mighty Missouri, Flagship of the Seventh Fleet!”
The signal returns: “This is the Lighthouse.”
That’s a story thats found in the Naval Proceedings Manual where they literally interpreted a lighthouse to be a ship, but I like the story because it helps me introduce this subject: That there are specific ways of thinking that are very common, that make it very difficult to be effective at securiing an organization. And I want to share some of those thought patterns with you today so will be aware of them and can hopefully operate your organization more effectively with that knowledge.
So, I’d like to share with you some things I’ve learned in my career in information security, these are my perspectives and opinions on techniques for improving the security of your organization…. The ideas are not new or revolutionary. I’m just trying to share what, in my experience I feel works well in regards to redteaming
So at a high level, we talk about…
Just briefly let me tell you my story ….
I’ve worn various security hats in my career, some defensive and offensive, from helpdesk to redteaming I’ve done about everything in between and I like to think that that gives me some perspective on the challenges of security in an organization.
Both Approach, Mindset, and TacticsIf you are a leader in an environment you probably don’t know everything that is going on.
If you are wise enough to come to this conclusion you need a red team to be the bring an alternate perspective
The alternative perspective would apply to your problems, and the problems of your adversary
Earliest evidence of the origins of redteaming came out of military wargaming exercises,
1976 – Hardliners in the Ford administration didn’t agree with the CIA’s conclusion. Believed that the U.S. had a capability gap.
Team “B” of experts with access to all information about known soviet military capabilities and came to an alternative conclusion compared to the CIA report.
Example of a scotoma
Red teams responsibility is to see other teams blind spots and predict failures
To do this they need to be aware of their own blind spots
a partial loss of vision or a blind spot in an otherwise normal visual field.
Lets try a game…
Find all the red you can in the room…
Now…
Where is the brown
Military examples
Translate this to real world / business scenarios
Multiple contingency plans for mulpiple scenarios
As a result of the redteam simulation they are able to better pretect the marathon
-
They are directed to come to the opposite conclusion of whatever the current plan or conventional wisdom is. They don’t just brief generals. They go to parliament. They brief the prime minister’s office and the prime minister’s Cabinet. They describe their jobs—one of the individuals I know who did the briefings—as exhaustive. You have to essentially be argumentative by design. You have to challenge and doubt everything that happens.
The key takeaway here is to understand that it is the highly skilled indivudual who can become an aggressor
You have to be good enough, to restrict yourself to a specific capability or skillset, but that capability and skillset changes based on who you are emulating
Image credit: david bianco
----- Meeting Notes (1/20/16 15:14) -----
nobody wants to drop 100k on a fireye and find out its configured wrong
This is a great argument for Red Teams ingesting threat intelligence reports < they can work it into their tradecraft for redteam operations
If you want to spend a year on an op working to get in, with an 0-day you can, but the simple fact is, if an adversary wants in bad enough, they will get in.
Again, if you know an adversaries MO, storyboard it, and determine where it could get caught and where defenses are lacking
Debrief after op completion
Teams need to be external in terms of culture, but internal and aware in terms of critical thought
Demoralizing if the blue team gets crushed week in and out