Successfully reported this slideshow.
Your SlideShare is downloading. ×

Adversary simulation

More Related Content

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all

Adversary simulation

  1. 1. ADVERSARY SIMULATION “RED CELL” APPROACHES TO IMPROVING SECURITY
  2. 2. Talk Background Introduction and overview of Red Teaming Organization challenges & Opportunities Redteaming / Red Cell effectiveness • Meeting the defenders where they are at -Adversary simulation • Emulating Tactics Techniques and Procedures • Being the Adversary Resources
  3. 3. $whoami • Chris Hernandez • RedTeamer • Former: • Pentester • Vuln/ Patch Mgmt • Sysadmin • Bug bounty hunter • Irc handle= piffd0s • Blog= Nopsled.ninja • @piffd0s
  4. 4. Introduction to Red Teaming • What is “Red Teaming”? • Origins of “Red Team” • Examples of Red Teaming Failures • Examples of Red Team Successes
  5. 5. What is Red Teaming? • Both Approach, Mindset and Tactics • Takes many forms, Tabletop Exercises, Alternative analysis, computer models, and vulnerability probes. • Critical Thinking • A Therapist…
  6. 6. What are its origins? • Originated in the 1960’s military war-game exercises • Red Team was meant to emulate the soviet union • 1963 - First historical example was a redteam exercise structured around procuring a long range bomber. • Most early examples are structured around determining soviet unions capability
  7. 7. Red Team Failures: Operation Eagle Claw • Failed mission to rescue 52 diplomats held captive in the US Embassy in Tehran. • Operation was “need to know” not Red Teamed • Operation was initiated without enough planning and foresight into potential challenges / obstacles
  8. 8. Unified Vision ‘01 & Millennium Challenge ‘02 • Millenium challenge ’02 • Red Cell Is highly restricted in its actions • Red Cell pre-emptively attacks US navy fleet with all of their air and sea resources sinking 21 Navy Vessels • White Cell “refloats” sunken navy vessels • Unified Vision ’01 • White Cell informs Red Cell that Blue Team has destroyed all of their 21 hidden ballistic missile silos • Blue Team commander never actually new the location of any of the 21 silos
  9. 9. RedTeam Success Stories • New York Marathon, NYPD and New York Roadrunners • Cover scenarios like: • How do you identify tainted water sources • How to respond if drones show up in specific locations • Race can be diverted at any point • Israeli Defense Force – “Ipcha Mistabra” • The opposite is most likely • Small group in the intelligence branch • Briefs Officials and Leaders on opposite explanations for scenarios
  10. 10. Organizational Challenges • Overcoming Groupthink • Maintaining Divergent thought • Remaining Skeptical • Assimilation into culture • Communicating risk effectively • Metacognition • Leadership buy in • “Gaming” the Op
  11. 11. Red Cell Effectiveness • Ex. 57th adversary tactics group • Only Highly skilled pilots are allowed to become “aggressors” • Allowed only to use known adversary tactics and techniques depending on who they are emulating • Same should apply to all red teams • Adversary emulation is key to realistic simulations
  12. 12. Red Cell Effectiveness • Effective adversary emulation can mean being a “worse” threat actor • Tests defenders “post- compromise” security posture. Aka “assumed breach model” • Post compromise / foothold can also save valuable time and money.
  13. 13. Adversary Skill and Detection Model 0 1 2 3 4 5 6 Ignorance Detection Proactive Pre-emptive Difficulty Difficulty ScriptKiddie Criminal(s) APT
  14. 14. What are the benefits of an effective Red Cell? • Train and measure IR teams detection and response. • MSFT measures this as MTTD MTTR Mean time to detect, and Mean Time to Recovery • Validates investment in very expensive security products, services, and subscriptions
  15. 15. An example red cell exercise • Build a relevant threat model based on your industry threats, or competitors breaches / news events • Story board the attack • Determine where IR should detect and respond • Use Red Team to validate story board • What went well / what went wrong – postmortem analysis • Debrief Tactics
  16. 16. Putting it all together – Adversary simulation • Emulate realistic threat actors TTPs • Assume breach model • Model attacker activity to your story board • Information exchange between red and blue teams* • Protect Red Team culture • Repeat in a reasonable amount of time
  17. 17. Example Adversary Simulation – TTPs – “Deep Panda” After seeing how these indicators were being applied, though, I came to realize something very interesting: almost no one is using them effectively. - Pyramid of Pain
  18. 18. ADDITIONAL RESOURCES Books: Red Team – Micah Zenko Applied Critical Thinking Handbook – UFMCS Online: Microsoft Enterprise Cloud Redteaming Whitepaper 2015’s Redteam Tradecraft / Adversary Simulation – Raphael Mudge The Pyramid of Pain – David Bianco Veris Group - Adaptive Threat Devision – Will Shroeder and Justin Warner The Adversary Manifesto - Crowdstrike

Editor's Notes

  • Hi everybody,

    I’m chris hernandez, what a pleasure it is for me to come and visit with you for a few minutes today and share some ideas that might be beneficial to you and your organization. I’m delighted to be here, I've been looking forward to it for some time. And its nice to be back, I was here not too long ago and It says something when you are invited back… it doesn't’t say everything, but it says something. Maybe it says “lets give him one more chance, and maybe he can get it right this time…”

    I feel two major responsibilities today, and here’s what they are…

    Number one, is that you get your money’s worth… It looks like no body paid for admission, but hey, at least the price of a beer or dinner

    And my second major responsibility is that you get your times worth, and the reason why I say that is because time is more valuable than money. If someone asks you to spend some money, sure no problem… you can get more of that… But if someone asks you to spend some time, you’ve really got to think that over… you can never get more time… so I appreciate and I understand the value of you investing your time today, and I hope this talk can be worth your time. This talk is going to be costly for me as well, its going to take some of my time… so in order for it to be worthwhile for me, I really want some of my ideas to make an impact. And I’m here not to just tell a few interesting stories and walk away, but I’m here hopefully to give you some value for your time.

    So, I’d like to share with you some things I’ve learned in my career in information security, these are my perspectives and opinions on techniques for improving the security of your organization…. The ideas are not new or revolutionary. I’m just trying to share what, in my experience I feel works well in regards to redteaming

  • So at a high level, we talk about…

  • Just briefly let me tell you my story ….
    I’ve worn various security hats in my career, some defensive and offensive, from helpdesk to redteaming I’ve done about everything in between and I like to think that that gives me some perspective on the challenges of security in an organization.


  • Both Approach, Mindset, and TacticsIf you are a leader in an environment you probably don’t know everything that is going on.
    If you are wise enough to come to this conclusion you need a red team to be the bring an alternate perspective
    The alternative perspective would apply to your problems, and the problems of your adversary

  • Earliest evidence of the origins of redteaming came out of military wargaming exercises,

    1976 – Hardliners in the Ford administration didn’t agree with the CIA’s conclusion. Believed that the U.S. had a capability gap.

    Team “B” of experts with access to all information about known soviet military capabilities and came to an alternative conclusion compared to the CIA report.
  • 3 helicopters malfunction / c130 and rh53d helicopter collided

    Example of what happens when there is no red teaming done in planning phases of an operation

    Military example, but think of business / public sector examples
  • Translate this to real world / business scenarios
  • Multiple contingency plans for mulpiple scenarios
    As a result of the redteam simulation they are able to better pretect the marathon


    -
    They are directed to come to the opposite conclusion of whatever the current plan or conventional wisdom is. They don’t just brief generals. They go to parliament. They brief the prime minister’s office and the prime minister’s Cabinet. They describe their jobs—one of the individuals I know who did the briefings—as exhaustive. You have to essentially be argumentative by design. You have to challenge and doubt everything that happens.
  • Image credit: david bianco

    The key takeaway here is to understand that it is the highly skilled indivudual who can become an aggressor
    You have to be good enough, to restrict yourself to a specific capability or skillset, but that capability and skillset changes based on who you are emulating

  • ----- Meeting Notes (1/20/16 15:14) -----
    nobody wants to drop 100k on a fireye and find out its configured wrong
  • This is a great argument for Red Teams ingesting threat intelligence reports < they can work it into their tradecraft for redteam operations
    If you want to spend a year on an op working to get in, with an 0-day you can, but the simple fact is, if an adversary wants in bad enough, they will get in.
    Again, if you know an adveraries MO, storyboard it, and determine where it could get caught and where defenses are lacking
    Debrief after op completion
    Teams need to be external in terms of culture, but internal and aware in terms of critical thought
    Demoralizing if the blue team gets crushed week in and out
  • An appropriate way to ingest threat intelligence data

×