Cybersecurity - Rainbow Teaming - what are the colour teams in cybersecurity, how purple differs from red teaming, what is white team and other colours ?
2. Summary
• There are many colour teams in cyber security
• They are a result of cybersecurity domain evolution
• They add value in different manner to organisations
3. Objective:
• Defence against attacks;
Key components:
• Cyber Threat Intelligence – who, what, how
and why;
• Prevention:
• People, Process and Technology
(nextGEN, smart and AI would be the
best ;P)
• Detection and Response;
• Monitoring and understanding signals
from our assets/network to detect
malicious activities (detection
engineering) – creating uses cases for
SIEM;
• Reacting to alerts/ raised incidents
(SOC, leveraging SIEM)
• DFIR;
• Threat Hunting.
Blue Team
4. Red Teaming
Objective
• Improve organisational security posture through adversary
simulation -> Ideally find new attack paths! / TTPs !
Characteristics:
• Goal oriented, scenario based, and threat intelligence led;
• Tactics, Techniques & Procedures (TTP);
• Action on production environment;
• Most often black/grey box;
• All attack vectors;
• May be more formalized eg. In financial sector.
Red team operator – more offensive mindset, Focus on goals
(determined, patient, wide perspective, unconventional approach,
mature + high ethics level)
5. Red Teaming
• Engagements may vary
depending on the approach but
common red team operations:
• Are usually full kill chain;
• Black box;
• Covered mode (small amount
of people from organisation is
involved);
Prepare
Plan
Execute
Report
Close &
clean up
6. Red Teaming
Oversight
Remediation actions
Change management
Owners
Results – issues / rootcauses -
actions
Basic review and
understanding of results
IoC analysis
Asynchronic approach
Prepare
Plan
Execute
Report
Close &
clean up
7. Red Teaming
Pros
At the moment looks like most realistics testing
methodology
Indicates priorities to invest resources, adds
value to formal maturity review
Good cross functional perspective (kill chain /
defence capabilities)
Cons
We vs They (different goals)
• Achieve flags, reveal weaknesses
• One wins, other looses -> no cooperation
• Scalability & Test time, Feedback Speed
8. Red Teaming
• Also
• Waterfall approach – typically – high cost of
reiteration;
• Often formalized (due to regulatory
requriements);
• Talents management;
• Small supply;
• Expensive;
• Insider threat;
• Long feedback time -> Long remediation time
• Org Ability to consume results
9. From Red to Purple
• While there are many advantages of red teaming, the disadvantages and cybersecurity
approach triggered shift into more agile approach – Purple teaming:
• Collaboration and information sharing
• Scalability;
• Fast feedback cycle;
• Accelerated control enhancements;
• Agile
• BE FASTER, ADAPT TO EMERGING THREATS && ENHANCE CONTROLS!
10. Purple Teaming
7. Detection and
response
8. Feedback,
information
sharing
9. Control
enhancement
10. Reporting
6. Threat actor
emulation
1. Describe
behaviours
2. Identify
and obtain
data
3. Define
detection
4.
Determine
remediation
5. Develop
scenarios
and plan
11.
Summary
12. Clean
up
Toolset
Infrastructure
# sprints
INTEL
TTPs
11. Purple
Teaming cons • Again while there are many advantages of Purple teaming there
are some cons too:
• Typically Focused on detection and response enhancement
(more mature environments)
• Detection enhancement does not mean successful
remediation
• Time from design to production
• Availability, usability, security of logs and CMDB – many
dependencies on maturity of other capabilities
• Requires automation technology to really benefit
from it.
12. How to start
• Set priorities
• What is on intel’s radar;
• What are the related TTPs;
• How it refers to our controls;
• How we can break it down to small
repetitive steps – atomics;
• Check your position against the threat
simulating that and analysing the outcome;
• Refine controls if needed – re-test.
16. Example simple scenario
Test organistaion capability to detect WMI related activities.
WMI is a native Windows utility that administrators use regularly to
automate tasks and remotely manage systems in their environments.
Adversaries generally use WMI for the same reasons that
administrators use it: to execute processes on remote systems.
Adversary will leverage WMI to interact with local and remote assets to
suport delivery of its objectives:
• Reconeissance
• Proces execution
• Lateral movement
5. Scenarios
and planning
https://github.com/redcanaryco/atomic-red-team
17. Testing…
Recon
#1 Users
wmic useraccount get /ALL /format:csv
#2 Process
wmic process get caption,executablepath,commandline /format:csv
wmic qfe get description,installedOn /format:csv
#3 Software
wmic /node:"#{node}" service where (caption like "%#{service_search_string}%")
#4 Remote Services
wmic process call create #{process_to_execute}
Execute
#5 Local execution
wmic /user:#{user_name} /password:#{password} /node:"#{node}" process call create
#{process_to_execute}
#6 remote execution
wmic /user:#{user_name} /password:#{password} /node:"#{node}" process where
name='#{process_to_execute}' delete >nul 2>&1
6. TTP emulation
https://github.com/redcanaryco/atomic-red-
team/blob/master/atomics/T1047/T1047.yaml
18. Example scope of lessons learnt
What was
detected and what
worked/ did not
work
What actions were
not detected and
why
Detection &
visibility gaps
Signal exists but
not managed
correctly
Alert raised but
not managed as it
should be
Response gaps
• Test executed -> so what? What does it mean?
• Next steps?
• We need new controls, new use case? Or change, update existing one?
• Gaps, risk issues identified – how to address, whom to engage?
• Enhanced -> test it again
Information flow + automation
https://github.com/mitre/caldera
Reporting, information flow
https://vectr.io/getting-started/
7.
Detection
and
response
8.
Feedback,
informati
on sharing
Description and
action parameters
Execution successfull
, objectives achieved
Execution successfull,
objectives not
achieved (eg. Blocked)
Execution result
unclear, no feedback
Execution failed
19. How to measure
your success over
time?
• Mean Time to Detect
• Mean Time to Respond
• Dwell Time
• Average time of detection onbarding
• Number of identified gaps vs closed gaps
http://kpilibrary.com/kpis/mean-time-to-detect-mttd-2
24. Summary
• Red Teaming – less agile, independent, long feedback cycle, cross
functional review, great to support formal maturity reviews
• Purple Teaming - agile, collaborative, fast, you need to be mature enough,
fast and collaborative, expensive in terms of consuming, internal team
costs
• Yellow Team – builders (architects, engineers, developers)
• Orange Team – developers think like attackers
• Green Team – builders embed security concepts
• White Team – oversight and management in most complex environment or
regulatory engagements
• Gold Team – Crisis Management