One of the highest threats to organizations today is also one of their most prevalent services available in most cases, web interfaces. The landscape has changed from simple static websites, to fully functional web-based applications that provide access to internal information gold mines. If you’re not testing those of your client organization, expect that someone else is! Our belief is that most organizations have little to no knowledge as to how many internal web resources they have within their environments that could lead to network compromise. By taking an approach to ensure the security of your client’s web interfaces through offensive security, you will find that there is a lot involved – and usually not a lot of time to get from initial scan to report. In this presentation, we’ll introduce RAWR (Rapid Assessment of Web Resources). We’ll cover its inception, hurdles faced, and give some practical advice on how to get the most out of ‘the little dinosaur’. There’s a lot packed in this tool that will help you get a better grasp of the threat landscape that is your client’s web resources. It has been tested from extremely large network environments, down to 5 node networks. It has been fine-tuned to promote fast, accurate, and applicable results in formats that you can use! RAWR will make the mapping phase of your next web assessment efficient and get you producing positive results faster!
2. INTRODUCTION
Adam Byers [@al14s]
Started with BASIC – Antic mag… the ‘Blue Pages’
•
•
•
•
Blue Team
Automation
Wireless
Malware forensics
Tom Moore [@c0ncealed]
AOL proggies/punters in the 90’s
• Red Team Menace
• Loves creating reports
• Cuddles his AK
4. WHY WORRY ABOUT WEB?
If you don’t know your organizations web attack
surface, expect that someone else already does.
One of the highest threats to organizations today is also one of their
most prevalent services available in most cases, web interfaces. The
landscape has changed from simple static websites, to fully functional
web-based applications that provide access to internal information gold
mines.
Our belief is that most organizations have little to no knowledge as to
how many internal web resources they have within their environments
that could lead to network compromise. By taking an approach to ensure
the security of your client’s web interfaces through offensive security, you
will find that there is a lot involved – and usually not a lot of time to get
from initial scan to report.
5. WHAT WOULD YOU DO?
You are given the following objective:
Assess your organization’s internal and
external web-based attack surface.
Your end goal is to produce a report that
can be provided to both technical
individuals and executives.
6. WHICH TOOLS TO LEVERAGE?
Different tools for each step in the process:
Recon
Mapping
Discovery
Exploitation
Reporting
These tools, in most cases do not produce output
that play nicely with one another.
This leaves YOU with the responsibility of
interfacing between them…
7. HOW WOULD YOU PRESENT IT?
How much work would be involved in obtaining
output that could be considered acceptable for
both of your intended audiences?
Executive
Technical
.
- Visuals and numbers.
- Specific information for remediation.
.
.
.
8. WHAT IS YOUR TURN-AROUND?
How long would it take you to go from initial
mapping, to producing the deliverable?
Mapping
Formatting data
Identify targets of interest
Additional information collection
Formatting data (again)
Validation of findings
Composing the report
12. INPUT
•
•
•
•
•
•
•
•
•
NMap XML (live or from file) *
Nexpose Simple XML
Nexpose XML (v1,v2)
Nessus XML (.nessus) *
OpenVAS XML
Qualys XML (Scan Report) *
Qualys CSV (Port/Services Scan)
Metasploit CSV
??? CSV
* Parses SSL cert info for these
22. PLANS FOR THE FUTURE
• HTML appearance
• SSL parser testing
• Talk to:
•
•
•
•
Malware Researchers
Pentesters
Developers
SysAdmins
23. CONCLUSION / DISCUSSION
Comments, praise, questions, cash donations:
Adam [ al14s@pdrcorps.com ]
Enraged hate mail, insults, threats:
Tom [ c0ncealedx64@gmail.com ]
Thank you for sitting in - we hope you found our talk worthwhile.
If not, it’s all Tom’s fault.
Editor's Notes
I know this isn’t a comicon, but we’re going to do a little role playingYou’re in a medium to large business. How many web interfaces? Approximately 4000 web interfaces.
I know this isn’t a comicon, but we’re going to do a little role playingYou’re in a medium to large business. How many web interfaces? Approximately 4000 web interfaces.
For the purpose of this talk, we are not going to go in-depth on which tools to utilize, but rather focus on the stop gaps that exist between them.
Which of the tools listed previously provide any information suitable for viewing by an executive?Not only this, but another consideration is how many times would you have to scan the network or re-query services to obtain information for both levels?What kind of visuals could we leverage? Screenshots / Site MapsWhat kind of numbers? Total services within the environment. Total number of systems not fitting the Minimum Security Baseline?What kind of technical information could we leverage? Query all the things! NMAP Results, CSV, Attack Surface Matrix, much much more…
Each step in the process is time consuming, especially when dealing with a lot of information.Mapping of a large environment can take days. Network segmentation. Obfuscated ports. Latency.Once you scan, now you need to format all of your mapping information… Make them play nice!What items in your mapping process bubble to the top? Vulnerable targets? Admin Panels / Sharepoint Sites / Internal Services (HR / FIN / LGL) / Configuration pages / Directory BrowsingDo we now need to collect more information on these targets? Services vulnerable? Screenshots? Site Maps? SSL Certificates? Robots.txt? Cookies?Now we get to format again… Now for pulling together all of that data that shows that we truly did find dirt.And lastly, my favorite, composing that report that helps us justify our existence. ALL OF THIS TAKES TIME / Exponentially based on the amount of active web services that we find.
Adam’s turn…
dynamic CSV – working on a way to make this dynamic by checking the content (seems like a good challenge)Limited testing pool for some of them, so please shoot me any parsing issues you have. I’m not interested in anyone else’s information, but would like to fix any problems that arise within RAWR.
Defpass file will grow as we add to the compilation
SOMEWHERE DOWN THE ROAD, WE’LL HAVE A PROBLEM.We can run into issues PARSING, REQUESTING, etc.Using python’s traceback module to give meaningful, discrete feedback on errors.
Every site is different, and even more likely if you’re working with external sites.I’ve done quite a bit to prevent an error from killing the thread.Error.log will hold all of your trace information.Makes it easy to troubleshoot or sanitize and send to the author so he can fix his mistake.. Now on to the demo……..
We’ll see how to use RAWR in an actual web assessment…
We’ll see how to use RAWR in an actual web assessment…
We’ll see how to use RAWR in an actual web assessment…
PyCharmTom’s picking up python, (WEPT class)Coders? Use the knowledge of the more experienced guys – let them point out the need. Then make it happen.The people that I know doing web assessments has been required to at least write one script in the exploitation phase of the analysis (validation of a weakness, non-cookie cutter)
Beautify the HTMLWork out the SSL parsingTalk to the malware researchersTalk to the pentestersTalk to the developers