The Web Application Hackers Toolchain


Published on

A cursory look at new tools and techniques for 2012

Published in: Technology
1 Comment
  • I Got The Full File, I Just Wanna Share to You Guyszz.. It's Working You Can The Download The Full File + Instructions Here :
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

The Web Application Hackers Toolchain

  1. 1. About Me:• Twitter @jhaddix•• I blog like I know stuff: • •• Former VA/Netpen turned Webpen• Currently work for HP Application Security Center • Webpen, Netpen, Mobile, etc…• Random Projects • Open Penetration Testers Bookmark Collection • • Nmap Http-enum fingerprints • ghetto Nessus parsers • Burp hacking presentation• No I can’t get you a touchpad• I love talking about hacking, and I like to drink Beer & Gin and Tonics • I don’t know if that’s a girly drink in Brussels =( • You’re welcome to educate me…
  2. 2. Words for the Wise:“Until a man is twenty-five, he still thinks, every so often, thatunder the right circumstances he could be the baddestmotherfucker in the world. If (he) moved to a martial-artsmonastery in China and studied real hard for ten years. If (his)family was wiped out by Colombian drug dealers and (he) swore(himself) to revenge. If (he) got a fatal disease, had one year tolive, and devoted it to wiping out street crime. If (he) just droppedout and devoted (his) life to being bad.Hiro used to feel this way, too, but then he ran into Raven. In away, this was liberating. He no longer has to worry about beingthe baddest motherfucker in the world. The position is taken.”― Neal Stephenson, Snow Crash
  3. 3. Workshop:Done a few conference talks, never done a workshop:• I’m going to move fast, you will get the slides from the con.• Videos for demos available shortly after the conference• If there’s something you want to know just pull me aside sometime or catch me around the con, I’ll do my best to answer all questions.• You’re pretty much getting a whole class converted to a workshop =).• Excuses!•• OK… lets do it.
  4. 4. • Web Hacking Tool Classes: • OSINT (Passive or Semi-Passive) • Discovery (usually dir brute-forcing or platform identification) • Brute Force (password bruting tools) • Proxies (usually include spider’s) • Fuzzers/Scanners (error or vuln identification tools) • Exploitation (vuln exploitation tools) • Data Aggregation
  5. 5. • What am I, a script kiddie? •Yes and no, you’re a pentester; Which means you have approximately 40hrs to do what a blackhat has months to do. •We need to identify technologies faster, vulns faster, and speed up the attack process. •We need to identify the best process and tools to use, even for our manual web pentesting.
  6. 6. GOAL: Gather data to be useful in a web pentest without(or minimally) interacting with the target. Google Hacking: SearchDiggity Metadata: FOCA Email Gathering: TheHarvester Metasploit
  7. 7. SearchDiggity:The SearchDiggity tools are basically automation of google/bing hackingqueries. Think of about a thousand vulnerability checks executed againstyour target except they are not actually touching your target, only thesearch engine cache.
  8. 8. GOAL: Free vulnerabilitychecks aka Googlehacking.SearchDiggity:• Requires Ajax Search Query API key• 100 queries per day unless you register a CC• Buy a pre-paid visa for $10• Find vulns fast
  9. 9. GOAL: Extract domain usernames, internal pathing, softwareversions, etcFOCA:FOCA is a windows tool to spider a domain for documents using google/bing/exalead,download them, and then extract relevant metadata and server information. • always go see these guys at DC: • •
  10. 10. GOAL: Extract domain usernames, internal pathing, etc
  11. 11. GOAL: Gather email addresses for forms based logins, etc.One of the first parts of recon in a pentest is gathering valid login namesand emails. We can use these to profile our target, bruteforceauthentication systems, send client-side attacks (through phishing), lookthrough social networks for juicy info on platforms and technologies, etc.Where do we get this info? Well without doing a full-blown Open SourceRecon (OSINT) style assessment, we can use two simple scripts:• Metasploits search_email_collector.rb and• theHarvester
  12. 12. Metasploit, under modules/auxiliary/gather, has search_email_collector.rb anduses search techniques for Google, Bing, and Yahoo. /framework3/msfcli auxiliary/gather/search_email_collector DOMAIN=your_target_domain OUTFILE=output_file E Running MSF search_email_collector... [*] Please wait while we load the module tree... [*] Harvesting emails ..... [*] Searching Google for email addresses from [*] Extracting emails from Google search results... [*] Searching Bing email addresses from [*] Extracting emails from Bing search results... [*] Searching Yahoo for email addresses from [*] Extracting emails from Yahoo search results... [*] Located 7 email addresses for [*] [*] [*] [*] [*] [*] [*]
  13. 13. theHarvester (just updated to v2.1) has now fixed some of its previousbugs. It supports searching Google, Bing, PGP servers, shodan, dns-bruteforcing, and LinkedIn. zombie@haktop:/tools/email/theHarvester# ./ -d -b google -l 500 Accounts found: ===================
  14. 14. theHarvester:
  15. 15. There’s also a ton of OSINT sites to help identify server informationwithout ever touching your target yourself:• Netcraft (Uptime Survey, server info)• Domain Tools (Whois Lookup and Domain info)• (traceroute, nslookup, automatic whois lookup, ping, finger)• ( GeoIP, whois, host, dig, blacklists, ping, traceroute & nmap)• (WHOIS and Reverse IP Service/virtual hosting info)• BING IP Search• SSL Labs – Projects / Public SSL Server Database – SSL Server Test• SHODAN – Computer Search Engine (indexed port scans and banner grabs)• Chris Gates presented on OSINT at Brucon 2009 •• Other good OSINT resources: • intelligence-gathering •
  16. 16. Now we need to map the site. Some issues that we need todeal with when mapping the site are poor ajax support forspidering (we go over this later) and finding non-linkedresources. To find non linked resources we bruteforcecommon file/path names, framework paths, etc.Discovery Tools:• Dirbuster & Wfuzz • SVNDigger Lists • FuzzDB and RAFT Lists • (optionally) Nmap HTTP-Enum & CMS Explorer
  17. 17. Dirbuster is a crossplatform directorybruteforcer written in java(GUI app).Tips:• Disable Recursion and Redirects for faster leaner bruting (our spiders will follow redirects later)• We can change threading on the fly• Dirbuster’s built-in lists are from a project that basically spidered the whole internet.
  18. 18. Wfuzz is a command line equivalent, with a bit more functionality forgeneral web fuzzing (filter by resp code, wc, charc, etc):• Also has some lists!
  19. 19. 301’s sometimes redirect to veryinteresting subdomains or promopages.
  20. 20. Alright, so for non linked resources and discovery we can use Dirbuster’slists or Wfuzz’s but they are very generic (that’s not necessarily a badthing). Like I mentioned before, Dirbuster’s are based off of spidering thenet and aggregating the most common directory data and commonwords (partially).But isnt that finding linked resources? YesThere are some more options for us as far as lists go:SVNDigger – a set of directory lists based of pathing of open sourceprojects on Google Code and SourceForge.RAFT Lists: Within the Discovery/PredictableRes path
  21. 21. and GoogleCode, 400kwords (5000 projects), directoriessorted by project: • Type • Extension contained • Context • DB TypeMost coverage/success running the“all” lists, pick as you need.
  22. 22. RAFT is a recent proxy project releasedat BHUSA 2011, with a set of wordlistsfor content discovery based uponspidering 1.7 million “robots.txt”disallows and contextual frameworkpaths. There’s some overlap withSVNDigger. The lists themselves aredownloadable from the RAFT site butthey are also contained in the FuzzDBDiscovery/PredictableRes directorywhich we’ll be seeing in the tacticalfuzzing section later.Broken down into directories, words, and fileswhich takes us to smarter recursive contentdiscovery…
  23. 23. 1. Use raft-large-directories in Dirbuster or Burp2. Take the successful output and add it to a Burp Intruder setup (clusterbomb) as payload set 13. Add raft-large-files.txt as 2nd payload set
  24. 24. Storytime
  25. 25. Hopefully at this point we have some logins or emails to try and bruteforceauthentication from the OSINT section. I prefer Burp Suite’s Intruder Module forbruteforcing authentication.1. Attempt Login2. Go to Proxy History Tab3. Find the POST request4. Send to Intruder5. Use Cluster Bomb payload6. Clear all payload positions7. Mark username and password fields as payload positions8. Goto “payloads” tab9. Set “payload set” 1 to your username list10. Set “payload set” 2 to your password list11. Click on the intruder Menu12. Start Attack13. Look for different lengths or grep possible successful auth messages under options
  26. 26. With some valid usernames we want to up our chances of bruting a validpassword. Ron Bowes (@iagox86) has some fantastic password research and hasarchived many of the lists that have been leaked on the web. Huge password repository. Actual user data from hacked sites: • RockYou (Rockyou 75 is a winner) • Phpbb • Myspace • Hotmail • Hak5 • Facebook • More…
  27. 27. Just a few…
  28. 28. GOAL: Spider the site, identify fuzz points, chain and feed scanner.For proxies and spidering I use Burp Suite. There exists some good Paros forks (ZAP)but Burp, even the free version, has much more power and extensibility.
  29. 29. Fiddler is a unique andpowerful option as well due tosome great plugins such asWatcher (for passivelyidentifying user controllable)and x5s (for identifyingpossible xss insertion points).These can help us later whenwe want to start tacitlyfuzzing.
  30. 30. Proxies sit between you and the browser but they can also enhance your testing bychaining them with your other tools. This is great if your scanner has a proxy mode, thisway we get walk through the functionality of the site and hit it with two different spiderengines and finally attack it with our scanner. Additionally, chaining proxies and scannerscan help us deal with auth/session issues in hard to scan environments (NTLM/Kerberos). Ifyou’re sticking with open source tools or non-proxy mode scanners you can export yourspider results as links and import them into you scanner. Browser -> Burp -> Scanner (in proxy mode) -> Site1. Walk app, executing all Ajax and rich functionality (snaplinks is handy)2. Browse to anything from the discovery stage to populate proxy and scanner3. Spider with Proxy of choice4. (optional, this might pollute your site tree) Fuzz with fuzzer/proxy of choice5. Run ScanThis all gets fed to the scanner sitemap/tree. Now the scanner has the best chance offinding all fuzz points and vulns.
  31. 31. You said scanners! Which ones?!Shay Chen has some excellentresearch on the accuracy of opensource and commercial scanners.  Only covers XSS and SQLi atm
  32. 32. Now that you have the blanket stuff out of the way, its time to interpret the proxy andscanner data for tactical fuzzing points.  Does this functionality display something back to the user?  Fuzz for XSS  Does it interact with a database?  Fuzz for SQLi or other injections  Does it call on the server file system?  Fuzz for LFI/PT  Does it call on a URL or external/internal site/domain?  Fuzz for RFITactical Fuzzing? Wtf?
  33. 33. Now we can fully utilize the project we mentioned a few times earlier, the FuzzDatabase:“ The fuzzdb aggregates known attack patterns, predictable resource names,server response messages, and other resources like web shells into the mostcomprehensive Open Source database of malicious and malformed input testcases.”
  34. 34. 1. Use Fuzzdb strings on all the afore mentioned forms and parameters2. Re-fuzz all parameters that gave errors on the spidering/scanning results.3. After concretely identifying the platform, re-fuzz/content discover with that platforms specific lists.
  35. 35. The fuzzdb also has an excellent error /vuln grep file for import into Burp:
  36. 36. When it comes to exploitation tools mostly we need someautomagic tools to exploit different forms of SQL injection or fileinclude vulnerabilities. For manual testing we also need a set ofweb shells.Our standards are SQLmap, Havij, SQLninja for sql injectionfimap and metasploit for file include vulnerabilities.and a common set of web shells from the fuzzdb .
  37. 37. SQLmap is a comprehensive SQL injection tool with the ability to do many forms ofinjection.SQLmap Tips: -l can import Burp logs to test your hosts (when saving in Burp use only your targets in scope) ./sqlmap -l /root/sqli.txt Often we want to force POST parameters ,setting –data will force POST: -- data=userid=test&pass=test We can specify parameters with -p : ./sqlmap –u TARGET-p userid,pass --level=LEVEL Level of tests to perform (1-5, default 1) has to do with insertion points. --risk=RISK Risk of tests to perform (0-3, default 1) has to do with test cases. ./sqlmap -l /root/sqli.txt --level=5 --risk=3 You can max out speed at threads=10 --forms will parse and test all forms on target --os-pwn for possible meterpreter shell
  38. 38. Other tools mentioned help us in edge cases: Havij for very up to date WAF evasion (modsec) Use at your own risk.  SQLNinja when SQLmap will not exploit  Fimap for file include exploitation Metasploit for remote file includes  exploit/unix/webapp/php_include
  39. 39. We also need some standalone shells in severaldifferent languages forupload vulns. Luckily theFuzzDB has these as well.
  40. 40. What about taking XSS beyond alert(‘xss’)?BeEF is the best tool for javascript attacks. It’s more extensible now that itintegrates with metasploit. We now can: Hook the browser with invisible iframes Inject/change content on the fly Footprint the internal network Sniff keystrokes Deliver browser based exploits or metasploit meterpreter java payloads for full control of the target
  41. 41. Video
  42. 42. What about webservices, SOAP,XML?With a wsdl andSOAPui proxiedthrough Burpand tacticallyfuzzing with theFuzzdb testcases we can domore than anyscript or tool I’veseen released.
  43. 43. I don’t have a fancy portal to put my data =(The Dradis framework has been revamped to accept a ton for tool outputs allowing us toimport data and keep working faster.Imports: Nmap Burp Nessus Metasploit Netsparker Openvas w3afMindmapping software works well too.
  44. 44. With all this out of the way semi-quickly we can now take moretime to tactically fuzz and test for logic and more obscuremanual checks!
  45. 45. Special thanks go out to: Andre Gironda Chris Gates, Armando Romeo, Joe McCray, James Fitts, Bernardo Damele, Daniel Miessler, Ferruh Mavituna, Shay Chen, Ron Bowes, Adam Muntner, and all tool authors.