This document discusses how Google can be used to find confidential information and vulnerable systems on the internet. It provides examples of Google search queries that can locate specific versions of web servers, as well as queries to find unsecured installations of common web applications with known vulnerabilities like WebJeff Filemanager and Advanced Guestbook. The document encourages readers to think carefully about the type of sensitive data that users carelessly make public and how it could aid attackers.
The document discusses how to use Google searches and operators to find sensitive information that could be useful for hackers. Some key points discussed include using intitle and inurl operators to find login portals and server configuration files containing passwords. Examples are given of searches to find passwords, credit card numbers, software serial numbers, and even live video feeds from unsecured cameras. The document warns that exploiting any found vulnerabilities would be unethical.
The Google Hacking Database: A Key Resource to Exposing VulnerabilitiesTechWell
We all know the power of Google—or do we? Two types of people use Google: normal users like you and me, and the not-so-normal users—the hackers. What types of information can hackers collect from Google? How severe is the damage they can cause? Is there a way to circumvent this hacking? As a security tester, Kiran Karnad uses the GHDB (Google Hacking Database) to ensure their product will not be the next target for hackers. Kiran describes how to effectively use Google the way hackers do, using advanced operators, locating exploits and finding targets, network mapping, finding user names and passwords, and other secret stuff. Kiran provides a recipe of five simple security searches that work. Learn how to automate the Google Hacking Database using Python so security tests can be incorporated as a part of the SDLC for the next product you develop.
Google dorks are search operators used to refine Google searches. They can be used to access secure webpages, download files, or access security cameras. Common dorks include "site:", "inurl:", "intitle:", and "filetype:" or "ext:". SQL injection is a code injection technique that exploits security vulnerabilities in database applications. It works by inserting SQL commands into user input fields to alter the meaning of SQL queries and gain unauthorized access to databases. Defenses include input validation, prepared statements, limiting privileges, and intrusion detection systems.
The document provides an overview of techniques for using Google to perform reconnaissance and searches. It discusses using Google to find information about people by searching for files containing personal details. It also describes using advanced Google search operators and techniques like crawling domains to find additional pages. The document warns that exposing sensitive information or vulnerabilities online could enable malicious activities.
Google Dorks: Analysis, Creation, and new DefensesFlavio Toffalini
1) The document analyzes Google dorks, which are search queries used by attackers to find vulnerable systems. It presents a taxonomy of existing dorks and motivates the need for new defenses.
2) The authors introduce a new type of "word-based" dork that uses common words left by content management systems to target websites built with those systems. An algorithm is developed to automatically generate effective word-based dorks.
3) Defenses against word-based dorks are proposed, such as inserting invisible Unicode characters into common words to prevent search engines from indexing them. The document concludes by motivating the need for continued research into new dork types and defenses.
This document provides an overview of using Google searches to gather information for hacking purposes. It discusses techniques like Google bombing, using advanced operators and wildcards to refine searches, searching titles, URLs, and filetypes, and using Google as a scanner to find vulnerable CGI scripts. It also covers automation of Google searches and tools like Gooscan for finding security vulnerabilities. The document warns that exploiting vulnerabilities goes beyond passive searching.
The document discusses how Google search queries can be used to gather confidential information from websites. It provides examples of advanced search syntaxes like "intitle:", "inurl:", and "filetype:" that can help find vulnerable sites. Specific queries are given that could reveal password files, server directories, or other sensitive files accessible online but not found through basic searches. The document aims to help administrators secure their servers and sites from being invaded through Google's search capabilities.
This document discusses how Google can be used to find confidential information and vulnerabilities on the internet. It provides examples of Google search queries that can locate sensitive data like personal details, system configurations, and error messages containing passwords. The author advises administrators to regularly patch systems and remove unnecessary details from public pages to prevent exposing vulnerabilities.
The document discusses how to use Google searches and operators to find sensitive information that could be useful for hackers. Some key points discussed include using intitle and inurl operators to find login portals and server configuration files containing passwords. Examples are given of searches to find passwords, credit card numbers, software serial numbers, and even live video feeds from unsecured cameras. The document warns that exploiting any found vulnerabilities would be unethical.
The Google Hacking Database: A Key Resource to Exposing VulnerabilitiesTechWell
We all know the power of Google—or do we? Two types of people use Google: normal users like you and me, and the not-so-normal users—the hackers. What types of information can hackers collect from Google? How severe is the damage they can cause? Is there a way to circumvent this hacking? As a security tester, Kiran Karnad uses the GHDB (Google Hacking Database) to ensure their product will not be the next target for hackers. Kiran describes how to effectively use Google the way hackers do, using advanced operators, locating exploits and finding targets, network mapping, finding user names and passwords, and other secret stuff. Kiran provides a recipe of five simple security searches that work. Learn how to automate the Google Hacking Database using Python so security tests can be incorporated as a part of the SDLC for the next product you develop.
Google dorks are search operators used to refine Google searches. They can be used to access secure webpages, download files, or access security cameras. Common dorks include "site:", "inurl:", "intitle:", and "filetype:" or "ext:". SQL injection is a code injection technique that exploits security vulnerabilities in database applications. It works by inserting SQL commands into user input fields to alter the meaning of SQL queries and gain unauthorized access to databases. Defenses include input validation, prepared statements, limiting privileges, and intrusion detection systems.
The document provides an overview of techniques for using Google to perform reconnaissance and searches. It discusses using Google to find information about people by searching for files containing personal details. It also describes using advanced Google search operators and techniques like crawling domains to find additional pages. The document warns that exposing sensitive information or vulnerabilities online could enable malicious activities.
Google Dorks: Analysis, Creation, and new DefensesFlavio Toffalini
1) The document analyzes Google dorks, which are search queries used by attackers to find vulnerable systems. It presents a taxonomy of existing dorks and motivates the need for new defenses.
2) The authors introduce a new type of "word-based" dork that uses common words left by content management systems to target websites built with those systems. An algorithm is developed to automatically generate effective word-based dorks.
3) Defenses against word-based dorks are proposed, such as inserting invisible Unicode characters into common words to prevent search engines from indexing them. The document concludes by motivating the need for continued research into new dork types and defenses.
This document provides an overview of using Google searches to gather information for hacking purposes. It discusses techniques like Google bombing, using advanced operators and wildcards to refine searches, searching titles, URLs, and filetypes, and using Google as a scanner to find vulnerable CGI scripts. It also covers automation of Google searches and tools like Gooscan for finding security vulnerabilities. The document warns that exploiting vulnerabilities goes beyond passive searching.
The document discusses how Google search queries can be used to gather confidential information from websites. It provides examples of advanced search syntaxes like "intitle:", "inurl:", and "filetype:" that can help find vulnerable sites. Specific queries are given that could reveal password files, server directories, or other sensitive files accessible online but not found through basic searches. The document aims to help administrators secure their servers and sites from being invaded through Google's search capabilities.
This document discusses how Google can be used to find confidential information and vulnerabilities on the internet. It provides examples of Google search queries that can locate sensitive data like personal details, system configurations, and error messages containing passwords. The author advises administrators to regularly patch systems and remove unnecessary details from public pages to prevent exposing vulnerabilities.
A Google dork query, sometimes just referred to as a dork, is a search string that uses advanced search operators to find information that is not readily available on a website. Google dorking, also known as Google hacking, can return information that is difficult to locate through simple search queries.
The document discusses using Google hacking techniques to locate vulnerabilities on websites. It describes what Google hacking is, which is using Google to find sensitive information that may have been exposed due to poor web application security. It provides examples of what attackers can do with vulnerable websites, such as file inclusion, SQL injection, and arbitrary file uploads. It also discusses the Google Hacking Database (GHDB), which is a collection of Google dorks or search queries that have revealed vulnerabilities. Finally, it covers some basics of Google hacking like using the Google cache to crawl website information and using Google as a proxy server.
The document contains a list of search strings that can be used to find potential vulnerabilities on websites and web applications. Some of the search strings look for pages indicating login portals for administrative access, content management systems, and other common internet-facing applications. Other search strings try to identify specific applications or technologies like vBulletin, ColdFusion, and iSecure. The overall document appears to be sharing ways to search for unprotected administrative or backend interfaces online.
This document provides an introduction to MongoDB and Python. It discusses how to install and run MongoDB, set up a Python environment connected to MongoDB, perform basic read and write operations on MongoDB collections from Python. It also covers common patterns for modeling data in MongoDB like embedding documents and indexing, and integrating MongoDB with popular Python web frameworks.
In this talk, I will walk through multiple tools/resources available to help you handle large datasets from log files to Google Analytics. These new techniques will empower you to find more valuable insights and help you avoid the annoyance of crashing Excel spreadsheets.
The document discusses several topics related to URLs and domains including:
- The components of a URL like protocol, hostname, path, query, and fragment
- Common HTTP status codes like 200, 404, 301, and 302
- The Robots Exclusion Protocol and common mistakes in robots.txt files
- Issues around parameter tracking, duplicate content across domains/URLs, and gateway pages
- The importance of sitemaps and tools from search engines like Live Search, Google, and Yahoo to diagnose crawl issues and analyze links.
Web Techology and google code sh (2014_10_10 08_57_30 utc)Suyash Gupta
This document discusses Google search and the code behind it. It provides an overview of how Google crawls websites and indexes keywords to pages. It explains how Google gets all the links on a page, crawls websites in depth, adds pages to an index, and records user clicks to enhance search relevance. Code examples are provided for crawling functions, getting links, adding pages to an index, and tracking click counts. Tips are also included for enhancing Google searches by using filters like filetype.
As shown at BSides Charm in Baltimore on April 23, here is my presentation on how a hacker looks at a web site, or it can also be seen as a checklist for a web application pentest. Feedback appreciated at plaverty9
This document summarizes a presentation about a web server hack at Brown University. The hack occurred due to overly permissive file permissions on the server that allowed a hacker to add malicious PHP files. These files redirected search engine traffic to pages selling pharmaceuticals, overloading the server with traffic and causing a denial of service. The presentation describes the timeline of events, how the file permissions issue allowed the hack to occur, examples of malicious code used, and steps taken to resolve the problem.
The document discusses diagnosing issues with hreflang tags in sitemaps and on web pages for multilingual and localized websites. It outlines some common problems with hreflang tags such as conflicts in tags, broken or redirecting links, incorrect language and country codes, and issues around the European Union. The document then provides guidance on how to diagnose hreflang tag issues through tools like Google Search Console, Screaming Frog, and Excel to identify missing, non-canonical, or incorrectly coded tags. It emphasizes using data to identify and correct hreflang tag problems.
Building Beautiful REST APIs in ASP.NET CoreStormpath
Core 1.0 is the latest iteration of ASP.NET. What’s changed? Everything! Nate Barbettini, .NET Developer Evangelist at Stormpath, does a deep dive on how to build RESTful APIs the right way on top of ASP.NET Web API.
Victoria Olsina outlined three basic SEO techniques she used to outrank IBM on the enterprise blockchain vertical:
1. Building topical authority by creating 28 pages about blockchain use cases
2. Optimizing content length by writing articles with at least 2000 words
3. Targeting featured snippets by answering related questions in a short paragraph and listing format within articles
This approach led to an exponential increase in organic traffic without a huge budget by focusing on the fundamentals of SEO.
.htaccess for SEOs - A presentation by Roxana StinguRoxana Stingu
The .htaccess file is famous for helping us set redirects but it can also help improve our website’s loading times as well as help with some crawling and indexing issues that I will cover in a bit. Learn where the file can be found, how it compares to https.conf, how it can be used for redirects, deal with duplicate content, what performance issues it can encounter, how it can help you create custom 404 pages, how it helps you leverage browser caching, gzip, disable image hotlinking, add canonical tags and robots directives in the HTTP headers and what tools and resources can help you learn even more.
Redefining technical SEO & how we should be thinking about it as an industry ...WeLoveSEO
It’s time to throw the traditional definition of technical SEO out the window. Why? Because technical SEO is so much bigger than just crawling, indexing, and rendering. Technical SEO is applicable to all areas of SEO, including content development and other creative functions. Join this session to learn how to integrate technical SEO into all areas of your SEO program.
600+ SEARCHABLE Sourcing Tools compiled by Susanna Frazier @ohsusannamarieSusanna Frazier
This document provides a list of over 600 sourcing tools categorized by their functions. It describes each tool's name, current version, category and a brief description. The tools cover a wide range of functions including search, social media, email, documents, scheduling and more. They allow users to easily access information, automate tasks and integrate various online services.
The document discusses securing WordPress sites from three perspectives: a user, system administrator, and developer. For users, it recommends choosing trusted plugins/themes, keeping everything updated, backups, strong passwords, and security plugins. For administrators, it recommends server configuration hardening like HTTPS, limiting permissions. For developers, it stresses sanitization, validation, escaping and secure coding practices. Responsible vulnerability disclosure is also covered.
This document discusses how Google can be used to find confidential information and vulnerable systems on the internet. It provides examples of Google search queries that can locate specific software versions, default website pages, and personal details. The document warns that many systems have outdated or unpatched software exposing sensitive data publicly online through search engines like Google. It encourages administrators to regularly update software and remove version details from webpages to prevent discovery by attackers searching for vulnerabilities.
This document discusses how Google can be used to find confidential information and vulnerabilities on the internet. It provides examples of Google search queries that can locate sensitive data like personal details, system configurations, and error messages containing passwords. The author advises administrators to regularly patch systems for known vulnerabilities and remove unnecessary details from webpages to prevent exposing sensitive information through search engines.
This document summarizes a presentation on search engine optimization (SEO) for Flash content. It discusses how search engines index Flash, including breakthroughs that allow indexing of text, links, and interactions. It emphasizes the importance of dynamic page ranking and getting links over initial page rank. Testing over long periods is recommended to understand how content is indexed. Tips provided include using descriptive text, metadata, and linking to optimize Flash content for search engines.
Google hacking involves using search engine commands and complex queries to locate sensitive data and vulnerable devices. Hackers can find vulnerable websites and devices listed in Google's database, as well as error pages, login pages, and default pages that provide information. While against Google's terms, nothing can stop hackers from using these techniques to select targets for attack.
Lab-4 Reconnaissance and Information Gathering A hacker.docxLaticiaGrissomzz
Lab-4: Reconnaissance and Information Gathering
A hacker uses many tools and methods to gather information about the target. There are two broad categories of information gathering methods: passive and active. These methods are detailed in the table below. In this lab, you will perform passive information gathering (gray-shaded column). In Lab 5, you will be performing active information gathering. Please review the table before starting this lab.
Information Gathering
Passive (Reconnaissance and Information Gathering) – This Week
Active (Scanning and Enumeration) – Next Week
Is the hacker contact with the target directly?
No direct contact with the target
Direct contact with the target
Are the activities logged?
No audit records on the target
Audit record might be created
What kind of tools has been used?
Web archives, Whois service, DNS servers, Search Engines
Port scanners, network scanners, vulnerability scanners (Nessus, Nmap)
What information can a hacker collect?
IP addresses, network range, telephone numbers, E-mail addresses, active machines, operating system version, network topology
Live hosts on a network, network topology, OS version, open ports on hosts, services running on hosts, running applications and their versions, patching level, vulnerabilities.
In passive information gathering, the hacker does not directly contact the target; therefore, no audit logs have been created. Both non-technical (such as employee names, birth dates, e-mail addresses) and technical information (IP addresses, domain names) can be gathered. This information can be used in many ways in the subsequent steps of the attack. For example, the phone numbers or e-mail addresses you discovered can be used in social engineering attacks. DNS records or subdomain names can be used to leverage specific attacks against hosts or URLs.
More notes on Reconnaissance and Information Gathering :
1) In this phase, an attacker may collect a lot of information without being noticed.
2) In some cases, an attacker may even discover vulnerabilities.
3) The information collected in this phase can be quite valuable when evaluated together with the information collected in the scanning and enumeration phase. For example, you might find the phone number and name of an employee in this phase, and you may find the computer IP address in the active scanning phase. You can use these two pieces of information together to leverage a social engineering attack. An attacker will increase the chance of gaining trust when s/he calls the victim's name and talk some specific about the victim's computer.
4) Companies should also perform reconnaissance and information gathering against themselves so that they can discover -before hackers- what kind of information the company and company employees disclose.
In this lab, you will practice 6 passive methods of Reconnaissance and Information Gathering. You have to use Kali VM in Sections 3, 5, and 6 of the lab. You may use Kali.
Introduction to google hacking databaseimthebeginner
Tips and tricks on using google to search effectively and getting the expected results is google hacking. Also, hackers use google hacking skills to find private information that present in the web. Like live web camera, web application vulnerabilities, software version and many more.
A Google dork query, sometimes just referred to as a dork, is a search string that uses advanced search operators to find information that is not readily available on a website. Google dorking, also known as Google hacking, can return information that is difficult to locate through simple search queries.
The document discusses using Google hacking techniques to locate vulnerabilities on websites. It describes what Google hacking is, which is using Google to find sensitive information that may have been exposed due to poor web application security. It provides examples of what attackers can do with vulnerable websites, such as file inclusion, SQL injection, and arbitrary file uploads. It also discusses the Google Hacking Database (GHDB), which is a collection of Google dorks or search queries that have revealed vulnerabilities. Finally, it covers some basics of Google hacking like using the Google cache to crawl website information and using Google as a proxy server.
The document contains a list of search strings that can be used to find potential vulnerabilities on websites and web applications. Some of the search strings look for pages indicating login portals for administrative access, content management systems, and other common internet-facing applications. Other search strings try to identify specific applications or technologies like vBulletin, ColdFusion, and iSecure. The overall document appears to be sharing ways to search for unprotected administrative or backend interfaces online.
This document provides an introduction to MongoDB and Python. It discusses how to install and run MongoDB, set up a Python environment connected to MongoDB, perform basic read and write operations on MongoDB collections from Python. It also covers common patterns for modeling data in MongoDB like embedding documents and indexing, and integrating MongoDB with popular Python web frameworks.
In this talk, I will walk through multiple tools/resources available to help you handle large datasets from log files to Google Analytics. These new techniques will empower you to find more valuable insights and help you avoid the annoyance of crashing Excel spreadsheets.
The document discusses several topics related to URLs and domains including:
- The components of a URL like protocol, hostname, path, query, and fragment
- Common HTTP status codes like 200, 404, 301, and 302
- The Robots Exclusion Protocol and common mistakes in robots.txt files
- Issues around parameter tracking, duplicate content across domains/URLs, and gateway pages
- The importance of sitemaps and tools from search engines like Live Search, Google, and Yahoo to diagnose crawl issues and analyze links.
Web Techology and google code sh (2014_10_10 08_57_30 utc)Suyash Gupta
This document discusses Google search and the code behind it. It provides an overview of how Google crawls websites and indexes keywords to pages. It explains how Google gets all the links on a page, crawls websites in depth, adds pages to an index, and records user clicks to enhance search relevance. Code examples are provided for crawling functions, getting links, adding pages to an index, and tracking click counts. Tips are also included for enhancing Google searches by using filters like filetype.
As shown at BSides Charm in Baltimore on April 23, here is my presentation on how a hacker looks at a web site, or it can also be seen as a checklist for a web application pentest. Feedback appreciated at plaverty9
This document summarizes a presentation about a web server hack at Brown University. The hack occurred due to overly permissive file permissions on the server that allowed a hacker to add malicious PHP files. These files redirected search engine traffic to pages selling pharmaceuticals, overloading the server with traffic and causing a denial of service. The presentation describes the timeline of events, how the file permissions issue allowed the hack to occur, examples of malicious code used, and steps taken to resolve the problem.
The document discusses diagnosing issues with hreflang tags in sitemaps and on web pages for multilingual and localized websites. It outlines some common problems with hreflang tags such as conflicts in tags, broken or redirecting links, incorrect language and country codes, and issues around the European Union. The document then provides guidance on how to diagnose hreflang tag issues through tools like Google Search Console, Screaming Frog, and Excel to identify missing, non-canonical, or incorrectly coded tags. It emphasizes using data to identify and correct hreflang tag problems.
Building Beautiful REST APIs in ASP.NET CoreStormpath
Core 1.0 is the latest iteration of ASP.NET. What’s changed? Everything! Nate Barbettini, .NET Developer Evangelist at Stormpath, does a deep dive on how to build RESTful APIs the right way on top of ASP.NET Web API.
Victoria Olsina outlined three basic SEO techniques she used to outrank IBM on the enterprise blockchain vertical:
1. Building topical authority by creating 28 pages about blockchain use cases
2. Optimizing content length by writing articles with at least 2000 words
3. Targeting featured snippets by answering related questions in a short paragraph and listing format within articles
This approach led to an exponential increase in organic traffic without a huge budget by focusing on the fundamentals of SEO.
.htaccess for SEOs - A presentation by Roxana StinguRoxana Stingu
The .htaccess file is famous for helping us set redirects but it can also help improve our website’s loading times as well as help with some crawling and indexing issues that I will cover in a bit. Learn where the file can be found, how it compares to https.conf, how it can be used for redirects, deal with duplicate content, what performance issues it can encounter, how it can help you create custom 404 pages, how it helps you leverage browser caching, gzip, disable image hotlinking, add canonical tags and robots directives in the HTTP headers and what tools and resources can help you learn even more.
Redefining technical SEO & how we should be thinking about it as an industry ...WeLoveSEO
It’s time to throw the traditional definition of technical SEO out the window. Why? Because technical SEO is so much bigger than just crawling, indexing, and rendering. Technical SEO is applicable to all areas of SEO, including content development and other creative functions. Join this session to learn how to integrate technical SEO into all areas of your SEO program.
600+ SEARCHABLE Sourcing Tools compiled by Susanna Frazier @ohsusannamarieSusanna Frazier
This document provides a list of over 600 sourcing tools categorized by their functions. It describes each tool's name, current version, category and a brief description. The tools cover a wide range of functions including search, social media, email, documents, scheduling and more. They allow users to easily access information, automate tasks and integrate various online services.
The document discusses securing WordPress sites from three perspectives: a user, system administrator, and developer. For users, it recommends choosing trusted plugins/themes, keeping everything updated, backups, strong passwords, and security plugins. For administrators, it recommends server configuration hardening like HTTPS, limiting permissions. For developers, it stresses sanitization, validation, escaping and secure coding practices. Responsible vulnerability disclosure is also covered.
This document discusses how Google can be used to find confidential information and vulnerable systems on the internet. It provides examples of Google search queries that can locate specific software versions, default website pages, and personal details. The document warns that many systems have outdated or unpatched software exposing sensitive data publicly online through search engines like Google. It encourages administrators to regularly update software and remove version details from webpages to prevent discovery by attackers searching for vulnerabilities.
This document discusses how Google can be used to find confidential information and vulnerabilities on the internet. It provides examples of Google search queries that can locate sensitive data like personal details, system configurations, and error messages containing passwords. The author advises administrators to regularly patch systems for known vulnerabilities and remove unnecessary details from webpages to prevent exposing sensitive information through search engines.
This document summarizes a presentation on search engine optimization (SEO) for Flash content. It discusses how search engines index Flash, including breakthroughs that allow indexing of text, links, and interactions. It emphasizes the importance of dynamic page ranking and getting links over initial page rank. Testing over long periods is recommended to understand how content is indexed. Tips provided include using descriptive text, metadata, and linking to optimize Flash content for search engines.
Google hacking involves using search engine commands and complex queries to locate sensitive data and vulnerable devices. Hackers can find vulnerable websites and devices listed in Google's database, as well as error pages, login pages, and default pages that provide information. While against Google's terms, nothing can stop hackers from using these techniques to select targets for attack.
Lab-4 Reconnaissance and Information Gathering A hacker.docxLaticiaGrissomzz
Lab-4: Reconnaissance and Information Gathering
A hacker uses many tools and methods to gather information about the target. There are two broad categories of information gathering methods: passive and active. These methods are detailed in the table below. In this lab, you will perform passive information gathering (gray-shaded column). In Lab 5, you will be performing active information gathering. Please review the table before starting this lab.
Information Gathering
Passive (Reconnaissance and Information Gathering) – This Week
Active (Scanning and Enumeration) – Next Week
Is the hacker contact with the target directly?
No direct contact with the target
Direct contact with the target
Are the activities logged?
No audit records on the target
Audit record might be created
What kind of tools has been used?
Web archives, Whois service, DNS servers, Search Engines
Port scanners, network scanners, vulnerability scanners (Nessus, Nmap)
What information can a hacker collect?
IP addresses, network range, telephone numbers, E-mail addresses, active machines, operating system version, network topology
Live hosts on a network, network topology, OS version, open ports on hosts, services running on hosts, running applications and their versions, patching level, vulnerabilities.
In passive information gathering, the hacker does not directly contact the target; therefore, no audit logs have been created. Both non-technical (such as employee names, birth dates, e-mail addresses) and technical information (IP addresses, domain names) can be gathered. This information can be used in many ways in the subsequent steps of the attack. For example, the phone numbers or e-mail addresses you discovered can be used in social engineering attacks. DNS records or subdomain names can be used to leverage specific attacks against hosts or URLs.
More notes on Reconnaissance and Information Gathering :
1) In this phase, an attacker may collect a lot of information without being noticed.
2) In some cases, an attacker may even discover vulnerabilities.
3) The information collected in this phase can be quite valuable when evaluated together with the information collected in the scanning and enumeration phase. For example, you might find the phone number and name of an employee in this phase, and you may find the computer IP address in the active scanning phase. You can use these two pieces of information together to leverage a social engineering attack. An attacker will increase the chance of gaining trust when s/he calls the victim's name and talk some specific about the victim's computer.
4) Companies should also perform reconnaissance and information gathering against themselves so that they can discover -before hackers- what kind of information the company and company employees disclose.
In this lab, you will practice 6 passive methods of Reconnaissance and Information Gathering. You have to use Kali VM in Sections 3, 5, and 6 of the lab. You may use Kali.
Introduction to google hacking databaseimthebeginner
Tips and tricks on using google to search effectively and getting the expected results is google hacking. Also, hackers use google hacking skills to find private information that present in the web. Like live web camera, web application vulnerabilities, software version and many more.
SearchLove San Diego 2019 - Britney Muller - Machine Learning: Know Enough To...Distilled
What is Machine Learning and how can we apply it to digital marketing? After this session, you'll understand machine learning basics, what ML can be used for, examples of ML solving SEO tasks and executable programs you start using immediately.
This document provides a tutorial on using the Simple Object Access Protocol (SOAP) for communication between components. It introduces an example component, an HTML calendar widget, that can receive event listings and display a calendar of events. The tutorial defines an interface for the calendar widget using CORBA IDL and then demonstrates making a request to add an event listing using a SOAP HTTP request with an XML payload wrapped in a SOAP envelope and body. The SOAP request follows the defined interface by specifying the date and event description in the payload to add a new listing to the calendar widget.
The document provides 10 ideas for using Web 2.0 tools to keep company intranets fresh, including using Google Reader to subscribe to industry news and feeds, Flickr to share company photos, Google Video to stream videos, Google Apps for a turnkey intranet, Google Gadgets to add widgets, Live.com for custom maps, WordPress for blogging and content management, social networking, and Zocalo for internal prediction markets. It describes setting up each tool and its benefits for internal company use.
The document discusses mashups and various technologies used to create them such as Flex, E4X, HTTPService, crossdomain.xml, and AMF. It provides examples of using APIs from Amazon, Flickr, Yahoo, and Google to retrieve and combine data from multiple sources into new applications. It also discusses platforms like Yahoo Pipes that allow creating mashups visually without programming.
The document discusses advanced search techniques using Google, including searching by file type, domain, keywords in page title, and reverse telephone lookups. It also covers privacy issues when online like cookies, web bugs, browser identifiers, and spyware. Specific tools and websites are mentioned that can help identify information being leaked or find browser identifiers.
This document summarizes a presentation about integrating eZ Publish, an open-source content management system. Chapter 1 introduces eZ Publish and provides statistics on its usage. Chapter 2 discusses custom data modeling in eZ Publish. Chapter 3 covers using eZ Publish to manage multiple websites. Chapter 4 explains approaches to integrating external data through APIs. The final chapter takes questions from the audience.
This document introduces a new method of creating phishing web pages using data URIs, which allow web content to be hosted directly within a URI without a traditional web server. It describes how to create a basic phishing page by encoding all content like HTML, images, and scripts directly into a data URI. As a proof of concept, it includes an encoded phishing version of the Wikipedia login page as an example. The summary concludes that this technique could make phishing pages more difficult to trace and shut down since they have no defined hosting location on the internet.
This document discusses various tools and techniques for security testing and debugging web applications developed in C#. It provides information on static analysis tools like FxCop and CAT.NET that can analyze source code. It also covers automated testing tools like NUnit, HTMLUnit, and Selenium that can test web applications without using a real browser. The document demonstrates how to integrate these tools into testing workflows and addresses related topics like logging, exception handling, and custom error pages.
The document discusses different types of accidental source code disclosure including:
1) Source code available in HTML source code when dynamic pages are published as static pages.
2) Source code stored in readable backup files or configuration files that are accessible.
3) Websites that stream static files through a download script, allowing source code to be downloaded by changing the file name parameter.
c0c0n, previously known as Cyber Safe, is an annual event conducted as part of the International Information Security Day. The Information Security Research Association along with Matriux Security Community is organizing a 2 day International Security and Hacking Conference titled c0c0n 2010, as part of Information Security Day 2010. The event is supported by the Kochi City Police. Various technical, non-technical, legal and community events are organized as part of the program. c0c0n 2010 is scheduled on 05, 06 Aug 2010 The number of digital security incidents and cyber crimes are increasing daily on a proportionate rate. The industry is demanding more and more security professionals and controls to curb this never ending threat to information systems. c0c0n is aimed at providing a platform to discuss, showcase, educate, understand and spread awareness on the latest trends in information, cyber and hi-tech crimes. It also aims to provide a hand-shaking platform for various corporate, government organizations including the various investigation agencies, academia, research organizations and other industry leaders and players for better co-ordination in making the cyber world a better and safe place to be.
Maximiliano Soler gives a presentation on using Google to gather information without sophisticated mechanisms. He demonstrates how to use Google search operators ("dorks") to find vulnerable products, error messages, sensitive files and passwords, foot holds for access, and more. He recommends securing servers and applications, disabling directory browsing, not publishing sensitive info without authentication, and analyzing website search traffic for security.
News web app which gives updated news. The news is been fetched from the different news website. This project is made by using Flask framework and python.
The Source code is available at: https://github.com/prateek-code-22/News-App
If you find it useful add STAR to it.
Thanks :)
This presentation was given to some brilliant students in Bucharest who were working on various projects. It describes the basic features of Yahoo! BOSS as well as how to use a search API to build complete web sites and applications.
Similar to Dangerous google searching for secrets (20)
1. Dangerous Google
– Searching for Secrets
Michał Piotrowski
This article has been published in issue 4/2005 of the hakin9 magazine.
All rights reserved. This file may be distributed for free pending no changes are made to its contents or form.
hakin9 magazine, Wydawnictwo Software, ul. Lewartowskiego 6, 00-190 Warszawa, en@hakin9.org
2. Dangerous Google
– Searching for Secrets
Michał Piotrowski
Information which should be
protected is very often publicly
available, revealed by careless
or ignorant users. The result is
that lots of confidential data is
freely available on the Internet
– just Google for it.
Basics
G
oogle serves some 80 percent of all
search queries on the Internet, making it by far the most popular search
engine. Its popularity is due not only to excellent search effectiveness, but also extensive
querying capabilities. However, we should
also remember that the Internet is a highly
dynamic medium, so the results presented
by Google are not always up-to-date – some
search results might be stale, while other
relevant resources might not yet have been
visited by Googlebot (the automatic script
that browses and indexes Web resources for
Google).
Table 1 presents a summary of the most
important and most useful query operators
along with their descriptions, while Figure 1
shows document locations referred to by the
operators when applied to Web searches. Of
course, this is just a handful of examples – skilful Google querying can lead to much more
interesting results.
Hunting for Prey
Google makes it possible to reach not just
publicly available Internet resources, but also
some that should never have been revealed.
2
www.hakin9.org
What You Will Learn...
•
•
•
how to use Google to find sources of personal
information and other confidential data,
how to find information about vulnerable systems and Web services,
how to locate publicly available network devices using Google.
What You Should Know...
•
•
how to use a Web browser,
basic rules of operation of the HTTP protocol.
About the Author
Michał Piotrowski holds an MA in IT and has
many years' experience in network and system
administration. For over three years he has
been a security inspector and is currently working as computer network security expert at one
of the largest Polish financial institutions. His
free time is occupied by programming, cryptography and contributing to the open source
community.
hakin9 4/2005
3. Google hacking
Table 1. Google query operators
Operator
Description
Sample query
site
restricts results to sites within the
specified domain
site:google.com fox
intitle
restricts results to documents whose intitle:fox fire will find all sites with the word fox in the
title contains the specified phrase
title and fire in the text
allintitle
restricts results to documents
whose title contains all the specified
phrases
allintitle:fox fire
inurl
restricts results to sites whose URL
contains the specified phrase
inurl:fox fire
allinurl
restricts results to sites whose URL
contains all the specified phrases
allinurl:fox fire
will find all sites containing the
word fox, located within the *.google.com domain
will find all sites with the words fox
and fire in the title, so it's equivalent to intitle:fox
intitle:fire
will find all sites containing the word fire
in the text and fox in the URL
will find all sites with the words fox
and fire in the URL, so it's equivalent to inurl:fox
inurl:fire
filetype, ext
restricts results to documents of the
specified type
filetype:pdf fire
numrange
restricts results to documents containing a number from the specified
range
numrange:1-100 fire
link
restricts results to sites containing
links to the specified location
link:www.google.com
inanchor
restricts results to sites containing
links with the specified phrase in
their descriptions
inanchor:fire
allintext
restricts results to documents containing the specified phrase in the
text, but not in the title, link descriptions or URLs
allintext:"fire fox" will
+
specifies that a phrase should occur
frequently in results
+fire
-
specifies that a phrase must not occur in results
-fire
delimiters for entire search phrases
(not single words)
"fire fox" will
.
wildcard for a single character
fire.fox
*
wildcard for a single word
fire * fox
|
logical OR
"fire fox" | firefox
""
hakin9 4/2005
will return PDFs containing the word
fire, while filetype:xls fox will return Excel spreadsheets
with the word fox
will return sites containing a number
from 1 to 100 and the word fire. The same result can be
achieved with 1..100 fire
will return documents containing
one or more links to www.google.com
will return documents with links whose
description contains the word fire (that's the actual link
text, not the URL indicated by the link)
return documents which contain the phrase fire fox in their text only
will order results by the number of occurrences of
the word fire
will return documents that don't contain the word
fire
return documents containing the phrase
fire fox
will return documents containing the phrases
fire fox, fireAfox, fire1fox, fire-fox etc.
will return documents containing the phrases
fire the fox, fire in fox, fire or fox etc.
will return documents containing the
phrase fire fox or the word firefox
www.hakin9.org
3
4. Figure 1. The use of search query operators illustrated using the hakin9
website
Basics
Figure 2. Locating IIS 5.0 servers using the intitle operator
4
The right query can yield some quite
remarkable results. Let's start with
something simple.
Suppose that a vulnerability is
discovered in a popular application
– let's say it's the Microsoft IIS server
version 5.0 – and a hypothetical attacker decides to find a few computers running this software in order to
attack them. He could of course use
a scanner of some description, but
he prefers Google, so he just enters
the query "Microsoft-IIS/5.0 Server
at"
intitle:index.of and obtains
links to the servers he needs (or,
more specifically, links to autogenerated directory listings for those
servers). This works because in its
standard configuration, IIS (just like
many other server applications) adds
www.hakin9.org
banners containing its name and version to some dynamically generated
pages (Figure 2 shows this query in
action).
It's a typical example of information which seems quite harmless, so is frequently ignored
and remains in the standard configuration. Unfortunately, it is also
information which in certain circumstances can be most valuable to
a potential attacker. Table 2 shows
more sample Google queries for
typical Web servers.
Another way of locating specific
versions of Web servers is to search
for the standard pages displayed
after successful server installation.
Strange though it may seem, there
are plenty of Web servers out there,
the default configuration of which
hasn't been touched since installation. They are frequently forgotten,
ill-secured machines which are
easy prey for attackers. They can
be located using the queries shown
in Table 3.
This method is both very simple
and extremely useful, as it provides
access to a huge number of various
websites and operating systems
which run applications with known
vulnerabilities that lazy or ignorant
administrators have not patched. We
will see how this works for two fairly
popular programs: WebJeff Filemanager and Advanced Guestbook.
The first is a web-based file
manager for uploading, browsing,
managing and modifying files on
a server. Unfortunately, WebJeff
Filemanager version 1.6 contains
a bug which makes it possible
to download any file on the server,
as long as it's accessible to the user
running the HTTP daemon. In other
words, specifying a page such as
/index.php3?action=telecharger&f
ichier=/etc/passwd in a vulnerable
system will let any intruder download
the /etc/passwd file (see Figure 3).
The aggressor will of course locate
vulnerable installations by querying
Google for "WebJeff-Filemanager
1.6" Login.
Our other target – Advanced
Guestbook – is a PHP application
hakin9 4/2005
5. Google hacking
Table 2. Google queries for locating various Web servers
Query
Server
"Apache/1.3.28 Server at" intitle:index.of
Apache 1.3.28
"Apache/2.0 Server at" intitle:index.of
Apache 2.0
"Apache/* Server at" intitle:index.of
any version of Apache
"Microsoft-IIS/4.0 Server at" intitle:index.of
Microsoft Internet Information Services 4.0
"Microsoft-IIS/5.0 Server at" intitle:index.of
Microsoft Internet Information Services 5.0
"Microsoft-IIS/6.0 Server at" intitle:index.of
Microsoft Internet Information Services 6.0
"Microsoft-IIS/* Server at" intitle:index.of
any version of Microsoft Internet Information Services
"Oracle HTTP Server/* Server at" intitle:index.of
any version of Oracle HTTP Server
"IBM _ HTTP _ Server/* * Server at" intitle:index.of
any version of IBM HTTP Server
"Netscape/* Server at" intitle:index.of
any version of Netscape Server
"Red Hat Secure/*" intitle:index.of
any version of the Red Hat Secure server
"HP Apache-based Web Server/*" intitle:index.of
any version of the HP server
Table 3. Queries for discovering standard post-installation Web server pages
Query
Server
intitle:"Test Page for Apache Installation" "You are free"
Apache 1.2.6
intitle:"Test Page for Apache Installation" "It worked!"
Apache 1.3.0 – 1.3.9
"this Web site!"
intitle:"Test Page for Apache Installation" "Seeing this
instead"
intitle:"Test Page for the SSL/TLS-aware Apache
Apache 1.3.11 – 1.3.33, 2.0
Apache SSL/TLS
Installation" "Hey, it worked!"
intitle:"Test Page for the Apache Web Server on Red Hat
Linux"
intitle:"Test Page for the Apache Http Server on Fedora
Core"
Apache on Red Hat
Apache on Fedora
intitle:"Welcome to Your New Home Page!" Debian
Apache on Debian
intitle:"Welcome to IIS 4.0!"
IIS 4.0
intitle:"Welcome to Windows 2000 Internet Services"
IIS 5.0
intitle:"Welcome to Windows XP Server Internet Services"
IIS 6.0
with SQL database support, used
for adding guestbooks to websites. In April 2004, information
was published about a vulnerability in the application's 2.2 version,
making it possible to access the
administration panel using an SQL
injection attack (see SQL Injection
Attacks with PHP/MySQL in hakin9
3/2005). It's enough to navigate
to the panel login screen (see
Figure 4) and log in leaving the
username blank and entering ') OR
hakin9 4/2005
as password or the other
way around – leaving password
blank and entering ? or 1=1 -- for
username. The potential aggressor can locate vulnerable websites
by querying Google for intitle:
diately patch any vulnerabilities.
Another thing to bear in mind is that
it's well worth removing application
banners, names and versions from
any pages or files that might contain
them.
Guestbook "Advanced Guestbook 2.2
Information about
Networks and Systems
('a' = 'a
Powered"
or
"Advanced
Guestbook
2.2" Username inurl:admin.
To prevent such security leaks,
administrators should track current
information on all the applications
used by their systems and imme-
www.hakin9.org
Practically all attacks on IT systems require preparatory target
reconnaissance, usually involving
scanning computers in an attempt
5
6. Figure 3. A vulnerable version of WebJeff Filemanager
to recognise running services, operating systems and specific service
software. Network scanners such as
Nmap or amap are typically used for
this purpose, but another possibility
also exists. Many system administrators install Web-based applications
which generate system load statistics, show disk space usage or even
display system logs.
All this can be valuable information to an intruder. Simply querying
Google for statistics generated and
signed by the phpSystem application using the query "Generated by
phpSystem" will result in a whole list
of pages similar to the one shown
in Figure 5. The intruder can also
query for pages generated by the
Sysinfo script using intitle:"Sysinfo
* " intext:"Generated by Sysinfo *
– these
pages contain much more system
information (Figure 6).
This method offers numerous
possibilities – Table 4 shows sample queries for finding statistics and
other information generated by several popular applications. Obtaining
such information may encourage the
intruder to attack a given system and
will help him find the right tools and
exploits for the job. So if you decide
to use Web applications to monitor
computer resources, make sure access to them is password-protected.
written by The Gamblers."
Figure 4. Advanced Guestbook login page
Looking for Errors
Basics
HTTP error messages can be extremely valuable to an attacker, as
they can provide a wealth of information about the system, database
structure and configuration. For
example, finding errors generated
by an Informix database merely requires querying for "A syntax error
has occurred" filetype:ihtml. The result will provide the intruder with error messages containing information
on database configuration, a system's file structure and sometimes
even passwords (see Figure 7). The
results can be narrowed down to
only those containing passwords by
altering the query slightly: "A syntax
error has occurred" filetype:ihtml
Figure 5. Statistics generated by phpSystem
6
intext:LOGIN.
www.hakin9.org
hakin9 4/2005
7. Google hacking
Equally useful information can
be obtained from MySQL database
errors simply by querying Google
for "Access denied for user" "Using
password" – Figure 8 shows a typical
website located in this manner. Table 5 contains more sample queries
using the same method.
The only way of preventing our
systems from publicly revealing error
information is removing all bugs as
soon as we can and (if possible) configuring applications to log any errors
to files instead of displaying them for
the users to see.
Remember that even if you
react quickly (and thus make the
error pages indicated by Google
out-of-date), a potential intruder
will still be able to examine the version of the page cached by Google
by simply clicking the link to the
page copy. Fortunately, the sheer
volume of Web resources means
Figure 6. Statistics generated by Sysinfo
Table 4. Querying for application-generated system reports
Query
Type of information
"Generated by phpSystem"
operating system type and version, hardware configuration, logged users, open connections, free memory and
disk space, mount points
"This summary was generated by wwwstat"
web server statistics, system file structure
"These statistics were produced by getstats"
web server statistics, system file structure
"This report was generated by WebLog"
web server statistics, system file structure
intext:"Tobias Oetiker" "traffic analysis"
system performance statistics as MRTG charts, network
configuration
intitle:"Apache::Status" (inurl:server-status | inurl:
status.html | inurl:apache.html)
server version, operating system type, child process list,
current connections
intitle:"ASP Stats Generator *.*" "ASP Stats
web server activity, lots of visitor information
Generator" "2003-2004 weppos"
intitle:"Multimon UPS status page"
UPS device performance statistics
intitle:"statistics of" "advanced web statistics"
web server statistics, visitor information
intitle:"System Statistics" +"System and Network
system performance statistics as MRTG charts, hardware configuration, running services
Information Center"
intitle:"Usage Statistics for" "Generated by
Webalizer"
web server statistics, visitor information, system file
structure
intitle:"Web Server Statistics for ****"
web server statistics, visitor information
inurl:"/axs/ax-admin.pl" -script
web server statistics, visitor information
inurl:"/cricket/grapher.cgi"
MRTG charts of network interface performance
inurl:server-info "Apache Server Information"
web server version and configuration, operating system
type, system file structure
"Output produced by SysWatch *"
operating system type and version, logged users, free
memory and disk space, mount points, running processes, system logs
hakin9 4/2005
www.hakin9.org
7
8. that pages can only be cached for
a relatively short time.
Prowling
for Passwords
Web pages contain a great many
passwords to all manner of resources – e-mail accounts, FTP servers or
even shell accounts. This is mostly
due to the ignorance of users who
unwittingly store their passwords
in publicly accessible locations,
but also due to the carelessness of
software manufacturers who either
provide insufficient measures of
protecting user data or supply no
information about the necessity of
modifying their products' standard
configuration.
Take the example of WS_FTP,
a well-known and widely-used FTP
client which (like many utilities) offers the option of storing account
passwords. WS_FTP stores its
configuration and user account
information in the WS_FTP.ini file.
Unfortunately, not everyone realises that gaining access to an FTP
client's configuration is synonymous
with gaining access to a user's FTP
resources. Passwords stored in the
WS_FTP.ini file are encrypted, but
this provides little protection – once
an intruder obtains the configuration
Figure 7. Querying for Informix database errors
Figure 8. MySQL database error
Table 5. Error message queries
Query
Result
"A syntax error has occurred"
Informix database errors, potentially containing function names, filenames, file
structure information, pieces of SQL code and passwords
filetype:ihtml
"Access denied for user" "Using
password"
"The script whose uid is " "is
not allowed to access"
"ORA-00921: unexpected end of SQL
command"
"error found handling the
Basics
request" cocoon filetype:xml
"Invision Power Board Database
Error"
"Warning: mysql _ query()"
"invalid query"
"Error Message : Error loading
required libraries."
"#mysql dump" filetype:sql
8
authorisation errors, potentially containing user names, function names, file
structure information and pieces of SQL code
access-related PHP errors, potentially containing filenames, function names
and file structure information
Oracle database errors, potentially containing filenames, function names and
file structure information
Cocoon errors, potentially containing Cocoon version information, filenames,
function names and file structure information
Invision Power Board bulletin board errors, potentially containing function
names, filenames, file structure information and piece of SQL code
MySQL database errors, potentially containing user names, function names,
filenames and file structure information
CGI script errors, potentially containing information about operating system
and program versions, user names, filenames and file structure information
MySQL database errors, potentially containing information about database
structure and contents
www.hakin9.org
hakin9 4/2005
9. Google hacking
file, he can either decipher the password using suitable tools or simply
install WS_FTP and run it with the
stolen configuration. And how can
the intruder obtain thousands of
WS_FTP configuration files? Using
Google, of course. Simply querying
for "Index of/" "Parent Directory"
"WS _ FTP.ini" or filetype:ini WS _ FTP
PWD will return lots of links to the data
he requires, placed at his evil disposal by the users themselves in their
blissful ignorance (see Figure 9).
Another example is a Web application called DUclassified, used
for managing website advertising
materials. In its standard configuration, the application stores all the
user names, passwords and other
data in the duclassified.mdb file,
located in the read-accessible
_private subdirectory. It is therefore
enough to find a site that uses DUclassified, take the base URL http://
<host>/duClassified/ and change
it to http://<host>/duClassified/
_private/duclassified.mdb to obtain the password file and thus
obtain unlimited access to the application (as seen in Figure 10).
Websites which use the vulnerable application can be located
by querying Google for "Powered
by
DUclassified"
-site:duware.com
(the additional operator will filter
out results from the manufacturer's
website). Interestingly enough, the
makers of DUclassified – a company called DUware – have also
created several other applications
with similar vulnerabilities.
In theory, everyone knows that
passwords should not reside on
post-its stuck to the monitor or
under the keyboard. In practice,
however, surprisingly many people
store passwords in text files and
put them in their home directories,
which (funnily enough) are accessible through the Internet. What's
more, many such individuals work
as network administrators or similar, so the files can get pretty big.
It's hard to define a single method
of locating such data, but googling
for such keywords as account, users, admin, administrators, passwd,
hakin9 4/2005
Figure 9. WS_FTP configuration file
Figure 10. DUclassified in its standard configuration
password and so on can be pretty
effective, especially coupled with
such filetypes as .xls, .txt, .doc,
.mdb and .pdf. It's also worth noting
www.hakin9.org
directories whose names contain
the words admin, backup and so
forth – a query like inurl:admin
intitle:index.of will do the trick.
9
10. Table 6. Google queries for locating passwords
Query
Result
"http://*:*@www" site
passwords for site, stored as the string "http://username:
password@www..."
filetype:bak inurl:"htaccess|passwd|shadow|ht
users"
filetype:mdb inurl:"account|users|admin|admin
istrators|passwd|password"
file backups, potentially containing user names and passwords
mdb files, potentially containing password information
intitle:"Index of" pwd.db
pwd.db files, potentially containing user names and encrypted
passwords
inurl:admin inurl:backup intitle:index.of
directories whose names contain the words admin and backup
"Index of/" "Parent Directory" "WS _ FTP.ini"
filetype:ini WS _ FTP PWD
WS_FTP configuration files, potentially containing FTP server
access passwords
ext:pwd inurl:(service|authors|administrators
files containing Microsoft FrontPage passwords
|users) "# -FrontPage-"
filetype:sql ("passwd values ****" |
"password values ****" | "pass values ****" )
files containing SQL code and passwords inserted into a database
intitle:index.of trillian.ini
configuration files for the Trillian IM
eggdrop filetype:user user
configuration files for the Eggdrop ircbot
filetype:conf slapd.conf
configuration files for OpenLDAP
inurl:"wvdial.conf" intext:"password"
configuration files for WV Dial
ext:ini eudora.ini
configuration files for the Eudora mail client
filetype:mdb inurl:users.mdb
Microsoft Access files, potentially containing user account information
intext:"powered by Web Wiz Journal"
websites using Web Wiz Journal, which in its standard configuration allows access to the passwords file – just enter http:
//<host>/journal/journal.mdb instead of the default http://<host>/
journal/
"Powered by DUclassified" -site:duware.com
"Powered by DUcalendar" -site:duware.com
"Powered by DUdirectory" -site:duware.com
"Powered by DUclassmate" -site:duware.com
"Powered by DUdownload" -site:duware.com
"Powered by DUpaypal" -site:duware.com
websites using the DUclassified, DUcalendar, DUdirectory, DUclassmate, DUdownload, DUpaypal, DUforum or DUpics applications, which by default make it possible to obtain the passwords
file – for DUclassified, just enter http://<host>/duClassified/ _
private/duclassified.mdb instead of http://<host>/duClassified/
"Powered by DUforum" -site:duware.com
intitle:dupics inurl:(add.asp | default.asp |
view.asp | voting.asp) -site:duware.com
intext:"BiTBOARD v2.0" "BiTSHiFTERS Bulletin
Board"
websites using the Bitboard2 bulletin board application, which on
default settings allows the passwords file to be obtained – enter
http://<host>/forum/admin/data _ passwd.dat instead of the default
Basics
http://<host>/forum/forum.php
10
Table 6 presents some sample
queries for password-related data.
To make our passwords less
accessible to intruders, we must
carefully consider where and why
we enter them, how they are stored
and what happens to them. If we're in
charge of a website, we should analyse the configuration of the applications we use, locate poorly protected
or particularly sensitive data and
take appropriate steps to secure it.
Personal Information
and Confidential
Documents
Both in European countries and the
U.S., legal regulations are in place
to protect our privacy. Unfortunately,
www.hakin9.org
it is frequently the case that all sorts
of confidential documents containing our personal information are
placed in publicly accessible locations or transmitted over the Web
without proper protection. To get our
complete information, an intruder
need only gain access to an e-mail
repository containing the CV we
sent out while looking for work. Ad-
hakin9 4/2005
11. Google hacking
dress, phone number, date of birth,
education, skills, work experience
– it's all there.
Thousands of such documents
can be found on the Internet
– just query Google for intitle:
"curriculum
Figure 11. Electronic address book obtained through Google
Figure 12. Confidential document found through Google
Figure 13. An HP printer's configuration page found by Google
hakin9 4/2005
www.hakin9.org
vitae"
"phone
"address
*"
"e-mail"
.
*
*
Finding
contact information in the form
of names, phone number and email addresses is equally easy
(Figure 11). This is because most
Internet users create electronic address books of some description.
While these may be of little interest
to your typical intruder, they can
be dangerous tools in the hands of
a skilled sociotechnician, especially
if the contacts are restricted to one
company. A simple query such as
filetype:xls inurl:"email.xls" can
be surprisingly effective, finding
Excel spreadsheet called email.xls.
All the above also applies to
instant messaging applications and
their contact lists – if an intruder
obtains such a list, he may be able to
pose as our IM friends. Interestingly
enough, a fair amount of personal
data can also be obtained from official documents, such as police
reports, legal documents or even
medical history cards.
The Web also contains documents that have been marked as
confidential and therefore contain
sensitive information. These may
include project plans, technical documentation, surveys, reports, presentations and a whole host of other
company-internal materials. They
are easily located as they frequently
contain the word confidential, the
phrase Not for distribution or similar clauses (see Figure 12). Table 7
presents several sample queries
that reveal documents potentially
containing personal information and
confidential data.
As with passwords, all we can
do to avoid revealing private information is to be cautious and retain
maximum control over published
data. Companies and organisations
should (and many are obliged to)
specify and enforce rules, procedures and standard practices for
*"
11
12. Table 7. Searching for personal data and confidential documents
Query
Result
filetype:xls inurl:"email.xls"
email.xls files, potentially containing contact information
"phone * * *" "address *" "e-mail" intitle:
CVs
"curriculum vitae"
"not for distribution" confidential
documents containing the confidential clause
buddylist.blt
AIM contacts list
intitle:index.of mystuff.xml
Trillian IM contacts list
filetype:ctt "msn"
MSN contacts list
filetype:QDF QDF
database files for the Quicken financial application
intitle:index.of finances.xls
finances.xls files, potentially containing information on bank accounts, financial summaries and credit card numbers
intitle:"Index Of" -inurl:maillog maillog size
maillog files, potentially containing e-mail
"Network Vulnerability Assessment Report"
reports for network security scans, penetration tests etc.
"Host Vulnerability Summary Report"
filetype:pdf "Assessment Report"
"This file was generated by Nessus"
Table 8. Queries for locating network devices
Query
Device
"Copyright (c) Tektronix, Inc." "printer status"
PhaserLink printers
inurl:"printer/main.html" intext:"settings"
Brother HL printers
intitle:"Dell Laser Printer" ews
Dell printers with EWS technology
intext:centreware inurl:status
Xerox Phaser 4500/6250/8200/8400 printers
inurl:hp/device/this.LCDispatcher
HP printers
intitle:liveapplet inurl:LvAppl
Canon Webview webcams
intitle:"EvoCam" inurl:"webcam.html"
Evocam webcams
inurl:"ViewerFrame?Mode="
Panasonic Network Camera webcams
(intext:"MOBOTIX M1" | intext:"MOBOTIX M10") intext:"Open
Mobotix webcams
Menu" Shift-Reload
Sony SNC-RZ30 webcams
intitle:"my webcamXP server!" inurl:":8080"
webcams accessible via WebcamXP Server
allintitle:Brains, Corp. camera
webcams accessible via mmEye
intitle:"active webcam page"
12
Axis webcams
SNC-RZ30 HOME
Basics
inurl:indexFrame.shtml Axis
USB webcams
handling documents within the
organisation, complete with clearly
defined responsibilities and penalties for infringements.
Network Devices
Many
administrator
downplay
the importance of securing such
devices as network printers or
webcams. However, an insecure
printer can provide an intruder with
a foothold that can later be used as
a basis for attacking other systems
in the same network or even other
networks. Webcams are, of course,
much less dangerous, so hacking
them can only be seen as entertainment, although it's not hard to imagine situations where data from a
webcam could be useful (industrial
espionage, robberies etc.). Table 8
contains sample queries revealing
printers and webcams, while Figure 12 shows a printer configuration
page found on the Web. n
On the Net
•
•
•
http://johnny.ihackstuff.com – largest repository of data on Google hacking,
http://insecure.org/nmap/ – Nmap network scanner,
http://thc.org/thc-amap/ – amap network scanner.
www.hakin9.org
hakin9 4/2005