Google hacking, also named Google dorking, is a hacker technique that uses Google Search and other Google applications to find security holes in the configuration and computer code that websites are using.
Basically, "Google hacking" involves using advanced operators in the Google search engine to locate specific errors of text within search results. Some of the more popular examples are finding specific versions of vulnerable Web applications. A search query with
intitle:admbook intitle:Fversion filetype:php would locate all web pages that have that particular text contained within them. It is normal for default installations of applications to include their running version in every page they serve, for example, "Powered by XOOPS 2.2.3 Final".
Devices connected to the Internet can be found. A search string such as
inurl:"ViewerFrame?Mode=" will find public web cameras.
Another useful search is following
intitle:index.of followed by a search keyword. This can give a list of files on the servers. For example,
intitle:index.of mp3 will give all the MP3 files available on various types of servers.
You can use symbols or words in your search to make your search results more precise.
See also: Johnny Long § Google hacking
The concept of "Google hacking" dates back to 2002, when Johnny Long began to collect Google search queries that uncovered vulnerable systems and/or sensitive information disclosures – labeling them googleDorks.
The list of Google Dorks grew into a large dictionary of queries, which were eventually organized into the original Google Hacking Database (GHDB) in 2004.
Concepts explored in Google hacking have been extended to other search engines, such as Bing and Shodan. Automated attack tools use custom search dictionaries to find vulnerable systems and sensitive information disclosures in public systems that have been indexed by search engines.
Robots.txt is a well known file for search engine optimization and protection against Google dorking. It involves the use of robots.txt to disallow everything or specific endpoints (hackers can still search robots.txt for endpoints) which prevents google bots from crawling sensitive endpoints such as admin panels.