SEOHere you will ﬁnd good examples of what Google collects in order to assess whether a web site has the quality to appear higher in search results.
Googlebots use multiple sources and techniques to gather information on our Web sites. For example: 1. Crawling a page2. Considering what the people are searching for 3. The free Google toolbar, Android browser & Chrome
Through crawling of a web page andcontent analysis , Googlebot collects and analyses the following information:
Layout of the page: What is the quality of the web design of aparticular page compared with that of the pages that are considered positive/negative by Google evaluators.*
Content/Advertising: What is the ratio of content to advertising? Relevance: To what extend is the contentof the webpage related to the search termand what the man who seeks really meant?
Through analysis of the behavior of web visitors who have arrived at aparticular site from search results, Google considers:
Repeat visits: With what frequency seekersreturn to your results or click on another page after they visit your site?Hits: What percentage of people who click on your page link then click on other search results? Social signals: Do users mark your page as a spam, or they rather give it a (+1)?
Through the data that is gatheredfrom Google toolbar, Android andChrome, search engines consider:
Time spent online: How long time the average visitor spent on your webpageNumber of visited pages: Do visitors look at other pages after they arrive on your site? Comparison: How does the data on visitor behavior relates to the data that is provided by the evaluators*?
Evaluators* Google users, evaluating the Webaccording to pre-set criteria. Google engineers put these assessments into the search engine algorithm. This approach was introduced by Google Panda algorithm, through whichGoogle emphasized the quality of web content.