Google Sitemap and robots.txt Setup Techniques


Published on

Concept about Google Sitemap & Robots.txt file for SEO Perspective.

Published in: Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Google Sitemap and robots.txt Setup Techniques

  1. 1. Google Site Map & Robots.txt File Nasir Uddin Shamim Co Founder DevsTeam
  2. 2. What The Hell A Sitemap Is?In the Visitors perspective: It gives user friendly navigation towards every corner of a site.In the Spider Bot perspective: Makes the crawling functionality better. Indexing have been much more faster and targeted.
  3. 3. Why You Should Have A Sitemap Each notifies the SE about each updates of your site. It helps the SE to easily find any content from the Web. Your Visitors can go through anywhere of your site as they want. You can have a clear idea where to improve more and what you are missing.
  4. 4. Types of Sitemaps HTML Sitemap XML Sitemap ?xml version="1.0" encoding="UTF-Code example of HTML: 8" !DOCTYPE html PUBLIC "- //W3C//DTD XHTML 1.0 Strict//EN"html lang="en" " This is a site map head D/xhtml1-strict.dtd"body htmlh1 header of HTML site map xmlns=" html" xml:lang="en" lang="en"> head This is a site map headp site map paragraph with bodylinks h1 header of XHTML site map h1body p site map paragraph with links phtml body html
  5. 5. Types of Sitemap Text Sitemap Special TypesExample of text sitemap file:  Image Sitemap  Video Sitemap  News Sitemap ome-directory/  GEO Sitemap
  6. 6. Difference Between HTML & XML SitemapHTML Sitemaps XML Sitemaps  XML sitemaps are used for SEO. HTML sitemaps are user  Makes it easier for search engine sitemaps. "spiders" to "crawl" through a HTML sitemaps enhance the look website. and feel of the website.  Helps search engines to index the Communicates the overall theme content of a site. of the website.  Communicates any website Organizes each section contained changes to search engines. in the website.  External links are not relied on to Makes user navigation easier. index a site. Makes it easier for users to link  All of the pages of a website have within the website. an opportunity to be indexed. Usually used for large websites.  Helps search engine "spiders" to index a site faster.
  7. 7. What is robots.txt File?Robots.txt file tells the search engine spider orcrawler, which Web pages or directory of yoursite should be indexed and which Web pagesshould be ignored!
  8. 8. Parameters To Be Used User-agent: Indicate Robots (Crawler or spider) * Indicate all robots Disallow: indicate not to crawl the selected directory or page (before writing a directory there should have a /). If there have no / before the selected directory that means it allow the page Allow: Indicate to crawl the directory # this sign is used to write a comment
  9. 9. A robots.txt File Examplesitemap: *Disallow: /cgi-bin/Disallow: /go/Disallow: /wp-admin/Disallow: /wp-includes/User-agent: Googlebot-ImageAllow: /wp-content/uploads/User-agent: Mediapartners-GoogleAllow: /
  10. 10. The End!Thanking You All……..!!! Shamim DevsTeam