Website Audit Bala Abirami, SEM Dot Com Infoway
Agenda What is Website Audit? Site Analysis SEO Audit Design And Usability Suggestions Navigation  Questions
Website Audit Why Website Audit is Done? To Check if the Website is SE Friendly User Friendly Improves the Online Presence From Conversion Point of View
Site Analysis Website Status in Search Engines Duplicate Content – Copyscape.com IP for www and non www version Backlinks Status DMOZ listing & Yahoo Directory listing
SEO Audit Canonical Issues WWW and Non WWW Version www.example.com example.com/ www.example.com/index.html example.com/home.asp
Solution 301 redirection from non-www version of the website to www version of the website. Removing the hyperlinks of  http://www.example.com/index.html  throughout the website.
Content Optimization Content is the king. It is the key to search engine rankings.  Visitors will spend time only when the content is more attractive and much informative.  It is an important factor in search engine algorithm for ranking websites.
Content Optimization Importance of Content optimization Unique and well-written.  With informative Content Comes Even More Links  Search Engines give more importance to fresh content
Content Optimization Include your Targeted Search Terms  Checking Keyword Density  Checking Keyword Proximity  Checking Keyword Frequency Checking Keyword Prominence   Relationship of Body Text Content to Keywords  Heading Tags Inline Text Links
Internal Duplication Internal Duplication are created when two different URLs have the same content within the same website For Example http://www.midas.com.au/ http://www.midas.com.au/index.php?new_page_id=1 Canonical Link Element <link rel=&quot;canonical&quot; href=&quot;http://example.com/page.html&quot;/> http://www.wikia.com/Wikia
Heading Tags  The <h1> to <h6> tags define headers. <h1> defines the largest header, <h6> defines the smallest header.  After Title Tag, Header tags are the next most important placement of your keywords. <h1> tags holds the greatest weight of the entire heading tags, its purpose is to act as the primary heading of the page.<h2> acts a sub-heading for the page.
Heading Tags  If you need to use another heading tag use the <H2> tag, and so on. But its better not to use more that 2 heading tags in a web page.  If you use heading tags irresponsibly you run the risk of having your website penalized for spam
Inline Text Links   Inline text links are links added right into text within your content.  The first is to give the reader a quick and easy way to find the information you are referring to. The second purpose of this technique is to give added weight to this phrase for the page on which the link is located and also to give weight to the target page.
Dont’s Don’t use Content Very Similar or Duplicate of Existing Content Don't bury your keyword-rich content at the bottom of the page. Don't overdo things.  Don't go overboard with the use of &quot;H1&quot; tags or bolded text
SEO Suggestions Avoid dynamic URLs. Use Mod-rewrite to overcome this problem. Use nofollow to the outgoing links Broken Links -  Check your broken and dead link using   the tool
URL Structure Static URL  Search Engine Friendly Keywords can be used in the URL  Easier for the end-user to view and understand what the page is about. Indexed more quickly in search engines
URL Structure Dynamic URL  Not search engine friendly  Keywords cannot be used in the URL  Not easier for end user URLs with Special Characters are not indexed easily by search engines.
Best URL Practice Describe Your Content Keep it Short  Use static URLs  Descriptives are Better than Numbers   ( instead of 114/cat223/ you can use /brand/adidas/ ) Never use multiple subdomains   (e.g., siteexplorer.search.yahoo.com)
Best URL Practice Better use Fewer Folders  -  http://www.newyorkmetro.com/fashion/fashionshows/2007/spring/ main/newyork/womenrunway/marcjacobs/   Stick with a single URL format throughout the site.
Robots.txt Robots.txt is a text file you put on your site to tell search robots which pages you would like them not to visit. Need for robots.txt If you have two versions of a page for eg.printer friendly version of a page. If you want some pages not get indexed by search engines. For example secured pages
Robots.txt Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines. Where to include:  It must be in the main directory because otherwise user agents will not be able to find it and index the whole site.
Sitemap A Sitemap is a list of the pages on your website. Sitemaps are particularly helpful if: Your site has dynamic content.  Your site has pages that aren't easily discovered by the bots during the crawl process - for example, pages featuring rich AJAX or Flash.
Sitemap Your site is new and has few links to it. (Bots crawls the web by following links from one page to another, so if your site isn't well linked, it may be hard for us to discover it.)  Your site has a large archive of content pages that are not well linked to each other, or are not linked at all.
Sitemap HTML or User Sitemap XML Sitemap HTML Sitemap It allows easier indexing of your site by the search engines. Sitemaps help with usability and site navigation for users.
Sitemap XML sitemap Google and other Search engines adheres to Sitemap Protocol 0.9 as defined by sitemaps.org. We have to include the file sitemap.xml. Types of Sitemap
Sitemap Video Sitemaps Mobile Sitemaps  News Sitemaps  Code Search Sitemaps  Geo Sitemaps Size -50,000 urls, 10 MB. If it exceeds the size then sitemap index file can be created (1,000 Sitemaps   )
Source Code Optimization Keep your code simple and short. Place CSS  and JavaScript in external files and reference them as needed. Placing the script code in an external file reduces the code to just one line.  Avoid Fancy features such as JavaScript, cookies, session IDs, AJAX, frames, DHTML, or Flash
Source Code Optimization Consistently use the simplest URLs. Link to &quot;/&quot; instead of &quot;/index.php&quot; or &quot;/news/&quot; instead of &quot;/news/index.php&quot;.  Run your code through a validator and keep it clean
Design and Usability  Make a site with a clear navigation and text links. Avoid more than two sub-directory levels.  Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. Make sure that your TITLE attributes and ALT attributes are descriptive and accurate. Avoid Empty Image Alt Attribute Keep the links on a given page to a reasonable number (fewer than 100).
Design and Usability  Non-spiderable Flash Menus Image and Flash Content Non Spiderable Java Script Menus
Design and Usability  Use Cascading Style Sheets (CSS) to implement a clean design throughout your web site. This will reduce the time to implement a consistent text (or layout) style for your web site.  Reduce image size - too many images or very large images on your web page will slow down the loading time of your web site. Slice large images into smaller pieces. Check the Browser compatibility of the website
Navigation Important links from Header Menus Drop Down Menus Footer Links From Content Side Navigation
Questions

Website Audit Presentation

  • 1.
    Website Audit BalaAbirami, SEM Dot Com Infoway
  • 2.
    Agenda What isWebsite Audit? Site Analysis SEO Audit Design And Usability Suggestions Navigation Questions
  • 3.
    Website Audit WhyWebsite Audit is Done? To Check if the Website is SE Friendly User Friendly Improves the Online Presence From Conversion Point of View
  • 4.
    Site Analysis WebsiteStatus in Search Engines Duplicate Content – Copyscape.com IP for www and non www version Backlinks Status DMOZ listing & Yahoo Directory listing
  • 5.
    SEO Audit CanonicalIssues WWW and Non WWW Version www.example.com example.com/ www.example.com/index.html example.com/home.asp
  • 6.
    Solution 301 redirectionfrom non-www version of the website to www version of the website. Removing the hyperlinks of http://www.example.com/index.html throughout the website.
  • 7.
    Content Optimization Contentis the king. It is the key to search engine rankings. Visitors will spend time only when the content is more attractive and much informative. It is an important factor in search engine algorithm for ranking websites.
  • 8.
    Content Optimization Importanceof Content optimization Unique and well-written. With informative Content Comes Even More Links Search Engines give more importance to fresh content
  • 9.
    Content Optimization Includeyour Targeted Search Terms Checking Keyword Density Checking Keyword Proximity Checking Keyword Frequency Checking Keyword Prominence Relationship of Body Text Content to Keywords Heading Tags Inline Text Links
  • 10.
    Internal Duplication InternalDuplication are created when two different URLs have the same content within the same website For Example http://www.midas.com.au/ http://www.midas.com.au/index.php?new_page_id=1 Canonical Link Element <link rel=&quot;canonical&quot; href=&quot;http://example.com/page.html&quot;/> http://www.wikia.com/Wikia
  • 11.
    Heading Tags The <h1> to <h6> tags define headers. <h1> defines the largest header, <h6> defines the smallest header. After Title Tag, Header tags are the next most important placement of your keywords. <h1> tags holds the greatest weight of the entire heading tags, its purpose is to act as the primary heading of the page.<h2> acts a sub-heading for the page.
  • 12.
    Heading Tags If you need to use another heading tag use the <H2> tag, and so on. But its better not to use more that 2 heading tags in a web page. If you use heading tags irresponsibly you run the risk of having your website penalized for spam
  • 13.
    Inline Text Links Inline text links are links added right into text within your content. The first is to give the reader a quick and easy way to find the information you are referring to. The second purpose of this technique is to give added weight to this phrase for the page on which the link is located and also to give weight to the target page.
  • 14.
    Dont’s Don’t useContent Very Similar or Duplicate of Existing Content Don't bury your keyword-rich content at the bottom of the page. Don't overdo things. Don't go overboard with the use of &quot;H1&quot; tags or bolded text
  • 15.
    SEO Suggestions Avoiddynamic URLs. Use Mod-rewrite to overcome this problem. Use nofollow to the outgoing links Broken Links - Check your broken and dead link using the tool
  • 16.
    URL Structure StaticURL Search Engine Friendly Keywords can be used in the URL Easier for the end-user to view and understand what the page is about. Indexed more quickly in search engines
  • 17.
    URL Structure DynamicURL Not search engine friendly Keywords cannot be used in the URL Not easier for end user URLs with Special Characters are not indexed easily by search engines.
  • 18.
    Best URL PracticeDescribe Your Content Keep it Short Use static URLs Descriptives are Better than Numbers ( instead of 114/cat223/ you can use /brand/adidas/ ) Never use multiple subdomains (e.g., siteexplorer.search.yahoo.com)
  • 19.
    Best URL PracticeBetter use Fewer Folders - http://www.newyorkmetro.com/fashion/fashionshows/2007/spring/ main/newyork/womenrunway/marcjacobs/ Stick with a single URL format throughout the site.
  • 20.
    Robots.txt Robots.txt isa text file you put on your site to tell search robots which pages you would like them not to visit. Need for robots.txt If you have two versions of a page for eg.printer friendly version of a page. If you want some pages not get indexed by search engines. For example secured pages
  • 21.
    Robots.txt Use robots.txtto prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines. Where to include: It must be in the main directory because otherwise user agents will not be able to find it and index the whole site.
  • 22.
    Sitemap A Sitemapis a list of the pages on your website. Sitemaps are particularly helpful if: Your site has dynamic content. Your site has pages that aren't easily discovered by the bots during the crawl process - for example, pages featuring rich AJAX or Flash.
  • 23.
    Sitemap Your siteis new and has few links to it. (Bots crawls the web by following links from one page to another, so if your site isn't well linked, it may be hard for us to discover it.) Your site has a large archive of content pages that are not well linked to each other, or are not linked at all.
  • 24.
    Sitemap HTML orUser Sitemap XML Sitemap HTML Sitemap It allows easier indexing of your site by the search engines. Sitemaps help with usability and site navigation for users.
  • 25.
    Sitemap XML sitemapGoogle and other Search engines adheres to Sitemap Protocol 0.9 as defined by sitemaps.org. We have to include the file sitemap.xml. Types of Sitemap
  • 26.
    Sitemap Video SitemapsMobile Sitemaps News Sitemaps Code Search Sitemaps Geo Sitemaps Size -50,000 urls, 10 MB. If it exceeds the size then sitemap index file can be created (1,000 Sitemaps )
  • 27.
    Source Code OptimizationKeep your code simple and short. Place CSS and JavaScript in external files and reference them as needed. Placing the script code in an external file reduces the code to just one line. Avoid Fancy features such as JavaScript, cookies, session IDs, AJAX, frames, DHTML, or Flash
  • 28.
    Source Code OptimizationConsistently use the simplest URLs. Link to &quot;/&quot; instead of &quot;/index.php&quot; or &quot;/news/&quot; instead of &quot;/news/index.php&quot;. Run your code through a validator and keep it clean
  • 29.
    Design and Usability Make a site with a clear navigation and text links. Avoid more than two sub-directory levels. Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. Make sure that your TITLE attributes and ALT attributes are descriptive and accurate. Avoid Empty Image Alt Attribute Keep the links on a given page to a reasonable number (fewer than 100).
  • 30.
    Design and Usability Non-spiderable Flash Menus Image and Flash Content Non Spiderable Java Script Menus
  • 31.
    Design and Usability Use Cascading Style Sheets (CSS) to implement a clean design throughout your web site. This will reduce the time to implement a consistent text (or layout) style for your web site. Reduce image size - too many images or very large images on your web page will slow down the loading time of your web site. Slice large images into smaller pieces. Check the Browser compatibility of the website
  • 32.
    Navigation Important linksfrom Header Menus Drop Down Menus Footer Links From Content Side Navigation
  • 33.