On site optimization


Published on

Every new SEO needs a good guide to start with. OnSite SEO is incredibly important for the development of a website.

Published in: Technology, Design
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

On site optimization

  1. 1. The Importance Of OnSite SEO
  2. 2. Brought to you by Geoff Smith
  3. 3. Social Link Building Keyword Research Quality Content OnSiteSEO Meta data,Titles, URLs, H-tags SEO Pyramid Unique Texts Inside Structure User Friendly Sitemaps URL Structure Server Response Codes
  4. 4. Site Architecture Home Page Subpages Subpages Subpages Sub-sub pages Sub-sub pages Sub-sub pages Sub-sub pages PageRank Pages pass value to other pages only hierarchically Crosslink to enhance pages
  5. 5. OnSite Checklist
  6. 6. OnSite Checklist ✓ Domain age - Usually older websites rank better (not necessarily). Check out the expiry date, too. ✓ Url Structure - dynamic or static ✓ Meta Data - Titles, description, keywords. All of these should be relevant to the webpage. ✓ Cross Linking - Be sure to link to other internal pages. ✓ 301 and 404 - 301 is a permanent redirect, so make sure it works fine. It’s a big plus to have a custom 404 page. ✓ Broken links - For obvious reasons, broken links are bad. Fix them.
  7. 7. OnSite Checklist ✓ Sitemap - Your website must have a .XML and .HTML sitemap. Be sure the links inside are the same as the links on your website. ✓ Robots.txt - You must have such a file. Check out if the sitemap is written correctly. Make sure the right things are disallowed. You don’t want your index.php to be there. ✓ Content - It must be unique. This means there is on duplicate content anywhere on the web. ✓ Alt Tags and Alt Titles - Google loves when there are alt tags and titles on images. People love it, too. ✓ Page Speed - Slow pages are never any good. If your site is slow, use online tools to check what the problem is.
  8. 8. OnSite Checklist ✓ Pictures - It’s not wise to have a big text content and don’t have any pictures. Make sure your pictures are unique. They mustn’t be too large. ✓ Mobile Responsive - With the evolution of technology every website should be mobile responsive or at least have a mobile version. ✓ Text-to-code ratio - Higher ratio is better for your site.
  9. 9. ● Check if there are different versions of your domain - with “www” and without. If there are two versions, you have a problem which needs to be solved. Google treats these as duplicate content. A simple 301 redirect can help. ● Don’t use dynamic URLs. Structured URLs are user friendly and SEO friendly. ● Make sure there are no broken URLs. There are different tools online or Add-ons for your browser to help you out. ● Keep URLs short. You are making a website for people, not for bots. ● Instead of subdomains use subfolders. E.g subdomain.website.com should be website.com/subfolder URLs - Structured URL - Dynamic URL
  10. 10. Meta Data ● Title Tag - This is the title that appears in the SERP (search engine result page). It’s good to have your main keyword in. The title should be not only attractive for spiders (crawlers, bots), but for the users, too. ● Description Tag - Even if it doesn’t help much with the ranking it is very essential to have good description tags. This way you can attract more user attention. ● ● Keyword Tags - It’s not going to help you with ranking. Still, it’s a common practice to add meta tags to your pages. That will
  11. 11. Sitemap Sitemaps are used to inform search engines about pages available for crawling. XML Sitemaps are files that list the URLs of a site, including some bonus data - last updates, how often it changes, importance of a page. This helps search engines to crawl faster around your site. HTML Sitemaps are used to navigate the user through the website. XML Sitemaps are used only by search engines.
  12. 12. Robots.txt “Robots.txt is a text file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines, but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site.” http://www.webconfs.com
  13. 13. Page Speed Page speed is one of the many criteria Google uses to rank your website. The faster it is, the better chances you have to get higher in the SERP. Why? Faster sites make the users happier. What can make your site slower? ● Unoptimized images ● Social Sharing buttons ● Various Ads ● Unnecessary JavaScript ● Bad CSS ● Not using Gzip ● Using old server software Check your Page Speed - http://tools.pingdom.com/fpt/
  14. 14. Geoff Smith Blog: http://dgeneralist.blogspot.co.uk/ Email: geoffthesmith@gmail.com Twitter: https://twitter.com/GeoffTheSmith Facebook: https://www.facebook.com/geoffrey.smith.73700
  15. 15. Tools ● http://who.is/ - Domain Age ● http://www.worldservicedirectory.com/Seo/meta-tag- analyzer - Meta Data ● https://chrome.google.com/webstore/detail/redirect- path/aomidfkchockcldhbkggjokdkkebmdll?hl=en - 404 Pages ● http://copyscape.com/ - Content ● http://tools.pingdom.com/fpt/ - Page Speed ● http://ami.responsivedesign.is/ - Mobile Responsive
  16. 16. Resources: ➢ http://moz.com/ ➢ http://www.searchenginejournal.com/ ➢ http://www.webconfs.com ➢ http://www.moveoutmates.co.uk/ ➢ http://searchengineland.com/ ➢ http://www.bbc.co.uk/