Tools for SEO Onsite Audits


Published on

Presentation from Jerusalem Web Professionals Seminar on Advanced SEO Tactics, delivered in Jerusalem on July 7, 2010.

Published in: Technology, Design
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Tools for SEO Onsite Audits

  1. 1. Mark Ginsberg JWP July 10, 2012
  2. 2.  In Browser ◦ Browser Extensions  Chrome Sniffer, Mozbar, etc. ◦ SaaS Tools  SEOMoz campaign tools, and lots more Standalone Applications ◦ Screaming Frog SEO Spider ◦ Xenu Link Sleuth ◦ IIS Crawler
  3. 3.  Chrome Sniffer - ◦ Quickly find out what CMS the site is using, like wordpress, drupal, magento, prestashop, etc. Click on the icon for more info about ◦ Are they using Google Analytics or another analytics tool? the page ◦ Which social plugins do they use? Canonical- ◦ Shows if page has a canonical tag different from the page you are currently visiting
  4. 4.  SEO Site Tools - ◦ Lots of Info about your Site and your page ◦ What type of server are you running? ◦ External link data – PR, data from open APIs – SEOmoz, Alexa, SEMRush, indexed pages, lots more
  5. 5.  Mozbar - ◦ Nice clean interface ◦ On page data ◦ Backlinking data ◦ HTTP status codes, canonical tags, and much more
  6. 6.  User Agent Switcher- ◦ Allows you to change your browser’s user agent ◦ View a webpage as Google does ◦ Use this for checking if website is cloaking
  7. 7.  Quirk Search Status- ◦ Lots of SEO information – like Chrome plugins previously mentioned ◦ Show other domains on IP – shortcut to Bing search ◦ Highlights nofollow links – other tools do this too – I leave this on by default in Firefox and use Chrome as primary browser
  8. 8.  Web Developer Toolbar- ◦ Disable elements to mimics search engine bots ◦ Quickly disable:  Cookies  Css  Images  JavaScript ◦ Show link information by default if checking links and are not broken
  9. 9.  Is the site trustworthy? Would you buy from them? What is the site about – does the homepage convey this? Would you give this site your credit card?
  10. 10.  Is the site cloaking – showing different content to you and Googlebot How to Check – view cache in google ◦ In Chrome, just add a cache: to the URL and hit enter Change user agent in Firefox to Googlebot and then view the page Compare to the live version you see in Chrome and other browsers
  11. 11. Normal User Sees What Googlebot Sees
  12. 12.  In our example – cached version of the page doesn’t show the cloaking/hack Google Instant preview does though Always check with multiple resources – cache, instant, user agent switch If detected, work with developer to clean up
  13. 13.  Does the homepage redirect to another page? ◦ What type of redirect – 301 or other? www vs non-www- is there only one version of the page? ◦ HTTP/1.1 301 Moved Permanently ◦ HTTP/1.1 200 OK ◦ Note – wordpress sites usually pretty good with this, other cms’s cause lots of problems Is there a canonical tag pointing somewhere else?
  14. 14.  Title tag ◦ Proper length? ◦ Keyword stuffed? ◦ Descriptive? ◦ Incorporates brand name? Meta-description ◦ Proper length? ◦ Keyword stuffed? Descriptive? Unique? Keywords Tag ◦ Is it in usage? Is it unique? Canonical Tag ◦ Is there a canonical tag pointing somewhere else?
  15. 15.  Is the navigation crawlable? ◦ Turn off javascript, images and CSS in Firefox with developer toolbar ◦ Can you still navigate to the main sections of the site? URL site structure ◦ Does the main navigation point to clean urls or search engine unfriendly URLs? ◦ Do the URLs contain capital letters or spaces? Does the server force lower case letters in the URLs?  Live pages - and
  16. 16.  ◦ Searches only the domain you specify – tells you how many pages indexed ◦ Shortcut to this query from tools like Searchstatus & SEO Site Tools ◦ If not many pages indexed, could be problem with navigation cache: ine ◦ Checks if page is stored in cache More advanced search queries:
  17. 17.  Robots.txt ◦ Check if the site has a robots.txt – check to see what it’s blocking ◦ Use Google Webmaster tools to check their access and see if specific URLs are blocked by robots.txt ◦ Are they accidentally blocking access from all search engines? This means they’re blocking everyone: User-agent: * Disallow: / Sitemap ◦ Does the site have one? Is it linked in the robots.txt file? ◦ Does it include errors or URLs no longer live
  18. 18.  Content ◦ Is there enough content? ◦ Is it organized, laid out nicely? ◦ Interlinking ◦ Crawlable by the search engines ◦ Check for duplicate content – text in quotes Meta Data ◦ Title tags, meta descriptions, keywords ◦ Meta robots – are these pages blocked by the search engines
  19. 19.  Why check SEO info on one page when you can look at the whole site, sections of it, etc?
  20. 20.  Check crawl errors, etc. Supplement this data with Webmaster Tools data
  21. 21.  Does a lot of good stuff as well Not specifically an SEO tool, but does produce relevant reports It’s Free (Screaming Frog full version is £100 a year – free version doesn’t have all features and only crawls 500 URLs)
  22. 22.  It gives a lot of data and information about the site Is free and easy to install (if you have Windows 7) – follow the instructions here -
  23. 23.  Independent SEO Consultant @markginsberg Google+ Skype: msginsberg
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.