Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Seozone - 5 tips


Published on

5 actionable SEO tips by Mark Thomas from DeepCrawl

Published in: Data & Analytics
  • Be the first to comment

  • Be the first to like this

Seozone - 5 tips

  1. 1. How valuable was Search in 2015? @SearchMATH
  2. 2. @SearchMATH Turkey
  3. 3. Have you ever intentionally browsed products at a store but, decided to buy them online? Have you ever intentionally browsed products online, but decided to buy them in-store? @SearchMATH 68% 32% Yes No 70% 30% Yes No
  4. 4. @SearchMATH “Web-rooming” is just as important as “showrooming”
  5. 5. @SearchMATH 56% - first thing they do when researching a purchase is to go to a search engine
  6. 6. @SearchMATH
  7. 7. @SearchMATH
  8. 8. Today I’m going to look at “ON-THE-PAGE FACTORS” AKA “TECHNICAL SEO” @SearchMATH
  10. 10. Why does TECHNICAL SEO matter? @SearchMATH
  11. 11. Ok… nice article @SearchMATH
  12. 12. But how healthy is their ON-THE-PAGE SEO? @SearchMATH
  13. 13. @SearchMATH Let’s test the User Experience (U/X)
  14. 14. But do 404’s hurt my site? @SearchMATH
  15. 15. YES
  16. 16. @SearchMATH
  17. 17. @SearchMATH
  18. 18. Website Health 0 2 4 6 8 10 1 5 10 Commercial Performance Low  High Website ‘Health’ impact on Commercial Performance
  19. 19. What do the world’s top SEO experts say? @SearchMATH
  20. 20. "Even a basic understanding of what to look for in technical SEO can get you far…
  21. 21. …So many people today focus too heavily on off-page SEO, but if a site is technically flawed, it won't matter how many links you have or how good your content is.”
  22. 22. Isolate & Dominate Your Base Metric 1. Organic revenue 2. Visits compared to last month 3. Visits compared year-over-year
  23. 23. "The biggest problems we have are tech problems. Often webmasters [are] trying to be too clever and give confusing signals… John Mueller, Webmaster Trends Analyst, Google
  24. 24. …Send clear, consistent and obvious signals.” John Mueller, Webmaster Trends Analyst, Google
  25. 25. GOOGLE RECOMMEND CRAWLING SEPARATELY Google Webmaster Hangout, 16th October, John Mueller recommended running a separate crawl to help identify and resolve technical issues that could be causing problems and delays when Google tries to crawl your site. “We get kind of lost crawling all of these unnecessary URLs and we might not be able to crawl your new updated content.”
  26. 26. I’m going to show you a case study to remind us of some of the power at our disposal @SearchMATH
  27. 27. Let’s begin… @SearchMATH
  28. 28. Where do I start with a technical review? @SearchMATH
  29. 29. Jon suggests we “CRAWL” @SearchMATH TIP #1
  30. 30. Independent web crawler software which imitates googlebot to produces rich reports detailing opportunities to achieve perfect website architecture. @SearchMATH
  31. 31. “CRAWL” = DISCOVERY @SearchMATH
  32. 32. An embarrassing truth for too many web managers… @SearchMATH
  33. 33. They don’t even know how many pages they have on their website @SearchMATH
  34. 34. Let alone how many issues are lurking below the surface @SearchMATH
  35. 35. Establish a clear picture of your website and then consider how it fits into the URL universe @SearchMATH
  36. 36. Recommend 5 actions from your audit: • 3 Quick-wins • 2 Long-term wins @SearchMATH TIP #2
  37. 37. What’s the story with these rel=canonical links? @SearchMATH
  38. 38. “Including a rel=canonical link in your webpage is a strong hint to search engines your about preferred version to index among duplicate pages on the web.”
  39. 39. “rel=canonical can be a bit tricky because it’s not very obvious when there’s a misconfiguration.”
  40. 40. rel=canonical to a Disallowed URL @SearchMATH
  41. 41. Audit your canonical tags quarterly @SearchMATH TIP #3
  42. 42. Your robots.txt file holds the power over crawlers. Monitor for free with @SearchMATH TIP #4
  43. 43. So, how can I fix this issue?
  44. 44. Adding this directive in the robots.txt is quicker, cleaner, easier to manage than getting a meta noindex added to specific pages. @SearchMATH
  45. 45. But ultimately the meta canonical needs amending to…
  46. 46. DeepCrawl already supports the robots.txt noindex directive: check which pages are being noindexed in your report via Indexation > Non-Indexable Pages > Noindex Pages.
  47. 47. Constantly optimise your architecture to improve crawl efficiency @SearchMATH TIP #5
  48. 48. So, remember: 1) Crawl your website 2) 5 actions per audit 3) Audit your canonicals 4) Try for FREE 5) Optimise your crawl efficiency @SearchMATH
  49. 49. Thank you! Please say “hi” @SearchMATH