More Related Content

Slideshows for you(20)

Similar to Discovering SEO Opportunities through Log Analysis #DTDConf(20)

More from Aleyda Solís(20)

Discovering SEO Opportunities through Log Analysis #DTDConf

  1. #loganalysis at #DTDConf by @aleyda from @orainti Discovering SEO Opportunities through Log Analysis
  2. #loganalysis at #DTDConf by @aleyda from @orainti “SEO crawling simulations are great, but there’s 
 nothing like seeing your site “real” search crawling behaviour” #loganalysis at #DTDConf by @aleyda from @orainti
  3. #loganalysis at #DTDConf by @aleyda from @orainti This can be done w/ Web server log files, that record 
 all requests, including search bots accesses and errors
  4. #loganalysis at #DTDConf by @aleyda from @orainti They’re critical to monitor Web activity - Eg: Structural changes or validating GSC errors you can’t replicate You can’t find these URLs in your own crawls
  5. #loganalysis at #DTDConf by @aleyda from @orainti I’m (sadly) not Morpheus, however today I’m going to show you what’s beyond search crawling simulations Aleyda Solis * International SEO Consultant & Founder. * Conference Speaker in +100 Events in +20 Countries. * Author “SEO. Las Claves Esenciales.” * Blogger in SEL & SEJ. #loganalysis at #DTDConf by @aleyda from @orainti
  6. #loganalysis at #DTDConf by @aleyda from @orainti Getting initial access will be the hardest part: You will 
 need to coordinate with your Server Admin or DevOps Tip: Become their BFF to make your SEO life easier
  7. #loganalysis at #DTDConf by @aleyda from @orainti You can use the terminal to access them 
 directly from your Web server Command line in Terminal
  8. #loganalysis at #DTDConf by @aleyda from @orainti You can also export & go through the data with a code 
 editor, however this might be unpractical for analysis Sublime or Brackets
  9. #loganalysis at #DTDConf by @aleyda from @orainti For “manual” analysis you can also import it into Excel, 
 that will facilitate data organisation and filtering
  10. #loganalysis at #DTDConf by @aleyda from @orainti To make analysis practical for larger sites, there are 
 log monitoring solutions with dashboards & filtering Use Loggly or Splunk
  11. #loganalysis at #DTDConf by @aleyda from @orainti And now also SEO focused log analysers that directly 
 offer reports and segments to facilitate SEO analysis Use Screaming Frog Log Analyser, OnCrawl
  12. #loganalysis at #DTDConf by @aleyda from @orainti Some of them can be directly integrated to the
 sites whether with dedicated or shared hosting SEOlyzer
  13. #loganalysis at #DTDConf by @aleyda from @oraintiDeepcrawl You can also use SEO crawlers integrating log files 
 data along other data sources to show existing gaps
  14. #loganalysis at #DTDConf by @aleyda from @orainti Let’s see some of the most critical SEO insights we can get through log analysis by asking certain questions… #loganalysis at #DTDConf by @aleyda from @orainti
  15. #loganalysis at #DTDConf by @aleyda from @orainti Are search bots effectively accessing & frequently 
 crawling your most important pages to index & rank?
  16. #loganalysis at #DTDConf by @aleyda from @orainti Do the same by checking which are your most 
 crawled site areas and categories
  17. #loganalysis at #DTDConf by @aleyda from @orainti SEO log analysers will also allow to specify any filtering criteria to obtain crawl behaviour for any site section
  18. #loganalysis at #DTDConf by @aleyda from @orainti What’s the share of your crawl activity 
 going to error or redirected URLs?
  19. #loganalysis at #DTDConf by @aleyda from @orainti Are your meaningful pages returning a 200 OK HTTP status or are they showing errors?
  20. #loganalysis at #DTDConf by @aleyda from @orainti Are you returning any unintended redirects? To which bots? Is this behaviour consistent over time?
  21. #loganalysis at #DTDConf by @aleyda from @orainti Are you returning different HTTP status 
 for the same URLs?
  22. #loganalysis at #DTDConf by @aleyda from @orainti Are there any gaps between your own crawlings 
 vs. the log file data? Are there any orphan URLs?
  23. #loganalysis at #DTDConf by @aleyda from @orainti Is your crawl budget wasted in not meant to be 
 accessed, non-relevant resources or non-indexed URLs?
  24. #loganalysis at #DTDConf by @aleyda from @orainti  Which are the largest and slowest crawled pages? 
 Can they be optimised for speed? 
  25. #loganalysis at #DTDConf by @aleyda from @orainti Do you have non-search engines bots consuming 
 your resources that you need to delay or block? 
  26. #loganalysis at #DTDConf by @aleyda from @orainti Be mindful of those “spoofing” search crawlers
 which you can validate too
  27. #loganalysis at #DTDConf by @aleyda from @orainti What's your desktop vs. smartphone googlebot crawl share? Verify if you’re in the mobile first index
  28. #loganalysis at #DTDConf by @aleyda from @orainti#loganalysis at #DTDConf by @aleyda from @orainti And this is only the start
  29. #loganalysis at #DTDConf by @aleyda from @orainti#loganalysis at #DTDConf by @aleyda from @orainti Thanks!