Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

OnCrawl Afterwork x Bloom - January 2019

87 views

Published on

Case study - Become the Sherlock Holmes of digital by using OnCrawl
Find out how to leverage major SEO issues on e-commerce websites and set up an efficient SEO roadmap to improve your performances. Pascal Côté, SEO Director @Bloom and Alice Roussel, Customer Success Manager @OnCrawl, will demonstrate how a crawler as OnCrawl can help detecting constraints and translate them into operational and tangible actions.

Published in: Data & Analytics
  • Be the first to comment

  • Be the first to like this

OnCrawl Afterwork x Bloom - January 2019

  1. 1. How to become Sherlock Holmes with
  2. 2. Match plan 1. Introduction 2. Technical SEO, is it important? 3. Common cases and their solutions 4. To infinity, and beyond! #oncrawlafterwork
  3. 3. Pascal Côté Director of SEO @ Bloom +7 years of dev/technical SEO Tech, marketing and video games enthusiast @axxurge Alice Roussel Customer Success Manager @ OnCrawl +5 years of experience in SEO Typical day: Google trademarks & log analysis @aaliceroussel
  4. 4. Technical SEO, is it important? It’s not only important, it’s elementary my dear Watson! ● It’s one of the three fundamental pillars of SEO ● It’s something you have the absolute control over ● It’s directly related to how your site is understood by search engine ● It is often a compromise between dev work and UX ● It sometimes needs a more tech-savvy crew #oncrawlafterwork
  5. 5. What do we mean by “Technical SEO” ? The simple definition Any action that purpose is to ease search engine’s understanding of your website, and that would require an intervention in the site’s code. ● Managing URLs ○ Redirections ○ HTACCESS file ○ URL structure ● Adding or editing files ○ Robots.txt ○ Sitemap.xml ● Image optimization ○ “alt” attributes ○ Size reduction ● Compression and minification ○ External calls (queries, API, etc.) ○ CSS, JS and other files #oncrawlafterwork
  6. 6. How can OnCrawl help SEOs? Watson, we’ve got a problem...
  7. 7. OnCrawl’s pretty good m’kay ● Cross-analysis with Google Analytics, Search Console, third-party data (revenue, paid investment...). ● Custom segmentations: allows you to view data from a specific angle. ● 500 native metrics to monitor the health status of a site.
  8. 8. THE platform for optimizing your SEO OnCrawl SEO Crawler Analyze your site the way Google does OnCrawl Log Analyzer Track bot and visitor behavior OnCrawl Data Understand the influence of ranking factors on indexability
  9. 9. +175 CLIENTS +50 BLOOMERS +12 YEARS OF EXPERIENCE From different industries (e-commerce, B2B, CPG, local, etc.) Paid media specialists, strategists, account managers, SEO specialists, sales and marketing, etc. Founded in 2007 by two ex-Luxury Retreats MAKEITBLOOM.COM Paid media campaign management Search engine optimisation Digital marketing strategy Conversion rate optimisation
  10. 10. The culprits ● The Insecure eCommerce ● The Java what now? ● The Snowbird OnCrawl‘s reach is extensive Here are a few cases in which we used OnCrawl in order to help clients resolves specific issues.
  11. 11. Case #1: insecure eCommerce
  12. 12. Case #1: insecure eCommerce The world is full of obvious things which nobody by any chance ever observes. The situation ● The client has a site with more than 2 million URLs ● Many dozens of transaction per day, with an average order of $200 ● Organic traffic was stable, but there were clear signs of index bloat #oncrawlafterwork
  13. 13. Case #1: insecure eCommerce OnCrawl’s intervention ● Detection of unsecured URLs ○ More than 180 000 URLs were indexed, containing PIIs. ● 95% of analyzed URLs had canonical issues ○ A common case when looking at other big eCommerce sites ● Only 11,4% of pages had an ideal load time : under 500ms ○ 1 in 2 users will abandon their visit if it takes more than 3 seconds to load #oncrawlafterwork
  14. 14. Case #1: insecure eCommerce Eliminate all other factors, and the one which remains must be the truth. The results ● 87% less URL errors ● Deindexation of unsecured URLs ● 250 000+ additional clicks and 2 000 000+ additional impressions in 5 months #oncrawlafterwork
  15. 15. Case #2: Java what now?
  16. 16. Case #2: Java what now? The situation ● Client operates on a PWA (Progressive Web App), hard to parse by standard crawlers (including SEs) ● No possible indexation, due to an absent or empty sitemap ● The main content is un-readable, hidden by non-executed JavaScript #oncrawlafterwork
  17. 17. OnCrawl’s intervention ● Oncrawl was the first tool that allowed us to crawl JavaScript ○ It wasn’t possible with desktop tools like Screaming Frog (in older versions) ● Without JavaScript rendering, only the <head> section was readable ○ Do NOT use JavaScript rendering in your <head> section 🙏 ● We’ve “forced” the JavaScript crawl in order to generate a sitemap ○ The resulting file is then pushed in Google Search Console I cannot live without brain-work. What else is there to live for? Case #2: Java what now? #oncrawlafterwork
  18. 18. The results ● Almost instant indexation, with resulting impressions and clicks ● Frequent crawls from Google on all submitted pages ● Submitted a “Prerender” version of the site to facilitate the crawling Case #2: Java what now? #oncrawlafterwork
  19. 19. Case #3: the Snowbird
  20. 20. Case #3: the Snowbird There is nothing new under the sun. It has all been done before. The situation ● The client is planning its migration from CMS A to CMS B, many months in advance ● The URL structure will dramatically change, to something more coherent and relevant ● It is of utmost importance to understand the current site structure and prepare a migration plan #oncrawlafterwork
  21. 21. OnCrawl’s intervention ● A pre-migration analysis reveals a complex URL structure on CMS A ○ The data visualization methods are much easier to understand in OnCrawl ● It is possible to quickly identify pages getting organic visits ○ Connecting Google Analytics and OnCrawl gives your the opportunity to see the data in the tool ● We create a mapping of all known URLs, prepare canonicals and we migrate! ○ Ultra-specific data segmentation allows OnCrawl to be efficient even in the most complicated cases Case #3: the Snowbird #oncrawlafterwork
  22. 22. I never make exceptions. An exception disproves the rule. The results ● Redirection plan implemented, very low quantity of unwanted 404 pages created ● Reattribution of internal popularity to relevant commercial pages ● Use of segmentation to detect any remaining old URLs from CMS A Case #3: the Snowbird #oncrawlafterwork
  23. 23. Bonus
  24. 24. Bonus #1: orphan pages The comparison between the URLs crawled and the URLs contained in the sitemap(s) allows to identify orphan pages: potentially pages that unnecessarily consume your crawl budget. #oncrawlafterwork
  25. 25. Bonus #2: lots of redirects Identifying redirection chains and loops helps to identify exploration patterns in which Googlebot can get lost and abandon URL discovery. #oncrawlafterwork
  26. 26. Bonus #3: crawl pattern The use of log data can be used to raise unexpected crawl patterns in relation to an HTTPS migration, for example. #oncrawlafterwork
  27. 27. To infinity, and beyond! Using OnCrawl helped us kickstart our client’s SEO performance. We’re not only able to find technical issues faster, but it is much easier for us to explain our findings to our clients. Nothing clears up a case so much as stating it to another person. #oncrawlafterwork
  28. 28. www.oncrawl.com Thank you! Start yur free trial!

×