Conductor is a company founded in 2006 that provides SEO support and data collection to over 5,000 brands. Brian McDowell from Conductor gave a presentation on SEO forensics tools and techniques. The presentation included case studies on recovering from link penalties and determining if a site was filtered or penalized. Conductor's platform collects 6 terabytes of SEO data weekly to help clients understand SEO performance.
2. THE CONDUCTOR TEAM
2
WHO WE ARE
•Founded in 2006
•Offices in New York and San Francisco
SEO LEADERSHIP
•150+ pieces of SEO thought leadership in 2012
•C3
, the largest user conference in enterprise SEO
•Dedicated to raising the profile of search marketers
CONDUCTOR SEARCHLIGHT
•Supporting SEO success for over 5,000 leading brands
•$4.5B in SEO revenue under management
•6TB of SEO data collected weekly
@Conductor
3. AGENDA SLIDE
WHY WAS I BANNED?
CASE STUDY – FILTERED BY REDESIGN
CASE STUDY – IMPACT/RECOVERY OF A LINK
PENALTY
GRASS ROOTS FORENSICS TOOLS
CASE STUDY – FILTERED BY IT LIMITATIONS
@Conductor
22. Clean Up & Reconsideration
http://galacommunity.org/archives/782
@Conductor
23. THANK YOU
We are hiring!
http://www.conductor.com/about/careers
BRIAN MCDOWELL
Director of Search Intelligence
Conductor, Inc
Editor's Notes
Penalties are not the same as filters. To many times I see a company come to me and say they have been penalized by Google or were hit by the latest algo change without having the proper data to back up their thesis. All sudden drops and gains should be investigated fully to identify the root cause.
Pandas and Penguins were easy to find…humming birds on the other hand…those things are elusive
Great video by MC describing the difference between what he calls “Manual Penalties” and “Algorithmic Penalties”
First things first. Let’s determine if traffic loss is specific to a search engine or if the errors are across the board.
Track the number of unique landing pages bringing traffic to your site pre and post traffic loss. This helps identify specific content blocks that may be under performing.
Monitoring Google errors with accessibility and also their crawl rate will help identify any weaknesses with your infrastructure.
You need this in advance – capturing the performance information after the fact will help you manage recovery but will make it difficult for forensics. Get it now.
Often overlooked is the man with the keys behind every release your company has pushed out. Go up to a month earlier and review any changes made to your production environment.
The Good Thing about filters is that there are thresholds. There is a distinct on/off switch.
Major electronics retailer – over 500,000 pages indexed in Google – Lost over 60% of their natural search traffic
Keyword ranking is not enough. You need to be following your highest ranking URLs as well and determine if there are shifts in relevant authority. In this example we see Google having a difficult time in identifying the preferred URL in late June. This was the result of a poor redirect strategy put in place on a redesign.
Major online Retailer - Over 4 Million pages indexed in Google – Introduced site search results to be indexed
Monitoring your 404s is mandatory for multiple purposes. It enables an efficient crawl and also acts as a form of link building. The big issue here is that the 404s almost equal the amount of indexed pages. This makes it difficult for Google to crawl through the sight and calculate relevancy. Search engines have limitations put in place to help their hardware run as efficiently as possible.
High visability. Major e-commerce online and brick and mortar store was called out by the national media.
Earlier in the year Google was receiving 5,000 reconsideration requests a week. You can now only submit these requests through Google Webmastertools manual actions viewer. Make sure you have done everything possible to repair the issue prior to submitting.