Did your website get banned or penalized following Google updates such as Panda or Penguin, or did a manual penalty cause your site to get locked out of the search engines? Learn what went wrong and how to recover in this session.
Penalties are not the same as filters. To many times I see a company come to me and say they have been penalized by Google or were hit by the latest algo change without having the proper data to back up their thesis. All sudden drops and gains should be investigated fully to identify the root cause. Pandas and Penguins were easy to find…humming birds on the other hand…those things are elusive
Great video by MC describing the difference between what he calls “Manual Penalties” and “Algorithmic Penalties”
First things first. Let’s determine if traffic loss is specific to a search engine or if the errors are across the board.
Track the number of unique landing pages bringing traffic to your site pre and post traffic loss. This helps identify specific content blocks that may be under performing.
Monitoring Google errors with accessibility and also their crawl rate will help identify any weaknesses with your infrastructure.
You need this in advance – capturing the performance information after the fact will help you manage recovery but will make it difficult for forensics. Get it now.
Often overlooked is the man with the keys behind every release your company has pushed out. Go up to a month earlier and review any changes made to your production environment.
The Good Thing about filters is that there are thresholds. There is a distinct on/off switch.
Major electronics retailer – over 500,000 pages indexed in Google – Lost over 60% of their natural search traffic Keyword ranking is not enough. You need to be following your highest ranking URLs as well and determine if there are shifts in relevant authority. In this example we see Google having a difficult time in identifying the preferred URL in late June. This was the result of a poor redirect strategy put in place on a redesign.
Major online Retailer - Over 4 Million pages indexed in Google – Introduced site search results to be indexed Monitoring your 404s is mandatory for multiple purposes. It enables an efficient crawl and also acts as a form of link building. The big issue here is that the 404s almost equal the amount of indexed pages. This makes it difficult for Google to crawl through the sight and calculate relevancy. Search engines have limitations put in place to help their hardware run as efficiently as possible.
High visability. Major e-commerce online and brick and mortar store was called out by the national media.
Earlier in the year Google was receiving 5,000 reconsideration requests a week. You can now only submit these requests through Google Webmastertools manual actions viewer. Make sure you have done everything possible to repair the issue prior to submitting.
Pubcon 2013 - Post mortem banned site forensics
Director of Search Intelligence
THE CONDUCTOR TEAM
WHO WE ARE
•Founded in 2006
•Offices in New York and San Francisco
•150+ pieces of SEO thought leadership in 2012
, the largest user conference in enterprise SEO
•Dedicated to raising the profile of search marketers
•Supporting SEO success for over 5,000 leading brands
•$4.5B in SEO revenue under management
•6TB of SEO data collected weekly
WHY WAS I BANNED?
CASE STUDY – FILTERED BY REDESIGN
CASE STUDY – IMPACT/RECOVERY OF A LINK
GRASS ROOTS FORENSICS TOOLS
CASE STUDY – FILTERED BY IT LIMITATIONS
Penalty vs Filter
Penalty Case Study