• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Ultimate guide to google's panda update
 

Ultimate guide to google's panda update

on

  • 766 views

This e-book contains everything you need to know about Google's Panda update. Is broken down into 4 main sections:...

This e-book contains everything you need to know about Google's Panda update. Is broken down into 4 main sections:

- What is the Panda update?
- Why certain website were affected by the Panda update
- How to check if your website was affected
- How to reverse or prevent the Panda penalty

Created by http://www.learn2rank.com/

Statistics

Views

Total Views
766
Views on SlideShare
766
Embed Views
0

Actions

Likes
0
Downloads
20
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Ultimate guide to google's panda update Ultimate guide to google's panda update Document Transcript

    • December 19th, 2011 Published by: jeffox4dUltimate Guide toGoogles Panda UpdateIntroduction to Googles Panda UpdateFirst off let me start by saying because of Google, I HATE pandas. I don’t even eat at Panda Express anymore for obvious reasons.SEOs vs. Google’s Panda is a lot like snowmen vs. a volcano. But despite the lack of success stories, there is hope.Since the first panda update back in February 2011 there have been many articles written about Panda and what webmastersshould and shouldn’t do. Some have been very insightful and others pretty much worthless. After reading nearly every pandaarticle published about the Panda update I would like to share with you what the general consensus on Panda is, as well assome of my own personal findings. My goal here is to equip all the webmasters and SEOs of the world with the right weaponsto slay the panda.So let’s get right to it. Grab some coffee and takes some notes because ladies and gentleman, it’s panda hunting season…What is The Panda Update?The name “Panda” comes from its creator Navneet Panda, a Google software engineer. Simply put, the Panda algorithm attemptsto put all websites into 1 of 2 categories: good or bad. Sounds easy, right? Not quite, there is a lot that goes into this simple task.Essentially here is how Google does it: 1. Google starts with a sample of many different websites with various levels of quality. 2. Google’s quality raters look at the websites individually and place them into 1 of 5 categories: Vital, Useful, Relevant, Slightly Relevant, and Off-Topic/Useless. 3. Using machine learning, Google looks at what metrics the bad websites have in common. 4. Google applies these metrics to all websites on the web. Those that share the same metrics as the bad websites get penalized. This usually happens once every 4-6 weeks.Even though the Panda update increased the quality of Google’s search results, many quality websites were negatively affected.Anytime a computer is given the task to think like a human, many mistakes are going to be made.Why Was Panda Created?Panda was put in place to reduce spam in Google’s search results by specifically targeting “scraper” websites and other lowquality websites. A scraper website, sometimes known as an “autoblog,” is a blog which steals content from other websites andpublishes it automatically. Usually a scraper website will have ads or affiliate links in an attempt to make some quick cash. 1
    • December 19th, 2011 Published by: jeffox4dThese websites filled the web with thousands upon thousands of pages of duplicate content. Not only did this negatively impactthe search results, but occasionally the copied content would outrank the original source. As a result Google was receiving a lotof pressure to clean up their search results and the Panda update was born.Why was my Site Penalized by Panda?Without working for Google no one knows for sure what will get a website pandalized (panda + penalized). But using correlationstudies along with some tips from Google we have a good idea of what the Panda is after. Below are possible factors that puta website at risk of the Panda Penalty. • Duplicate content between websites on the web • Duplicate content within a website • Poor visitor interactionDuplicate Content Between Your Websites and Others on the WebIn my personal experience having duplicate content between your website and other websites is the quickest way to gettingslapped with the Panda Penalty. This has been a problem for nearly every Panda affected website I have come across.As mentioned above in the “Why Was Panda Created?” section, Panda was put in place to target websites who were stealingcontent. So there is no surprise that having duplicate content will get your site penalized. Here are some examples of how this duplicate content can about about: • You take content from other websites and post it on your own site to get some extra ad revenue. Panda comes along and sees hundreds or even thousands of pages of duplicate content. You get a much deserved panda penalty. • You are the ideal webmaster and spend hours writing all unique content. Other sites take your content and post it on their website. Google doesn’t know who the original source is and just sees that you have the same content as a bunch of other websites. You get penalized (yes this does happen) and punch a hole in the wall. • You have an eCommerce website with hundreds or thousands of products. To save time, you (along with all the other eCommerce sites) use the manufacturer’s provided product description. Google sees that you have hundreds or even thousands of pages of duplicate content. Your website gets penalized and your sales plummet.Duplicate Content Within Your WebsiteDuplicate content within a website can also get your site penalized if you are not careful. This type of duplicate content usuallycomes in two different forms: 1. Multiple pages with identical, or nearly identical content: All the content across your website should be unique. Copy and pasting content while only changing a few keywords can get you into trouble fast. To avoid a panda penalty write all unique content on all your pages. 2. One page with multiple URLs: In Google’s eyes every different URL is a different page. This mean that to Google www.opencart.com and www.opencart.com/index.php are two different pages with duplicate content. Make sure all webpages only have 1 URL per page.Eliminating duplicate content within your website is good in general, not just for Panda. Search engines have trouble decidingwhich page to index and rank when duplicate content is present. Having duplicate content caused by multiple URLs can dividelink juice between the different versions of the page and lower rankings.Poor Visitor InteractionPeople usually don’t stay on a spammy website very long. In fact I would bet that the majority of visitors to spammy websitesleave almost immediately. Why am I telling you this? Think about it, if all spammy websites share a high bounce rate wouldn’tGoogle want to factor this into their Panda algorithm? Yup! And that’s why many SEOs, including myself, believe that usagemetrics such as bounce rate can affect whether or not your website gets Pandalized. But having a high bounce rate alone isn’twhat Google cares about. Let me explain with two examples where the user bounces from the website… • Good User Path: Joe wants to know when the next Olypmics are. He goes to Google and searches “when is the next olympics”. He ends up at the Olympic Games page and quickly sees that the next Olympic Games are July 27 – August 12 2012. Joe is satisfied with his result and leaves the page. 2
    • December 19th, 2011 Published by: jeffox4d • Bad User Path: Mike wants to buy Call of Duty: Modern Warfare 3. He goes to Google and searches “buy call of duty modern warfare 3″. He ends up on a page at www.buycallofduty.info and realizes that this ad filled page will not help me buy the game he desires. Mike goes back to the search results and clicks on a page from www.amazon.com. This time Mike is satisfied with his search result and goes on to buy the game.The first example above shows that there are times when you can expect to have users bounce from your website. These areusually times when a user is looking for specific information such as a phone number for a business, capital of a country, a foodrecipe, etc.On the other hand, the second example shows a user path that could indicate to Google that your website is crap. Sure a few ofthese types of visits won’t hurt you, but if this occurs a regular basis you may be in some trouble.How to Check if You Were PenalizedIf you are already 100% positive you were mauled by the panda you can skip this section and move on, otherwise let me explainhow you can quickly figure out if you were affected. This step-by-step tutorial will tell you exactly how to diagnose your trafficand figure out if you were bitten by the beast. 1. Before we get started make sure you are using the new version of Google Analytics. If you are not sure what version you are using just refer to the following screenshots: Old Google Analytics New Google Analytics 2. We need to setup an advanced segment to filter out all the irrelevant traffic such as direct traffic, referral traffic, Bing traffic, etc. If you are new to analytics don’t worry, this is easier than it sounds. Start off by opening up the “Advanced Segments” drop down menu at the top and selection “+ New Custom Segment”. 3. Once inside the “Advanced Segments” drop down, we need to name the segment and start adding the necessary filters, aka “dimensions”. When all filters are properly added your screen should look something like this (click to enlarge): The first filter ensures we are only receiving traffic from Google but since this could still contain Google AdWords traffic we need to add another filter. The second filter makes sure all traffic is coming from organic search traffic, aka not pay- per-click. Then we need to filter out any branded keywords, these are usually keywords in your domain name. Finally we are ready to save our Advanced Segment and start looking at the traffic. 4. Now we want to see all of the traffic history from yesterday to just before the first Panda update. With the new version of Google Analytics set the date range from 2/1/11 to yesterday’s date. 3
    • December 19th, 2011 Published by: jeffox4d 5. Look for any noticeable drops in traffic and compare them to the dates of known panda updates. A typical drop in traffic from Panda is usually at least 25% and happens overnight.(click to see larger image)Here is a list of all Panda updates thus far: • February 23, 2011 • April 11, 2011 • May 9, 2011 • June 16, 2011 • July 23, 2011 • August 12, 2011 • September 28, 2011 • October 3, 2011 • October 13, 2011 • November 18, 2011 • ~Early January, 2012The example above shows a particular website that was hit by panda on June 16, 2011. The sudden drop in traffic is typical ofa Panda penalty. If you have a significant traffic drop (>20%) on the same date as a Panda update, you were likely penalized.But don’t worry! In part 4 of this Panda series I will explain exactly what steps you can take to reduce and hopefully eliminatethe panda penalty.Prevent and Reverse the Panda PenaltyMany times people think they got hit by the Panda update when they actually received a penalty for spammy links. Positive thatGoogle’s Panda mauled your traffic? Great! Let’s jump in and see what we can do to tame the beast.Basically there are a few factors that result in a website getting slapped with the Panda penalty: • Duplicate content between your website and other websites on the web • Duplicate content within your website • Poor visitor interactionFixing each of these issues is easier said than done. But don’t worry, I’m going to hold your hand every step of the way andtogether we can slay the Panda.Remove Duplicate Content & Write Unique ContentFirst things first, make sure all the content on your website is unique. Copying and pasting content is the fastest way to getpandalized. I hate using general tips like “write unique content” because it is so vague, but for many websites this is the onlyoption. 4
    • December 19th, 2011 Published by: jeffox4d 1. Go to CopyScape.com and check for duplicate content – CopyScape is an amazing tool that searches the web for duplicate content. With the free version you can only check 1 URL at a time. If you have a larger site and larger wallet than I would recommend checking a batch of URLs with Copyscape Premium. Keep in mind that checking for duplicate content in batches will cost you $0.05 per URL 2. Noindex or rel=canonical all pages with duplicate content – Every page that CopyScape identifies as being a duplicate to other pages on the web needs to go. There are many ways to do this but the 2 easiest ways are noindexing the page or using the rel=canonical tag. The noindex code tells search engines not to index the page. The rel=canonical tag tells Google that the page is a duplicate of another page. A general rule of thumb is to canonical pages with backlinks and noindex the others. Why? Because the canonical tag passes link juice to whatever page you point it to. This may be a bit confusing, but I have gone ahead and created 2 examples to visualize the process.Both the noindex and rel=canonical code are placed in the <head> of your source code. Once these tags are added to yourduplicate pages it will take anywhere from 1 day to 1 month before Google re-crawls the pages and removes them from the index.The time it takes for Google to re-crawl these pages depends on the size of your website and the number of backlinks. Once thisis done it’s time to sit back and wait for Google to re-run the Panda algorithm. This occurs about once every 4-8 weeks; the nextPanda update should occur sometime in early January 2012.Implement the Rel=Author Tag to Your ArticlesI would highly recommend implementing the rel=author tag if your articles are getting taken from scraper sites. The rel=authortag is a fairly new piece of html code that Google is now reading. It tells Google who is the original source of an article andsometimes adds a portrait of the author to the search engine results page.Example of Rel Author in Search ResultsMatt Cutt’s has hinted that using rel=author is, or could be a ranking factor if you are the source of an article. And as of today,Google announced that they will be providing author stats in Google Webmaster Tools. These are 2 big indicators that rel=authoris already being used as a signal by Google and could be even more of a ranking factor in the future.In order to utilize the rel=author code you must have a Google+ account. Google uses your Google+ account to confirm that thearticle is associated with the author and pull the portrait thumbnail into the SERPs. Google has provided some easy to followsteps on implementing rel=author, but essentially this is how it works: 5
    • December 19th, 2011 Published by: jeffox4dNot only will rel=author tell Google that you are the original author of the content, but the profile picture in your search resultwill increase the number of clicks to your website. You can’t go wrong!Clean up Internal Duplicate ContentNow that we have addressed duplicate content between different websites, it’s time to focus on internal duplicate content issues.The reason Panda cracked down on internal duplicate content is because many spammy websites were automatically creating100′s or 1000′s of duplicate pages while only changing a few keywords. This would allow them to target many keywords withvery little effort (emphasis on would).So what does this mean for you? It’s quite simple, any duplicate content within your website has gotta go. There are 2 mainreasons why a website would have internal duplicate content: 1. You are automatically creating 100′s or 1000′s of duplicate pages to target more keywords 2. You have pages with multiple URLsIf you belong to the first group and are automatically creating 100′s or 1000′s of pages with essentially duplicate content it’s timeto stop. Unless you want to get hit by Panda, you need to either rewrite all unique content or noindex/canonical the duplicatepages.Having multiple URLs for a single page is a common issue amongst most websites. This can be a problem because in Google’seyes, every separate URL is a separate page. Which means that to Google and other search engines, www.opencart.com andwww.opencart.com/index.php are different pages.Here is a more visual example of a duplicate URL issue and what the canonical code would look like:As you can see, the mikesbikeshop.com/red-bikes page is adding on the ?sort=A-Z parameter whenever someone sorts the pagealphabetically. This is causing multiple URLs for the same page as well as duplicate content. Mike should put the canonical tagon the red bikes page but also on every page of his website as an SEO best practice.If you are using WordPress then I would recommend installing the All in One SEO Pack because it will do all this for you,otherwise you will need to do this page by page. If you have a large website, hire a programmer to automatically do this for you.Lower Your Bounce RateGoogle knows when someone searches a keyword, clicks on your site, and immediately leaves your website to click on anothersearch result. It is likely that these types of interactions can signal to Google that your website is low quality and may be a goodfit for the panda penalty. There are a few ways you can lower your bounce rate: • Have good and relevant content that the searcher is looking for. • Add a video. Videos can significantly lower the bounce rate of a page and increase the average time on site. • Don’t target irrelevant keywords just because they have a high number of monthly searches. • Add a clear call to action to get your visitors to go to another page. 6
    • December 19th, 2011 Published by: jeffox4dThese tips will help lower your bounce rate and reduce your risk of being penalized by Panda. But at the end of the day if usersare consistently bouncing from your pages it may be a sign of a deeper issue. Put yourself in the searcher’s shoes and figure outwhat they would want to see for any given search query. If you can do that then you are one step closer to being a successfulinternet marketer.ConclusionWhether for better or worse, Google’s panda update forever changed SEO. The update determines the quality level of any givenwebsite by looking at duplicate content and usage metrics. Sometimes Google gets it right, other times not so much. Either way,this update is just another way of Google pushing webmasters to provide higher quality content. 7