2. Caffeine Algorithm
• The Caffeine update is a significant update to Google's search algorithm that was first introduced in 2010.
• It was designed to improve the speed and accuracy of Google's search results, and it has had a major impact on the
way that websites are ranked in search results.
• One of the main differences between the Caffeine update and previous updates made by Google is the way that it
processes and indexes web pages.
• Previous updates used a "crawling" method, in which a web crawler would visit a website, follow links on the page,
and then index the content of those pages.
• The Caffeine update, on the other hand, uses a "streaming" method, which allows it to index web pages almost as
soon as they are published.
• This means that new content can be indexed and appear in search results much faster than before.
• To keep up with the evolution of the web and to meet rising user expectations, we've built Caffeine.
• Searchers want to find the latest relevant content and publishers expect to be found the instant they publish.
3. • The old index had several layers, some of which were refreshed at a faster rate than others; the main layer would update
every couple of weeks.
• To refresh a layer of the old index, Google would analyze the entire web, which meant there was a significant delay
between when we found a page and made it available to you.
• Caffeine lets us index web pages on an enormous scale. In fact, every second Caffeine processes hundreds of thousands
of pages in parallel.
4. • Another key difference between the Caffeine update and previous updates is the way that it evaluates the relevance
and quality of web pages.
• Previous updates relied heavily on the use of keywords and meta tags to determine the relevance of a web page to a
particular search query.
• The Caffeine update, however, takes a more holistic approach, considering factors such as the relevance and quality of
the content, the authority of the website, and the user experience.
• One of the goals of the Caffeine update was to improve the relevance of search results for long-tail queries, which are
more specific and less common than short-tail queries.
• To do this, the update placed a greater emphasis on the context and meaning of words, rather than just their
individual occurrences.
• It also introduced new algorithms to better understand the relationships between words and concepts, which helped
to improve the accuracy of search results for complex queries.
5. Panda Algorithm
• Google Panda first launched in February 2011 as part of Google’s quest to eliminate black hat SEO tactics and webspam.
• The stated purpose of the Google Panda algorithm update was to reward high-quality websites and diminish the
presence of low-quality websites in Google’s organic search engine results.
Triggers for Panda
Thin content
Duplicate Content
Low-quality content
Lack of authority/trustworthiness
Content farming - Large numbers of low-quality pages, often aggregated from other websites. For example, of a content
farm might be a website that employs large numbers of writers at a low wage to create short articles covering a vast
variety of search engine queries, producing a body of content that lacks authority and value to readers because its core
purpose is simply to gain search engine rankings for every conceivable term.
Low-quality user-generated content (UGC) - An example of this type of low-value User Generated Content would be a
blog that publishes guest blog posts that are short, full of spelling and grammatical errors and lacking in authoritative
information.
High ad-to-content ratio - Pages made up mostly of paid advertising rather than original content.
6. How can I know if I’ve been hit by Panda?
• One signal of a potential Panda penalization is a sudden drop in your website’s organic traffic or search engine rankings
correlating with a known date of an algorithm update.
• However, it’s vital to bear in mind that many things can result in lost rankings and traffic.
• These include the rise of a competitor in your market (look at who is outranking you to see if someone new has moved
ahead of you), manual penalties (check Google Search Console for reported issues), expected seasonal dips in consumer
interest (like a ski lodge in July), or even a completely different Google update than the one you suspect (for example,
Penguin instead of Panda).
• When a known, named update occurs, it’s important to study industry documentation of practices being cited as involved
in the update.
• If your loss of rankings or traffic corresponds with a known date, go through these industry lists of bad practices to discover
if they are taking place on your website.
• Then, if you believe you’ve found a correlation between bad practices and an update, act to remedy the situation.
7. How to recover from Panda
• Abandoning content farming practices
• Overhauling website content for quality, usefulness, relevance, trustworthiness and authority
• Revising the ad/content or affiliate/content ratio so that pages are not dominated by ads or affiliate links
• Ensuring that the content of a given page is a relevant match to a user’s query
• Removing or overhauling duplicate content
• Careful vetting and editing of user-generated content and ensuring that it is original, error-free and useful to readers,
where applicable
• Using the Robots noindex, nofollow command to block the indexing of duplicate or near-duplicate internal website
content or other problematic elements
8. Penguin Algorithm
• Following on the heels of Panda, the Penguin update was announced by Google as a new effort to reward high-quality
websites and diminish the search engine results page (SERP) presence of websites that engaged in manipulative link
schemes and keyword stuffing.
• Penguin, which was first launched in 2012
Triggers for Penguin
Link schemes - The development, acquisition or purchase of backlinks from low-quality or unrelated websites, creating an
artificial picture of popularity and relevance in an attempt to manipulate Google into bestowing high rankings.
Keyword stuffing - Populating a webpage with large numbers of keywords or repetitions of keywords in an attempt to
manipulate rank via the appearance of relevance to specific search phrases. For example, an unnatural repetition of
keywords on a given page might look like
9. How can I discover if I’ve been hit by Penguin?
• First, it’s important to differentiate between Penguin and a manual penalty for unnatural linking.
• In brief, Penguin is a Google index filter applicable to all websites, whereas a manual penalty is specific to a single
website that Google has determined to be spamming.
• These manual penalties may be the result of a given website being reported by Google users for spam, and it’s
also speculated that Google may manually monitor some industries (like payday loan companies) more than
others (like cupcake bakeries).
• If your website’s analytics show a drop in rankings or traffic on a date associated with a Penguin update, then you
may have been affected by this filter.
• Be sure you’ve ruled out expected traffic fluctuations from phenomena like seasonality (for instance, a Christmas
tree farm in April), and carefully evaluate whether your keyword optimization or linking practices would be
deemed spammy by Google, making your site vulnerable to an update like Penguin.
10. How to recover from Penguin
• The removal of any unnatural links over which you have control, including links you’ve built yourself or have
caused to be placed on 3rd party websites
• The disavowal of spammy links that you can’t control
• The revision of your website’s content to remedy over-optimization, ensuring that keywords have been
implemented naturally instead of robotically, repetitively or nonsensically on pages where this is no
relationship between the topic and the keywords being used.
• In sum, Penguin was created to remedy a severe weakness in Google’s system that enabled their algorithm
to be ‘tricked’ by large numbers of low quality links and the keyword over-optimization of pages.
• To avoid having your website devalued by Google for spam practices, all content you publish should reflect
natural language, and your link-earning-and-building practices must be deemed “safe.”
11. EMD Algorithm
• The Exact-Match Domain (EMD) Update is a change in Google’s ranking algorithm launched on 27th September, 2012.
• We consider a domain to be an “exact-match” if the search phrase mostly part of, or identical to the domain name.
• The exact match domains update is a new filter aimed at ensuring that low-quality websites do not attain a high Page
Rank (PR) and rise high in Google’s SERPs (search engine results pages) just because such websites have the relevant
search term in the domain names.
12. • Keyword domains are not really beeing “punished” or removed from the search results.
• The algorithm update “EMD-Update” actually targets keyword domains with thin or bad content and
increasingly sends these domains to the back of the rankings.
• In general, Google has lowered the influence of domain names as a ranking factor even further with this
update.
• It is still possible to attain good rankings with a keyword domain – provided that the quality of the
content is up to par. Using a keyword-domain as a ranking-advantage, as it was possible to do in the past,
has lost much of its effect thanks to this algorithm update.
• The intent behind this update was not to target exact match domain names exclusively, but to target sites
with the following combination of spammy tactics: exact match domains that were also poor quality sites
with thin content.
13. Page Layout Algorithm
• Google’s ‘above the fold’ page layout algorithm is a Google Search ranking factor that Google introduced in 2012.
Google said it changed because it wanted to “provide more relevant search results” and “improve its relevance.”
• In other words, Google wants to do a better job of showing you what you’re looking for on the first few web pages of
Google Search.
• The page layout algorithm update targeted websites with too many static advertisements above the fold.
• These ads would force users to scroll down the page to see content.
• Google said this algorithm would affect less than 1 percent of websites. But, the 1 percent of those sites affected
were forced to design a better user experience.
• To the point where user experience was weak, websites with excessive ads had to re-think their web design.
• Ad revenue would suffer from zero traffic over fewer ads.
• As a website owner, I can see where some people might have been upset.
• When it’s your page, with your ads, you’d think you should be able to organize it however you want and not be
penalized.
14. • The truth is that people won’t want to visit a site if they feel berated with ads.
• They’ll get frustrated and look for whatever they were searching for – whether it’s birthday party ideas or
expert golf advice – somewhere else.
• If you click on a website and the part of the website you see first either doesn't have a lot of visible content
above-the-fold or dedicates a large fraction of the site's initial screen real estate to ads, that's not a very
good user experience. Such sites may not rank as highly going forward.
• This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects
sites that go much further to load the top of the page with ads to an excessive degree or that make it hard
to find the actual original content on the page.
• Overall, our advice for publishers continues to be to focus on delivering the best possible user experience
on your websites and not to focus on specific algorithm tweaks.
16. In the April 21 post, Google gave a quick three bullet list of what this update would impact:
• Affects only search rankings on mobile devices.
• Affects search results in all languages globally.
• Applies to individual pages, not entire websites.
• This was not just an algorithm update, it was a cultural shift, and Google was about to move the market.
• Google is obsessed with improving the user experience as much as possible and aligning it with user behavior and
trends in the market.
• Why does Google care so much about a user’s experience with their search engine? Mainly because the bulk of
their revenue still comes from paid ads.
• They want to provide the best experience possible so people keep clicking on them and funding their free lunches
and robotic dreams.
• To check if your site is mobile-friendly, you can examine individual pages with the Mobile-Friendly Test or check
the status of your entire site through the Mobile Usability report in Webmaster Tools.
• If your site's pages aren't mobile-friendly, there may be a significant decrease in mobile traffic from Google
Search.
17. Core Algorithm
• Several times a year, Google makes significant, broad changes to our search algorithms and systems.
• We refer to these as core updates, and we give notice when they happen on our list of Google Search ranking updates.
• Core updates are designed to ensure that overall, we're delivering on our mission to present helpful and reliable results
for searchers.
• In fact, there's nothing in a core update that targets specific pages or sites.
• Instead, the changes are about improving how our systems assess content overall.
• These changes may cause some pages that were previously under-rewarded to do better in search results.
• As explained, pages that experience a change after a core update don't have anything wrong to fix.
• That said, we understand that those who may not be performing as well after a core update change may still feel they
need to do something.
• Broad core updates tend to happen every few months.
• Content that was impacted in Search or Discover by one might not recover—assuming improvements have been made—
until the next broad core update is released.
19. Recent Algorithms
March 2023 Core Update
• Core algorithm updates can impact search rankings, making it crucial for SEOs and website owners to stay informed
and adapt strategies.
• Monitor site metrics, focus on quality content, and optimize technical aspects to maintain a strong presence in search
results.
• Prioritize the user experience and provide valuable content to prepare for future algorithm updates.
• The reviews system is designed to evaluate articles, blog posts, pages or similar first-party standalone content
written with the purpose of providing a recommendation, giving an opinion, or providing analysis.
• It does not evaluate third-party reviews, such as those posted by users in the reviews section of a product or services
page.
• Reviews can be about a single thing, or head-to-head comparisons, or ranked-lists of recommendations.
• Reviews can be about any topic.
• There can be reviews of products such as laptops or winter jackets, pieces of media such as movies or video games,
or services and businesses such as restaurants or fashion brands.
February 2023 Product Reviews Update
20. December 2022 Link Spam Update
• The power of SpamBrain to neutralize the impact of unnatural links on search results.
• SpamBrain is the AI-based spam-prevention system.
• Besides using it to detect spam directly, it can now detect both sites buying links, and sites used for the purpose of
passing outgoing links.
• Ranking may change as spammy links are neutralized and any credit passed by these unnatural links are lost. This launch
will affect all languages.
• As we have always emphasized, links obtained primarily for artificial manipulation of Search rankings are link spam.
• Our algorithms and manual actions aim to nullify these unnatural links at scale, and we will continue to improve our
coverage.