How to deal with Google updates and leverage big data to get an advantage out of a holistic search approach - Workshop by Marcus Tober, Founder & CEO of Searchmetrics at the NOAH 2013 Conference in London, Old Billingsgate on the 14th of November 2013.
Developer Data Modeling Mistakes: From Postgres to NoSQL
Searchmetrics - NOAH13 London
1. WORKSHOP
„How to deal with Google updates and leverage big
data to get an advantage out of a holistic search
approach“
Marcus Tober
11/14/2013
12/6/2013 ® Searchmetrics Inc. 2013 │Page 1
3. Made with Love in
Berlin
120 passionate
People
Innovator in SEO
Software since 2007
Follow us
twitter.com/searchmetrics
facebook.com/Searchmetrics
4. Searchmetrics – leading vendor of search
an social analytics software
with international focus
If you want to get the deck:
marcus@searchmetrics.com
5.
6.
7. IT IS IN GOOGLES INTEREST TO ONLY
SHOW RELEVANT SEARCH RESULTS
10. „Brands are the
solution, not the
problem. Brands are
how you sort out the
cesspool.“
„The internet is fast
becoming a cesspool
where false information
thrives “ – Eric Schmidt,
2010
38. Small things make a difference
12/6/2013 ® Searchmetrics Inc. 2013 │Page 38
39. Google‘s „Panda-Questions“ (out of 23)
What counts as a high-quality site?
• 1.
Would you trust the information presented in this article?
•…3.
Does the site have duplicate, overlapping,
or redundant articles on the same or
similar topics with slightly different
keyword variations?
40. Panda - Problem of Redundancy and SPAM
Up to 82,100 results for a
Wikipedia article?
40
41. Panda - Problem of Redundancy and SPAM
® Searchmetrics GmbH 2013 - Marcus Tober
41
57. Penguin Loser UK
holiday-rentals.co.uk
New vs Lost Links
Very good Strategy
concerning new
Linkbuilding + Removal
of bad quality Links.
12/6/2013 ® Searchmetrics Inc. 2013 │Page 57
Google separates the wheat from the chaff. The Google algorithm decides which content is “wheat” and which is “chaff” – relevant content versus spam. They are getting better at that by updating their algorithm.
When the algorithm wasn’t as complex as it is today, people claimed to have found the “golden nugget”: the One Big Ranking factor that potentially had the largest effect on good ranking positions. We can compare this type of SEO with a simple pocket knife: there are only a couple of tools to chose from.
As a result of all the updates there is a wide range of factors that influence search results. All of those potentially correlate with good ranking positions.
Because google values content with an algorithm, we can study how Google manages to do so. We are interested in knowing which factors Google’s algorithm takes into account when valuing content. This way Searchmetrics is able to explain how tweaks in the algorithm impact the performance of websites in search engines.
As a result, search results have become more relevant and SEO has become far more complex.
A google algorithm update is no more than tweaking “ranking factors” in the algorithm. We will have a look at two major types of updates: panda and penguin.
… and consequently make suggestions based on how the best performing competitors have built their content.
Prevent these things from happening by monitoring these factors. Example: 50% similarity to another well-ranking URL and 15 (!) pieces of content on only 4 keywords…
Again, by studying the algorithm we can find correlating factors that value good content. Penguin is an update about backlinks. Our studies show that the impact correlating link factors have changed, for instance between 2012 and 2013.
For example, this factor used to correlate well with higher ranking positions, but its weight has now been heavily reduced.