SEO vs Angular
Website Migration
https://www.gokam.co.uk/ June 2018
“
Why are we
talking about
this?
2
3
Enjoy this graph of a website migrated to
full JS #SEO
https://twitter.com/operbet/status/1016234468997255169
SEO Migration
What's at stake?
How does Google
analyze websites?
Crawl & Index
Illustration :
HP
HP
HP
HP
Steps
10
Listing /
prioritisation
New URLs
Indexation
Words/links extraction
performed by the Web Rendering
Service (included in 'Caffeine')
Crawl/fetch
URLs by Googlebot
03
01 02
Limit crawl rates based on several criteria.
Officialised in Jan. 2017 by Google
Crawl budget
11
Crawl Budget criteria
12
- Difficulty: Google will crawl more or less assiduously
depending on the server's response times, or even on the
server sending errors rate.
- Need for updating: a static site, not often updated, will not
be crawled often.
- Need for exploration: according to popularity
HP
HP
Duplicat
content
Empty
HP
Deindexed
How to make an “SEO”
migration
Crawl & Index
Google wants to make sure that
1. same owner
17
Make sure of the owner
- No whois change
- No domain name change
- No topic change
18
Make sure of the owner
- No whois change
- No domain name change
- No topic change
(if possible)
19
Google wants to make sure that
1. same owner
2. same content
20
“
For Google, there is no
greater irrelevance than
sending users to empty
pages or error pages
21
Old New
Old New
Ideal: The URLs remain the same
Old
New
Redirection 301 "page by page"
Google wants to make sure that
1. same owner
2. same content
3. popularity is not usurped
25
Old
New
New
A migration often means:
29
- In the case of a domain name update, it is essential
to make the minimum change at the same time and
explicitly declare the migration in the search console.
Significant increase in crawl rate for a few days.
New website can go down as web cache is not fully
available yet.
A migration often means:
30
A migration often means:
If 301's is too high. The site's SEO traffic will be lowered
until Google "understands" the new site
If there are still old' urls on the Web and even worse in
internal navigation, the 301s can stay for a long time.
31
What about Angular ?
Crawl & Index
“
Google say :
we are generally able to
render and understand your
web pages like modern
browsers
33Source : https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html
“
Just because Google is the
one behind Angular, it
doesn't mean that the
framework is the best for
SEO."
34
“
35Source : https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html
Google : “we recommend
following the principles of
progressive enhancement.”
What are the options?
No architecture modifications
Optimize
client side rendering
37
option n°1
Critical part
38
03
01 02
⚠
Listing /
prioritisation
New URLs
Indexation
Words/links extraction
performed by the Web Rendering
Service (included in 'Caffeine')
Crawl/fetch
URLs by Googlebot
Requirement : It is absolutely necessary to let
Googlebot access all js/css files
Beware of robots.txt file
Good practices
39
option n°1
“5 seconds rule
“There’s no specific timeout value that we
define, mostly because the time needed
for fetching the resources is not
deterministic or comparable to a browser
(due to caching, server load, etc). 5
seconds is a good thing to aim for, I
suspect many sites will find it challenging
to get there though ”
40
option n°1
*Source : John Muller, Webmaster Trends Analyst at Google, 19/05/2017
Good practices: ensure links discoverability
- Use Push-state to ensure accounting of user/bots behaviors
- Expose to Google all the links necessary to discover all the
pages
41
option n°1
*Source : https://developers.google.com/search/docs/guides/rendering
- The rendering service uses Chrome 41
- Version that is 3 years old
- ES6 javascript partially compatible, no "let" for example
- Cookies, and local and session storage are cleared across page
loads
- Use Google's rendering tools to test:
- Google mobile / Speed test
- Chrome dev tool / audit
- Fetch in the search console
Good practices : test everything
42
option n°1
*Source : https://developers.google.com/search/docs/guides/rendering
Good practices
43
Don't forget metadata:
- for Google: title, meta description, canonical tag
- For Facebook og:title, og:description
You should not rely on the XML sitemap to index the pages
option n°1
*Source : https://developers.google.com/search/docs/guides/rendering
The indexing/interpretation sentence will require a lot more
resources for Google.
If the list of pages to be processed grows longer for the WRS,
the pages will sometimes be considered empty until the service
does its job.
Indexation will be slower no matter what happens and will go
"less far". "Good SEO is Efficiency"
The limits of the approach
44
option n°1
*Source : https://developers.google.com/search/docs/guides/rendering
Flawed & Google only solution
The limits of the approach
45
option n°1
*Source : https://developers.google.com/search/docs/guides/rendering
#! & to _escaped_fragment_ format
Google Ajax Crawling Specification
46
option n°2
47*Source : https://developers.google.com/webmasters/ajax-crawling/docs/specification
Officiellement
deprecated since
October 2015
Not a option
48
Equivalence between app URL and backend URL
User browse
/#!key1=value1&key2=valu
e2
Googlebot crawl
/?_escaped_fragment_=ke
y1=value1%26key2=value2
Not a option
Popular Hack
Prerender.io
49
option n°3
50
In a few words
It is a service that interprets your
Javascript code in a virtual browser, then
distributes the result as static HTML to
bots indexing.
option n°3
“
"Make sure this content is
strictly identical."
Otherwise it's cloaking 💀
52
option n°3
https://support.google.com/webmasters/answer/66355?hl=en&ref_topic=6001971
https://support.google.com/webmasters/answer/66353
Prerender.io was still using the syntax
_escaped_fragment_ a short time ago
Today the setting should be the same for all bots.
53
_escaped_fragment_
option n°3
54
- Open source
- Popular
- Multi-framework (Angular/React/Vue.js)
- Quite old, quite well documented solution
- Possibility to install the service yourself or pay via an
online service ($35 per month for 50,000 with a 7-
day cache)
- Makes the crawl very fast / resource efficient
Pro’s
option n°3
- Considered a hack
- Complexify architecture (to maintain / monitor)
- Many are looking to replace this solution with
something more "elegant".
55
Con’s
option n°3
Sponsored Hack
Rendertron
56
option n°4
57
In a few words
Like prerender, Renderstron is a service
that interprets your Javascript code in a
virtual browser, but it distributes the result
on the fly to Bots
option n°4
Rendertron
Rendertron
Open source
"Dockerized
Supported by the Chrome and Firebase team
Multibots
60
Pro’s
option n°4
61
Con’s
Very recent (1 year)
option n°4
server side rendering
Angular Universal
62
option n°5
63
- The idea is to use the SSR to display the
first page of the application. When the user
navigates, we go back to side rendering
client
- Same behavior for users/bots
- The "best of two worlds"
In a few words
option n°5
SSR
SSR
CSR
SSR
CSR
The best solution in the best of all worlds
Open source
Most flexible / powerful solution
Easy to test, it is "enough" to deactivate the JS
Speed grain
Google and especially the web quality team seems to
think that this will be the main solution in the future
68
Pro’s
option n°5
Most complex solution
Unproven/tested with Angular, interested people turned
to other solutions (Hacks, React/Next, Vue/Nuxt)
Add a lot of constraint on both sides.
Examples :
- Only CommonJS for Node.js' (babel-node/ts-node)
- "Universal is not able to use global objects" (document, window,
localStorage, etc.) but potentially injectable
- "the DOM must only be handled by Angular"
69
Con’s
option n°5
Thanks to the SSR, it is possible to make a fast /
ergonomic and SEO friendly application
But this requires substantial resources
Airbnb did so using React but to do so had to become an
active contributor to the project.
70
Airbnb
option n°5
...
Keep the current tech....
71
option n°6
72
Cut things in half
Serve SEO content with
legacy tech
http://www.example.com/
Serve the app using Angular
http://app.example.com/
option n°6
The End
Thanks
73https://www.gokam.co.uk/ June 2018
74
Open Source Google
Slides themes by
slidesgala.com
https://www.gokam.co.uk/ June 2018

SEO vs Angular

  • 1.
    SEO vs Angular WebsiteMigration https://www.gokam.co.uk/ June 2018
  • 2.
  • 3.
    3 Enjoy this graphof a website migrated to full JS #SEO https://twitter.com/operbet/status/1016234468997255169
  • 4.
  • 5.
    How does Google analyzewebsites? Crawl & Index
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
    Steps 10 Listing / prioritisation New URLs Indexation Words/linksextraction performed by the Web Rendering Service (included in 'Caffeine') Crawl/fetch URLs by Googlebot 03 01 02
  • 11.
    Limit crawl ratesbased on several criteria. Officialised in Jan. 2017 by Google Crawl budget 11
  • 12.
    Crawl Budget criteria 12 -Difficulty: Google will crawl more or less assiduously depending on the server's response times, or even on the server sending errors rate. - Need for updating: a static site, not often updated, will not be crawled often. - Need for exploration: according to popularity
  • 13.
  • 14.
  • 15.
  • 16.
    How to makean “SEO” migration Crawl & Index
  • 17.
    Google wants tomake sure that 1. same owner 17
  • 18.
    Make sure ofthe owner - No whois change - No domain name change - No topic change 18
  • 19.
    Make sure ofthe owner - No whois change - No domain name change - No topic change (if possible) 19
  • 20.
    Google wants tomake sure that 1. same owner 2. same content 20
  • 21.
    “ For Google, thereis no greater irrelevance than sending users to empty pages or error pages 21
  • 22.
  • 23.
    Old New Ideal: TheURLs remain the same
  • 24.
  • 25.
    Google wants tomake sure that 1. same owner 2. same content 3. popularity is not usurped 25
  • 26.
  • 27.
  • 28.
  • 29.
    A migration oftenmeans: 29 - In the case of a domain name update, it is essential to make the minimum change at the same time and explicitly declare the migration in the search console.
  • 30.
    Significant increase incrawl rate for a few days. New website can go down as web cache is not fully available yet. A migration often means: 30
  • 31.
    A migration oftenmeans: If 301's is too high. The site's SEO traffic will be lowered until Google "understands" the new site If there are still old' urls on the Web and even worse in internal navigation, the 301s can stay for a long time. 31
  • 32.
    What about Angular? Crawl & Index
  • 33.
    “ Google say : weare generally able to render and understand your web pages like modern browsers 33Source : https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html
  • 34.
    “ Just because Googleis the one behind Angular, it doesn't mean that the framework is the best for SEO." 34
  • 35.
    “ 35Source : https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html Google: “we recommend following the principles of progressive enhancement.”
  • 36.
    What are theoptions?
  • 37.
    No architecture modifications Optimize clientside rendering 37 option n°1
  • 38.
    Critical part 38 03 01 02 ⚠ Listing/ prioritisation New URLs Indexation Words/links extraction performed by the Web Rendering Service (included in 'Caffeine') Crawl/fetch URLs by Googlebot
  • 39.
    Requirement : Itis absolutely necessary to let Googlebot access all js/css files Beware of robots.txt file Good practices 39 option n°1
  • 40.
    “5 seconds rule “There’sno specific timeout value that we define, mostly because the time needed for fetching the resources is not deterministic or comparable to a browser (due to caching, server load, etc). 5 seconds is a good thing to aim for, I suspect many sites will find it challenging to get there though ” 40 option n°1 *Source : John Muller, Webmaster Trends Analyst at Google, 19/05/2017
  • 41.
    Good practices: ensurelinks discoverability - Use Push-state to ensure accounting of user/bots behaviors - Expose to Google all the links necessary to discover all the pages 41 option n°1 *Source : https://developers.google.com/search/docs/guides/rendering
  • 42.
    - The renderingservice uses Chrome 41 - Version that is 3 years old - ES6 javascript partially compatible, no "let" for example - Cookies, and local and session storage are cleared across page loads - Use Google's rendering tools to test: - Google mobile / Speed test - Chrome dev tool / audit - Fetch in the search console Good practices : test everything 42 option n°1 *Source : https://developers.google.com/search/docs/guides/rendering
  • 43.
    Good practices 43 Don't forgetmetadata: - for Google: title, meta description, canonical tag - For Facebook og:title, og:description You should not rely on the XML sitemap to index the pages option n°1 *Source : https://developers.google.com/search/docs/guides/rendering
  • 44.
    The indexing/interpretation sentencewill require a lot more resources for Google. If the list of pages to be processed grows longer for the WRS, the pages will sometimes be considered empty until the service does its job. Indexation will be slower no matter what happens and will go "less far". "Good SEO is Efficiency" The limits of the approach 44 option n°1 *Source : https://developers.google.com/search/docs/guides/rendering
  • 45.
    Flawed & Googleonly solution The limits of the approach 45 option n°1 *Source : https://developers.google.com/search/docs/guides/rendering
  • 46.
    #! & to_escaped_fragment_ format Google Ajax Crawling Specification 46 option n°2
  • 47.
  • 48.
    48 Equivalence between appURL and backend URL User browse /#!key1=value1&key2=valu e2 Googlebot crawl /?_escaped_fragment_=ke y1=value1%26key2=value2 Not a option
  • 49.
  • 50.
    50 In a fewwords It is a service that interprets your Javascript code in a virtual browser, then distributes the result as static HTML to bots indexing. option n°3
  • 52.
    “ "Make sure thiscontent is strictly identical." Otherwise it's cloaking 💀 52 option n°3 https://support.google.com/webmasters/answer/66355?hl=en&ref_topic=6001971 https://support.google.com/webmasters/answer/66353
  • 53.
    Prerender.io was stillusing the syntax _escaped_fragment_ a short time ago Today the setting should be the same for all bots. 53 _escaped_fragment_ option n°3
  • 54.
    54 - Open source -Popular - Multi-framework (Angular/React/Vue.js) - Quite old, quite well documented solution - Possibility to install the service yourself or pay via an online service ($35 per month for 50,000 with a 7- day cache) - Makes the crawl very fast / resource efficient Pro’s option n°3
  • 55.
    - Considered ahack - Complexify architecture (to maintain / monitor) - Many are looking to replace this solution with something more "elegant". 55 Con’s option n°3
  • 56.
  • 57.
    57 In a fewwords Like prerender, Renderstron is a service that interprets your Javascript code in a virtual browser, but it distributes the result on the fly to Bots option n°4
  • 58.
  • 59.
  • 60.
    Open source "Dockerized Supported bythe Chrome and Firebase team Multibots 60 Pro’s option n°4
  • 61.
    61 Con’s Very recent (1year) option n°4
  • 62.
    server side rendering AngularUniversal 62 option n°5
  • 63.
    63 - The ideais to use the SSR to display the first page of the application. When the user navigates, we go back to side rendering client - Same behavior for users/bots - The "best of two worlds" In a few words option n°5
  • 65.
  • 66.
  • 67.
  • 68.
    The best solutionin the best of all worlds Open source Most flexible / powerful solution Easy to test, it is "enough" to deactivate the JS Speed grain Google and especially the web quality team seems to think that this will be the main solution in the future 68 Pro’s option n°5
  • 69.
    Most complex solution Unproven/testedwith Angular, interested people turned to other solutions (Hacks, React/Next, Vue/Nuxt) Add a lot of constraint on both sides. Examples : - Only CommonJS for Node.js' (babel-node/ts-node) - "Universal is not able to use global objects" (document, window, localStorage, etc.) but potentially injectable - "the DOM must only be handled by Angular" 69 Con’s option n°5
  • 70.
    Thanks to theSSR, it is possible to make a fast / ergonomic and SEO friendly application But this requires substantial resources Airbnb did so using React but to do so had to become an active contributor to the project. 70 Airbnb option n°5
  • 71.
    ... Keep the currenttech.... 71 option n°6
  • 72.
    72 Cut things inhalf Serve SEO content with legacy tech http://www.example.com/ Serve the app using Angular http://app.example.com/ option n°6
  • 73.
  • 74.
    74 Open Source Google Slidesthemes by slidesgala.com https://www.gokam.co.uk/ June 2018

Editor's Notes