2. What is JavaScript SEO?
What it takes for a
modern JavaScript-
powered website to
be properly indexed
by search crawlers?
3. 7 Random Statements
1. SEO was never this dynamic
2. Google is pushing more changes than ever, they aren’t as popular as
Penguin and Panda
3. I still see too many SEOs who live in a fantasy world made out out of
MOZ DA scores, keyword density and SEO “hacks”
4. That tiny pocket in jeans was designed to store pocket watches
5. Looking at some of the biggest websites in the world, it seems that most
didn’t hear about Technical SEO yet.
6. McDonald’s once made bubblegum-flavored broccoli
7. JavaScript is here to stay and JavaScript SEO is not a geeky option
anymore (YAY!)
15. “…Rendering pages at the scale of the web
requires a lot of time and computational
resources. And make no mistake, this is a
serious challenge for search crawlers,
Googlebot included.”
16. “Rendering the JavaScript powered
web pages takes processor power
and memory. While Googlebot is very
powerful, it doesn’t have infinite
resources.”
17. 1 user = 60 Watt lightbulb
lit for 3 hours*
*Data from 2011
86. ...if you care about
SEO, you still need to
have server-rendered
content.
Jeff Whelpley
Angular U conference, June 22-25, 2015, Hyatt Regency, San Francisco Airport
“Angular 2 Server Rendering”
ele.ph/angularU
87. ...we are generally able
to render and
understand your web
pages like modern
browsers.
115. JavaScript vs. Crawler budget (crawl demand)
Oh no!
(…) Also, crawling &
indexing is currently
a bit slower than
static HTML (...)
ele.ph/crawldemand
128. Rich media cautions
Graceful degradation
enable a clean down-
level experience so
crawlers can see your
content
129. Rich media cautions
Graceful degradation
Down-level experience
enhances discoverability
avoid housing content
inside Flash or JavaScript
– these block crawlers
from finding the content
202. What is partial indexing?
This means that if your site is using a heavy amount of client side
JS, you can be tripped up at times when the content is being
indexed due to this two phase indexing process.
It’s possible that some details may
be missed.
203. And this effectively means that if
your site is using a heavy amount of
client-side JavaScript for rendering,
you could be tripped up at times
when your content is being indexed
due to the nature of this two phase
indexing process.
…it’s possible some details might be
missed.
Tom Greenaway
204. 2 waves
WAVE 1 WAVE 2
• JS dependent
content only
• HTML content
• Canonicals
• Meta data
• HTTP Codes
213. Prerendering/Dynamic Rendering issues
1. Computing power – A LOT of servers
2. Prone to issues (often load related)
3. Downtime = ranking loss
4. More complex and difficult from an SEO perspective (crawls, 2 sets
of code, etc.)
5. Requires a lot of SEO knowledge
6. Requires a great dev team to make it run smoothly
214. When to use dynamic
rendering (according
to Google)
A dynamic website
“Is if your site is large and
rapidly changing for, example
if you have a news website.”
Your website is relying on
features that are not
supported in Chrome 41
Libraries that cannot be
transpiled back to ES5, APIs
that don’t suport Chrome 41
232. To Do:
1. Diff Check is your new best friend
2. Experiment and make sure to monitor
your server logs
3. Compare your setup with Chrome 41
4. Make sure that your content is indexed
in Google
5. Under the right URL