Create Search Friendly Content
1. Make sure your URL is crawlable
- use robots.txt
2. Utilize canonical tag
- Tag the source document as
3. Keep the URL clean and unique
- Don’t list session information
4. Provide Google with a sitemap
- Googlebot has a link to crawl
5. Use history API
- no more hashbang (#!) tag
6. Anchor tag and HREF attributes
- Googlebot won’t recognize it
What we know about Googlebot rendering
1. Googlebot uses the Chrome 41 browser for rendering
3. Two-phase indexing
- First indexing- before the rendering process is complete
- Second indexing - after final render
- The second indexing doesn't check for canonical tag
- The indexability, metadata, canonical tags and HTTP codes of your web
pages could be affected.
There are four types of rendering
1. Client side rendering
- Rendering happens on the browser of users or on a search engine.
2. Server side rendering
- Rendering happens at your server.
3. Hybrid rendering (the long-term recommendation)
4. Dynamic rendering (the policy change)
- This method sends client side rendered contents to users while search engines got server side rendered
- This works in the way that your site dynamically detects whether its a search engine crawler request.
You need dynamic rendering
1. You have a large and constantly
2. You rely on a lot of modern
3. Your site has social media or chat
application that needs access to
You don’t need dynamic
1. Googlebot can index your page
Has Googlebot indexed my website correctly?
1. Fetch as Google on Google Search Console.
2. Run a mobile friendly test.
3. Check the developer console.
4. Run the rich result test.
Some good news for the future...
1. Rendering will be moved closer to crawling and indexing.
2. Googlebot will be using a more modern browser.