2. Table of Contents #Bluebashco
1. SSL Certificate
Check Meta Tags
Mobile-Friendly Website
Page Speed
Remove Broken Links
Set Your Preferred Version
Index Your Website
Check Robots.txt and Sitemap.XML
1
3. SSL Certificate #Bluebashco
Secure Sockets Layer
There are three recognized categories of SSL certificate authentication types:
• Extended Validation (EV)
• Organization Validation (OV)
• Domain Validation (DV)
2
4. Meta Tags #Bluebashco
Check Meta Tags
3
MetaTags - H1 to H5, Title,Description, Keyword, ImageTag,OG Tags,Favicon Icon, Canonical
Tag.
Hreflang Tags– Hreflang is an HTML attribute used to specify the language and geographical
targeting of a webpage. If you have multiple versions of the same page in different
languages, you can use the hreflang tag to tell search engines like Google about these
variations. This helps them to serve the correct version to their users.
For example–
<link rel="alternate"hreflang="en-gb"href="https://bluebash.co/uk/"/>
<link rel="alternate"hreflang="x-default"href="https://bluebash.co/"/>
SEO On-page Tags
5. Mobile-Friendly Website #Bluebashco
Mobile-Friendly Website (Responsive)
4
Mobile-Friendly TestBy Google- https://search.google.com/test/mobile-friendly
Mobile is changing the world. Today, everyone has smartphones with them, constantly communicating and looking for information. In many countries,
the number of smartphones has surpassed the number of personal computers; having a mobile-friendly website has become a critical part of having an
online presence.
Mobile-Friendly Website Checklist
1. Test Your Website at Multiple Resolutions
2. Make Sure Your Menus and Navigation Display Well on Smaller Screens
3. Check if Your Images Look Good at Smaller Resolutions
4. Test Your Content for Readability
5. Test Your Website’s Loading Times
6. Page Speed Optimization #Bluebashco
Page Speed
2 Websitesfor checkingSpeedOptimization
• PageSpeedInsights
• GTmetrix |WebsitePerformanceTestingandMonitoring
Visibility | Usability | Conversion
5
Guidelines to speed up your website
• Use CDN (Content Delivery Network)
• Move the website to a better host (VPS is best for this)
• Image size optimisation, use the best extension like – SVG and WEBP
• Reduce the number of plugins
• Minimize the number of javascript and CSS files
• Reduce the blank space
• Reduce the redirects
7. Remove Broken Links #Bluebashco
Remove Broken Links
6
Broken links are another signal of poor user experience. No one wants to click a link and find that it
doesn’t take them to the page they’re expecting.
For Example :
before: https://www.bluebash.co/ruby-on-rails
after: https://www.bluebash.co/services/ruby-on-rails-development-company
After: This link is removed by the developer end, Now need to add the 404.html page.
It’s redirected to the 404.html page.
8. Set Your Preferred Version #Bluebashco
Set Your Preferred Version
7
You also want to make sure that all your other versions are pointing to the correct, preferredversion
of your site. If people access one version they should automatically be redirected to the correct
version.
These are all the versions:
• http://bluebash.co
• https://bluebash.co
• http://www.bluebash.co
• https://www.bluebash.co
9. Search Engine Indexing #Bluebashco
Search Engine Indexing
Indexing is the process by which search engines organize informationbefore a search to enable super-fast
responses to queries.
8
WebmasterChecklist
- GoogleWebmaster(70%worldwide)
- BingWebmaster(30% Canada/USA)
- YandexWebmaster(90%Russia)
Process
- Create a Webmaster Tools account and verify your site
- Add an XML Sitemap for each site in your Webmaster Tools
account
- Add Robots.txt file
- And verify your domain in Webmaster
10. Robots.Txt #Bluebashco
Robots.Txt
9
Robots.txt files are instructionsfor search engine robotson how to crawl yourwebsite.
Every website has a “crawl budget,” or a limitednumber of pages that can be included in a crawl – so
it’s imperativeto make sure that only your most important pages are being indexed.
On the flip side, you’llwant to ensure yourrobots.txt file isn’t blocking anythingthat you want to be
indexed.
Here are some example URLs that you should disallow in your robots.txt file:
• Temporary files
• Admin pages
• Cart & checkout pages
• Search-related pages
11. Sitemap.XML #Bluebashco
Sitemap.XML
9
An XML sitemap is a file that lists a website’simportant pages, making sure Google can find and crawl
them all. It also helps search engines understandyour website structure. You want Google to crawl
every essential page of yourwebsite. But sometimes, pages end up without any internallinks pointing
to them, making them hard to find. A sitemap can help speed up content discovery.
Sitemap Generator
https://www.xml-sitemaps.com/