2. The SEO consists of two main parts.
• ON Page Optimization
• OFF Page Optimization
ON page optimization includes the process of preparing website pages
search engine friendly.
Mainly the ON page optimization is doing in the HEAD and
BODY section of the page.
The Header part consists of the title, meta description etc.
The Body part consists the header tags like h1,h2,h3 etc.
3. TITLE
The tag should be placed within the header tag of the HTML document.
The title should be unique for each page on the site.
The contents of the title tag will usually appear in the first line of the
Results. Give the title with the keywords as a descriptive one.
The character limit is 60. So consider one thing that the character limit also
depends on the pixel rate Of each letter. So try to limit the character
between 50 – 55.
If the limit exceeds, the title in the search result will not be shown
Completely instead a (…) will.
Words in the title are bolded if they appear in the user's search query.
This can help users recognize if the page is likely to be relevant to their
search.
Create unique title tags for each page. Never give the same for every
page.
4. META DESCRIPTION
The meta description of a page will give a summary of the page to the
Google and other search engines.
The Google may use this description as a snippet for the web pages. But
Its not sure that this will definitely shown as a snippet.
In the case of not giving this meta tag, the Google will select a appropriate
Content form the page to show as a snippet.
5. The meta tag have character limit of 170 characters. Due to pixel
Rate of each letter , cannot use complete space. So 160 is good to
Use as the character limit
In the case of blogs, again we need to reduce the character limit
Because the blogs will automatically add date with the content.
So that we need to reduce character limit to 150.
If the character limit exceeds, the snippet may show the remaining
Number of character as (…).
6. This is the format of a meta description.
<meta name=“ description” content=“write the description here” />
META KEYWORDS
The meta keywords will help to add appropriate keywords that we aim to
list. But this tag is not consider by most of the search engines.
Due to the presence of this tag, the competitors will come to know about
The keywords that we are trying. So better don’t use the tag.
CANONICAL TAG
This meta tag will help to use the same content in two or more places.
The presence of same content in the same or different web site will
confuse the search engine which one to show.
8. ROBOTS.TXT
A Robots.txt file will tell the bots to crawl or not. This Robots.txt file
Should be upload in the root directory of the website.
By the use of Robots.txt file both the bots and users cannot see the
Pages that restrict.
Still Google appoints spy bots for crawling such areas to check whether
The activities happening there are safe or not.
9. We can write “allow” in the robots.txt, if any pages are remain not catch
Eg: user-agent: *
disallow: /admin
allow: /about us
Another method is also we can adopt, adding noindex to the robots meta
Tag.
<meta name=“robots” content=“noindex,nofollow”/>
Beware of rel=“nofollow” for links
The rel=“nofollow” will tell the Google that certain links in your site
Shouldn’t be followed.
This is practiced because comment spamming will pass the page’s
Reputation to that link.
10. IMAGE OPTIMIZATION
Give a perfect label to the image that we upload to our web site.
Give the same to the alt tag of the image also.
By giving a suitable name by considering the image can only give
result in the image search by the user.
There will be no user who will search our targeted keywords to search
For an image.
<img src= “image name” alt=“suited word”/>
INTERNAL LINKS
These are the links that point from one page on a domain to different
page on the same domain.
The internal links will build the site architecture.
11. The url structure of a web site is very important. The internal links
Will help to build a structured url.
The structured url will help the bots to crawl pages without any
Interruption.
SITE MAPS
The site map can be of two types. The XML and HTML. The XML site
maps are not for users but for the search engine to understand the
pages in the website. The HTML sitemaps are more useful for users to
find out the content in the webpage. It also helps the Google bots to
analyze the website.
The site map will help the bots to make the crawling easier. Its only
possible to add 500 URL's in a single site map.
12. www.xml-sitemaps.com is a website that help to make the
site map and can download as xml or html site maps.
BREAD CRUMBS
This will help the user to know where they are currently in the
Website.
Websites like Amazon are still using the breadcrumbs.
13. BROKEN LINKS
There are chances of taking place of broken links. This can be solved
Using the 301 and 302 redirection.
14. The 301 redirection will redirect the broken link permanently. So that
The gathered power by the old page will be transferred to the new
Page.
The 302 redirection will redirect the users only temporarily. Which
Means the gathered power will remain in the old page itself.
For giving redirection we need to upload the redirection file to the
Htaccess location.
“ redirect_301/old url/new url”
“ redirect_302/old url/new url”
DISAVOW TOOL
This tool is used to remove the non quality links from our website.
We can review the back links to our website using the webmaster
Tools.
15. The non quality links can be removed by uploading a document that
Contains the list of those urls .
Upload this document to the disavow section of the Google webmaster
Tools.
“google.com/webmasters/tools/disavow-links”
SOCIAL SIGNALS
Social signals have both a direct and indirect impact on organic search
Rankings
Direct impact from:
Number of people that like your brand on Facebook.
Number of Facebook shares
Number of Twitter followers
Number of tweets mentioning your brand name or including a link to
your website.
Number of people that “have you in their circles” (Google+)
16. Indirect impact from:
Increased inbound links due to improved online visibility
Increased positive reviews (in Google Local , Yelp, etc.) due to
happier customers.
Decreased bounce rate, higher time on site, and more repeat visitors
to your website
SITE SPEED
The site speed is a signal used by Google to rank pages according to
Their algorithm.
In a slower website, The search engines can crawl only fewer pages
17. Methods to speed up a website:
Enable compression like Gzip
Minify CSS, JavaScript, and HTML
Reduce redirects
Leverage browser caching
Improve server response time
Optimize images
Use a content distribution network
TIPS
Grow natural links that should from a quality web site
The URL structure has to be short and descriptive and also help to
categorize the web site.
Using of keywords purposefully in between contents will empower SEO.
The website have to be a https one that reduces chances of getting
hacked
18. The website should be responsive in order to make the browsing in
other devices compatible.
Authorization
It is the process of making Google know about the ownership belongs
To whom. This can be done by adding the site url in Google+ and edit
this link in our website by adding rel=“author”