SlideShare a Scribd company logo
1 of 4
Tips To Create Search Engine Friendly Webpages

Things you need to remember regarding Search engines:

Search Engines survive through advertising. They make money via ads and in the world of
advertising - one rule rules, the more users view or read your ads, the more known your brand
will be and the more are viewed, the money you will make. Therefore, showcasing ads to your
users is profitable. The formula is - the more users, the more money you will make. A search
engine gains additional users by providing them with quality results. If your site is most relevant
to users search results then the search engine will boost your rankings – as that’s how they earn.

Search engines are computerized programs. They have a program called Spider. This program
named Spider downloads your web pages, reads them and then decides what must be done with
them on the basis of the information they contain. This process is known as Crawling. Therefore in
simple words, Search Engine Spiders are computerized programs used for crawling web pages.

If you have ever used a computer, you might have noticed that at times your computer tends to
slow down or at times hang up, which is because you over burden the computer by assigning
multiple tasks at once that overloads it. The same is the case with Search Engine Spiders, if your
web pages are disorganized, not systematically arranged or put up in a confusing fashion; it is
difficult for search engine spiders to crawl your web pages. Hence, your web pages will not be
crawled substantially. Web pages not getting crawled on regular basis adequately lead to the loss
of your websites rankings process. Do not forget, that there are millions and billions of web pages
on the World Wide Web; therefore, it is not possible for search engines to crawl all web pages as
their time is limited. So it is your duty to see to it that your web pages are simple and take less
time for crawling. This minimizes the risk of your website being ignored by search engines.

Now we shall see how to build highly spider-friendly web pages. Though it will not instantly boost
your rankings but will help you from making the mistakes that indirectly affect your rankings. Not
making it difficult for search engines to find and crawl your web pages.

HTML must be well-formatted

All the HTML codes must be well formatted and correct although they will not augment the
rankings, but it is hard for search engines to crawl a broken code. By prevention of using source
codes it is difficult for search engine spiders to track them and be careful enough to close all
relevant tags. Do remember - a search engine spider is pre-programmed computer software. It is
set up of different rules, making it restricted in certain areas. After which, we must be alert
enough to design our web pages according to search engine spider's rules to avoid consequences.


Primary rules for a search engine friendly page are:

The easier we make it for search engine spiders to find our codes, the more we stand a chance
towards success. This means, we must deter to make these two mistakes in HTML codes:

1. HTML code must not be obsolete or archaic or belonging to a proprietor i.e. to one single
browser.
2. Not use an extremely latest and new HTML that is not recognizable by most spiders.

Testing your HTML web page with World Wide Web Consortium (W3C) will tell whether it works
properly or not. Only run it on an HTML validator first:

http://www.validator.w3.org/
If your page works without any hassles then it only indicates that spiders will have not problem in
crawling your web pages. This does not mean that your web pages get you instant rankings; it
only means that your web pages are free of errors so will be easily crawled by spiders.

DOCTYPES work easily with web browsers and can be quickly analyzed and identified by search
engine spiders. Using the W3C validator becomes easy if you have a valid DOCTYPE. If this is the
condition then your web pages will be viewed by customers effortlessly across all types of
browsers, resulting into minor increments in your search engine rankings.

Few search engine spiders have had problems previously in dealing with XHTML DOCTYPES
therefore our suggestion to you would be that you use HTML 4.01 Transitional DOCTYPE, for
example:

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//En">
"http://www.w3.org/TR/html4/loose.dtd">
(Note: the above code should be in a line.)

To get more information about this you can visit the below mentioned web site:
You can fix your site by the web pages with a valid DOCTYPE
http://www.alistapart.com/articles/doctype/

Keep URLs simple

By following links, a search engine spider finds the web pages of your web site. Search engine
spiders firstly download a page, scan the links in it and then arrange all the links in a queue. After
they are done with this, they pick up the first link and then repeat the same steps until they are
done with all the links. Surely, this is a simplified explanation of the process but it does gives you
an outline of how work is undergone.



A majority of websites, especially ecommerce websites use powerfully created URLs. Meaning, the
URLs are mechanically produced by extracting variables not in the database in order to corelate
the consumer's product details. A dynamic built URL generally contains a number of odd-looking
characters like, ?, &, = and +.

For example, if a shopkeeper who sells books has a powerfully created URL:
http://yoursite.com/index.php?item=books&author=shakespeare&genre=play

A static looking URL is much easier to look at:
http://yoursite.com/books/shakespeare/play

Or

http://yoursite.com/b/s/p


Do you know what Indexing is?
When Search engine spiders gather data while crawling the internet then collecting it in a
database called an index. Gathering and storing a web page is known as process of indexing.

A number of search engine spiders are working on improving themselves greatly in order to be
capable of crawling dynamically long URLs. However, in comparison to dynamically long pages,
short and static URLs are expected to be crawled by search engines more sufficiently and get
more pages indexed.
If a website entirely depends on dynamically drawing content from a database to create its URLs,
despite this, it is still possible to have URLs that appear as static with the use of a tool like
mod_rewrite.

Google Sitemaps feature offers an alternative to use mod_rewrite allowing you the liberty to notify
Google of which pages you want to get crawled, to make it apparent for them to avoid the full
fledged long URLs, only if you will state them as essential.

Obviously, Google Sitemaps does not affect other search engine's caliber to crawl your web pages
therefore such a tool like mod_rewrite proves to be versatile.

Important: Do not change your URLs if your site is already ranked and indexed on search engines.
If you change the URLs of your website uncaringly, it will result into some serious consequences of
you loosing your rankings on search engines. For if you do not report the changes made to search
engines and if they fail to find your web pages then they will drop your web site from the rankings
it holds.

If you still do change your website's URLs then make it definite to pass on the old URLs to their
fresh location. So that, search engines and visitors, who will visit your web site will be
automatically be directed to your new URL, saving you from loosing your rankings and customers.

Stay watchful over Session Ids

Session ids permit a website to crawl any customer from page to page, they are embedded in
URLs and work as exclusive identifiers. For example, if you visit an ecommerce website then
session ids keep a record of all the things you add to your shopping cart. Because of this feature
of session ids of crawling every customer activity creates huge number of links, making it a
tedious task for spiders to crawl them. The danger of spiders repeatedly indexing the same link
over and over again will lead them trapped in a loop of powerfully created loops.

To make the concept clearer, here's an example of how session ids within a particular site create
number of links i.e. pages. A session id link looks like:

http://www.yoursite.com/name.cgi?id=dkom5678kle09i

Such a link is given to the spider when it first downloads a web page. After processing, by the
time the spider reverts to the site to download more pages it finds another URL that looks like:

http://www.yoursite.com/name.cgi?id=ok8787ijidj87k

This is the same page only with an unlike tracking variable. The problem is that it looks like a
brand new URL to spiders and so they repeatedly download them again and again. Therefore
spiders tend to ignore links that seem to have session or tracking variables which are preset.

Google makes it clear by asserting that they do not crawl all such pages:

“Don’t use “&id=” as a parameter in your URLs, as we don't include these pages in our index.”

Make use of a flat directory format

A flat directory structure means that which has less number of sub-directory levels. If there are
deeper levels of sub-directory then there are changes of your web page being less spidered. Let's
understand it better with an example:
http://yoursite.com/content/articles/2010/09/page.html

Similarly, web pages with such deep levels of directory stand at the disadvantage of being highly
ranked on search engines, devoid of some exceptionally popular sites with plenty of incoming
links.

Directories are commonly used for the organization of the format of a site. The reason being – if a
page is three or four levels deep i.e. a flat directory structure becomes tedious to logically
arrange. A well placed site map becomes highly essential if your sub-directory demands it.


Note: If your directory is already set up permanently and your web pages rank decently well on
search engines then there is no need to change anything at all. Without informing search engines
if you move your web pages, they will be deleted from the search engines list. But if you want
your established flat directory to rank high on search engines then you must pass on the web
pages to their fresh site, making it easy for search engines as well as for the customers to be able
to find your new pages.


Semaphore is primarily a Web Design Company that offers Search Engine Friendly Web Site
Design Services and SEO Services. We are expert in Website Designing, Portal Development, Web
Re-designing, Corporate Branding.

More Related Content

Viewers also liked

Modernism And Post Modernism
Modernism And Post ModernismModernism And Post Modernism
Modernism And Post ModernismJohn Kirk
 
Email Marketing Ppt Presentation
Email Marketing Ppt PresentationEmail Marketing Ppt Presentation
Email Marketing Ppt PresentationDiseño Domingo
 
trabajo practico
trabajo practicotrabajo practico
trabajo practicodaianasoro
 
Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)respine2
 
Modernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoaméricaModernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoaméricablogliter
 
Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)respine2
 
Senoir Project Photos
Senoir Project PhotosSenoir Project Photos
Senoir Project PhotosCayleeGeller
 
Treball religió
Treball religióTreball religió
Treball religióBuscarons
 
gusanos de seda ( en gallego)
gusanos de seda ( en gallego)gusanos de seda ( en gallego)
gusanos de seda ( en gallego)manuelo1a
 
Sérgio J. - Estudo de caso seguradora - plano - solução
Sérgio J. - Estudo de caso   seguradora - plano - soluçãoSérgio J. - Estudo de caso   seguradora - plano - solução
Sérgio J. - Estudo de caso seguradora - plano - soluçãozeusi9iuto
 

Viewers also liked (13)

Modernism And Post Modernism
Modernism And Post ModernismModernism And Post Modernism
Modernism And Post Modernism
 
Email Marketing Ppt Presentation
Email Marketing Ppt PresentationEmail Marketing Ppt Presentation
Email Marketing Ppt Presentation
 
trabajo practico
trabajo practicotrabajo practico
trabajo practico
 
Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)
 
Modernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoaméricaModernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoamérica
 
Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)
 
Senoir Project Photos
Senoir Project PhotosSenoir Project Photos
Senoir Project Photos
 
Treball religió
Treball religióTreball religió
Treball religió
 
1 b – g2
1 b – g21 b – g2
1 b – g2
 
Projecte 1r a
Projecte 1r aProjecte 1r a
Projecte 1r a
 
gusanos de seda ( en gallego)
gusanos de seda ( en gallego)gusanos de seda ( en gallego)
gusanos de seda ( en gallego)
 
Sérgio J. - Estudo de caso seguradora - plano - solução
Sérgio J. - Estudo de caso   seguradora - plano - soluçãoSérgio J. - Estudo de caso   seguradora - plano - solução
Sérgio J. - Estudo de caso seguradora - plano - solução
 
Períodos simples e composto
Períodos simples e compostoPeríodos simples e composto
Períodos simples e composto
 

Recently uploaded

Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGSujit Pal
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 

Recently uploaded (20)

Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAG
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 

Tips To Create Search Engine Friendly Webpages

  • 1. Tips To Create Search Engine Friendly Webpages Things you need to remember regarding Search engines: Search Engines survive through advertising. They make money via ads and in the world of advertising - one rule rules, the more users view or read your ads, the more known your brand will be and the more are viewed, the money you will make. Therefore, showcasing ads to your users is profitable. The formula is - the more users, the more money you will make. A search engine gains additional users by providing them with quality results. If your site is most relevant to users search results then the search engine will boost your rankings – as that’s how they earn. Search engines are computerized programs. They have a program called Spider. This program named Spider downloads your web pages, reads them and then decides what must be done with them on the basis of the information they contain. This process is known as Crawling. Therefore in simple words, Search Engine Spiders are computerized programs used for crawling web pages. If you have ever used a computer, you might have noticed that at times your computer tends to slow down or at times hang up, which is because you over burden the computer by assigning multiple tasks at once that overloads it. The same is the case with Search Engine Spiders, if your web pages are disorganized, not systematically arranged or put up in a confusing fashion; it is difficult for search engine spiders to crawl your web pages. Hence, your web pages will not be crawled substantially. Web pages not getting crawled on regular basis adequately lead to the loss of your websites rankings process. Do not forget, that there are millions and billions of web pages on the World Wide Web; therefore, it is not possible for search engines to crawl all web pages as their time is limited. So it is your duty to see to it that your web pages are simple and take less time for crawling. This minimizes the risk of your website being ignored by search engines. Now we shall see how to build highly spider-friendly web pages. Though it will not instantly boost your rankings but will help you from making the mistakes that indirectly affect your rankings. Not making it difficult for search engines to find and crawl your web pages. HTML must be well-formatted All the HTML codes must be well formatted and correct although they will not augment the rankings, but it is hard for search engines to crawl a broken code. By prevention of using source codes it is difficult for search engine spiders to track them and be careful enough to close all relevant tags. Do remember - a search engine spider is pre-programmed computer software. It is set up of different rules, making it restricted in certain areas. After which, we must be alert enough to design our web pages according to search engine spider's rules to avoid consequences. Primary rules for a search engine friendly page are: The easier we make it for search engine spiders to find our codes, the more we stand a chance towards success. This means, we must deter to make these two mistakes in HTML codes: 1. HTML code must not be obsolete or archaic or belonging to a proprietor i.e. to one single browser. 2. Not use an extremely latest and new HTML that is not recognizable by most spiders. Testing your HTML web page with World Wide Web Consortium (W3C) will tell whether it works properly or not. Only run it on an HTML validator first: http://www.validator.w3.org/
  • 2. If your page works without any hassles then it only indicates that spiders will have not problem in crawling your web pages. This does not mean that your web pages get you instant rankings; it only means that your web pages are free of errors so will be easily crawled by spiders. DOCTYPES work easily with web browsers and can be quickly analyzed and identified by search engine spiders. Using the W3C validator becomes easy if you have a valid DOCTYPE. If this is the condition then your web pages will be viewed by customers effortlessly across all types of browsers, resulting into minor increments in your search engine rankings. Few search engine spiders have had problems previously in dealing with XHTML DOCTYPES therefore our suggestion to you would be that you use HTML 4.01 Transitional DOCTYPE, for example: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//En"> "http://www.w3.org/TR/html4/loose.dtd"> (Note: the above code should be in a line.) To get more information about this you can visit the below mentioned web site: You can fix your site by the web pages with a valid DOCTYPE http://www.alistapart.com/articles/doctype/ Keep URLs simple By following links, a search engine spider finds the web pages of your web site. Search engine spiders firstly download a page, scan the links in it and then arrange all the links in a queue. After they are done with this, they pick up the first link and then repeat the same steps until they are done with all the links. Surely, this is a simplified explanation of the process but it does gives you an outline of how work is undergone. A majority of websites, especially ecommerce websites use powerfully created URLs. Meaning, the URLs are mechanically produced by extracting variables not in the database in order to corelate the consumer's product details. A dynamic built URL generally contains a number of odd-looking characters like, ?, &, = and +. For example, if a shopkeeper who sells books has a powerfully created URL: http://yoursite.com/index.php?item=books&author=shakespeare&genre=play A static looking URL is much easier to look at: http://yoursite.com/books/shakespeare/play Or http://yoursite.com/b/s/p Do you know what Indexing is? When Search engine spiders gather data while crawling the internet then collecting it in a database called an index. Gathering and storing a web page is known as process of indexing. A number of search engine spiders are working on improving themselves greatly in order to be capable of crawling dynamically long URLs. However, in comparison to dynamically long pages, short and static URLs are expected to be crawled by search engines more sufficiently and get more pages indexed.
  • 3. If a website entirely depends on dynamically drawing content from a database to create its URLs, despite this, it is still possible to have URLs that appear as static with the use of a tool like mod_rewrite. Google Sitemaps feature offers an alternative to use mod_rewrite allowing you the liberty to notify Google of which pages you want to get crawled, to make it apparent for them to avoid the full fledged long URLs, only if you will state them as essential. Obviously, Google Sitemaps does not affect other search engine's caliber to crawl your web pages therefore such a tool like mod_rewrite proves to be versatile. Important: Do not change your URLs if your site is already ranked and indexed on search engines. If you change the URLs of your website uncaringly, it will result into some serious consequences of you loosing your rankings on search engines. For if you do not report the changes made to search engines and if they fail to find your web pages then they will drop your web site from the rankings it holds. If you still do change your website's URLs then make it definite to pass on the old URLs to their fresh location. So that, search engines and visitors, who will visit your web site will be automatically be directed to your new URL, saving you from loosing your rankings and customers. Stay watchful over Session Ids Session ids permit a website to crawl any customer from page to page, they are embedded in URLs and work as exclusive identifiers. For example, if you visit an ecommerce website then session ids keep a record of all the things you add to your shopping cart. Because of this feature of session ids of crawling every customer activity creates huge number of links, making it a tedious task for spiders to crawl them. The danger of spiders repeatedly indexing the same link over and over again will lead them trapped in a loop of powerfully created loops. To make the concept clearer, here's an example of how session ids within a particular site create number of links i.e. pages. A session id link looks like: http://www.yoursite.com/name.cgi?id=dkom5678kle09i Such a link is given to the spider when it first downloads a web page. After processing, by the time the spider reverts to the site to download more pages it finds another URL that looks like: http://www.yoursite.com/name.cgi?id=ok8787ijidj87k This is the same page only with an unlike tracking variable. The problem is that it looks like a brand new URL to spiders and so they repeatedly download them again and again. Therefore spiders tend to ignore links that seem to have session or tracking variables which are preset. Google makes it clear by asserting that they do not crawl all such pages: “Don’t use “&id=” as a parameter in your URLs, as we don't include these pages in our index.” Make use of a flat directory format A flat directory structure means that which has less number of sub-directory levels. If there are deeper levels of sub-directory then there are changes of your web page being less spidered. Let's understand it better with an example:
  • 4. http://yoursite.com/content/articles/2010/09/page.html Similarly, web pages with such deep levels of directory stand at the disadvantage of being highly ranked on search engines, devoid of some exceptionally popular sites with plenty of incoming links. Directories are commonly used for the organization of the format of a site. The reason being – if a page is three or four levels deep i.e. a flat directory structure becomes tedious to logically arrange. A well placed site map becomes highly essential if your sub-directory demands it. Note: If your directory is already set up permanently and your web pages rank decently well on search engines then there is no need to change anything at all. Without informing search engines if you move your web pages, they will be deleted from the search engines list. But if you want your established flat directory to rank high on search engines then you must pass on the web pages to their fresh site, making it easy for search engines as well as for the customers to be able to find your new pages. Semaphore is primarily a Web Design Company that offers Search Engine Friendly Web Site Design Services and SEO Services. We are expert in Website Designing, Portal Development, Web Re-designing, Corporate Branding.