2. The search engine optimisation (SEO) process consists of
designing, writing, and coding web pages to increase the
likelihood that they will appear at the top of search
engine results for targeted keyword phrases. Many so-called
SEO experts claim to have reversed engineered
search engine algorithms and use strategically created
"doorway pages" and cloaking technology to maintain
long-term search positions. Despite all of these claims,
the basics of a successful search engine campaign have
not changed in all the years we have provided these
services.
3. To get the best overall, long-term search engine
positions, three components must be present on a web
page:
7. All of the major search engines (AltaVista, FAST Search,
Google, Lycos, MSN Search and other Inktomi-based
engines) use these components as a part of their search
engine algorithms. Sites that have (a) all of the
components on their web pages, and (b) have optimal
levels of all the components perform well in the search
engines overall.
9. Since the search engines build lists of words and phrases
on URL's, then it naturally follows that in order to do
well on the search engines, you must place these words
on your web pages in the strategic HTML tags.
10. The most important part of the text component of a
search engine algorithm is keyword selection. In order
for your target audience to find your site on the search
engines, your pages must contain keyword phrases that
match the phrases your target audience is typing into
search queries.
11. Once you have determined the best keyword phrases to
use on your web pages, you will need to place them
within your HTML tags. Search engines do not place
emphasis on the same HTML tags. For example, Inktomi
reads Meta tags; Google ignores Meta tags. Thus, in
order to do well on the entire search engines, it is best
to place keywords in all of the HTML tags possible,
without keyword stuffing. So no matter what the search
engine algorithm is, you know that your keywords are
contained in your documents
13. The strategy of placing keyword-rich text in your web
pages is useless if the search engine spiders have no way
of finding that text. The way your pages are linked to
each other, and the way your web site is linked to other
web sites, does impact your search engine positions.
14. Even though search engine spiders are powerful data-gathering
programs, HTML coding or scripting can
prevent a spider from effectively crawling your pages.
Examples of site navigation schemes that can be
problematic are:
15. 1. Poor HTML coding on all navigation schemes:
Browsers (Netscape and Explorer) can display web pages
with sloppy HTML coding; search engine spiders are not
as forgiving as browsers are.
16. 2. JavaScript: All of the major search engines cannot
follow links embedded inside of JavaScript, including but
not limited to mouseovers, arrays, and drop-down
menus. Note: Even though reputable designer resources
claim search-engine friendly scripts exist, many of them
are untested and unproven.
17. 3. Dynamic or database-driven web pages: Pages that
are generated via scripts, databases, and/or have a?, &,
$, =, +, or % in the URL can present spider "traps."
18. 4. Flash: Currently, none of the search engines can
follow the links embedded in Flash documents.
19. Therefore, to ensure that the spiders have the means to
record the data on your web pages, we recommend
having two forms of navigation on a web page: one that
pleases your end users, and one that the search engine
spiders can follow.
25. Attaining an optimal popularity component is not simply
obtaining as many links as possible to a web site. The
quality of the sites linking to your site holds more
"weight" than the quantity of sites linking to your site.
Since Yahoo is the most frequently visited site on the
web, a link from Yahoo to your web site carries far more
"weight" than a link from a smaller, less visited site.
Other outstanding sites that can help generate excellent
popularity are LookSmart, the Open Directory, and
About.com. Non-competitive, industry-specific sites
(such as javascript.com) are also excellent link
development resources.
26. Obtaining links from other sites is not enough to
maintain optimal popularity. The major search engines
and directories are measuring how often end users are
clicking on the links to your site and how long they are
staying on your site (i.e., reading your web pages). They
are also measuring how often end users return to your
site. All of these measurements constitute a site's click-through
popularity.
27. The search engines and directories measure both link
popularity (quality and quantity of links) and click-through
popularity to determine the overall popularity
component of a web site.