Conjungo SEO Audit
Report contents: ............................................................................................. 2
Summary: ...................................................................................................... 3
Main issues..................................................................................................... 4
URL structure .............................................................................................. 4
Content deficit............................................................................................. 4
Other issues ................................................................................................ 5
Detailed breakdown of SEO issues ................................................................... 6
Site structure .............................................................................................. 6
Site category navigation is un-spiderable and lacking accessibility options....... 9
“Related category” navigation elements indexed by search engines .............. 11
“Vendor Solutions” page elements indexed by search engines ...................... 12
Duplicated page titles ................................................................................ 13
conjungo.com domain does not exist .......................................................... 15
Default doc type duplicate content issues.................................................... 16
pageID parameter in URL causing duplicate content .................................... 18
Lack of meta data on important pages ........................................................ 19
Images used in place of text ...................................................................... 21
Missing alt text from a number of images.................................................... 22
Page content ............................................................................................. 23
Old pages have been removed ................................................................... 25
“404” error handling .................................................................................. 26
Not enough external backlinks to www.conjungo.com.................................. 27
Localised TLD not used .............................................................................. 29
News section does not utilise web feeds ..................................................... 30
Code layout............................................................................................... 31
Other non-SEO issues ................................................................................ 32
Glossary of key terms ................................................................................ 34
Although the conjungo.com site is hosted on a mature domain, the current
implementation could be considered to be a relatively new site in search engine terms.
The site has not been fully indexed or gained sufficient authority (PageRank 1) for many
of the problems which are normally associated with the current site structure to emerge;
indeed, some of the points raised in this document may be partly responsible for the site’s
small search footprint and reported lack of authority.
windows which prevent search engines from crawling or ‘hide’ site content from the
engines which, conversely, is also responsible for ‘orphaned’ page elements being indexed
in the search results outside of the site’s HTML framework.
In addition, there is very little content on the site to distinguish the pages as unique
to search engines. We discuss this and make recommendations for improved content,
page and site structure, as well as supporting content, in this report.
Increasingly, the emphasis on off-page factors for search needs to be addressed
and online search PR, in the Web 2.0 age, has to be an ongoing task. There are not nearly
as many backlinks or as much general 'noise' online surrounding the site as one might
hope to realise with a venture of this magnitude.
This audit highlights areas of the site where there are problems (or potential
problems) which may adversely affect the site’s performance in search engine results,
giving detailed descriptions of specific issues and suggesting solutions to these problems.
In some cases, we give more than one approach, along with an explanation of the merits
and drawbacks of each, so that you can decide which will be easier to implement or more
in line with brand requirements. Most of the on-site issues associated with this particular
site are of a technical nature and Netrank can provide you with technical support in order
to help you to resolve them.
There are a number of issues with the current URL structure of the site that are
either problematic or non-optimal with regards to search engine indexing. There are a
number of problems which could lead to duplicate content issues arising. While many of
these issues are not severe, some are more serious and there are a number of areas
which should be addressed. There are opportunities here to make substantial
The main horizontal navigation in the ‘category search’ part of the site relies on
the search aspect of the site is implemented, the category navigation is the main ‘spider’
crawl path for exposing the content within the site. In this audit, we recommend and
discuss a solution to this, which will immediately improve the site’s search engine
We imagine that, for prospective advertisers, part of Conjungo’s unique selling
point is that it creates greater exposure for them and for their products and solutions. As
such, getting unique, relevant pages indexed for advertisers is a prime concern and one
that is not currently addressed.
In the Conjungo search results, the link to the advertiser’s details relies on
some of the page modules have been indexed, they have been done so in isolation
presenting further navigation and indexing problems. We discuss these issues and suggest
solutions in this audit.
There is a lack of text content on some of the pages on the site, particularly the
category search page and category landing pages. In some cases this is caused by text
being ‘hidden’ from search engines within images but in others it’s simply due to a lack of
In this report, we highlight areas where this is the case and suggest ways in which
the existing content can be improved and more strongly emphasised, without necessarily
requiring a complete site redesign.
There are a number of other issues included within this report, such as technical
domain and hosting issues, and an exploration of the optimisation of the structure of
pages, both from a visitor’s and search engine’s perspectives. Consideration is given to
functionality and branding as well as to the more technical areas of SEO.
Detailed breakdown of SEO issues
The site is not arranged in a ‘hierarchal’ fashion, and instead relies on dynamic URL
parameters presented on a limited number of different page names to create different
While this is only applied in a limited way, and the URLs can be considered as being
search engine friendly, there are also opportunities to increase the human visitor
friendliness of these URLs. As URLs are generally read from left to right, there are changes
that can be made to make it more obvious to a human visitor where they are in the site
from looking at the URL. These changes also aid comprehension when looking at a URL in
a search result page, and help external sites providing links to know instantly what sort of
page they will be linking to.
Here are some examples of how the site is currently structured, dependent on the
visitor’s choice of navigation:
These URLs are:
The home page of the site
The search results page
The home page of the site
The category search page
The ‘Internet’ category
The ‘Networking’ category
The ‘Security’ category
Structure the site hierarchically. For example, the above category URLs could be
structured along the lines of:
This should be implemented consistently across the site. The URL structure in the
current site, as demonstrated by the selection of links given in the examples, is not
consistent as you navigate down through the various sections available from the global
An added advantage of structuring URLs in this way is the inclusion of relevant
keywords, which assist in search engine ranking as well as helping to improve usability for
human visitors. They also help with converting searchers into visitors, as matches in the
URL for the keywords that were searched for may increase the chances of the searcher
visiting the page.
While it is likely that many of the site’s visitors will use the search box on the home
page to access information on the site, many people will use one of the major search
engines directly. Therefore, to optimise the category pages from conjungo.com for listing
in the major search engines, consider re-writing the dynamic URLs into the form
suggested above. Netrank can provide you with technical help with this.
It is very important that 301 redirects are implemented on all old URLs to redirect
human visitors and search engine ‘spiders’ to the location of the new page.
Example code for one of the links:
<a href="#" onclick="redirectto_cat('categorysub.php?catid=85','85')"
Not only are search engines often unable to understand and follow such URLs,
causing internal site crawling and indexing issues, but human visitors to the site who have
<a href=”categorysub.php?catid=85” tabindex=”9”>
link which will open the pop-up window, so that the plain text href is still valid. Also add
disabled click on the link, it will still open in a new window, albeit one which does not have
the window size specified (this is still an improvement on the current situation, where such
a user would have a non-functional site).
Consider adding a <meta name="robots" content="noindex,follow" /> tag to the
page to prevent it from being indexed outside of the normal HTML structure of the site, as
addressed elsewhere in this document.
These search guidelines are in addition to any legally required DDA compliance
Site category navigation is un-spiderable and lacking accessibility options
The site category navigation on the category search landing page uses images and
search engine friendliness. The image below is a screenshot of the conjungo.com
homepage with images disabled.
enabled, they will see the global site navigation as the image below. The images are
loaded but the menu does not function.
As many screen readers only read text, blind or partially sited visitors would be
unable to use the main navigation. Search engine robots also only read text and so the
main navigation would appear similar to a search engine. The lack of accessibility options
prevents navigation by both users and search engines.
replaced with HTML/CSS, providing search engine friendly links in plain HTML that would
- 10 -
“Related category” navigation elements indexed by search engines
Individual page element files have been extensively indexed by Google, leading to
search results which take the visitor directly to a page element file which is clearly
pop-up, and which lacks navigation to reach the main site.
The ‘Related Categories’ link in the search and category results contains a
categories related to the search result. While, from a usability aspect, this is not ideal as
many people block pop-up windows by default, the site meets accessibility requirements
by including the URL within the anchor tag. However, as search engine robots do not have
Exclude the ‘include’ files using robots.txt. Whilst many of the page elements of the
site are content which should be indexed, robots need to be excluded from listing the
individual files, so that the page elements are contained within the site's HTML framework,
Alternatively, add the following meta tag to the head element of all the ‘related
<meta name="robots" content="noindex,follow" />
This approach may initially involve more work, but the advantage is that links in
these pages will still be followed, allowing link-juice to be passed on to the targets of
Netrank can provide technical assistance with this.
- 11 -
“Vendor Solutions” page elements indexed by search engines
Individual page element files have been extensively indexed by Google, leading to
search results which lead the visitor directly to a page element file, which is clearly
pop-up, and lacking navigation to reach the main site.
Exclude the ‘Vendor Solution’ files using robots.txt. Whilst many of the page
elements of the site are content which should be indexed, robots need to be excluded
from listing the individual files, so that the page elements are contained within the site's
HTML framework, with navigation.
Alternatively, add the following meta tag to the head element of all the ‘related
<meta name="robots" content="noindex,follow" />
If the proposed new Vendor Page structure, demonstrated on
http://www.timandra.co.uk/conjungo/01v1/, is implemented, the Vendor Solution links
should target the appropriate section of the client’s page.
- 12 -
Duplicated page titles
Page titles are extensively duplicated throughout the site, missing opportunities for
optimisation and the potential for the increased Click-Through-Rate that compelling titles
in search results can generate.
A site search for “conjungo” in Google shows the same or similar titles on almost all
Additionally, the home page title, shown below, is truncated in the search results from:
Conjungo - Find IT products, suppliers, vendors and resellers,
including software, internet, wireless, telecoms and support services.
Conjungo - Find IT products, suppliers, vendors and resellers...
Although there is no evidence that exceeding the arbitrary 60 character cut-off limit harms
search rankings, it affects the appearance of the snippet, which may negatively affect the
overall brand image.
Title tags are heavily weighted content and should be used both for search
optimisation and for enhancing the user experience.
All landing pages and category pages should have unique, compelling titles. A
simple solution to this for the category pages would be to use the category breadcrumb as
the title. For example:
“Internet > Web Optimisation > Solution Providers – Conjungo” (59 chars)
Include relevant keywords in the page titles and remove non-useful terms.
An example of a focused, keyword-rich title for the home page could be:
“Conjungo – IT Solutions, Products, Vendors and Resellers” (56 chars)
- 13 -
Keep title tags within about 60 characters to ensure that they do not get truncated
(shortened) by search engines.
An SEO professional should be employed to write titles that are both keyword-rich
and enticing, in order to help the pages to gain higher search engine rankings and to
encourage searchers to visit these pages when they are returned in their search engine
- 14 -
conjungo.com sub-domain does not exist
Currently, whilst the sub-domain http://www.conjungo.com does exist, the sub-
domain http://conjungo.com does not exist. As some sites may link to this domain without
the www., the benefit of these links is lost.
Also, potential visitors may type in the domain name without the www. These
visitors will see an error and will not be able to view the site (except those using web
browsers that auto-complete the “www.”).
Set up the DNS entries for the domain conjungo.co.uk without the “www.”.
As creating a duplicate sub-domain would cause duplicate content issues,
implement 301 permanent redirects from the “non-www.” domain to the “www.” version.
For example, if a visitor should type in:
they should be redirected to:
and a visitor to:
should be redirected to:
- 15 -
Default doc-type duplicate content issues
At the ‘root’ of the site, the page ‘/index.php’ is present in addition to the page ‘/’.
While it is appreciated that this is just the way that servers will render default document
types, only one of these should ever show.
For example, both of the following are present and have identical content:
In addition, if links are made to both of the pages, the benefit is split between them
rather than being concentrated on one page. Visitors clicking on the site logo or the
‘Home’ link from category navigation are directed to the /index.php page and therefore,
after using the site, may link to http://www.conjungo.com/index.php instead of to the
Perform a 301(permanent) redirect from one of the URLs to the other. Switch all
controlled links to point to the chosen version only.
While the choice of which URL to redirect will have little impact on search engine
rankings, we suggest redirecting ‘/index.php’ to ‘/’ throughout the sites. This is purely for
presentation and usability purposes – the latter is a ‘cleaner’ URL which is simpler to read
and is less likely to be truncated by search engines when listing the URL in the results.
This may lead to better click-through and conversion rates.
To help you to visualise this, compare the following two sets of hypothetical URLs.
Both link to identical pages but the former uses the ‘/’ variant and the latter uses the
- 16 -
In addition to simply making URLs more readable, there is another long term
benefit to redirecting root index pages to the ‘/’ variant: the future proofing of server
platform changes with minimal additional work.
If a page has a path ending in ‘/index.php’ and the decision was made to switch to,
for example, an ASP or JSP-based web hosting solution, the URL may have to be
redirected to index.asp or index.jsp. If, however, the URLs of these pages simply end with
‘/’, this additional work would not be needed if the hosting platform changed.
- 17 -
‘pageID’ parameter in URL causing duplicate content
Where there are more results than can be shown on one page, and pagination links
are presented and indexed, duplicate content issues can arise. In the example Google
results set below, you will see around 126 results indexed by Google:
Searching with the filter turned on shows only 1 result, as Google does not consider
that these results add any relevancy.
A potential solution to this issue, which would prevent search engines from indexing
content beyond the ‘canonical’ page, while still allowing the robots to crawl and index the
full result set and therefore index the target pages, is to use the following tag on all pages
containing the dynamic URL parameter “pageID=”:
<meta name="robots" content="noindex,follow">
As suggested elsewhere in this document, the implementation of unique page titles
and meta data would assist in ensuring that search engines judge the canonical category
pages to have unique, relevant content which is worth ranking.
- 18 -
Lack of meta data on important pages
A large number of pages on the site contain no meta description tag (a hidden
information tag which describes the page). While this tag is not widely used by search
engines for ranking purposes due to prior widespread ‘spamming’ of the tag, it is still
sometimes used for other purposes.
The meta description is used for the ‘snippets’ provided by several search engines.
These are the small sections of text beside most search results. If a meta description
contains keywords relevant to the search, text from it may be selected as the snippet for
that page in lieu of text from the web page and, if it is well written, it can have a very
positive effect on click-through and conversion rates.
A meta description can also help to indicate to the search engine that pages which
are otherwise similar are actually different. In certain cases, including this tag and using it
properly will decrease the chances of pages being considered to be duplicate content.
Another example of meta data contributing towards duplicate content issues can be
where different pages have identical or very similar meta tags, or where these tags are
present but contain no data. This is particularly rife where meta title (or indeed title) or
meta description tags are not sufficiently varied, but the meta keywords can also be a
contributory factor. If used, meta keywords should ideally contain 3 to 8 relevant
keywords relevant to the specific page, although, because the keyword tag was so abused
in years gone by, it hasn't had any significant effect on rankings in any major search
engine for some time.
Add unique content to the meta description tags of all pages that should be
returned for a search result on Google. The tag should describe the page, be well written
(to attract visitors if it gets displayed as the search engine ‘snippet’) and contain a number
of relevant keywords which should also appear on the page. It should also not be too
long: definitely less than 255 characters and ideally about half of that amount.
It may not make economic sense to retrospectively add contents to this tag for all
older pages, but this tag should be present on all key existing pages such as the landing
pages and category pages. All new pages should also include this tag and, ideally, this
should be enforced somehow, e.g. by the content management system (CMS) used.
- 19 -
Ensure that no two pages on the site have the same title tag, meta description tag
or meta keywords tag. An easy way to do this is to include elements from what the page
focuses on (e.g. a product, category or client’s name) to the contents of these tags.
Please note that it is very important that different pages do not have identical or
very similar meta tags, as this can lead to instances of duplicate content.
- 20 -
Images used in place of text
A large number of pages on the site use images to represent textual content. As
mentioned above, it is used for the category navigation. It is also used for large areas on
pages instead of text.
With images off, some pages are almost devoid of text content. In the example
below, the ‘Company Search’ page, there is no ‘search’ button visible for the
Most images in the site do not appear to contain alt text. Alt text is beneficial for
both accessibility and, to some extent, search engines although it is generally not
considered to be as ‘powerful’ as real text.
Replace images used for text with real text. Ideally, use heading tags (h1, h2 and
so on) where appropriate.
If brand guidelines dictate that this is not acceptable and that an image is required,
display the images using the background-image: CSS selector. That way, when CSS is
disabled, the textual headings are shown and the images are hidden, rather than both
You need to make sure that exactly the same text (word for word) is used, so that
the hidden text does not look like an attempt to cloak content for search engines.
Netrank can provide you with technical support to help you to achieve this.
- 21 -
Missing alt text from a number of images
A number of pages on the site include images but do not include alt text for those
images. This will prevent search engines that can read alt text from indexing the text of
As well as hiding content away from spiders and search engines, a non-SEO issue is
that sufficient alt text is also an accessibility requirement, as some disabled users (for
example, blind users using screen readers) are not able to understand the page properly
without it. Empty ALT attributes will be passed over, but where no ALT attribute is
included, many screen readers will simply read out the word ‘image’ repeatedly: on the
conjongo.com homepage that would be 15 times.
Ensure that all images on the site that are visible contain a sensible alt text. This is
still necessary for images used within anchor tags even if the anchor tags contain the title
attributes, as they are not read by screen readers. For all images which contain text,
generally, the exact text should be used. For images without text (for example, a
photograph of a celebrity) a description of that image should be included.
Ensure that, where alt text is used for images which contain text, the full text is
used, and also that words which are not present in the images are not included in the alt
text of images which are purely text, as this could be considered “keyword stuffing” by the
- 22 -
Most pages on the site contain very little textual content to describe what the page
is about. Keyword-rich content that describes a page is a very important factor in a search
engine’s algorithms for determining which pages are relevant for a given search query. In
general, without keyword-rich, relevant content, a search engine is likely to rank a page
less well than if it had such content.
The Netrank Density Engine tool is useful here to demonstrate the keyword content
of the home page of the site. Please visit the following URL for a demonstration:
An examination of the page with this tool shows that many ‘money terms’
such as [technology solutions providers] are not very prominent on the front page of the
Other pages, such as the category and company search pages, also contain very
little relevant textual content.
Include keyword-rich, relevant content on more pages on the site, in particular on
key pages such as the category home and landing pages as well as the company search
pages. The text should ideally describe what the page is about, using a number of
different keywords that users looking for the page may search for. Keyword research
should be undertaken to identify the important keywords. Ideally, an experienced SEO
copywriter should be involved.
It is important that the quantity of text does not become excessive; after all, the
page still has to be user-friendly. It is also important that it does not appear ‘spammy’ by
including a large quantity of repeated keywords (this is obviously detrimental to the user
experience), or by hiding the text to make the page design look cleaner, as this can be
highly detrimental to the site’s presence in search engines.
- 23 -
Here are some general examples of extra content that could be included:
• News or ‘features’ sections on the front page (this also helps with the
“freshness” of a page as it changes more often).
• A more detailed description of the category or products on the category
landing pages and listings.
• Opinions, success stories, customer or partner testimonials, and findings of
third parties that perform comparisons (for example, favourable reports from
relevant periodicals). Current partners/sponsors are only used in images.
Should SEO research indicate that the number of searchers for a particular keyword
or term (for example, [Web Design Agency]) was sufficient, it may warrant creating a
page specifically targeting potential Web Design Agencies, perhaps a guide, how-to or
feature page, as one way of increasing exposure for this search. This can increase the
chances of the page being considered relevant to a specific term (for example, the page
copy and back links will be more specific to just that type of search). It is also likely to
provide a better conversion rate, as it is targeted at a specific group of people and
matches the exact keywords that they typed into the search engines.
Another method to expand the site and/or attract backlinks would be to provide
different content. For example, humorous advertisements could be made available to
users. Another type of ‘different content’, albeit one that would require much more work,
would be an online game (typically Macromedia Flash-based).
These types of alternative content can be used to build brand awareness or to push
selling points whilst the web surfer is visiting the page. Ideally, anything like this would be
combined with a promotional campaign, including link building, to help to get it going.
Should such a campaign ‘go viral’, it would result in a large number of valuable, long-
lasting backlinks which would, in turn, help the rest of the site.
- 24 -
Old pages have been removed
There are a number of old pages indexed by search engines which are 404 pages:
This could be damaging to the brand but, more importantly, it causes a loss of
authority to the site as a whole. Web pages build up authority through on-page factors,
age, and both internal site links and external links. If the page is removed and no
redirection is put in place, it has an impact on both the user experience and on the search
engine spiders. Both are left on an ‘orphaned’ page, without navigation.
Whenever a page is removed from the website, put in place a 301 (permanent)
redirect in place, so that visitors are transferred seamlessly to an appropriate replacement
page. Search engine spiders will then recognise that the new page is to be indexed
instead of the old one, helping to ensure that only fresh pages appear in the search
To remove the current dead pages, add 301 redirects to either the same page
elsewhere on the site (if it has moved), an appropriate related page, a page that explains
why the contents of the original page are no longer accessible and presents alternative
options (while ensuring that numerous duplicate pages are not created as a result) or, as
a last resort, to the root index page.
The page to which the old page is redirected should have navigation to the rest of
- 25 -
“404” error handling
The site handles “404” or non-existent page requests in a manner that is potentially
detrimental to search. There are currently two different ways in which pages that are not
found are handled on the site:
1. If the page does not exist at all, for example
http://www.conjungo.com/fakepage.htm, then a 404 status code is correctly served.
2. If the page exists, but the page= or id= parameter is invalid, for example
http://www.conjungo.com/categorysub_sub.php?catid=fake_category, then instead of
serving up a 404 status code, it gives a “200 OK” status code, indicating that it has
returned a valid page, and loads an empty template page.
A search engine spider visiting the site may become confused and decide that there
are many pages on the website, as some of them do not serve an expected “404” not
found header. The search engine algorithms may also notice that these ‘pages’ are all
identical, so a duplicate content issue may arise.
Serve a 404 status code for all cases where a page is not found. Do not use
temporary redirects or return a status code of 200. Create a custom ‘404’ error page
containing branding and navigation links back to the core site.
- 26 -
Not enough external backlinks to www.conjungo.com
One of the most important factors (if not the most important) in achieving high
search engine rankings is the anchor text of backlinks from other sites to the site. The
words used in the anchor text of these external backlinks are powerful in terms of
Examining the backlinks to www.conjungo.com, as provided by the search engines,
it is apparent that the site has a low number of backlinks and would probably benefit from
a backlink campaign.
Google is not currently reporting any backlinks, although it does often delay
“publishing” backlinks in order to allow for “link-churn” and reports only the authoritative
Yahoo reports several thousand backlinks but, other than internal site links, these
are mostly from the www.ecademy.com domain. While this domain provides a valuable
link from its home page, the additional site-wide links do not add any authority from a
search engine rankings perspective. In fact, there is some evidence to suggest this may be
detrimental when compared to just one authoritative link.
Also, the major search engines have always discouraged paid link sales and, where
a site is identified as engaging in this practice, the links are devalued. There is evidence to
suggest that Google is now penalising the rankings of sites selling links in order to control
this, now widespread, practice.
The optimal backlink development model focuses on a spread of different relevant
sources to obtain a variety of different backlinks to the site. Having diversity in a site’s
backlinks reduces the risk of having the site’s rankings affected by one of its partners
being penalised and losing authority.
Good backlink development is a very time consuming task, involving careful
selection and evaluation of potential link sources. The development of particularly valuable
backlinks is something that generally needs to be undertaken by an SEO professional.
Some backlink development methods include:
- 27 -
• Undertaking a ‘link to us’ campaign, where visitors are encouraged to link to
your site. The advantage of this is that you can specify the anchor text in the
• Submitting the site to a few high quality directories (while avoiding low
quality directories which are often little more than link farms). Some
essential directories include the Open Directory Project (also known as
DMOZ), and the Yahoo! Directory – neither of which www.conjungo.com is
• Finding and approaching authoritative sites and requesting links from them.
• Syndicating press releases.
• Links and articles can also be purchased. Search engines generally try and
discount these types of links, but they can be useful for providing traffic as
Netrank can provide you with a number of link development services.
- 28 -
Localised TLD not used
A geographic TLD is an extremely valuable asset with regards to ranking for search
results within that country/region, and ideally all sites targeted at a particular market
should use a relevant local TLD (e.g. .co.uk for the United Kingdom, .fr for France, etc).
It may also be beneficial in some regionalised search engines to host the site in the
relevant country, but if a geographic TLD is used this is usually not as important.
The choice of which domain to use depends on the intended audience for the site,
both now and in the future.
If the site is intended primarily or exclusively for a UK audience and this is likely to
be the case going forwards, we recommend switching the domain from the .com TLD to
the .co.uk TLD.
If the site is aimed at, or will in the future be aimed at, an international English-
speaking audience, we recommend leaving the domain on the .com TLD.
It is possible that some people may directly enter the Conjungo name into a
browser with the .co.uk tld and so we recommend that you register the domain
conjungo.co.uk (which is currently available). This action also supports brand protection.
It may also be worth buying other available domains with names that use misspellings of
Put in place 301 redirects from every request to the .co.uk site to the appropriate
page on the .com site to ensure that the benefit of any backlinks to the new URL is
maintained and that duplicate content risks are avoided.
If you are likely to want international versions of the site in the future (for example,
separate sites for the UK and for the US, or a site in French and a site in English), we
recommend using domains appropriate to these countries and/or languages. Netrank can
provide you with recommendations on domain selection for any future country/language-
specific sites on a case-by-case basis.
- 29 -
News section does not utilise web feeds
Although the site features a news/press releases section, located at:
it does not use web feeds (such as RSS or Atom). Using web feeds is a very
effective method of enabling frequently updated, new information to be discovered and
indexed by search engines rapidly, and also allows this information to be found in a
variety of other blog and feed search engines, such as Technorati or Google Blog Search.
Provide RSS and/or Atom feeds (using recent versions of these formats) for the
most recent press releases and articles. There is at least one potential feed that should be
• Press Releases
These should be provided on the following pages respectively:
Include “<link />” elements within all pages that link to the feed with appropriate
titles, so that if a visitor is using a web browser that can identify the presence of feeds,
such as Firefox or Internet Explorer 7, they can easily determine which feed they are
The feeds should be dynamically generated, and the system should be set up to
automatically ‘ping’ one or more different ‘pinging’ services which are appropriate for the
content as each new article is created.
It is also important to ensure that the feeds validate. This is more important with
feeds than is generally the case with web sites (although this can be important as well).
Feed validation can be performed at this site: http://feedvalidator.org/
Lastly, link the individual news/press release pages to other relevant internal pages
on the Conjungo site.
- 30 -
The HTML code for the main site navigation menu on all of the areas of the site is
placed before the main content of the page. Search engines, notably Google, tend to give
more weight to content near the beginning of a web page than to content near the end of
Also, as a menu often contains the same textual content on every page, and so
many pages on the site start with the same text, in certain cases this can increase the
chance of encountering duplicate content issues, particularly on pages with very little
unique textual content.
page. Taking a typical subcategory page as an example, there were more than 1600 lines
of code and the unique content of the page, the search results, did not begin until after
Include the main site navigation menu near the bottom of the HTML code for pages
on the site, after the main content for the page, and use CSS to position it in the desired
location in the page.
page. As well as focusing the code onto the relevant content, this has the additional
benefit of speeding up page load time, as most users’ browsers cache external linked files
and would not need to load them again while navigating the site.
Note: It should be stressed that this is a very minor search consideration
and the ROI of implementing this across the site would be minimal. This is
suggested as a consideration for future site builds.
- 31 -
Other non-SEO issues
The following issues with the site came up during this analysis and are not
specifically SEO issues, but may be of interest so have been included here:
1. The directories http://www.conjungo.com/company/ and
http://www.conjungo.com/promotions/ and their subdirectories are
configured to allow directory browsing which has, in some cases been
indexed. This may negatively affect the brand image and
2. A ‘favicon’ is a great branding mechanism for the site. This is a small icon
(16x16 pixels in size) and web browsers can display it in a number of
locations, for example bookmarks/favourites, or on tabs from the web page,
and beside the URL in the address bar.
3. Access Keys are not used within the site. Access Keys are Keyboard
shortcuts (typically a user will press [Alt]+[letter] with a PC, or
[Ctrl]+[letter] with Macs, although Firefox 2.0 has implemented a
4. The 404 page served up for non-existent pages (for example
http://www.conjungo.com/fakepage.htm) does not provide any method of
navigation to the rest of the site.
5. Pages on the site do not pass W3C validation. Although, as long as the code
is not too malformed, this generally does not matter for SEO purposes, valid
code helps to ensure that the page renders correctly on different browsers,
and also helps to prevent the creation of malformed code leading to search
engine indexing issues. The site is quite close to validation, so it may not be
as large a task to accomplish this as it would have been if it had not been
designed with validation in mind.
6. Some images/banner adverts are currently blocked by browsers using
Adblock software. This is almost certainly because they are found within a
directory named http://www.conjungo.com/promotions/bannerAdverts/’.
7. A template news page is indexed in the search results:
- 32 -
1. Set the permissions for all public directories to prevent directory browsing.
Add a default file with a 301 redirect to remove any currently indexed pages.
2. Construct a distinct 16x16 pixel .ico icon file. This file should then be placed
in the following location: http://www.conjungo.com/favicon.ico
Netrank can help you to generate favicons.
3. Implementing basic access keys (for example H = Load home page, A =
Load accessibility page, S = Skip to the main content of a page) is simple to
do and improves the accessibility of the site.
4. Consider using a custom 404 page to display a more helpful and better
presented 404 error page which provides a number of useful links to
different parts of the site.
5. Ensure that all pages on the site pass W3C validation.
6. Rename the directory which currently contains Adblocked images from
‘/promotions/bannerAdverts/’ to something less likely to trip filters, such as
/images/design/ or /images/creatives/.
7. Implement a 301 permanent redirect to the current news page:
- 33 -
Glossary of key terms
A hidden information attribute attached to an embedded image which describes
briefly what the image is about. Somewhat useful for search engine rankings, and a legal
The text of a link. Anchor text of backlinks is very important for search engines.
A link to the current page.
This term means that multiple pages feature very similar content, to the extent that
search engine algorithms consider the pages to be essentially the same content. Many
things can cause duplicate content issues in search engines.
An important phrase consisting of more than one keyword.
The practice of putting far too many keywords into a page in an attempt to improve
rankings. This is bad for both usability and for search presence, as a site using this
technique may be penalised by search engines.
An optional hidden information tag which contains a brief description of what the
page is about. No longer widely used by search engines for ranking purposes, but still
used for other purposes.
An optional hidden information tag which contains a list of keywords relevant to the
page. Has little search value as it is no longer widely used by search engines for ranking
- 34 -