Seo Training

  • 2,949 views
Uploaded on

 

More in: Technology , Design
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
2,949
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
53
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. SEO Aims & Objective
  • 2. What is a SEO ? SEO is the process of improving the volume and quality of traffic to a web site from search engines via "natural“ ("organic" or "algorithmic") search results.
  • 3. Basic Steps on SEO Process 7. Advanced SEO for Beginners
    • Keyword Searching
    3. On Page Optimisation / Code Optimisation 2. Domain Naming and Url naming Convention 4. Off Page Optimisation 5. Competition Analysis 6. Most Deadly SEO Sins (Black Hats Technique)
  • 4. Keyword Searching (Needs & Tools)
  • 5. Keyword research and, ultimately, keyword selection are extremely important parts of the overall SEO process. Selecting keywords relevant to your field of activity is essential. Make sure you conduct a proper in-depth keyword research before you begin placing keywords on your web site.
  • 6. Free Tools for Keyword Searching Good Keywords - a free Windows software for finding the perfect set of keywords for your web pages . Overture Inventory - currently a Yahoo! Search Marketing product – generates a list of related searches that include your term and also shows how many times the term was searched for during the last month. Related Pages - generates a list of possible keyword combinations based on lists of keywords that you provide. Google Suggest - as you type, Google will offer suggestions. Digital Point Keyword Research Tool - shows you the results of your query from both Wordtracker and Overture for determining which phrases are searched for most often.
  • 7. Payed Tools for Keyword Searching K eyword Intelligence - drive high quality customers to your website using the top search terms used successfully by millions of people across all major search engines including Google, Yahoo! and MSN Wordtracker - free trial also available - Discover the best keywords to target on your website Keyword Discovery - free trial also available - Find the best keywords for your website
  • 8. Domain Name and Url's Naming Convention Having a proper domain and url name is quite often neglected. Many search engines actually put some weight-age in the way you name your domain or url files. You will definitely want to include some juicy words in your naming convention. For example, if you site is about website critics and your url is http://www.sitecritic.com, this will definitely be better than a domain like http://www.bluecatfish.com. The same principle goes for hyperlinks. If you have 2 words as key words, you can use an underscore "_" or dash "-" to separate them.
  • 9. On Page Optimisation / Code Optimisation HTML Code Optimization The TITLE Tag Optimization The Meta tags Optimization The BODY of HTML - Code Optimization Code Optimization Checklist **Tips and Advice
  • 10. HTML Code Optimization The optimization of your HTML code for search engines is vital. It is the base of your SEO campaign. It must be optimized in a number of ways in order to improve the relevance of a chosen keyword. The closer the better and the higher your rank will be. Remember: Keywords are the words people will use in search engines. Including a keyword in your site content (and optimizing your site) will cause your site to be returned as a search result. You can choose to optimize your page for a keyword or a keyphrase (a number of related words, eg: 'free red hats'). Using a keyphrase is more advantageous (as discussed later) but for simplicity, I will refer to keywords AND keyphrases as just keywords. TIP: Try to optimize each page for just one keyword. This will stop each keyword competing against each other for weightings and you will rank higher for the chosen keyword.
  • 11. The TITLE Tag Optimisation The title tag should not contain any of the words Google disregards. These are words like 'and', 'not', 'a', 'the', 'about' etc which are too common for Google to take any notice of. Using these words will dilute the importance that your keyword is given in your title (if you put it in your title). These words are known as 'stop' words. Include your keyword in the title of your page. Including other words in your title that are not your chosen keyword/s will be detrimental to your ranking. This is because it makes your keyword seem less relevant to the title of the page. This relevance is known as 'weight'. The more weight your keyword has in a certain criteria the better. Don't include the name of your website in the title of your page: for example 'Share The Wealth – affiliate marketing'. This is because it will dilute the prominence of your keyword (in this example 'affiliate marketing'). It is tempting to include your site's name as it may look better, however it is not that important as people don't pay much attention to the title. “ The TITLE tag should be kept between 60 - 90 characters in length.”
  • 12. The Meta tags Optimisation <meta name=&quot;Description&quot; content=&quot;Free articles and guides on affiliate marketing and SEO&quot;> This meta tags contain a brief description about the Html Page. ***(generally, 200 to 250 characters may be indexed, though only a smaller portion of this amount may be displayed). <meta name=&quot;Keywords&quot; content=&quot;Affiliate Marketing,SEO&quot;> <meta name=&quot;Keywords&quot; content=&quot;keyword1, keyword2, keyword3&quot;>
  • 13. The BODY of HTML - Code Optimization Once you have written the content of your page, you can begin SEO on it. Complete the page ready for publishing and then apply the following rules to it to ensure its optimized 100% for the top search engines. 1. Your keyword should appear in bold at least once on your page. This will show the search engines that the word, your keyword, is important to the subject of your page and so must be relevant to the keyword search performed by the search engine user. 2. Your keyword should have a weight of 2% on your page (for Google Only) . This is the ideal percentage as if it is too high a search engine may penalize your page for spamming. Spamming is a term used to describe the action of webmasters that trick search engine page ranking systems (SEPRS) into thinking they are relevant in order to get a high ranking. 3. Use heading tags ( <h1>heading</h1> etc) and put your keyword into the heading. Again the usual weighting rules exist. Have your keyword as close to the beginning of the heading and have as few other words in the heading as possible. Position this heading as close to the top of your page as you can for increased relevance. Cont……….
  • 14. 5. Keep your page content between 100 and 1400 words. This is for a number of reasons, including the size of Google's page cache (amount of data from a page Google stores). If you have too much content, you could try splitting the page into two separate pages and perhaps having a 'page 2' link at the bottom of the content. 4. Put your keyword in up to three of the alt attributes for images and include it in one of the first three alt image attributes in your code. Alt image attributes are the alt tags given to images in your code which can be seen if the image fails to load. These are great for hosting your keyword as users cannot usually see them. Don't spam though, stick to three alt tags. Alt tags are used as follows: <img src=&quot;imagename.gif&quot; alt=&quot;alt-text-here&quot; width=&quot;image-width&quot; height=&quot;image-height&quot;> 0 6. Your keyword should appear at the beginning of your content and at the end (The first and last 50 words)
  • 15. Code Optimization Checklist
    • No stop words in your title tag
    • Keyword included in title
    • Website name not included in title
    • Keyword in meta keywords líst
    • Keyword placed as close to the beginning of the meta keywords líst as possible
    • Keyword appears in bold at least once in the content
    • Keyword has a 2% weíght (for Google Only)
    • Keyword is in the first heading tag and is at the top of the page content
    • Keyword is in the first 50 words and last 50 words of the page
    • Page content is between 100 and 1400 words
    • Keyword is in one of the first three alt image attributes and is in three of them in total
  • 16. **Tips and Advice**
    • Try to optimize each page for just one keyword. This will stop each keyword competing
    • against each other for weightings and prominence and you will rank higher for the chosen
    • keyword.
    • Constantly chëck your competition. You may not feel it is possible to get onto the first page
    • on Google for a certain keyword/phrase. Choose a less contested keyword.
  • 17. Off - Page Optimisation >> Good Link building Technique >> Bad Link Building Practices
  • 18. Good Link Building Technique Directory Links : Directories are viewed as being a very positive source of links by a number of search engines. Obviously, some directory listings carry much more weight than others and some directories are hardly worth the effort. Be careful to drip feed your site with directory links at first because it is possible that too many too quickly will see your site penalized until your link profile becomes more natural. Reciprocal Links: You may have read that reciprocal linking is dead. While it is true that Google and possibly other search engines now place much less weight on a profile that is crammed with nothing but reciprocal links they still have a place. Keep the number of reciprocal links you use down to a minimum and certainly don't base your entire link building efforts on this one tactic alone. Unique One-Way Inbound Links: These should pretty much be the staple diet of your link portfolio. An inbound link that is one way does not necessitate the inclusion of a link back to that page on your site. This can help to give your own pages the benefit instead of handing it out to your link partners. The more relevant and the more important that search engines deem the linking site to be the more weight they give that particular link. Cont….
  • 19. Site Wide Links: Again, these should be used sparingly. Gaining a site wide link means that a link to your site or your pages is placed on a number of pages in a site. Search engines are known to give less weight to links that are procured on this basis but it does help to give your portfolio a more rounded appearance. Press Release Links: Writing and submitting a digital press release can provide good links. Many press releases are used by other sites and industries related to your site and they may also be included on some major news websites. Article Links: Writing and submitting articles to article directories can provide a large number of links. Not only can you submit one article to numerous directories but each directory has the potential of generating a number of interested websites. These websites also publish your article (which includes an author bio section with your link). This can be a good way to get authoritative sites to link to you. Community Links: Join forums and include your link in your signature. Post useful comments on other people's blogs and include your link as your username. You should, under no circumstances, spam blogs or forums and only include links on the sites that allow it.
  • 20. Bad Link Building Technique FFA Sites: An FFA, or Free-For-All page, is one that allows anybody to post any link they like on the page. Typically they are not only useless to your cause, because the search engines ignore them, but they will not generate any natural traffic but may attract the spammers to your doors. Link Farms: A link farm is a page that contains an excessively large number of links. Some say a page with 100 links directed out of that page is a link farm, but in all honesty it is unlikely that a page will yield much benefit for SEO or non-SEO with more than fifty or so links. Off Topic: Off topic links are something of a bone of contention. They may offer very slight weight with some search engines because it is quite possible that natural links from certain websites would point to any number of pages on any topic. This appears in the bad link section because they offer very little positive benefit and your efforts would be best placed gaining on-topic links. Cont….
  • 21. Unindexable: Purely from an SEO standpoint, links that cannot be indexed by search engines are completely useless. A search engine spider must be able to follow the link to find your page and provide you with any benefit for that link. Avoid any page that offers to display your link in a frame, or includes the noindex or nofollow robots.txt tags. Conclusion: Your link profile should appear as natural as possible so vary the good links as much as possible and avoid the bad links. Collect links from as many sources using as many tactics as possible and use keyword variants in your anchor text. By following these guidelines you should be able to improve the appearance of your link profile and, therefore, improve your search engine rankings.
  • 22. Competetion Analysis Popular Pages Entry Pages Exit Pages Came From Keyword Analysis Visitor Paths Visit Length Returning Visits
  • 23. Most Deadly SEO Sins Optimizing for the wrong keywords: The biggest sin anyone can make in a SEO campaign is to choose the wrong keywords to optimize your site with. If your Site is ranking high for a keyword that is not being searched for then even a very high ranking cannot bring you any traffic. On the other hand, if you are ranking high with wrong keywords then you would get you traffic but will not convert into transactions. Spamming: Search Engines are getting smarter by the day. Over a period of time Search Engines have evolved from “Ignoring” spam to a point where they now 'penalize' websites for using spam techniques. Following are the prominent Search engine spam techniques you should avoid. Hidden text: Hidden text is using the same color text on your page as the background color. In order to get a higher keyword density, webmasters sometimes add a lot of keywords as hidden text because they are not visible to the human viewer but can be read by the search engine crawlers in the source code of the page. Most search engines can now detect which pages use such techniques and ignore or ban such sites. Cont…
  • 24. Doorway pages: A doorway page is a web page designed for search engines so as to rank well for specific keyword phrases and redirect a user to a different page on visit. These pages usually rely on frequent repetition of the keyword phrase, and try to &quot;trick&quot; search engines into ranking them well. Most search engines can now detect techniques such as “Meta Refresh” and penalize such sites. If you've used doorway pages on your website and it is still not penalized, you stand a good chance to come out clean by deleting these pages immediately. Doorway domains / multiple domains with same content: This technique uses URL-redirection meant to display another web address for the same web page or several domains show same content. In a typical example, the user types in a web address such as www.xyz.com but the URL is redirected to www.abc.com . Alternately, both these websites show identical content in the hope that one or the other may rank high in the search engine result pages (SERP). Most of the time the same party registers these domains. Search engines can easily detect these techniques. Duplicate content: Many site owners try to increase their content base by creating multiple pages of the same content either on the same site or copying the same site over several domains they may own. Search engines avoid cluttering their index with duplicate content and penalize sites, which do excessive content duplication in order to ‘trick’ their algorithms. Cont…
  • 25. Cloaking: Cloaking is a technique of serving keyword-stuffed spam pages to search engine spiders by detecting their IP address, while serving totally different pages to human visitors. This is different than geo-targeting where you may show different content to different visitors based on their region or language. The search engines can differentiate between the two and might penalize your site if you are attempting to ‘trick’ them. If you want to avoid any penalties, the thumb rule is to show the same content to search engines that you show to the visitors. Keyword spam: Keyword spam is a technique to stuff lots of keywords all over the page – in the Title tag, Meta tags, Anchor texts, Alt Attributes etc., in an attempt to increase keyword density or accommodate large number of keywords on the same page. This not only results in the page text to sound stupid to your readers, but you also lose the ‘theme’ of the page. Search engines rate pages as per ‘themes’ and credit pages, which have content, classified in nicely laid out ‘themes’. For best results, one should always focus on optimizing a page with 2-3 themed keywords rather than trying to optimize with lots of keywords. Over-optimized pages with keyword spam may invite search engine penalties. Excessive HTML markup: It is common knowledge that search engines give more credit to text marked as Headline <H1> / <H2> or other attributes like making the text bold, underlined, colored, italicized etc. In an attempt to improve importance of the text, many webmasters do an excessive HTML markup of their page content and hide the ugly display behind a craftily made CSS. Search engines have a fair idea of a balanced markup and penalize sites, which do excessive HTML markup in an attempt to ‘trick’ their algorithms. Cont…
  • 26. Creating search engine roadblocks: If you are developing a website, it helps to know of ways you can prevent creating search engine roadblocks. Your SEO efforts may not get results if your site structure is such that search engines find it difficult to index your site easily. Following are some of the problem areas which create a hurdle in your site indexing - Having Flash-based site: Sites built in Flash may have a greater aesthetic impact but perform poorly in search engines. Most search engines cannot read text or links embedded in Flash. It is best to limit the use of flash for content which absolutely requires flash or make a text-based alternate for your website so that search engine crawlers can index your site content easily. Having text in images: Search engines cannot read text embedded in images. If you have too much content and navigation using graphic, image maps or image buttons, it would be good to either convert them to simple text or have an alternate navigation bar at the bottom of your website. It would also help to have appropriate text in the Alt-Text tag for each image. Cont…
  • 27. Having frame-based sites: If you have built your website using frames, you might have a major problem indexing content sitting inside frames. The way frames and the content page URLs are structured, most search engines find it difficult to reach the inner page content of your website. It is best to restructure your website and remove frames so that you can get your site in the search engine index. Having free content behind login: Some sites & forums prefer the visitors to login before they can reach the real content of the website. Due to lack of awareness or inadvertently, if you have placed all your content behind a login, then the search engines cannot index your site as their crawlers cannot fill in a login-password or ‘register’ on your website. You might want to restructure your website such that you can show the publicly-available content without the need to login to your website. Poor site inter-linking: Poor site interlinking not only poses hurdles to search engines for indexing your site but also makes your site navigation difficult for your site visitors. It is advisable to have a good navigation bar and a site map on your site so that each page is not more than two-2-three clicks away.
  • 28. Deep directory structure: A deep directory structure is generally difficult for search engines to crawl. While it is not a rule, it would be good if you can keep your directories not more than one or two levels down, neatly classified into ‘themes’. Deep directory structure also makes your inner pages URLs look too long, which discourages other sites to link to your inner pages. Having less content / Non-original content: Search engines thrive on text content. They are mainly looking for text content on your website and reward sites which have lots of easily accessible, non-duplicate original text content. If your sites has less or non-original content picked up from other sites or your affiliate sites, it is unlikely that you would be rewarded, no matter how many efforts you put in optimizing your website. A good way to generate original content is to write nice, keyword rich, descriptive articles classified in themes, about your product or service, its usage, benefits and tips etc. Using session IDs: Session ID is a long string of jumbled characters appearing in the URL of your website which changes on each visit. They are usually used to track a visitor ’s online shopping cart contents. When a search engine crawler visits your website, your server assigns it a session ID and the crawler indexes your content and associates it with that session ID. Most search engines have hundreds of bots crawling and re-crawling the web. On each repeat visit of the crawler to your website, they index a fresh copy of your content and associates it with different URLs (session IDs) resulting in cluttering the search engine database with lots of duplicate content. Since search engines are not very good at tackling this problem, they often drop the site from future indexing. If you wish to track as user’s session, a better solution is to use cookies instead of session IDs. Cookies are information files stored on user’s computers and perform the same task as a session ID.
  • 29. No efforts in getting incoming links: Incoming links to your website are very important and are responsible for high rankings on search engines to a large extent. Link popularity determines how important your site is. Link building is an important part of your SEO campaign. No extent of site optimization can get you high rankings if you do not carry out a supporting link building campaign. Relying on one-time SEO: Search engine optimization is a constant process. You need to update your website SEO when you add new content, update old content or change the theme of your web pages. Addition of new content often requires you to update site interlinking, improve navigation and add links in your site map. Search patterns and keyword phrases also change over time as the search community matures or industry trends change. This requires a fresh keyword research and SEO perspective on your site. Changes may also become necessary if your current SEO strategy is not paying off. In any case, optimizing your site once and expecting it to give you sustained results without further efforts is a mistake. Showing impatience: SEO of a website rarely show instant results. If your site is already indexed in search engines, it may take about 4-8 weeks to index your new content. New sites may take 4-6 months to get indexed for the first time. It takes time for SEO results to show up. It is easy to think the past efforts were not good enough and get tempted to change the optimization techniques. It is not advisable to change the optimization of your site pages before you are able to see the results of your previous SEO efforts.
  • 30. Advanced SEO for Beginners Images or Text? Links, and Link Relevancy Duplicate Content Multiple Sites Unfriendly URLs Redirects IP addresses (C class)
  • 31. Images or Text? Situation: Navigation is based on images, even if these are images of text. Problem: Search engines find links very important and only read text. they like to match the topic of the target page to the text in the link that pointed to the page. Solution: Use text only navigation. the text and containing elements can by styles with CSS to make them look like images, give a background image, or at least make them look a little fancier. The alternative is to ensure that there is some text based navigation elsewhere on the page. Benefits: Search engines will be able to match the topic of the target page with the reputation (link text). A higher relevancy is better. Ensuring there is text on the page will also help the visually impaired who use text only browsers in conjunction with brail-output devices or text-to-speech software.
  • 32. Links, and Link Relevancy Situation: You want lots of links to your website Problem: Although links may be great for visitors and search engine spiders, the true value of the link may not be fully clear to you. Visitors are unlikely to click on irrelevant links and search engines hate them. Reciprocal links (you link to me, and I'll link to you) can be of use when relevant, but carry less weight in the search engines than one way inbound links. Solution: Avoid link farms (pages of links, for the sake of links), treat reciprocal link requests with caution and stay relevant. Benefits: Spending time and thought on your link profile will ensure natural growth of links. Search engines hate anything that looks artificial, or anything that could be interpreted as 'Search engine Manipulation'.
  • 33. Duplicate Content Situation: A webmaster copies content from another website, or sets up multiple sites which are identical, or very similar. Problem: search engines apply duplicate content filters to the search results. Although the idea of increasing the market share is a good idea, multiple similar sites is not the way to go about doing it. Efforts put into the secondary sites will be wasted . Solution: Ensure all of the pages on each of your sites are unique. Querying a search engine for an exact match of long strings of text will show you if the content is found elsewhere. If a webmaster does have duplicate pages and wishes to avoid any penalties or filters, the webmaster must choose one page to have the other identical pages redirect to. Redirects are covered later. Benefits: Your efforts are not wasted, time and money can be saved.
  • 34. Multiple Sites Situation: Following on from duplicate content, there are multiple sites. A single webmaster has several closely themed sites, all inter-linked and hosted on the same IP address. Problem: This is seen as search engine manipulation, the linking structure and multiple sites just isn't natural. Penalties can apply. Solution: Such site structures can work if set out properly. this means that each site must reside on it's own IP address where, at least, the C class is different. IP address are covered later. Benefits: Increased market share, well linked and spiderable sites, no penalties applied.
  • 35. Unfriendly URLs Situation: The URL (web address) of some of your sites pages look like [http://www.my-site.com/product-info.php?product-id=12345&category= 6789&sid=f29a3483270cc10b3783706916216e3a Problem: Search engine spiders are getting better and following and indexing these type of URLs, bit not all of them. The spiders are scared of getting caught in an endless loop of identical pages with 1000s of different URLs. they are also off putting to your site visitors who could easily remember [http://www.my-site.com/products/autos/index.html.] Solution: Use Search Engine Friendly URLs. If you are running the Apache webserver on Linux enabled server use mod_rewrite to translate the URLs into something nicer Benefits: Higher chance of being indexed by more of the search engines, keywords in your URL, higher chance of being clicked on in the SERPs
  • 36. Redirects There are two main redirects, based on the HTTP specification 301 - moved permanenty 302 - moved temporarily You should use 301 redirects where ever possible. this can be done in your server side scripting, eg Code: php header(&quot;HTTP/1.1 301 Found&quot;); header(&quot;Location: http://www.my-other-site.com/&quot;); redirects can also be set up in your .htaccess file, if you run the Apache web server
  • 37. IP addresses (C class) IP addresses are the numerical addresses of computers connected to the internet. For ease of reading they are broken down into four parts, each called an octet as they contain 8 bits of data when written in binary form. An example my be 209.85.76.114. in this example the C class is 76 (and everything up to it), so if you have several sites hosted on this IP they would all have identical IP addresses. If each had a unique IP address that started 209.85.76. then they all share the same C-class IP address. As the allocation of IP addresses within a server environment is usually configured so that each server has it's own C-class; this means that a search engine knows that all sites on the same C-class IP address are all located on the same server