• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Optimization techniques in 2013
 

Optimization techniques in 2013

on

  • 1,495 views

I think I am really armature in this field. I created this document after long research may it will help us people like who want

I think I am really armature in this field. I created this document after long research may it will help us people like who want
to build there carrier in SEO field

Statistics

Views

Total Views
1,495
Views on SlideShare
1,489
Embed Views
6

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 6

http://www.linkedin.com 6

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Optimization techniques in 2013 Optimization techniques in 2013 Document Transcript

    • Optimization techniques in 2013Keyword optimization:  Important Level: 1/5 (It doesn’t affect ranking but helps in making pages unique)  Do’s: Limit it to 3 to 10 keywords. It can make the pages unique and help search engine  Pick up the right page for a keyword.  Don’ts: Don’t Stuff it with keywords.  Quality check: Check the webmaster consoles and see the duplicate Meta keywords. Also  Make sure that the targeted pages have proper title.  Progress check: See how many Meta Descriptions are fixed. Some of the pages may go  Through many changes, so we can keep a track of that too.So how you can chose right keyword for your site: Many tools available today to choosekeyword but I think best tool is Google Keyword, https://adwords.google.comRelated Site:  http://tools.seobook.com/keyword-tools/seobook/  http://www.wordtracker.com/
    • Title Optimization: A title tag is the main text that describes an online document. It is the singlemost important on-page SEO element (behind overall content):Title tags, technically called title elements, define the title of a document and are required for allHTML/XHTML documents.Code Sample<head> <title>Example Title</title> </head>Optimal Format:Primary Keyword - Secondary Keyword | Brand NameorBrand Name | Primary Keyword and Secondary KeywordBest PracticesLess than 70 characters, as this is the limit Google displays in search resultsTitle appears in three key places:  Browser: Title Tags show up in both the top of browsers chrome and in applicable tabs.  Search Result Pages: Title tags also show up in search engine results.
    •  External Websites: Many times, external websites (especially social media sites) will use the title of a web page as its link anchor text.Meta description Optimization: Meta Descriptions, which are HTML attributes that provideconcise explanations of the contents of web pages, are commonly used by search engines on search resultpages to display preview snippets for a given page.Code Sample<head> <meta name="description" content="This isan example of a meta description. This will oftenshow up in search results."> </head>Optimal Length for Search Engines: Roughly 155 CharactersGood Meta Descriptions can help us better Click through Rates in Search Engine Result pages, whichhelps in better ranking. If a certain site clicked more from results in search engines, search engines givemore value to it. Example, when people see Wikipedia in the result pages they click on it even when it isranking very low as people like Wikipedia. Thus Search Engines promote Wikipedia above others. Youcan have email id, phone numbers in Meta description, and show your strength under the Metadescription. See the keywords that are bringing traffic to a page and search with the keyword to see if thedescription that appears in the search engine is properAvoid Duplicate Meta description Tags (As with title tags, it is important that Meta descriptions oneach page be unique)
    • Related Tools:  http://www.seomoz.org/seo-toolbar The mozBar makes it easier to see relevant SEO metrics as you surf the web  http://www.w3schools.com/tags/tag_meta.asp (W3 Schools official documentation on meta tags (including meta descriptions).  http://www.bing.com/community/site_blogs/b/webmaster/archive/2009/07/18/head-s-up-on- lt-head-gt-tag-optimization-sem-101.aspx  http://www.seochat.com/seo-tools/meta-tag-generator?tool=3/?tool=3 (If youre new to web development and search engine optimization, you may find this tool useful to ensure that your meta tags are correctly formed)  Meta Tag Analyzer (http://www.seochat.com/seo-tools/meta-analyzer/ ) this tool will analyze a websites Meta tags. Analyzing a competitors keyword and description Meta values is a good way to find ideas for key terms and more effective copy for your site.  http://tools.seobook.com/meta-medic/Meta content-language tag: If you are directing your website’s contents toward a specificlanguage-speaking audience, you can specify the language of your content using the <meta> tag’scontent-language attribute. For example, for a target audience of American English speakers, you wouldadd the following tag to the <head> section of all your pages:<meta http-equiv="content-language" content="en-us" />Implementation of ISO Language Codes: According to the W3C recommendation you should declarethe primary language for each Web page with the lang attribute inside the <html> tag, like this:<html xmlns="http://www.w3.org/1999/xhtml"> after implement lang attribute itshould be like: <html lang="en" xmlns="http://www.w3.org/1999/xhtml">For more information please visit: http://www.w3.org/International/articles/language-tags/No Index OR USE Robot.TXTWeb site owners use the /robots.txt file to give instructions about their site to web robots; this iscalled The Robots Exclusion Protocol.It works like this: a robot wants to visits a Web site URL, sayhttp://www.example.com/welcome.html. Before it does so, it firsts checks forhttp://www.example.com/robots.txt, and finds:User-agent: *Disallow: / The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the
    • site.So how you can create Robot.TXT: You can use http://tools.seobook.com/robots-txt/generator/ this tool to create robot.txt fileTxt file look like:User-Agent: *Disallow:Disallow: /accountantsinlondon.com/partnersAfter Create this file you need to upload root folder call publick_htmlKeyword density: keyword density is a very important factor. However, its more importantto ensure that your target keywords appear in key places, such as:  Page title  H1  H2  H3  Main body text  Internal and external pointing to your page  Image alt tags  etcAlso dont forget that content is for users as well as search engines so forcing keywords into the content toincrease density could devalue it. But remember we write contain for humans not for Google, so you have to maintain properdensity otherwise it will look spammy . So it’s very important to check the keyword density toavoid over optimization penalty.So what is Ideal percentage about Keyword Density?Answer: (One word density ... 1% – 6% two word density 5% to 20%) Don’t go above 20% density forany keyword, it should between 3 to 10%So how you can calculate keyword density:Keyword Density = (Nkr / Tkn) x 100Where:
    • Density = your keyword densityNkr = how many times you repeated a specific keywordTkn = total words in the analyzed textKeyword density = (Nkr / Tkn) x 100= (15 / 500) x 100= 0.03 x 100=3Keyword density = 3%!!!Key-phrases Density CheckDensity = (Nkr x (Nwp / Tkn)) x 100Where:Density = your keyword densityNkr = how many times you repeated a specific key-phraseNwp = number of words in your key-phraseTkn = total words in the analyzed textSo, again, if we take my “Waffles in Delaware” example – There are three words in my key-phrase and Ihave used that key-phrase three times amidst my total word count of 500 words.Density = (Nkr x (Nwp / Tkn)) x 100= (3 x (3 / 500)) x 100= (3 x 0.006) x 100= 0.018 x 100Density = 1.8%Useful Tools  http://www.keyworddensity.com/search_engine_optimization/keyword_density.cgi  http://www.live-keyword-analysis.com/XMl Sitemap:If you have a website, you should be using an XML Sitemap to help improve your Internet visibility. Whatis an XML Sitemap? It is a simple, effective way for you to give the search engines a list of all the URLsyou want them to crawl and index. If the search engines don’t find your site or specific pages on it, yourprospects won’t either!So how you will create Xml Site map for your site: It’s very easy,Use http://www.xml-sitemaps.com/ to get XML site map for your site.How to Submit XML Sitemap?
    • After Create site map upload your file to your main (root) or public_html for Linux server and www forwindows server. Once youve created a Sitemap in an accepted format, you can submit it to Googleusing Google Webmaster Tools.And then submit your Sitemap to Google webmaster tool. https://www.google.com/webmasters/toolsXml sitemap for blogger: http://ctrlq.org/blogger/Useful Tools  http://www.xml-sitemaps.com/  http://ctrlq.org/blogger/  https://www.google.com/webmasters/toolsBroken linksBasically broken links are tow type internal broken links & External broken. Internal broken links are quitelikely a quality issue, always fix those as soon as possible. External broken links may not be quite soimportant, but its a good idea to run a check every few months and fix what you can.The first two are easy to fix. Use a crawler like xenu or screaming frog to find internal and outgoing linkissues, and fix them.For incoming links (backlinks) register your website with Google Webmaster Tools and check out theDiagnostics->Crawl Errors. Here you will see who has broken links to your website.Useful Tools  http://home.snafu.de/tilman/xenulink.html  http://www.brokenlinkcheck.com/  http://validator.w3.org/checklink  If you have WordPress: http://wordpress.org/extend/plugins/broken-link-checker/Canonical issuesIt is important to represent the content by one URL rather than many URLs. It divides the link strength. Usehtaccess, 301 other redirection methods to consolidate the URL to one. Also one can use canonical tag to make lifeeasier.I think I am really armature in this field. I created this document after long research may it will help us people like who wantto build there carrier in SEO fieldSudip Nandy