SEO
What is SEO?Search engine optimization is a technique that helps to improvevisibility of our site through coding and conte...
Technical SEO Site architecture Providing the readable content to  search engines Avoiding duplicate issues Delivering...
Site Architecture Site need to be designed based on the  intended target audience For example if the site is for kids th...
How to provide content to search engines     Unique URL- SEO friendly     Meta data to provide details about the      pa...
URL is important for good ranking.     Best practices to follow for URL     structure:  Relevant, compelling and accurate...
Here the following URL has 2 slugs.     http://www.egrovesys.com/application-development/prestashop-         development.h...
Whereas the following URL has 7 slugs. It is using H1            Programmers should not use such default functionality an...
Title and Meta description are very important andneed to follow SEO guidelines
Title should not exceed 65 characters and Meta descriptionshould not exceed160 characters. Otherwise the content providedi...
The Open graph protocol OG will help to control how a link appearsin Facebook when a page is shared or likedthere. So they...
Avoiding the duplicate content, title anddescription    Duplicate issue is one of the serious     issues that programmers...
301 redirect     301 redirect will help if any page is no    longer required and can be permanently    redirected. Advanta...
Solution is one of them can be 301 redirected to other URL.
Canonical element   Canonical element will be helpful to avoid duplicate issues arising out of URLgeneration with session ...
Pagination handling Pages in series or galleries will normally generateduplicate title and description issues which can be...
http://www.realsimple.com/food-recipes/tools-products/14-surprising-uses-for-your-microwave-10000001035388/index.html
http://www.thisoldhouse.com/toh/article/0,,451111,00.html
Egrovesys.com Portfolio Page
Pagination Screenshots Explanation   The first page only contains rel=‖next‖ and no rel=‖prev‖    markup.   Pages two to...
No index Meta and robots.txt ―No index‖ meta will be useful if we  don’t want to index a page. robots.txt  can be used to...
Page load time   Page load time one of the factors that could influence  users to stay and do transaction in the sites. So...
Avoiding Excessive JS in Head   Pages that contain excessive java script need attention from    development team to find ...
Avoiding Excessive JS in Body   It is recommended to reduce the JS in the body to help    spiders to quickly crawl the pa...
Avoiding Excessive Whitespace   Minifying code is recommended which refers to  eliminating unnecessary space, new line  ch...
Following Heading Rules   H1 should come first in the source code and should be the first    Header tag parsed by any sea...
Custom 404 error page:   HTTP requests are expensive. So making an HTTP request and    getting a 404 error or ―not found‖...
Bad design: taking advantage from their brand value
Redirected to home page: O.K
Custom 404 page: Good
Better: designed with search option
Ajax implementation     Ajax implementation in the site needs tofollow Google guidelines to display AJAX URLsin the search...
Q and A
Thank YouPart II:   Combining images:   Browser caching   Lossless compression of images   Inline Java script   Rich ...
eGrove Systems Corporation - PrestaShop Development Services
eGrove Systems Corporation - PrestaShop Development Services
eGrove Systems Corporation - PrestaShop Development Services
Upcoming SlideShare
Loading in …5
×

eGrove Systems Corporation - PrestaShop Development Services

417 views
341 views

Published on

Hire PrestaShop Developers from eGrove for all kind of PrestaShop Development Services, Migration, Customization, Integration, Themes and Modules Development

Published in: Technology, Design
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
417
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
13
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

eGrove Systems Corporation - PrestaShop Development Services

  1. 1. SEO
  2. 2. What is SEO?Search engine optimization is a technique that helps to improvevisibility of our site through coding and content.Analytics and Web intelligenceKeyword and contentOn-page SEO & Site architecture – technicalLink development or off page SEO – mostly non technical
  3. 3. Technical SEO Site architecture Providing the readable content to search engines Avoiding duplicate issues Delivering the content faster to users and search engines Improving user experience
  4. 4. Site Architecture Site need to be designed based on the intended target audience For example if the site is for kids then color, theme and font size need to be selected to attract kids. Kids also feel comfortable in spending time in the site. Mostly Business analysts, Project managers and Designers will take care of this part
  5. 5. How to provide content to search engines  Unique URL- SEO friendly  Meta data to provide details about the page  XML site maps for regular pages and for videos  Updating the sitemap if there is a change in the site
  6. 6. URL is important for good ranking. Best practices to follow for URL structure:  Relevant, compelling and accurate.  Use hyphens to separate words and don’t use underscores.  URL to have primary keyword. Adding secondary keywords will be an additional value.  Limit URL slug to 3 -5 words.
  7. 7. Here the following URL has 2 slugs. http://www.egrovesys.com/application-development/prestashop- development.html
  8. 8. Whereas the following URL has 7 slugs. It is using H1  Programmers should not use such default functionality and need to use a solution for this.  If your SRS doesn’t say about this, clarify with Xavier.  Try to limit 3 levels deep unless clients are specific about this.
  9. 9. Title and Meta description are very important andneed to follow SEO guidelines
  10. 10. Title should not exceed 65 characters and Meta descriptionshould not exceed160 characters. Otherwise the content providedin the Meta will be truncated if they are used by search engines.
  11. 11. The Open graph protocol OG will help to control how a link appearsin Facebook when a page is shared or likedthere. So they have a value in FB, but nonein search engines.◦ Since Facebook is growing, more and more clients are interested to have OG implementation in their sites. I recommend you to raise question if there is no mentioning.
  12. 12. Avoiding the duplicate content, title anddescription  Duplicate issue is one of the serious issues that programmers need to avoid Some example sources: http://www.example.com and http://example.com www.example.com and www.example.com/index.html www.example.com and www.example.com?session-id=1234 www.example.com/1 and www.example.com/1/
  13. 13. 301 redirect 301 redirect will help if any page is no longer required and can be permanently redirected. Advantage with this practice is there won’t be any loss in the link values. For example egrovesys.com and egrovesys.com/index.php render the same page. See it from Screenshots in next Slide
  14. 14. Solution is one of them can be 301 redirected to other URL.
  15. 15. Canonical element Canonical element will be helpful to avoid duplicate issues arising out of URLgeneration with session ids, query parameters and tracking codes.Example: Pages with session ids generate duplicate title issues in this example:. http://www.thisoldhouse.com/toh/article/0,,1147475,00.html http://www.thisoldhouse.com/toh/article/0,,1147475,00.html?xid=hinewsletter 081908-47-skillsNeed to implement canonical back to the preferred URL to resolve this issue.Need to add the canonical element within the head section.<link rel="canonical" href=" http://www.thisoldhouse.com/toh/article/0,,1147475,00.html"/>* Business analyst and programmer can add this feature as an additional scope in the project development.
  16. 16. Pagination handling Pages in series or galleries will normally generateduplicate title and description issues which can beavoided by using rel=‖next‖ and rel=‖prev‖Let us See In detail from Following Screenshots..
  17. 17. http://www.realsimple.com/food-recipes/tools-products/14-surprising-uses-for-your-microwave-10000001035388/index.html
  18. 18. http://www.thisoldhouse.com/toh/article/0,,451111,00.html
  19. 19. Egrovesys.com Portfolio Page
  20. 20. Pagination Screenshots Explanation The first page only contains rel=‖next‖ and no rel=‖prev‖ markup. Pages two to the second-to-last page should be doubly- linked with both rel=‖next‖ and rel=‖prev‖ markup. The last page only contains markup for rel=‖prev‖, not rel=‖next‖. rel=‖next‖ and rel=‖prev‖ values can be either relative or absolute URLs (as allowed by the<link> tag). And, if you include a <base> link in your document, relative paths will resolve according to the base URL. rel=‖next‖ and rel=‖prev‖ only need to be declared within the <head> section, not within the document <body>.
  21. 21. No index Meta and robots.txt ―No index‖ meta will be useful if we don’t want to index a page. robots.txt can be used to block any particular section of a site from crawling. If the page is already indexed, robots.txt will not have any impact. So wherever possible use ―noindex‖.
  22. 22. Page load time Page load time one of the factors that could influence users to stay and do transaction in the sites. Some of the areas where programmers can use their intelligence are:Avoiding Excessive CSS in Head: Placing CSS inside of head should be avoided for helping spiders to reach the text quickly.Example: http://www.health.com/health/static/buzz/contests_and_giv eaways.htm External CSS file is recommended to handle such issues like <link rel=‖stylesheet‖ type=‖text/css‖ href=‖externalcss.css‖ /> We Can See it from Screenshot in next Slide
  23. 23. Avoiding Excessive JS in Head Pages that contain excessive java script need attention from development team to find the possibility to move either to the bottom or to the external file. Google and other search engine spiders are more advanced nowadays and can be able to detect page text even if there are excessive java scripts. Time required to reach the text will be the important factor. Java scripts are unnecessary areas for spiders. Excessive java scripts in the head will consume spider time with no reason. So delivering the required text to spiders quickly by eliminating lengthy java scripts ahead of body text will improve the ranking.Example: http://www.health.com/health/anxiety We can See it From Screenshot in Next Slide
  24. 24. Avoiding Excessive JS in Body It is recommended to reduce the JS in the body to help spiders to quickly crawl the page. Page load time also improves if JavaScripts are handled properly. In order to load a page, the browser must parse the contents of all <script> tags, which adds additional time to the page load. By minimizing the amount of JavaScript needed to render the page, and deferring parsing of unneeded JavaScript until it needs to be executed, we can reduce the initial load time of page.Example: http://www.health.com/health/appendicitis We can See it From Screenshot in Next Slide
  25. 25. Avoiding Excessive Whitespace Minifying code is recommended which refers to eliminating unnecessary space, new line characters, comments etc.Example: http://www.health.com/health/library/mdp/0,,d04537t1,0 0.html
  26. 26. Following Heading Rules H1 should come first in the source code and should be the first Header tag parsed by any search engine crawler. Do not precede the H1 with any other Header tag. You should have only one (1) H1 tag per page. Thereafter, you can have as many H2 – H6 tags as necessary to layout the page and its content, but use a logical sequence and do not ―style‖ your text via Header tags in your CMS.---H1-------H2----- --H3— --H3— ---H4-----H2---
  27. 27. Custom 404 error page: HTTP requests are expensive. So making an HTTP request and getting a 404 error or ―not found‖ will slow down the user experience. Some sites have helpful and creative 404 error page to cover bad user experience. Still such pages waste server resources (like database, etc). Particularly bad is when the link to an external JavaScript is wrong and the result is a 404. It is a good practice to keep 404 errors to minimum level through other means like blocking unnecessary URL generation. As a final resort 301 redirects can be used. But such redirects should go to main page or any other related page. Google maintains that 404 errors won’t impact site’s search performance, and can be ignored if we’re certain that the URLs should not exist on our site. It’s important to make sure that these and other invalid URLs return a proper 404 HTTP response code, and that they are not blocked by the site’s robots.txt file.
  28. 28. Bad design: taking advantage from their brand value
  29. 29. Redirected to home page: O.K
  30. 30. Custom 404 page: Good
  31. 31. Better: designed with search option
  32. 32. Ajax implementation Ajax implementation in the site needs tofollow Google guidelines to display AJAX URLsin the search results.For example:www.egrovesys.com/portfolio#1 Shouldbecome:www.egrovesys.com/portfolio#!1
  33. 33. Q and A
  34. 34. Thank YouPart II: Combining images: Browser caching Lossless compression of images Inline Java script Rich snippets for ratings and reviews Moving a site to a new host Ajax implementation

×