Technical seo tips for web developers

  • 1,051 views
Uploaded on

Singsys provides International, National & Local SEO,SMO,SEM Services. Our highly skilled and dedicated team of SEO professionals helps you to gain profit and make your website popular all over the …

Singsys provides International, National & Local SEO,SMO,SEM Services. Our highly skilled and dedicated team of SEO professionals helps you to gain profit and make your website popular all over the world in affordable budget.

More in: Technology , Design
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,051
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
10
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Technical SEO tips for Web Developers Richa Bhatia Singsys Pte. Ltd.
  • 2. URL Structure best practices Optimal Format of any URL should be - http://www.website.com/category-keyword/subcategory- keyword/primary-keyword.html URL structure should be optimized during the development phase of any kind of website. It will save time of web developers as well as webmaster. Use hyphens to separate words. Google will consider tennis-ball (hypen) as two separate words. If we use underscore tennis_ball, it will be considered as single word by search engines.
  • 3. URL Structure best practices URLs need to be static, descriptive within 3-5 words in length. Include proper category in your URL structure. If you're thinking of using 114/cat223/, go with /brand/reebok For dynamically created pages through a CMS, create the option of including keywords in the URL. ( For php we can create proper url instead of giving id's in url. You should consider using the rewrite module of the Apache server to rewrite those dynamic urls into static looking URL, There are also PHP scripts which can be implemented which will change the address into a readable page.) http://net.tutsplus.com/tutorials/other/using-htaccess- files-for-pretty-urls/
  • 4. URL Structure best practices Avoid using generic page names like page1.php. Use proper keyword instead. Don't be Case Sensitive(example : SINGSYS.COM should be redirected to singsys.com) The optimal structure for a website would look similar to a pyramid. This structure has the minimum amount of links possible between the homepage and any given page.
  • 5. Pyramid Structure Home Page
  • 6. Sitemap  A sitemap is a file that sits on your server and inform search engines about the structure and hierarchy of your website. It's an XML file http://www.website.com/sitemap.xml that contains every page URL of your site. 
  • 7. Sitemap  Always keep your website sitemap up to date. Every time you make a change to your website, generate and publish the new one.  Sitemaps are an important way of communication with the search engines.  Types of Sitemaps - XML, HTML, Video, News, Mobile, Image  Don't forget to submit your sitemap to Major Search Engines
  • 8. robots.txt  A robots.txt file tells search engine bots what pages it is not allowed to crawl. This file goes in the root of your site.  Include those pages in robots.txt which you do not wish to be indexed in Google Search Results. Ex- Privacy policy page, Admin page.  The filename of robots.txt is case sensitive. Use "robots.txt", not "Robots.TXT”.  Meta robots with parameters "noindex, follow" is best way to to restrict crawling or indexation.
  • 9. robots.txt  There are a few ways to block search engines from accessing a domain: Block with Robots.txt This tells the engines not to crawl the given URL, but that they may keep the page in the index and display it in in results. Block with Meta NoIndex This tells engines they can visit, but are not allowed to display the URL in results. This is the recommended method. Block by Nofollowing Links This is almost always a poor tactic. Using this method, it is still possible for the search engines to discover pages in other ways: through browser toolbars, links from other pages, analytics, and more.
  • 10. Clean, valid, html coding Clean and valid HTML coding means the search engines will have no problem in crawling and interpreting your web pages. Validate Your Code using W3C validator. Ensure that there is no coding error while designing web page.
  • 11. Keep web page size low  Small web page loads more faster as compared to web page very big in size.  Put CSS code into a .css file and JS code into a .js file and include them at the top. This will keep your page source clean and small, and search engines will load the pages that must faster.  Users and customers leave the page if it loads longer than 10 seconds.(So you can lose half of your visitors by overlooking your page load time).  Huge web pages decreases Google's crawl rate.  Maintain your CSS and JavaScript files Externally - Placing your CSS/JavaScript in external files will help the crawler find your content faster. It also helps in decreasing the page load time.
  • 12. Images Search Engines can't read images. Include alt tag to define what the image is about. Alt tag is the text that you provide for an image in case it can't be displayed. Make sure your images loads easily. Write image title for each image. Include your main keywords in img alt tag and image title.
  • 13. Alt Tag Title Tag
  • 14. Header Tags
  • 15. Header tags Headings are defined in web page with H1 to H6 tags. Place Keywords in Your Heading Tags for Better SEO Results. Search engines pay special attention to the words in your headings because they expect headings to indicate page main topics. Use only 1 H1 tag per page.
  • 16. Header tags Format : <h1>Main Heading</h1> <h2>Secondary Heading 1</h2> <h3>Sub-section of the secondary heading 1</h3> <h2>Secondary Heading 2</h2> <h3>Sub-section of the secondary heading 2</h3>
  • 17. Anchor text Anchor text is the clickable text that users will see as a result of a link, and is placed within the anchor tag <a href="..."></a> Avoid writing generic anchor text like "page", "article", or "click here". Include main keywords in Anchor text.
  • 18. Canonical issue http://www.yoursite.com and http://yoursite.com are two different pages for Google which results in duplicate content issue. 301 is permanent redirect. Google will remove old url’s from its index in 301 redirect. 302 is temporary redirect. 301 redirects pass between 90 percent and 99 percent of their value, whereas 302 redirects pass almost no value at all. Avoid duplicate pages by redirecting the non-www version over to the www version. It can be done using htaccess
  • 19. Canonical issue Common Canonical Issues : http://www.site.com http://site.com http://www.site.com/index.php http://site.com/index.php
  • 20. Create an Informative Error Page(404 page) Design and include proper 404 error page in your website. 404 page should contain header and footer menus as well so that user can directly navigate to any page he wishes to.
  • 21. Google page speed  Page speed is one of the most important ranking factor in SEO. Fast and optimized web pages lead to higher visitor engagement, retention, and conversions.  SEO tool to check page speed- https://developers.google.com/speed/pagespeed/  Ways to improve Site speed  Use Image Sprites - An image sprite is a collection of images put into a single image. A web page with many images can take a long time to load and generate multiple server requests. Using image sprites will reduce the number of server requests and save bandwidth. Logos and all other structuring images should be packed in a CSS sprite
  • 22. Combine Images Into CSS Sprites Single Image
  • 23. Google page speed Minify JavaScript - minify javascript is used to compress your javascript code. When you minify, the comments and spaces get deleted to improve the performance of your code. Minify CSS - CSS can slow your website down.The solution to this is to keep an original copy for the developers to work in and then have them minify (remove all the unnecessary bits) the code for the live website. You can expect a 20-30% saving on average. SEO tool for this - http://www.csscompressor.com/
  • 24. Other points Social snippets buttons. Same url structure throughout site. Check compatibility between browsers. Avoid using lot of flash in webpage. Search Engines do not crawl flash. We can use HTML5 in place of flash. Avoid duplicate content on website - You can solve duplicate content problems by using noindex, follow in your robots meta tag, using a 301 redirect or with a robots.txt file.
  • 25. Other points Check for Broken Links and Images - If your website has broken links pointing to page that doesn’t exist, it will probably never be found. Also search engines penalize sites with many broken links, so don’t forget to use the W3C link validator tool to find them. Optimize your Title and Meta Tags the Correct Way.
  • 26. Contact Us via info@singsys.com Phone-65613900
  • 27. Thank You!