Benchmarking Your Web Site

2,087 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
2,087
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
14
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Do you have any questions?
  • Benchmarking Your Web Site

    1. 1. Benchmarking Your Web Site Brian Kelly UKOLN University of Bath Bath, BA2 7AY Email [email_address] URL http://www.ukoln.ac.uk/ UKOLN is supported by:
    2. 2. Contents <ul><li>Introduction </li></ul><ul><li>Why Benchmark? </li></ul><ul><li>What Is Benchmarking? </li></ul><ul><li>Tools For Testing Web Sites </li></ul><ul><li>From Testing To Benchmarking </li></ul><ul><li>Case Studies: WebWatch Surveys Of UK University Sector </li></ul><ul><li>Commercial Approaches </li></ul><ul><li>Discussion </li></ul>
    3. 3. What Do You Think Benchmarking is? <ul><li>Could you turn to your neighbour and ask: </li></ul><ul><ul><li>What do you think benchmarking is? </li></ul></ul><ul><ul><li>How you think benchmarking can help you? </li></ul></ul>
    4. 4. Benchmarking: A Definition <ul><li>Benchmarking is about identifying and measuring best practice processes that work elsewhere and then emulating them. </li></ul><ul><li>The aim is to reduce duplication by learning from others who have already found the solution. </li></ul><ul><li>It is about: </li></ul><ul><ul><li>Understanding your weaknesses </li></ul></ul><ul><ul><li>Comparison with your peers </li></ul></ul><ul><li>Note that best practices are constantly evolving. </li></ul>
    5. 5. Aims Of This Talk <ul><li>By the end of the session you should: </li></ul><ul><ul><li>Be able to benchmark your Web site in relation to other sites in your community </li></ul></ul><ul><ul><li>Have seen examples of use of auditing and evaluating tools </li></ul></ul><ul><ul><li>Have considered other types of benchmarking activity available </li></ul></ul><ul><ul><li>Be in a position to decide whether to adopt this methodology in your organisation </li></ul></ul>
    6. 6. Approaches To Benchmarking <ul><li>There are a number of approaches to benchmarking of Web sites: </li></ul><ul><li>Manual Benchmarking </li></ul><ul><ul><li>Use of manual techniques such as questionnaires, usability studies, etc. </li></ul></ul><ul><li>Automated Benchmarking </li></ul><ul><ul><li>Automated benchmarking makes use of software tools to support benchmarking </li></ul></ul><ul><ul><li>Approaches can include: </li></ul></ul><ul><ul><ul><li>Dedicated benchmarking software products, typically running on desktop PC </li></ul></ul></ul><ul><ul><ul><li>Use of benchmarking services available on the Web </li></ul></ul></ul>
    7. 7. Benchmarking Approaches <ul><li>Manual: </li></ul><ul><ul><li>End users involved in process </li></ul></ul><ul><ul><li>Can receive feedback which cannot be obtained using software </li></ul></ul><ul><ul><li>Can be time-consuming and expensive </li></ul></ul><ul><li>Automated: </li></ul><ul><ul><li>Can be less expensive </li></ul></ul><ul><ul><li>Can scale to many thousands of resources </li></ul></ul><ul><ul><li>Restricted to aspects which software can process </li></ul></ul>This talk deals mainly with automated benchmarking, using Web-based tools
    8. 8. Web Testing Services <ul><li>Many Web-based services are available for reporting on various aspects of Web sites, including: </li></ul><ul><ul><li>HTML compliance </li></ul></ul><ul><ul><li>Browser compatibility </li></ul></ul><ul><ul><li>Broken Links </li></ul></ul><ul><ul><li>Load Time </li></ul></ul><ul><ul><li>Accessibility </li></ul></ul><ul><ul><li>Link popularity </li></ul></ul><ul><ul><li>… </li></ul></ul>
    9. 9. Example - NetMechanic <ul><li>NetMechanic : </li></ul><ul><ul><li>A Web-based service for checking Web sites </li></ul></ul><ul><ul><li>Various functions available for free </li></ul></ul><ul><ul><li>Additional functions such as more comprehensive testing are licensed </li></ul></ul>http://www.netmechanic.com/
    10. 10. Example – Doctor HTML <ul><li>Doctor HTML: </li></ul><ul><ul><li>A Web page analysis tool </li></ul></ul><ul><ul><li>Free for testing individual pages </li></ul></ul><ul><ul><li>A licensed version can be installed locally for checking entire Web sites </li></ul></ul>http://www2.imagiware.com/RxHTML/
    11. 11. Example - Bobby <ul><li>Bobby : </li></ul><ul><ul><li>A Web-based accessibility checker </li></ul></ul><ul><ul><li>Can test individual pages </li></ul></ul><ul><ul><li>A licensed downloadable version can check entire Web sites </li></ul></ul>http://bobby.watchfire.com/
    12. 12. Example – Link Popularity <ul><li>Link popularity: </li></ul><ul><ul><li>An indication of the popularity of a Web site </li></ul></ul><ul><ul><li>Can be obtained by analysing search engines such as AltaVista and Google </li></ul></ul>http://www.linkpopularity.com/
    13. 13. Example – Server Analysis <ul><li>Netcraft’s server analysis provides details of: </li></ul><ul><ul><li>Web server software </li></ul></ul><ul><ul><li>Operating system environment </li></ul></ul><ul><ul><li>Server availability (limited) </li></ul></ul><ul><ul><li>Nos. of servers in domain </li></ul></ul>http://www.netcraft.com/
    14. 14. Example – Server Analysis <ul><li>The University of Dundee provide a HTTP analysis tool: </li></ul><ul><ul><li>Analyses HTTP headers </li></ul></ul>http://www.somis.dundee.ac.uk/ general/wizards/fetchhead.html
    15. 15. Usability <ul><li>Usability normally requires manual testing </li></ul><ul><li>However automated support tools are also available such as WebSAT </li></ul>http://zing.ncsl.nist.gov/ WebTools/WebSAT/operation.html
    16. 16. From Testing To Benchmarking <ul><li>These tools are typically used for testing individual pages on one’s own Web site </li></ul><ul><li>However applying tools across other Web sites allows: </li></ul><ul><ul><li>Comparisons to be made with competitors and collaborators </li></ul></ul><ul><ul><li>Examples of best practices to be found and lessons learnt from </li></ul></ul><ul><ul><li>Examples of problems to be found and mistakes avoided </li></ul></ul><ul><ul><li>Trends to be monitored by repeat surveys </li></ul></ul><ul><ul><li>Limitations of tools to be found by testing across wide range of Web site environments </li></ul></ul>
    17. 17. UK HE Case Studies <ul><li>Every 3 months a WebWatch survey is published in the Ariadne e-journal: </li></ul><ul><ul><li>Surveys include: </li></ul></ul><ul><ul><ul><li>Accessibility of entry points </li></ul></ul></ul><ul><ul><ul><li>Nos. of Web servers </li></ul></ul></ul><ul><ul><ul><li>Nos. of links to organisation </li></ul></ul></ul><ul><ul><ul><li>Size of entry points </li></ul></ul></ul><ul><ul><ul><li>Web server software </li></ul></ul></ul><ul><ul><ul><li>Relationships </li></ul></ul></ul><ul><ul><li>Together with manual surveys of: </li></ul></ul><ul><ul><ul><li>Search engine software </li></ul></ul></ul><ul><ul><ul><li>404 error pages </li></ul></ul></ul><ul><ul><li>See: < http://www.ukoln.ac.uk/web-focus/ webwatch/articles/#latest > </li></ul></ul>Case Studies
    18. 18. Accessibility <ul><li>In September 2002 Bobby was used to analyse the entry points the UK University Web sites. </li></ul>http://www.ariadne.ac.uk/issue33/web-watch/ <ul><li>Findings </li></ul><ul><ul><li>Only 4 pages appeared to comply with Bobby AA guidelines </li></ul></ul><ul><ul><li>Further analysis revealed that only 3 complied </li></ul></ul>Case Studies
    19. 19. Size Of Home Page <ul><li>The size of UK University entry points was analysed in 1998 and repeated in June 2001 </li></ul>http://www.ariadne.ac.uk/issue28/web-watch/ The reasons for the large entry points was reviewed. Case Studies
    20. 20. Numbers Of Web Servers <ul><li>A survey of the numbers of Web servers was carried out in 2000 and repeated in 2002 </li></ul>http://www.ariadne.ac.uk/issue31/web-watch/ Most institutions have a small number of Web servers but a few have over 100 Case Studies
    21. 21. What’s Related? <ul><li>Netscape’s What’s Related tool was used to record: </li></ul><ul><ul><li>Popularity </li></ul></ul><ul><ul><li>Nos. of pages indexed </li></ul></ul><ul><ul><li>Nos. of links to site </li></ul></ul>http://www.ariadne.ac.uk/ issue27/web-watch/
    22. 22. Links To University Web Sites <ul><li>A survey of the number of links to UK University Web sites was published in 2000 </li></ul><ul><li>The survey used the AltaVista and Infoseek search engines </li></ul>http://www.ariadne.ac.uk/issue23/web-watch/
    23. 23. Search Engines <ul><li>A manual survey of search engines used on UK University Web sites was carried out in 1997 and has been repeated every 6 months in order to monitor trends </li></ul>http://www.ariadne.ac.uk/issue30/web-watch/
    24. 24. 404 Error Pages <ul><li>A (manual ) survey of 404 error pages on UK University Web sites was carried out in 1999 and repeated in 2002. </li></ul>WebWatch: 404s - What's Missing? June 1999 <http://www.ariadne.ac.uk/issue20/404/> Revisiting 404 Error Pages In UK University Web Sites , June 2002, <http://www.ariadne.ac.uk/issue32/web-watch/> The original survey and article helped to raise the importance of well-designed 404 pages as an important navigation feature Case Studies
    25. 25. 404 Error Pages <ul><li>Significant changes have been made since the findings of the first survey was published </li></ul>Apache/1.3.6 Server at www.shef.ac.uk Port 80
    26. 26. Limitations Of This Approach <ul><li>What limitations do you think this approach may have? </li></ul>Limitations
    27. 27. Limitations Limitations Reliance on third party tools Inconsistencies across tools Unusual aspects of Web sites Inadequacies of automated tools Can’t handle Intranets Performance implications Legal and ethical issues Personalision, cookies, etc. Limitations
    28. 28. Reliance On Third Party Tools <ul><li>This approach relies on use of third party software: </li></ul><ul><ul><li>Company may go out of business </li></ul></ul><ul><ul><li>Company may introduce charging or change conditions of use </li></ul></ul><ul><ul><li>Companies may change format of its output </li></ul></ul><ul><ul><li>Company may change algorithms (possibly without notification) </li></ul></ul><ul><li>Example: </li></ul><ul><ul><li>The Bobby accessibility checker withdrew a summary of the file size of resources. </li></ul></ul><ul><ul><li>CAST sold Bobby: the new company introduced limitations to use of Bobby </li></ul></ul>Limitations
    29. 29. Intranets, etc. <ul><li>Use of public third party Web sites for testing: </li></ul><ul><ul><li>May not work with Intranets or Web sites which require a username and password to access </li></ul></ul><ul><li>Possible Solution </li></ul><ul><ul><li>Some testing services allow you to give a username and password </li></ul></ul><ul><ul><li>If you do this, you are trusting the service not to steal the username and password! </li></ul></ul>Limitations
    30. 30. Inconsistencies <ul><li>Different tools may have inconsistent approaches </li></ul>Limitations <ul><li>Example </li></ul><ul><ul><li>NetMechanic and Bobby (previous version) reported on the file size of analysed pages. </li></ul></ul><ul><ul><li>Bobby only analysed the HTML page and inline images. </li></ul></ul><ul><ul><li>NetMechanic also included external JavaScript and stylesheet files. </li></ul></ul><ul><ul><li>NetMechanic respected the Standard for Robot Exclusion (SRE) and would not analyse images if the SRE banned access. </li></ul></ul><ul><ul><li>Bobby ignored the SRE. </li></ul></ul>
    31. 31. Inadequacies <ul><li>Automated tools: </li></ul><ul><ul><li>Are not suitable for testing all aspects of a Web site </li></ul></ul><ul><ul><li>Need to be complemented by manual testing </li></ul></ul><ul><ul><li>Reliance on automated results without warning notices can cause confusion </li></ul></ul>Limitations Accessibility Testing Automated tools such as Bobby can report on missing ALT tags However automated tools cannot report that a meaningful ALT tag is given <img src=“important-graph”> <img src=“important-graph”> alt=“”>
    32. 32. Performance Implications <ul><li>Automated tools: </li></ul><ul><ul><li>Could degrade the performance of Web sites if poorly designed </li></ul></ul>Limitations Case Study A HTML validation tool was used to check A Web site. Shortly after it was used, it was found that it repeatedly sent HTTP requests to a Web site, which felt that this was a denial-of-service attack.
    33. 33. Legal And Ethical Issues <ul><li>If your survey findings: </li></ul><ul><ul><li>Give a negative impression of a Web site </li></ul></ul><ul><ul><li>Are flawed, and give an mistaken negative impression of a Web site </li></ul></ul><ul><li>could you be sued? </li></ul>Limitations Case Study WebWatch surveys seek to highlight examples of best practices. Care is taken in the language used when problems are reported.
    34. 34. Personalisation <ul><li>How should testing tools behave if Web sites provide personalised interfaces: </li></ul><ul><ul><li>Make use of username details to personalise context </li></ul></ul><ul><ul><li>Personalise the interface based on the user’s browser </li></ul></ul><ul><ul><li>Personalise the interface based on other environment factors (e.g. time, referer page, language setting, etc.) </li></ul></ul>Limitations
    35. 35. Unusual Aspects <ul><li>How should testing tools deal with other unusual aspects of a Web site such as: </li></ul><ul><ul><li>Web page redirects  Splash screens </li></ul></ul><ul><ul><li>Pop-up windows  Frames </li></ul></ul><ul><ul><li>etc. </li></ul></ul>Limitations Example When you give a URL the Web page is redirected to another URL. The new page displays a splash screen for 5 seconds and then moves to a new page with contains frames. In addition a pop-up window is displayed. How many pages are there?
    36. 36. Benchmarking And QA <ul><li>The benchmarking approach may be used to: </li></ul><ul><ul><li>Ensure that Web sites comply with standards and best practices </li></ul></ul><ul><ul><li>May be of interest to funding bodies </li></ul></ul><ul><ul><li>UKOLN involved in work in this area to ensure that projects comply with standards and best practices in order that they will be interoperable </li></ul></ul>See < http://www.ukoln.ac.uk/ qa-focus/surveys/ >
    37. 37. Who Else Is Doing This? <ul><li>Who else may be carrying out benchmarking surveys? </li></ul><ul><li>We will use a Google search for: </li></ul><ul><ul><li>“ accessibility surveys ”, “ Web site benchmark ”, “ HTML compliance surveys ”, etc. </li></ul></ul><ul><li>In order to explore other approaches including: </li></ul><ul><ul><li>Commercial approaches </li></ul></ul><ul><ul><li>Non-commercial approaches </li></ul></ul>
    38. 38. US Accessibility Surveys <ul><li>Axel Schmetzke has carried out surveys of the accessibility of selected US University Web sites </li></ul>http://library.uwsp.edu/aschmetz/ Accessible/websurveys.htm
    39. 39. Try It For Yourself <ul><li>The methodology which has been described can be used by yourself across your community </li></ul><ul><li>Benefits: </li></ul><ul><ul><li>You will get an idea of how your compare with your peers </li></ul></ul><ul><ul><li>For national bodies, funders, etc. you can gain a profile of your community </li></ul></ul><ul><ul><li>There may be opportunities for describing your community at conferences, etc. </li></ul></ul>
    40. 40. Implementing A Benchmark Survey <ul><li>To implement your own benchmark across a community you can simply examine WebWatch articles and adapt the HTML for your own use. </li></ul><ul><li>Further details at < http://www.ariadne.ac. uk/issue29/web-watch/ > </li></ul>http://bobby.cast.org/bobby/bobbyServlet? URL=http%3A%2F%2F www2.brent.gov.uk %2F&output=Submit&gl=wcag1-aaa <ul><li>Technique Used </li></ul><ul><li>Use the Web service on a site </li></ul><ul><li>Copy URL into template </li></ul><ul><li>Determine URL structure </li></ul><ul><li>Use as basis for use with other URLs </li></ul>
    41. 41. Implementation (2) <ul><li>Simple technique: </li></ul><ul><ul><li>Copy URLs into a template </li></ul></ul><ul><li>Better technique: </li></ul><ul><ul><li>Create HTML file using a server-side script </li></ul></ul><ul><li>Better technique: </li></ul><ul><ul><li>Use a backend database so resources can be more easily managed </li></ul></ul><a href=“ tool ? url ”>Try it</a> … <!-- query_string= http://…/tool.cgi?URL= $website --> Do for all website s <a href=“ query_string? $website ”>Try it</a> …
    42. 42. Next Generation Tools <ul><li>We can expect to see further development in testing tools: </li></ul><ul><li>Why? </li></ul><ul><ul><li>Compliance with, say, e-Government guidelines </li></ul></ul><ul><ul><li>Ensure Web sites work e.g. e-commerce </li></ul></ul><ul><li>How: </li></ul><ul><ul><li>Tools which provide richer functionality (e.g. dealing with personalised Web sites) </li></ul></ul><ul><ul><li>Development of “Web services” for testing </li></ul></ul><ul><ul><li>Agreement on standards e.g. what is a Web “page” </li></ul></ul><ul><ul><li>Development of XML standards for interchange of results e.g. EARL </li></ul></ul>
    43. 43. Resources For You To Use <ul><li>A series of exercises on Web site benchmarking is available, which contains details of a number of benchmarking tools </li></ul>http://www.ukoln.ac.uk/web-focus/events/ conferences/ucisa-tlig-2002/benchmarking/
    44. 44. Questions <ul><li>Any questions? </li></ul>

    ×