• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Measuring government websites
 

Measuring government websites

on

  • 4,676 views

Measuring online performance can be tricky, and there's not a lot of guidance out there for non-commerce sites. This framework has been developed specifically for government organizations to help them ...

Measuring online performance can be tricky, and there's not a lot of guidance out there for non-commerce sites. This framework has been developed specifically for government organizations to help them measure, track and improve their online services.

Statistics

Views

Total Views
4,676
Views on SlideShare
3,730
Embed Views
946

Actions

Likes
8
Downloads
12
Comments
0

8 Embeds 946

http://www.egov.vic.gov.au 514
http://usability4government.wordpress.com 321
http://governingpeople.com 68
http://www.slideshare.net 27
http://www.linkedin.com 11
http://webcache.googleusercontent.com 3
http://www.governingpeople.com 1
http://translate.googleusercontent.com 1
More...

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • If a redesign: Site advances our mandate as measured by reduced costs (fewer FTEs, reduced overtime, reduced time to publish)
  • ideally for all channels (web, e-mail, phone) ideally for all channels (web, e-mail, phone) Ideally, this information would be available for all channels.
  • This framework was developed for implementation as part of a website redesign project for a federal government department if you'd like to use it these steps are suggested.

Measuring government websites Measuring government websites Presentation Transcript

  • Creating a Web Performance Measurement Framework for Government sites Providing evidence for making decisions to ensure citizen-focused websites Developed by Laura Wesley with help and research by tons of others both known and unknown to me. Find me online at http://usability4government.wordpress.com or on Twitter @resultsjunkie
  • Table of Contents
      • Objectives of providing this framework
        • Goals of the framework
        • Common challenges of measuring online performance
      • Implementing the framework
        • Measuring efficiency
        • Measuring effectiveness
        • Measuring satisfaction
      • Web Performance Measurement Framework
        • Sample Web Performance Measurement Strategy
        • Evaluation criteria & rationale for each indicator
        • Reporting for decision-making
  • Objectives & common challenges of measuring online performance
  • Goal of providing this framework
    • Reduce duplication, error rates & cost of gathering and using client feedback to improve online communication and service to the public by:
      • Providing a Web Performance Measurement Framework that can be customized and re-used by any department to increase ability to make decisions using facts and figures.
      • Proposing a set of performance indicators that can be used consistently across the Government of Canada to compare progress and success.
  • Goals of the framework…
    • Develop an approach that is consistent with:
        • Government of Canada’s Management Accountability Framework, specifically Citizen-focused Service element.
        • Government Service Reference Models (GSRM)
        • Private sector approach to client-centred design and testing
        • Results-based Management (RBM) concept & tools
        • Research from the Institute on Citizen-Centric Service (ICCS) and other academic forums.
    • Provide a framework to assess web channel performance
        • Effectiveness of website
        • Efficiency of web tools & processes that support this channel
        • Satisfaction of website visitors
    • Develop an evidenced-based approach to guide long-term site improvements.
  • Common challenges
    • This framework is meant to assist with these common challenges to collecting and using data to inform decisions:
      • Too much/not enough data.
      • Not sure what information to gather or analyze.
      • Focus on data instead of trends that lead to insights.
      • Difficulty turning data into insights and then into concrete actions.
      • Mistake website as a goal in itself rather than a means by which a goal is achieved or communicated.
  • Limitations of this framework
    • This framework only covers measuring performance of your own site to inform ongoing improvements.
    • Doesn't provide additional steps required to:
      • Evaluate impact of offline change in the world (outcomes) as a result of providing services or information online.
      • Measure performance of off-site content, that is, social media or other forms of online content that are not contained on your website.
      • Diagnose specific site ailments.
  • Develop and implement your web performance measurement strategy
    • Efficiency
      • Demonstrate that the resources required to create, maintain & continuously improve the website are worth associated costs.
    • Effectiveness
      • Demonstrate that visitors are able to complete the necessary and desired tasks that are available on the website.
    • Satisfaction
      • Demonstrate that site visitors are happy with the online experience.
    What do we need to measure?
  • Measure the efficiency of your online communication and services
  • How do we measure e fficiency ?
    • Suggested indicator = Cost per visit/interaction. Compare resources invested to qualified visits on the site.
    • How?*
      • Track all costs associated with developing the site, plus the number of visits to the site.
      • Take it one step further by breaking down cost by:
        • types of content – eg. video, podcasts or newsletters
        • transactional vs. informational services
        • service channels (phone, web, in-person)
    *It should be noted that I have not yet seen this done well. If you have, please call me!
  • How do we use this info?
    • As government organizations, we have an obligation to serve citizens in the way they want to be served, but we also have to provide services as efficiently as possible. Chances are, you can't eliminate the highest cost channel or activities but you can:
      • Reduce time spent on inefficient activities
      • Promote use of most efficient channel
      • Promote services where feedback is positive, but costs still outweigh the number of people using it.
  • Measure the effectiveness of your online communication and services
  • How do we measure effectiveness?
    • Suggested indicators: accessible, credible, valuable, desirable, useful, usable, findable*
    • How?
      • Collect data to support user expectations mentioned above. Taken together, these expectations should help deconstruct elements of end-user satisfaction.
      • Develop a research plan to collect all the data required to validate user expectations. This becomes your web performance measurement strategy.
    *From Peter Morville’s Semantic Studios : http://semanticstudios.com/publications/semantics/000029.php
  • How do we use this info?
    • Actual numbers are less important than trends, but both are useful. Additional research may be required to further investigate actual improvements but by collecting data to measure user expectations, you can:
      • Create a starting point to report on level of improvements over time (baseline) or compare to similar organizations (benchmark).
      • Look for consistent trends across feedback mechanisms (qualitative/quantitative) to prioritize necessary changes.
      • Test assumptions and theories about what is working.
      • Understand clients' needs and expectations better in order to develop new products and services.
  • Sample web performance measurement strategy
  • Components of User Experience From Peter Morville’s Semantic Studios : http://semanticstudios.com/publications/semantics/000029.php Context Clients Content Findable Accessible Usable Useful Desirable Credible Valuable MAF: citizen-focused service
  • Goal is to be… Key Performance Indicator (KPI) Suggested This year's target Data Source Who will collect when? Share info with… Accessible Comply with W3C accessibility guidelines as measured by the percentage of CLF-compliant pages 95% CLF compliance CLF validation reports Web services (Annually) Treasury Board (MAF13) Valuable Site advances our mandate as measured by conversion rates of on-site tasks. 20% of visitors take desired action (on-site) User survey /Various third party Web services (Post-release) Senior Managers Desirable Improved perception of quality of service as measured by percentage of visitors who agree that their visit was positive. 85% satisfaction User survey Analyst (Annually) Site owner /Content owner Usable More easily complete tasks as measured by ease, time and error rate of task completion. 80% composite score of ease, time, error rate Usability testing Web Comms (Pre-release) Site owner /Content owner Useful Increased satisfaction as measured by the percentage of respondents who agree with the statement “This page was useful”. 80% say YES On-page feedback form Analyst (Monthly) Content owner Credible Increased trust in the reliability of the content as measured by the percentage of pages reviewed within past 6 months. 95% of pages reviewed within past 6 months Content Mgt System Web services (Bi-annually) Content owner Findable More easily locate information through external search by measuring percentage of traffic referred to site from search engines. 60% visits referred by search engines Web Analytics software Analyst (Monthly) Content owner
  • Accessibility Evaluation Criteria Comply with Common Look and Feel (CLF) Policy and W3C accessibility guidelines. Percentage of pages that are compliant to CLF priority checkpoints. 95% of pages are 95% compliant. CLF validation reports. This information will be presented to channel managers and senior management. It will also be reported to Treasury Board annually. Report Determine performance indicators Collect data to measure against target # Set target Identify goal
  • Valuable Evaluation Criteria Strategic objectives are met or enhanced through online service delivery and information sharing. Conversion rate on pre-determined tasks or workflows. Increase conversion rates by 2% for at least 3 tasks or workflows. Define important tasks and determine desired paths or workflows through the site then map & track through web analytics. Baseline and final reports will be provided to channel managers with each release. Final report to senior management. Report Determine performance indicators Collect data to measure against target # Set target Identify goal
  • Desirable Evaluation Criteria End-users are satisfied with the website. Percentage of users who are satisfied with the Web site. 80% of respondents are satisfied. Annual online survey. This information will be reported to programs and presented to channel managers and senior managers annually. Report Determine performance indicators Collect data to measure against target # Set target Identify goal
  • Usability Evaluation Criteria Users can complete desired tasks as measured by efficiency, effectiveness and satisfaction of task completion. A minimum of 80% should be reached in 4 out of 5 test cases. Task-based usability testing should be performed with the proposed site mock-ups before every major release. This information will be presented to content managers and channel managers. The indicator will be expressed as a composite of the speed, error rate and user satisfaction of users regarding each task. Report Determine performance indicators Collect data to measure against target # Set target Identify goal
  • Usefulness Evaluation Criteria Content provided is useful to site visitors. Percentage of respondents who agree with the statement “This page was useful”. 80% of respondents agree that “This page was useful”. “ Was this page useful?” on-page feedback form. This information will be reported to content managers and presented to channel managers and senior managers quarterly. Report Determine performance indicators Collect data to measure against target # Set target Identify goal
  • Credibility Evaluation Criteria Visitors trust the reliability of the content. Percentage of pages that have been reviewed in the previous six months. 95% of pages have been reviewed and updated (if necessary) within previous 6 months. Reports run from the Content Management System on the ‘last reviewed’ information for each page. This information will be reported to content managers, channel managers and senior managers twice a year. Report Determine performance indicators Collect data to measure against target # Set target Identify goal
  • Findability Evaluation Criteria Easily locate information on the site. Percentage of traffic from search engines is maintained throughout project, increased after project. At least 60% of site traffic referred from search engines. Web Analytics Information will be presented to content managers quarterly. Report Determine performance indicators Collect data to measure against target # Set target Identify goal
  • Reporting for decision making
  • Reporting to Content Owners
    • To empower content owners to use data to inform decisions and priorities they should be provided with:
      • Tombstone data about visitors (total visits, user demographics)
      • User experience indicators – updated as and when available
        • Monthly: Findability, Credibility, Usefulness.
        • Pre/post tests or surveys: Usability, Value.
        • Annually: Accessibility, Desirability.
    • Reports should be specific to only the content for which each content owner is responsible.
    • Presentations to governance committees should also highlight trends across data collected and suggested improvements for their specific area of the site.
  • Reporting to Channel Owners
    • To empower channel owners to use data to inform decisions and priorities reports should include:
      • Same info as provided to content owners, but rolled up for the entire website.
      • Costs – ideally in the form of efficiency ratios described on slide 14
    • Presentations to governance committees should highlight trends across data collected. Web analysts should suggest improvements. This information should be used to inform prioritization of functionality and content needs for each release cycle.
  • Reporting to Senior Management
    • To empower senior managers (of content or channels) to use data to inform decisions and priorities reports should include
      • report summaries mentioned on previous 2 slides depending on their role in organization.
    • Presentations to governance committees should demonstrate how data is being used to inform decision-making at other levels.
    • Senior managers should be asked to help set priorities based on data provided and other externalities (e.g. resource availability, other unrelated activities that also demand resources).
  • Reporting to Treasury Board Secretariat*
    • Report to Treasury Board Secretariat through your Departmental Performance Report (DPR) to demonstrate good management practices, results and improvements based on performance data.
    • Relevant areas of the Management Accountability Framework (MAF) include:
      • AoM 12: Effectiveness of Information Management
        • Public can find info/services to complete tasks that meet their needs
      • AoM 13: Effectiveness of Information Technology
        • Common Look & Feel Standards on Accessibility, Usability
      • AoM 20: Citizen-focused Service
        • Using client feedback to continuously improve services
    *This step only relevant for Canadian federal departments who report through this balanced-scorecard approach. See details: http://www.tbs-sct.gc.ca/maf-crg/documents/booklet-livret/booklet-livret-eng.asp
  • Implementation
    • This framework was developed for implementation as part of a website redesign project for a federal government department if you'd like to use the following steps are suggested:
      • Develop an implementation plan for the data that needs to be collected, including when and how it will be used to improve the site.
      • Submit data collection needs to technical lead as part of requirements gathering phase to ensure ability to collect data is added over time.
      • If starting from scratch, add an indicator or two to reporting cycle as capacity builds, for example, with each release or annual reporting cycle include a new measure. It may not be practical or feasible to implement and collect data all at once.
      • Customize the approach for your own implementation.
  • Customization
    • Specific indicators, targets, data sources may vary from the sample performance measurement framework provided but the goals described herein are probably the same.
    • Why? The goals used here describe the expectations most users have of how the web works.
    • Be practical. Don't collect data you can't turn into an improvement.
    • Add one indicator at a time and build the ability to collect data into each technical release.
  • Use this framework if…
    • You're not yet collecting any or all data required.
    • Your governance or business processes do not enable continuous improvements.
    • You're spending money maintaining or updating your site but not sure of the value.
    • You don’t know how to define success or use the data you do have to inform decisions.
    • You're making decisions based on what the boss wants rather than what site visitors and clients want.
  • Questions & comments can be forwarded to Laura Wesley via Twitter: @resultsjunkie or online via http://usability4government.wordpress.com