User testing and evaluation:     why, how and when to do it   Evaluating and user testing… Appleby Magna Centre 11 May 200...
Intro: Martin Bazley <ul><li>Consultancy/websites/training/user testing ICT4Learning.com (10+ yrs) </li></ul><ul><li>Chair...
Why evaluate websites? <ul><li>Why do evaluation and user testing?  </li></ul><ul><li>Isn’t it really expensive and time c...
Making websites effective <ul><li>3 key success factors </li></ul><ul><li>Understanding  audience </li></ul><ul><li>Learni...
Who for what for ... <ul><li>Who for?   (audience) </li></ul><ul><ul><li>Need to be clear from start e.g. ‘ for teachers o...
 
 
 
 
 
 
 
 
 
Website evaluation and testing <ul><li>Need to think ahead a bit: </li></ul><ul><ul><li>what are you trying to find out? <...
Test early <ul><li>Testing one user early on in the project… </li></ul><ul><li>… is better than testing 50 near the end </...
When to evaluate or test and why <ul><li>Before funding approval – project planning </li></ul><ul><li>Post-funding - proje...
Testing is an iterative process <ul><li>Testing isn’t something you do once  </li></ul><ul><li>Make something </li></ul><u...
Before funding – project planning <ul><li>*Evaluation of other websites </li></ul><ul><ul><li>Who for? What for? How use i...
 
Post-funding - project development <ul><li>*Concept testing </li></ul><ul><ul><li>refine project outcomes based on  feedba...
 
 
 
 
Post-funding - project development 2 <ul><li>*Full evaluation of a draft working version  </li></ul><ul><ul><li>usability ...
 
 
 
 
 
<ul><li>Video clip Moving Here key ideas not lesson plans  </li></ul>
 
 
 
 
Post-funding - project development 3 <ul><li>Acceptance testing of ‘finished’ website </li></ul><ul><ul><li>last minute ch...
Two usability testing techniques  <ul><li>“ Get it” testing </li></ul><ul><li>- do they understand the purpose, how it wor...
 
User testing – who should do it? <ul><li>The worst person to conduct (or interpret) user testing of your own site is… </li...
User testing – more info <ul><li>User testing can be done cheaply – tips on how to do it available (MLA SE guide):  www.IC...
<ul><li>Strengths and weaknesses of different data gathering techniques </li></ul>
Data gathering techniques <ul><li>User testing   - early in development and again near end </li></ul><ul><li>Online questi...
<ul><li>Need to distinguish between: </li></ul><ul><li>Diagnostics  – making a project or service  better </li></ul><ul><l...
Online questionnaires <ul><li>(+) once set up they gather numerical and qualitative data with no further effort –  given t...
Focus groups <ul><li>(+) can explore specific issues in more depth, yielding rich feedback  </li></ul><ul><li>(+) possible...
Visitor surveys  <ul><li>(+) possible to control participant composition to ensure representative </li></ul><ul><li>(–) co...
Web stats <ul><li>(+) Easy to gather data – can decide what to do with it later </li></ul><ul><li>(+) Person-independent d...
Web stats <ul><li>(–) Different systems generate different data for the same web activity – for example no of unique visit...
Web stats <ul><li>(–) As the amount of off-website web activity increases (e.g. Web 2.0 style interactions) the validity o...
Who for what for ... <ul><li>Who for?   (audience) </li></ul><ul><ul><li>Need to be clear from start e.g. ‘ for teachers o...
<ul><li>How can you ensure you do get these right? </li></ul><ul><ul><li>Build questions into the planning process  </li><...
<ul><li>Martin Bazley </li></ul><ul><li>0780 3580 737 </li></ul><ul><li>www.martinbazley.com   </li></ul>More information
Upcoming SlideShare
Loading in...5
×

090511 Appleby Magna Overview Presentation

323
-1

Published on

Slides used as an introduction to E-Learning Resources: Evaluation course at Appleby Magna on 11 May 2009 run by Martin Bazley on behalf of Renaissance East Midlands

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
323
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • 090511 Appleby Magna Overview Presentation

    1. 1. User testing and evaluation: why, how and when to do it Evaluating and user testing… Appleby Magna Centre 11 May 2009 Martin Bazley Martin Bazley & Associates www.martinbazley.com
    2. 2. Intro: Martin Bazley <ul><li>Consultancy/websites/training/user testing ICT4Learning.com (10+ yrs) </li></ul><ul><li>Chair of E-Learning Group for Museums </li></ul><ul><ul><li>Previously: </li></ul></ul><ul><li>E-Learning Officer, MLA South East (3yrs) </li></ul><ul><li>Science Museum, London, Internet Projects (7yrs) </li></ul><ul><li>Taught Science in secondary schools (8yrs) </li></ul>
    3. 3. Why evaluate websites? <ul><li>Why do evaluation and user testing? </li></ul><ul><li>Isn’t it really expensive and time consuming? </li></ul><ul><li>Save money – avoid substantial, hurried redevelopment later in project </li></ul><ul><li>Audience feedback improves resource in various ways – new activity ideas, etc </li></ul><ul><li>Demonstrate involvement of key stakeholders throughout project </li></ul>
    4. 4. Making websites effective <ul><li>3 key success factors </li></ul><ul><li>Understanding audience </li></ul><ul><li>Learning experience and learning outcomes – right for audience and clearly stated </li></ul><ul><li>Evaluation – esp in classroom or home (observe in ‘natural habitat’ wherever possible…) </li></ul>
    5. 5. Who for what for ... <ul><li>Who for? (audience) </li></ul><ul><ul><li>Need to be clear from start e.g. ‘ for teachers of yr5/6 in local area with whiteboards’ </li></ul></ul><ul><li>What ‘real-world’ outcomes? (learning outcomes) </li></ul><ul><ul><li>What will they learn or do as a result? e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera… </li></ul></ul><ul><li>How will they use it? (learning experiences) </li></ul><ul><ul><li>What do they actually do with the site? e.g. work online or need to print it? - in pairs or alone? - with or without teacher help? </li></ul></ul><ul><li>Where, when and why will they use it? </li></ul><ul><ul><li>context is important </li></ul></ul>
    6. 15. Website evaluation and testing <ul><li>Need to think ahead a bit: </li></ul><ul><ul><li>what are you trying to find out? </li></ul></ul><ul><ul><li>how do you intend to test it? </li></ul></ul><ul><ul><li>why? what will do you do as a result ? </li></ul></ul><ul><ul><li>The Why? should drive this process </li></ul></ul>
    7. 16. Test early <ul><li>Testing one user early on in the project… </li></ul><ul><li>… is better than testing 50 near the end </li></ul>
    8. 17. When to evaluate or test and why <ul><li>Before funding approval – project planning </li></ul><ul><li>Post-funding - project development </li></ul><ul><li>Post-project – summative evaluation </li></ul>
    9. 18. Testing is an iterative process <ul><li>Testing isn’t something you do once </li></ul><ul><li>Make something </li></ul><ul><li>=> test it </li></ul><ul><li>=> refine it </li></ul><ul><li>=> test it again </li></ul>
    10. 19. Before funding – project planning <ul><li>*Evaluation of other websites </li></ul><ul><ul><li>Who for? What for? How use it? etc </li></ul></ul><ul><ul><li>awareness raising: issues, opportunities </li></ul></ul><ul><ul><li>contributes to market research </li></ul></ul><ul><ul><li>possible elements, graphic feel etc </li></ul></ul><ul><li>*Concept testing </li></ul><ul><ul><li>check idea makes sense with audience </li></ul></ul><ul><ul><li>reshape project based on user feedback </li></ul></ul>Focus group Research
    11. 21. Post-funding - project development <ul><li>*Concept testing </li></ul><ul><ul><li>refine project outcomes based on feedback from intended users </li></ul></ul><ul><li>Refine website structure </li></ul><ul><ul><li>does it work for users? </li></ul></ul><ul><li>*Evaluate initial look and feel </li></ul><ul><ul><li>graphics,navigation etc </li></ul></ul>Focus group Focus group One-to-one tasks
    12. 26. Post-funding - project development 2 <ul><li>*Full evaluation of a draft working version </li></ul><ul><ul><li>usability AND content: do activities work, how engaging is it, what else could be offered, etc </li></ul></ul>Observation of actual use of website by intended users , using it for intended purpose , in intended context – classroom, workplace, library, home, etc
    13. 32. <ul><li>Video clip Moving Here key ideas not lesson plans </li></ul>
    14. 37. Post-funding - project development 3 <ul><li>Acceptance testing of ‘finished’ website </li></ul><ul><ul><li>last minute check, minor corrections only </li></ul></ul><ul><ul><li>often offered by web developers </li></ul></ul><ul><li>Summative evaluation </li></ul><ul><ul><li>report for funders, etc </li></ul></ul><ul><ul><li>learn lessons at project level for next time </li></ul></ul>
    15. 38. Two usability testing techniques <ul><li>“ Get it” testing </li></ul><ul><li>- do they understand the purpose, how it works, etc </li></ul><ul><li>Key task testing </li></ul><ul><li>ask the user to do something, watch how well they do </li></ul><ul><li>Ideally, do a bit of each, in that order </li></ul>
    16. 40. User testing – who should do it? <ul><li>The worst person to conduct (or interpret) user testing of your own site is… </li></ul><ul><ul><li>you! </li></ul></ul><ul><li>Beware of hearing what you want to hear… </li></ul><ul><li>Useful to have an external viewpoint </li></ul><ul><li>First 5mins in a genuine setting tells you 80% of what’s wrong with the site </li></ul><ul><li>etc </li></ul>
    17. 41. User testing – more info <ul><li>User testing can be done cheaply – tips on how to do it available (MLA SE guide): www.ICT4Learning.com/onlineguide </li></ul>
    18. 42. <ul><li>Strengths and weaknesses of different data gathering techniques </li></ul>
    19. 43. Data gathering techniques <ul><li>User testing - early in development and again near end </li></ul><ul><li>Online questionnaires – emailed to people or linked from website </li></ul><ul><li>Focus groups - best near beginning of project, or at redevelopment stage </li></ul><ul><li>Visitor surveys - link online and real visits </li></ul><ul><li>Web stats - useful for long term trends /events etc </li></ul>
    20. 44. <ul><li>Need to distinguish between: </li></ul><ul><li>Diagnostics – making a project or service better </li></ul><ul><li>Reporting </li></ul><ul><li>– to funders, or for advocacy </li></ul>
    21. 45. Online questionnaires <ul><li>(+) once set up they gather numerical and qualitative data with no further effort – given time can build up large datasets </li></ul><ul><li>(+) the datasets can be easily exported and manipulated, can be sampled at various times, and structured queries can yield useful results </li></ul><ul><li>(–) respondents are self-selected and this will skew results – best to compare with similar data from other sources, like visitor surveys </li></ul><ul><li>(–) the number and nature of responses may depend on how the online questionnaire is displayed and promoted on the website </li></ul>
    22. 46. Focus groups <ul><li>(+) can explore specific issues in more depth, yielding rich feedback </li></ul><ul><li>(+) possible to control participant composition to ensure representative </li></ul><ul><li>(–) comparatively time-consuming (expensive) to organise and analyse </li></ul><ul><li>(–) yield qualitative data only - small numbers mean numerical comparisons are unreliable </li></ul>
    23. 47. Visitor surveys <ul><li>(+) possible to control participant composition to ensure representative </li></ul><ul><li>(–) comparatively time-consuming (expensive) to organise and analyse </li></ul><ul><li>(–) responses can be affected by various factors including interviewer, weather on the day, day of the week, etc, reducing validity of numerical comparisons between museums </li></ul>
    24. 48. Web stats <ul><li>(+) Easy to gather data – can decide what to do with it later </li></ul><ul><li>(+) Person-independent data generated - it is the interpretation, rather than the data themselves, which is subjective. This means others can review the same data and verify or amend initial conclusions reached </li></ul>
    25. 49. Web stats <ul><li>(–) Different systems generate different data for the same web activity – for example no of unique visits measured via Google Analytics is generally lower than that derived via server log files </li></ul><ul><li>(–) Metrics are complicated and require specialist knowledge to appreciate them fully </li></ul>
    26. 50. Web stats <ul><li>(–) As the amount of off-website web activity increases (e.g. Web 2.0 style interactions) the validity of website stats decreases, especially for reporting purposes, but also for diagnostics </li></ul><ul><li>(–) Agreeing a common format for presentation of data and analysis requires collaborative working to be meaningful </li></ul>
    27. 51. Who for what for ... <ul><li>Who for? (audience) </li></ul><ul><ul><li>Need to be clear from start e.g. ‘ for teachers of yr5/6 in local area with whiteboards’ </li></ul></ul><ul><li>What ‘real-world’ outcomes? (learning outcomes) </li></ul><ul><ul><li>What will they learn or do as a result? e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera… </li></ul></ul><ul><li>How will they use it? (learning experiences) </li></ul><ul><ul><li>What do they actually do with the site? e.g. work online or need to print it? - in pairs or alone? - with or without teacher help? </li></ul></ul>
    28. 52. <ul><li>How can you ensure you do get these right? </li></ul><ul><ul><li>Build questions into the planning process </li></ul></ul><ul><ul><li>Evaluate/test regularly </li></ul></ul><ul><ul><li>Get informal feedback whenever possible – and act on it </li></ul></ul><ul><li>Who is it for? </li></ul><ul><li>What are the real world outcomes? </li></ul><ul><li>How will they use it? </li></ul><ul><li>Also When, Where, Why? </li></ul>Who for what for ...
    29. 53. <ul><li>Martin Bazley </li></ul><ul><li>0780 3580 737 </li></ul><ul><li>www.martinbazley.com </li></ul>More information
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×