Chi2012 analysis in practical usability evaluation web

533 views

Published on

Full paper presentation held at CHI2012, Austin, Texas.

Published in: Technology, Design
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
533
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Chi2012 analysis in practical usability evaluation web

  1. 1. Analysis in Practical Usability Evaluation:A Survey Study Asbjørn Følstad, SINTEF Effie Lai-Chong Law, University of Leicester Kasper Hornbæk, University of Copenhagen CHI 2012
  2. 2. What is analysis?A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 2
  3. 3. Analysis superficially treated in textbooks 7,5 % Share of pages in Dumas & Redish’ a practical guide to usability testing treating Tabulating and analyzing data + Recommending changes. In contrast, preparations for a usability test is covered in 46 % of the same book.A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 3
  4. 4. Few - if any - studies on how practitioners do analysis However, the research community has developed: User action framework Instant data analysis Analysis frameworks (Andre, Hartson, Williges, 2001) (Kjeldskov, Skov, Stage, 2004) and processes SUPEX – structured UP extraction (Cockton, Lavery, 1999) Templates Problem description (Lavery, Cockton, Atkinson, 1997) Guidelines formats (Capra, 2006) Usability problem inspector (Andre, Hartson, Williges, 2003) Analysis tools Morae plugin for problem description and grouping (Howarth, Smith-Jackson, Hartson, 2009)A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 4
  5. 5. Few - if any - studies on how practitioners do analysis … and the research and standards communities have discussed the relation between evaluation and design Context analysis Evaluation User reqirements Design ISO 9241-210 – Human-centred design for interactive systemsA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 5
  6. 6. A researcher perspective on analysis The main output of an analysis in (formative) usability evaluation is a usability problem list to inform later design work Research-based processes, methods and tools are needed to support rigorous analysisA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 6
  7. 7. But is the research user-centred? Do we really know the practitioners? Do we know how they do usability evaluation in general – and analysis of evaluation data in particular?A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 7
  8. 8. Survey to explore analysis practices 155 participants, mainly recruited through SIGCHI and UPA chapters Experienced participants: Median usability work experience 5 years Questions concerned the participants latest usability evaluation Latest evaluation within last 6 months - 62% within last 2 months Target both usability testing and inspection – questionnaires adapted - 112 usability testing - 43 usability inspectionA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 8
  9. 9. Research questions 1. How is analysis supported? 2. How are usability problems identified? 3. How do usability practitioners collaborate in analysis? 4. How is redesign integrated into the evaluation process?A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 9
  10. 10. Findings 1. How is analysis supported? Working hours – for the entire evaluation 2. How are usability problems identified? 3. How do usability practitioners collaborate in analysis? Analysis resources 4. How is redesign integrated into the evaluation process? Tools used for analysis Structured formats for problem descriptionA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study.
  11. 11. Findings 1. How is analysis supported? Working hours - for the entire 2. How are usability problems identified? evaluation? 3. How do usability practitioners collaborate in analysis? Usability testing: 48 (median) 4. How is redesign integrated into the evaluation process? Usability inspection : 24 (median)A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study.
  12. 12. Findings 1. How is analysis supported? Analysis resources? 2. How are usability problems identified? UT UI_ 3. How do usability practitioners collaborate in analysis? Heuristics / guidelines 60% 76% 4. How is redesign integrated into the evaluation process? Design patterns 41% 54% Test participant opinion: … usability problems 64% … redesign suggestions 48%A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 12
  13. 13. Findings Tools used during analysis? (free text) 1. How is analysis supported? Screen recording and analysis 2. How are usability problems identified? software, e.g. Morae (11- all UT) 3. How do usability practitioners collaborate in analysis? Drawing and prototyping tools, 4. How is redesign integrated into the evaluation process? e.g. Balsamiq, Axure (8 – UT and UI) Plain screen recording, e.g. Camtasia, SnagIt (5 – all UT) Web analytics, e.g. Google analytics, Seevolution (5 - all UI) (Where are the tools from the last 20 years of research?)A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 13
  14. 14. Findings Structured formats for 1. How is analysis supported? problem description? 2. How are usability problems identified? 3. How do usability practitioners 55%: Problems described collaborate in analysis? according to our own format 4. How is redesign integrated into the evaluation process? 41%: No format, just plain prose 4%: Formats from standards or literatureA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 14
  15. 15. Findings 1. How is analysis supported? Evaluation deliverables containing 2. How are usability problems identified? redesign suggestions? 3. How do usability practitioners collaborate in analysis? Time of making redesign suggestions 4. How is redesign integrated into the evaluation process? Sources of redesign suggestions Means of redesign presentationA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 15
  16. 16. Findings 1. How is analysis supported? Deliverable characterized as … 2. How are usability problems identified? 3. How do usability practitioners UT UI_ collaborate in analysis? A set of redesign suggestions motiva- 51% 53% 4. How is redesign integrated into the evaluation process? ted from UPs A set of UPs with some redesign suggestions 46% 43% A set of UPs with no redesign suggestions 4% 5%A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 16
  17. 17. Findings 1. How is analysis supported? When are redesign 2. How are usability problems identified? suggestions made? 3. How do usability practitioners collaborate in analysis? 4. How is redesign integrated 49%: First all UPs were identified, into the evaluation process? then redesign suggestions were made 46%: Some (or all) redesign suggestions were made immedeately upon identfying a UPA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 17
  18. 18. Findings 1. How is analysis supported? 2. How are usability problems identified? Sources of redesign 3. How do usability practitioners collaborate in analysis? suggestions UT UI_ 4. How is redesign integrated into the evaluation process? Response to UPs 94% 74% Non-optimal solution, even though no UP 38% 47% was observedA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 18
  19. 19. Findings 1. How is analysis supported? Means of redesign 2. How are usability problems identified? presentation 3. How do usability practitioners collaborate in analysis? UT UI_ 4. How is redesign integrated Textual descriptions 68% 71% into the evaluation process? Annotated screen shots 50% 55% UI digital mock-ups 32% 47% Sketching 29% 21%A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 19
  20. 20. A researcher perspective on analysis - revisited The main output of an analysis in (formative) usability evaluation is a usability problem list to inform later design work Research-based processes, methods and tools are needed to support rigorous analysisA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 20
  21. 21. A researcher perspective on analysis - revisited Researchers need The main output of an analysis in to learn from (formative) usability evaluation is a practitioners how usability problem list and a set of evaluation and design are related redesign suggestions – the latter – not vice versa often visually presented. Research-based processes, methods and tools are needed to support rigorous analysisA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 21
  22. 22. A researcher perspective on analysis - revisited Should we rather The main output of an analysis in support home- (formative) usability evaluation is a growing of analysis usability problem list and a set of support, and align with commercial redesign suggestions – the latter tools? often visually presented. Research-based processes, methods and tools need to be developed in response to practitioners needs – as seen from the practitioner perspectiveA. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 22
  23. 23. ConclusionA fast-paced presentation of a selectionof the survey findingsIf what you heard interests you: There ismore to be found in the paper :-)Thank you for your attention!A. Følstad, E. L-C. Law, K. Hornbæk: Analysis in practical usability evaluation: a survey study. 23

×