Paper prototype evaluation

406 views
289 views

Published on

David Lamas @ European Innovation Academy Summer Session 2013

Published in: Business, Technology, Design
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
406
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Paper prototype evaluation

  1. 1. Monday Tuesday 10:00 to 11:00 Introduction to paper prototyping. Critical assessment of representative examples. Stop-motion prototyping. Introduction to paper prototype evaluation. Evaluating paper prototypes using techniques such as expert evaluation, cognitive walk-through, co- discovery, wizard of Oz and others. Critical assessment of representative examples. 11:00 to 12:00 Initial prototyping activities. Drafting paper prototype evaluation protocols. 13:00 to 16:00 Prototyping madness (sharing your initial ideas with the group, 30 seconds per team). Further prototyping activities. Prototyping madness (sharing your ideas with the group, 30 seconds per team). Prototyping madness (sharing your draft evaluation protocols with the group, 30 seconds per team). Paper prototype refinement through evaluation. Prototyping showcase (sharing the results of your prototyping efforts with the group for feedback, 2 minutes per team).
  2. 2. Paper Prototype Evaluation
  3. 3. Evaluation • Well, before addressing the evaluation of paper prototypes, perhaps we should consider reasoning upon what we should evaluate, right? – Let’s then take a look into concepts such as usefulness, usability and user experience
  4. 4. Usefulness • Usefulness is a crucial quality of any product or service – Concerns the degree to which a product enables a user to achieve his or her goals, and is an assessment of the user’s willingness to use the product or service at all • Without usefulness, other measures make little or no sense, because it will just sit on the shelf • We name this the fundamental quality of a product or service
  5. 5. Usability • According to the ISO standard – Usability is the extent to which a product or service can be used by specific users to achieve predefined goals in a specified context of use – The standard defines three main usability dimensions: • Effectiveness, • Efficiency, and • Satisfaction
  6. 6. Usability • Anyway, when looking into usability, we normally account for: – Effectiveness – Efficiency – Satisfaction – Learnability – Memorability • We name these the pragmatic qualities of a product or service
  7. 7. Usability • Common reason for the delivery of less usable products are – System focused process – Poorly integrated teams – Design and development mismatches – And of course, designing usable products is difficult • That’s what we are pushing paper prototyping in this moment of the process
  8. 8. User Experience • According to its ISO standard, user experience is a person’s perception and responses that result from the use or anticipated use of a product or service
  9. 9. User Experience • User experience subsumes usability and includes includes – emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviors and accomplishments that occur before, during and after use • We name these the effected of the hedonic, or pleasure related, qualities of a product or service
  10. 10. Evaluation • As stating in the beginning, before addressing the evaluation of paper prototypes, we should consider reasoning upon what we should evaluate – And now that we briefly cover usefulness, usability and user experience, I believe it’s fair to say that we should target going as far as usability
  11. 11. Evaluation • In general, product and service evaluation can be formative or summative • Paper prototype evaluation is mostly formative
  12. 12. Evaluation • Paper prototype evaluation focus on… – The most significant issues preventing users from accomplishing their goals – What works and what do users find frustrating – What are the most common errors or mistakes users are making – Assessing the improvements being made from one design interaction to the next – Identifying issues that are expected to remain even when the product is launched
  13. 13. Evaluation • Today we look into – Inspection methods • These are methods where an expert evaluator inspects a product or service – Testing methods • These are methods where products and services are evaluated by testing them on real users
  14. 14. Inspection methods • There are also several but we will address: – Cognitive walk-through – Heuristic evaluation
  15. 15. Cognitive walk-through • The purpose is to verify if the paper prototype actually allows the fulfillment of the selected user stories – This is a within team activity that ensures that your prototype complies with the identified user stories
  16. 16. Cognitive walk-through • Designers and developers of the product or service then walk through the steps as a group, asking themselves a set of questions at each step – Data is gathered during the walk-through, and afterwards a report of potential issues is compiled – Finally the evaluated proposition is redesigned to address the issues identified
  17. 17. Heuristic evaluation • This is a type of evaluation ideally carried out by an expert. – It specifically involves evaluators examining the design and judging its compliance with recognized principles, such as Jakob Nielsen’s 10 usability heuristics • These evaluation methods are now widely taught and practiced in the new media sector, where products and services are often designed in a short space of time on a budget that may restrict the amount of money available to provide for other types of interface testing
  18. 18. Testing methods • There are several but we will address: – Co-discovery – Wizard of Oz
  19. 19. Co-discovery • Two users attempt to perform tasks together while being observed – They are to help each other in the same manner as they would if they were working together to accomplish a common goal using the product – They are encouraged to explain what they are thinking about while working on the tasks
  20. 20. Co-discovery • The designers and developers should refrain from explaining the design decisions and rather focus on getting the most of the pair of users tacking with your prototype – Note taking is fundamental and you should run this kind of test until no significant additional information is feed back into the design process
  21. 21. Wizard of OZ • This is a testing approach built upon a paper device instead of using a working technological artifact – This king of testing involves systematic observation under controlled conditions to determine how well people can use a product or service – Rather than showing users a rough draft and asking, Do you understand this?, this kind of testing involves watching people trying to use something for its intended purpose
  22. 22. Wizard of OZ • Setting up such a test involves asking the test subjects to recreate a set of user stories after being introduced to the underlying scenario – For example, to test the attachment function of an e-mail program, a scenario would describe a situation where a person needs to send an e-mail attachment, and ask him or her to undertake this task – The aim is to observe how people function in a realistic manner, so that developers can see problem areas, and what people like.
  23. 23. Testing methods • This kind of evaluation should be repeated until no significant added value comes from bring in an additional subject – The usual number is 5, but other claim otherwise
  24. 24. Pre and post-tests • When applying testing methods, pre-test and post-test questionnaires are also used to gather feedback on the product being tested – A common questionnaire is the 25 years old System Usability Scale
  25. 25. System Usability Scale
  26. 26. Evaluation protocol • But these methods are useless without an adequate evaluation protocol • While designing your evaluation protocol, you should take into account… – A mixed selection of inspection and testing methods; – The user stories supported by your prototype; and – The affordances of your paper prototype.
  27. 27. Evaluation protocol • Examples are provided in the companion blog but please note that… – In all cases, evaluation protocols should be piloted to ensure that once they are being applied, you are actually focusing in assessing the prototype and not on solving evaluation protocol issues
  28. 28. Now it’s up to you to make it happen
  29. 29. paperprototypingateia .wordpress.com

×