Ifi7155.DT-Lesson1

1,302 views

Published on

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,302
On SlideShare
0
From Embeds
0
Number of Embeds
868
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • participants marks their current emotional state on a 2-dimensional 9x9 grid where arousal forms the y axis and pleasantness the x axis
  • Ifi7155.DT-Lesson1

    1. 1. Evaluating the User Experience Sónia Sousa & Mati Mõttus
    2. 2. How to evaluate UX?
    3. 3. Evaluation • Before thinking on EVALUATION, – We should consider reasoning upon what we should evaluate, right? Where should evaluation practices are valuable? 2016 3@ Sónia Sousa
    4. 4. ISO 9241-210 iterative design early focus on users and tasks empirical measurement http://www.system-concepts.com/assets/images/usability/usability%20diagram%20for%20blog.jpg John D. Gould and Clayton Lewis. 1985. Designing for usability: key principles and what designers think. Communications of the ACM 28(3), 300-11 2016 4@ Sónia Sousa
    5. 5. Three main design paradigms 2016 5 Purpose Evaluat i-on Artefact Benefic i-aries User centered design Engineering Applied Arts Purpose Evalu- ation Benefici- aries Purpose Evalu- ation Artefact Benefici- aries @ Sónia Sousa
    6. 6. Purpose Evalu- ation Artefact Benefici- aries Interaction design 2016 6@ Sónia Sousa
    7. 7. Lesson 1 • Let’s then take a look into Evaluation related concepts such as… • Usefulness, Usability and User experience • Evaluation Methods and Tools – Objects for Evaluation – Methods & categories & metrics – Data & Scales 2016 7@ Sónia Sousa
    8. 8. Evaluation Concepts: usefulness, usability and user experience
    9. 9. a good user experience depends on both pragmatic and hedonic qualities
    10. 10. usefulness and usability are pragmatic
    11. 11. pleasure is hedonic
    12. 12. a user experience occurs when a person interacts with a product…
    13. 13. Usability and user experience • Product – Behaviour – Do goal • Metrics – Efficiency – Effectiveness – User satisfaction Usability testing is… a systematic experimental evaluation of the interactions Is easy to use? Is easy to learn? Is satisfying to use? 132016 @ Sónia Sousa
    14. 14. Systematic measure • Nowadays when a person interacts with a product… they look for more than just – Efficient, effective and cheap products. • They look for products that stimulate pleasure; and – Address our needs for social and individual differentiation. 2016 14@ Sónia Sousa
    15. 15. Hanssenzahl and Tractinsky (2006) User Context Content Kuniavsky (2003) Identity Jetter and Gerken (2010) Design results and marketing Marketing Nielsen, Norman and Tognazzini (2011) All aspects of the end-user’s interaction Services Roto, Law, Vermeeren & Hoonhout (2011) Interface Usability Information, interaction and identity An active and passive encounter with a system Characteristics of the system Time factor Information Interaction Product company 152016 @ Sónia Sousa
    16. 16. Usability • According to the ISO standard – Usability is the extent to which a product or service can be used by specific users to achieve predefined goals in a specified context of use – The standard defines three main usability dimensions: • Effectiveness, • Efficiency, and • Satisfaction 2016 @ Sónia Sousa 16
    17. 17. Usability • Anyway, when looking into usability, we normally account for: – Effectiveness – Efficiency – Satisfaction – Learnability – Memorability • We name these the pragmatic qualities of a product or service 2016 @ Sónia Sousa 17
    18. 18. User Experience • According to its ISO standard, user experience is a person’s perception and responses that result from the use or anticipated use of a product or service 2016 18@ Sónia Sousa
    19. 19. User Experience • User experience subsumes usability and includes includes – emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviors and accomplishments that occur before, during and after use • We name these the effects of the hedonic, or pleasure related, qualities of a product or service 2016 19@ Sónia Sousa
    20. 20. Usefulness… • The degree to which a product enables a user to achieve his or her goals: and – Assesses user’s willingness to use the product • If a system is easy to use, easy to learn, and even satisfying • But, this vector is most often overlooked during experiments and studies in the lab • As it addresses system-oriented design takes – Jeffrey Rubin 202016 @ Sónia Sousa
    21. 21. Efficiency…  How quickly can users perform tasks? – How fast It can be accomplished (accurately and completely) – Quantitative measure • Associated to % of total users • Represents: • Product behavior • Users expectations • Do exactly what user wont 212016 @ Sónia Sousa
    22. 22. Learnability… • How easy is it for users to accomplish basic tasks – For the first time – After stop using for a while • Represents – user’s competence • ability to operate the system – after some predetermined amount and period of training • ability of infrequent users to relearn the system – after periods of inactivity 222016 @ Sónia Sousa
    23. 23. Satisfaction… • Does the product meets or satisfy users needs – Quantitative and qualitative measures • captured through both written and oral questioning • Represents – user’s perceptions – user’s feelings – User’s opinions • Typically, users are asked – to rate and rank products that they try, – Asked to reveal causes and reasons for problems that occur 232016 @ Sónia Sousa
    24. 24. Accessibility • Describe the ability to access and benefit of something • The degree to which a product, device, service, or environment is accessible by as many people as possible, without modification – The degree it can accommodate things that people can't easily change… • Focus often – on people with disabilities and their right of access to entities 242016 @ Sónia Sousa
    25. 25. How to…evaluate?
    26. 26. Evaluation • Before addressing the evaluation protocol, – we should consider: 1. What is the purpose of evaluation? 2. What metrics will I need? – Is a measurements: a method to measuring something 3. What methods I can use? – a particular procedure for accomplishing or approaching something 4. What type of data we will need? 2016 26@ Sónia Sousa
    27. 27. Evaluation • UX evaluation depends on what we focus on… – The most significant issues preventing users from accomplishing their goals – What works and what do users find frustrating – What are the most common errors or mistakes users are making – Assessing the improvements being made from one design interaction to the next – Identifying issues that are expected to remain even when the product is launched 2016 27@ Sónia Sousa
    28. 28. 2016 28 What is the usage intention? Positive outcomes Motivation aspects Granularity measure methods Value Vs importance of use Low level experience measure Overall experience measure + + AttitudesIntentions What is the Usage attitude @ Sónia Sousa
    29. 29. 2016 29 EmotionsValue Perceptions User AttitudesProduct Qualities Usage IntentionsUser Beliefs Before interaction During interaction After use + First impressions Mood State Why people like and use certain products What make people use a certain product @ Sónia Sousa
    30. 30. What to evaluate? 2016 30 Product Layout, utility, functionalityFirst impressions Hedonic qualities Emotions Mood state Value Perceptions Pleasure, Affective arousal symbolic aspects of the product User perceptions @ Sónia Sousa
    31. 31. Then…choose a specific metric or method?
    32. 32. Metrics and methods • Issues-based metrics – What is an issue? • Anything that prevents task completion • Anything that takes someone off-course • Anything that creates some level of confusion • Anything that produces an error • Not seeing something that should be noticed • Assuming something is correct when it is not • Performing the wrong action • Misinterpreting some piece of content • Not understanding the navigation – How are they identified? • Using an inspection method • Eventually combined with some performance data analysis 2016 @ Sónia Sousa 32
    33. 33. 2016 @ Sónia Sousa 33 Utility Usability Pleasure Intentions Social Value Efficiency Perceptions of interactions The product Affection Satifaction Social Value Mood State Relevance of the product User intention to use usefulness Trust Learnability Accessibility UX Asthetics Social Links
    34. 34. Metrics and methods • Issues-based metrics – After identified, issues usually classified according to their severity • Small impact on user experience, few users experiencing issue – Low severity • Small impact on user experience, many users experiencing issue – Medium severity • Large impact on user experience, few users experiencing issue – Medium severity • Large impact on user experience, many users experiencing issue – High severity 2016 @ Sónia Sousa 34
    35. 35. Metrics and methods • And others exist such as – Self-reported metrics, used for assessing satisfaction among other participant perceived measures – Behavioral and physiological metrics • of which eye-tracking is one of the most used ones as far as Web usability testing is concerned – Combined and comparative metrics • based on combinations of the previously mentioned siblings – And others such as… • Server logs • Card-sorting data – Open card sorting – Closed card sorting • Accessibility indicators 2016 @ Sónia Sousa 35
    36. 36. Metrics and methods 2016 36 using Task oriented Perceived usage Usability UX Prevent Experience Reflection Concern with product attributes and prevent errors Concern in build positive experiences resulted from the interaction with the product @ Sónia Sousa
    37. 37. How to…choose a specific method?
    38. 38. Evaluation methods • Inspection methods – These are methods where an expert evaluator inspects a product or service • Testing methods – These are methods where products and services are evaluated by testing them on real users 2016 38@ Sónia Sousa
    39. 39. Inspection methods • There are also several but we will address: – Cognitive walk-through – Heuristic evaluation 2016 39@ Sónia Sousa
    40. 40. Cognitive walk-through • The purpose is to verify if the paper prototype actually allows the fulfillment of the selected user stories – This is a within team activity that ensures that your prototype complies with the identified user stories 2016 40@ Sónia Sousa
    41. 41. Cognitive walk-through • Designers and developers of the product or service then walk through the steps as a group, asking themselves a set of questions at each step – Data is gathered during the walk-through, and afterwards a report of potential issues is compiled – Finally the evaluated proposition is redesigned to address the issues identified 2016 41@ Sónia Sousa
    42. 42. Heuristic evaluation • This is a type of evaluation ideally carried out by an expert. – It specifically involves evaluators examining the design and judging its compliance with recognized principles, such as Jakob Nielsen’s 10 usability heuristics • These evaluation methods are now widely taught and practiced in the new media sector, where products and services are often designed in a short space of time on a budget that may restrict the amount of money available to provide for other types of interface testing 2016 42@ Sónia Sousa
    43. 43. Testing methods • There are several but we will address: – Co-discovery – Wizard of Oz 2016 43@ Sónia Sousa
    44. 44. Co-discovery • Two users attempt to perform tasks together while being observed – They are to help each other in the same manner as they would if they were working together to accomplish a common goal using the product – They are encouraged to explain what they are thinking about while working on the tasks 2016 44@ Sónia Sousa
    45. 45. Co-discovery • The designers and developers should refrain from explaining the design decisions and rather focus on getting the most of the pair of users tacking with your prototype – Note taking is fundamental and you should run this kind of test until no significant additional information is feed back into the design process 2016 45@ Sónia Sousa
    46. 46. Wizard of OZ • This is a testing approach built upon a paper device instead of using a working technological artifact – This king of testing involves systematic observation under controlled conditions to determine how well people can use a product or service – Rather than showing users a rough draft and asking, Do you understand this?, this kind of testing involves watching people trying to use something for its intended purpose 2016 46@ Sónia Sousa
    47. 47. Wizard of OZ • Setting up such a test involves asking the test subjects to recreate a set of user stories after being introduced to the underlying scenario – For example, to test the attachment function of an e-mail program, a scenario would describe a situation where a person needs to send an e-mail attachment, and ask him or her to undertake this task – The aim is to observe how people function in a realistic manner, so that developers can see problem areas, and what people like. 2016 47@ Sónia Sousa
    48. 48. Testing methods • This kind of evaluation should be repeated until no significant added value comes from bring in an additional subject – The usual number is 5, but other claim otherwise 2016 48@ Sónia Sousa
    49. 49. Pre and post-tests • When applying testing methods, pre-test and post-test questionnaires are also used to gather feedback on the product being tested – A common questionnaire is the 25 years old System Usability Scale 2016 49@ Sónia Sousa
    50. 50. How to…design an evaluation protocol?
    51. 51. Evaluation protocol • But these methods are useless without an adequate evaluation protocol • While designing your evaluation protocol, you should take into account… – A mixed selection of inspection and testing methods; – The user stories supported by your concepts/prototype; and – The affordances of your concepts/prototype. 2016 51@ Sónia Sousa
    52. 52. Evaluation protocol • We will work on perfection our evaluation protocol throughout the course… – In all cases, evaluation protocols should be piloted to ensure that once they are being applied, you are actually focusing in assessing the prototype or concepts and not on solving evaluation protocol issues 2016 52@ Sónia Sousa
    53. 53. Now it’s up to you to make it happen
    54. 54. ???How??? • Answer to – what you wont to evaluate? – What are your evaluation needs? • Decide for instance – Product name • Digital e-Textbook service – Product Description • Web service, mobile, desktop and tablet application – Design stage • Concept design – Product representation • Functional prototypes – Purpose of evaluation • Find the best concept idea for implementing the service – Participants • Students, teachers and publishers – Time restriction • 3 weeks – Equipment and tools • Audio recorder, screen recording, assistant for note taking – Skills of researcher • HCI students 2016 4@ Sónia Sousa
    55. 55. …and that is why it should be carefully addressed
    56. 56. evaluating user experiences is tricky but possible
    57. 57. and we will now address possible approaches
    58. 58. Most known Methods… • Usability Methods • System Usability scale [SUS] • Methods that were transposed from Usability – Cognitive Walkthrough – Think aloud – Diaries 2016 58@ Sónia Sousa
    59. 59. Most known Methods… • Others are adaptation of – Extended usability testing – TRUE Tracking Realtime User Experience – UX Expert evaluation – Property checklists • Others are new – AttrakDiff - – Emocards, Emotion Cards, Emofaces – Reaction Cards 2016 59@ Sónia Sousa
    60. 60. SUS is a quick and dirty scale popular for end-of-test subjective assessments
    61. 61. System Usability Scale 2016 61@ Sónia Sousa
    62. 62. Affect Grid
    63. 63. 2016 63@ Sónia Sousa
    64. 64. AttrackDiff
    65. 65. AttrackDiff assesses the user's feelings about a system with a questionnaire addressing both hedonic and pragmatic dimensions of UX
    66. 66. More…
    67. 67. How user perceive product features • Aesthetics and interaction features • Layout, content, functionality – When before and during • Personal expectation and needs – Record of mouse activity and mood state – AttractDiff and EMG values – Kleiss, Lavies questionnaire 2016 68@ Sónia Sousa
    68. 68. What is the intention to use • Judgments of the product based on – Feelings towards the product 2016 69@ Sónia Sousa
    69. 69. First impressions • How user perceive product features – Aesthetic and symbolic aspects of products • Measurements – Visual and verbal tests 2016 70@ Sónia Sousa
    70. 70. Values & perceptions • How relevant users see the product? – Pleasure, challenge, needs, expressions • Mood boards • Perceived quality • Efficiency • Effectiveness • sensors 2016 71@ Sónia Sousa
    71. 71. Emotions & Mood state – Positive outcomes • Emotions, joy, excitement, value – Affects – BMIS – MIS – RAS – Affect grid – SAM – PreEmo 2016 72@ Sónia Sousa

    ×