Desirability and Preference Testing - UPA International 2011

  • 1,153 views
Uploaded on

 

More in: Technology , Design
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,153
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
16
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Impacts a product's or application's perceived:UtilityUsabilityCredibility
  • If users have a positive impression of the design aesthetics, they are more likely to overlook or forgive poor usability or limited functionality. With a negative impression, users are more likely to find fault with an interaction, even if a product’s overall usability is good and the product offers real value.
  • High desirability feeds into the motivational factors that help trigger target behavior.
  • High desirability feeds into the motivational factors that help trigger target behavior.
  • The simplicity of the question doesn’t work well with larger numbers of design options, especially if some are highly similar
  • People can have a difficult time articulating what it is about a design they like or dislike
  • The whys are important for stakeholder acceptance (branding guidelines)Business sponsors and stakeholders often want substantial customer feedback to assure them a given direction is correct.
  • TriadingQualitative interview technique that reveals constructsElicits attributes that are important to users in their vocabularyResearcher asks the participant to identify how two of the three examples are different from the thirdIn typical user research interviews, a researcher asks participants about their thoughts on a defined list of topics. The disadvantage of this approach is that the researcher may be inquiring about topics that are of little value or significance to the experience of the participants. Generally, participants will dutifully answer questions about any topics we ask them about, without thinking more broadly, going beyond the limits our questions impose, or interrupting us to tell us about dimensions that may be more relevant to them. Participants assume researchers are interested in studying the particular topics they’ve included in their interview scripts and don’t raise other issues that might be more pertinent to their overall experience with a product or potential design.Triading is a method that allows a researcher to uncover dimensions of a design space that are pertinent to its target audience. In triading, researchers present three different concepts or ideas to participants and ask them to identify how two of them are different from the third. Participants describe, in their own terms, the dimensions or attributes that differentiate the concepts. Participants follow this process iteratively—identifying additional attributes they feel distinguish two of the concepts from the third until they can’t think of any other distinguishing factors. By repeating this process across multiple participants, researchers can see trends that define audience segments or personas.The benefit of this process is that it uncovers dimensions of a particular domain that are important to the target audience rather than the researcher or designer. In addition, the dimensions participants identify are generally emotional aspects that it is important for experience designers to consider. For example, participants may describe differences in groups as “warm” versus cold” or business-like” versus fun.” Designers can then use the most relevant or common dimensions as inspiration for further design and exploration.
  • Benefits – straightforward and easy to administer on a large scaleNegatives – if you want to pick more than a clear winner but rather understand the emotional connections/reactions to each design this will not lend itself to that.
  • Obvious examples are consumer electronics or other retail productsAlso appropriate for applications in healthcare, insurance, financial, travel, etc.
  • Sensors track participants’ physiological measurements to particular designs. Changes in suggest a particular emotional response.Paired with attitudinal and self-reporting surveys measurements give a multifaceted view of emotional reactions to a design
  • Respondents are being asked: "To which extent do the feelings expressed by the characters correspond with your own feelings towards the stimulus?“Building on the responses of many people allows you to abstract valuable data pertaining to the emotional performance of your website, product, service.
  • “luxurious, approachable, friendly, capable, multi-cultural/inclusive, established”
  • “My initial reaction to this web site is that it seems kind of plain. There is not much going on in the page, and the colors seem kind of drab.”“This is a nice looking website. It is well designed, well laid out, and is appealing to look at. It makes me want to continue to navigate the site to learn more. “
  • “Men don’t really go with children… where there’s a baby, there must be a mother.”“My initial reaction to the website is that it seems very clean and modern. I like the layout, it looks like its easy to find information.”
  • “I felt love. I saw a mother holding a child…that’s pretty touchy. The site looks good, and it makes the hospital trustworthy.”“My initial reaction was that the hospital is represented by a caring, warm and friendly website.”
  • *** review no big fonts **
  • Evolution of how the document separates from the header and tools, and related topics on right
  • In order of ascending practicality
  • In order of ascending practicality

Transcript

  • 1. Preference and Desirability Testing: Measuring Emotional Response to Guide Design
    Michael Hawley
    Chief Design Officer, Mad*Pow
    Paul Doncaster
    Senior User Experience Designer, Thomson Reuters
  • 2. Agenda
    Why we should care
    Why it’s not always as simple as asking:“Which option do you prefer?”
    Methods to consider
    Case Study: Greenwich Hospital
    Case Study: WestlawNext
    Summary/Comparison
  • 3. Why we should care
    3
  • 4. Impressions Count
  • 5. An important role of visual design is to lead users through the hierarchy of a design as we intend
    For interactive applications, a sense of organization can affect perceived usability and, ultimately, users’ overall satisfaction with the product
    Functional Desirability
  • 6. Visceral Emotions
  • 7. 7
    Fogg’s Behavior Model
    Core motivators include:
    • Pleasure/pain
    • 8. Hope/fear
    • 9. Acceptance/rejection
    http://www.behaviormodel.org/
  • 10. 8
    Positioning Desirability Studies
    http://www.xdstrategy.com/2008/10/28/desirability_studies/
  • 11. “Which do you prefer?”
    9
  • 12. Quantity, granularity breed apathy
  • 13. Poor articulation
    “It reminds me of…”
    “It’s nice and clean.”
    “There’s just something about it . . .”
    “I ordinarily don’t like red, but for some reason it works here . . .”
    “It’s better than the other ones.”
  • 14. What Stakeholders Should Care About
    “We should go with design C over A and B, because I feel it evokes the right kind of emotional response in our audience that is closer to our most important brand attributes.”
  • 15. Methods to Consider
    13
  • 16. Present three different concepts or ideas to participants, and ask them to identify how two of them are different from the third and why.
    14
    Triading
  • 17. Broad, experience-based questionnaires, that also include questions relating to visual appeal and aesthetics
    • SUS (System Usability Scale)
    • 18. QUIS (Questionnaire for User Interface Satisfaction)
    • 19. WAMMI (Website Analysis and Measurement Inventory)
    15
    Qualitative Questionnaires
  • 20. Show participants a user interface for a very brief moment, then take it away.
    Participants recall their first impression, then moderator probes for meaning.
    • Helpful for layout decisions, prominence of content, labels
    • 21. www.fivesecondtest.com
    16
    Attention designers:
    You have
    50 milliseconds
    to make a good
    first impression
    Quick Exposure Memory Tests
  • 22.
    • Electroencephalography (EEG): Brain activity
    • 23. Electromyography (EMG):
    Muscles and Excitement
    • Electrodermal Activity (EDA): Sweat, Excitement
    • 24. Blood Volume Pressure (BVP): Arousal
    • 25. Pupil Dilation: Arousal and Mental Workload
    • 26. Respiration:
    Negative Valence or Arousal
    17
    Physiological and Neurological
  • 27. 18
    PrEmo Emotional Measurement
    Dr. Pieter Desmet, Technical University of Delft
    http://www.premo-online.com
  • 28. 19
    http://www.microsoft.com/usability/uepostings/desirabilitytoolkit.doc
    Product Reaction Cards
  • 29. Case Study: Greenwich Hospital
    20
  • 30. Determine intended brand attributes (and their opposites)
    21
    Product Reaction Cards: Before You Begin
    Leverage existing marketing/brand materials
    Alternatively, stakeholder brainstorm to identify key brand attributes/descriptors using full list of product reaction cards as a start
    Tip: “If the brand was a person, how would it speak to your customers?”
  • 31. Methodology
    Include 60/40 split of positive and negative words
    Target 60 words, optimized to test brand
    Simple question: “Which of the following words do you feel best describe the site/design/product (please select 5):”
    One comp per participant, or multiple comps per participant (no more than 3)
    Participants
    Qualitative: Paired with usability testing
    Quantitative: Target minimum of 30 per option if possible
    22
    Product Reaction Cards: Conducting
  • 32. 23
    Process - Analyzing
    Calculate percentage of positive and negative attributes per design
    Visualize overall sentiment of feedback using “word clouds” (see wordle.net)
    68% Positive
    32% Negative
  • 33.
    • Align the website with the character of the Hospital
    • 34. Update the site after nearly 10 years
    • 35. Counter impressions that Greenwich is more than just maternity and elder care
    • 36. Communicate that they are long-standing members of the community
    24
    Case Study: Greenwich Hospital Website Redesign
  • 37. 25
    Case Study: Greenwich Hospital Website Redesign
    • 3 visually designed comps
    • 38. 50 people reacted to each comp (quantitative) via survey
    • 39. Additional feedback obtained via participant interviews (qualitative)
    Survey Questions
    Hello, I am requesting feedback on a website I am working on. Your answers let me know if the site is conveying the right feel.
    1. What are your initial reactions to the web site?
    2. Which of the following words best do you feel best describe the site (select 5):
  • 40. 26
    Three Different Visual Designs
  • 41. 27
    Results: Concept 1
    12% Negative
    88% Positive
  • 42. 28
    Results: Concept 2
    87% Positive
    13% Negative
  • 43. 29
    Results: Concept 3
    5% Negative
    95% Positive
  • 44.
    • Mix of qualitative and quantitative is key
    • 45. Qualitative helps provide color to the results
    • 46. Quantitative resonates with stakeholders and executives
    • 47. Position results as one form of input to decision-making process, not declaring a “winner”
    • 48. Simple, cost-efficient way to assess audience’s emotional response to a design
    30
    Lessons Learned
  • 49. Case Study: WestlawNext
    UX Research Team:
    Drew Drentlaw
    Shannon O’Brien
    Bill Quie
    November Samnee
    31
  • 50.
  • 51. for Phase 1
    • Use large sample sizes to establish a design “baseline,” from which to advance the design direction in subsequent iterations
    • 52. Isolate preference trends for specific page design aspects
    • 53. Determine tolerance for manipulation of the site “brand”
    • 54. Maintain tight security
    Goals
  • 55. Sessions were held in 4 cities over 5 days
    Seattle
    Denver
    Memphis
    Minneapolis-St. Paul
    4 sessions were held per day, with a maximum of 25 participants per session
    1.5 hours allotted per study, most participants finished in less than 1 hour
    319 participants successfully completed their sessions
    Phase 1: Logistics & Execution
  • 56. Participants completed the study at individual workstations at their own pace
    All workstations included a 20” monitor, at 1024x768 resolution
    Phase 1: Logistics & Execution
    Memphis, TN, May 2009
  • 57. Brief review of Westlaw critical screens
    Positive/negative word selection to describe Westlaw
    36
    Positive/negative product descriptors
  • 58. Each set of Element variations were viewed in full screen
    Participant selects “top choice” by dragging a thumbnail image to a drop area
    37
    Homepage: Design Elements
  • 59. 38
  • 60. Homepage: Design Elements (1)
    All options viewed in full screen
    Participant selects “top choice” by dragging a thumbnail image to a drop area
  • 61.
  • 62.
  • 63. Visual Weight (6 options)
    Use of Imagery (8 options)
    Components (4 options)
    Search Area (4 options)
    Palette (10 options)
    Homepage: Design Elements
  • 64. 19 HP designs viewed in full screen (randomized)
    All 19 options are presented again; participant assigns a rating using a 10-point slider.
    Top 5 and Bottom 2 choices are positioned in order of rating values on one long, scrollable page. Next to each design displayed, rates key aspects for each design on a 5-point scale
    Homepage: Design Gallery
  • 65.
  • 66.
  • 67.
  • 68.
  • 69.
  • 70. Repeat the process for Results List design:
    New Results List
    • Design Elements
    • 71. Column Collapsing (4 options)
    • 72. Column Separation (2 options)
    • 73. Theme/Color (8 options)
    • 74. Design Gallery
    • 75. 14 Results Lists designs (randomized)
    • 76. Key Aspects Rated
    • 77. Color scheme
    • 78. Global Header
    • 79. Summary and Excerpt (list contents)
    • 80. Filters design (left column)
    • 81. Overall look and feel
  • Repeat the process for Document Display design:
    New Document Display
    • Design Elements
    • 82. Tabs vs. Links (4 options)
    • 83. Background Separation (4 options)
    • 84. Margin Width (3 options)
    • 85. Font Size (12 options)
    • 86. Locate (2 options)
    • 87. Design Gallery
    • 88. 9 Results Lists designs (randomized)
    • 89. Key Aspects Rated
    • 90. Color scheme
    • 91. Layout of content
    • 92. Text formatting
    • 93. Overall look and feel
  • “Based on the designs I’ve liked most today . . .”
    51
    Positive/negative design descriptors
  • 94. Results were analyzed across 8 different sample filters
    The top picks were surprisingly consistent across all of the ‘Top 5’ lists analyzed
    High-level Results
  • 101. 53
    Top Homepage Designs by Job Title
    Job Title. Top 5 out of 19 possible.
    Overall (319)
    Associate (189)
    Librarian (37)
    Partner (81)
    Solo Practitioner (5)
    1
    1
    1
    1
    1
    HP16
    HP1
    HP16
    HP5
    HP15
    2
    HP8
    HP15
    HP15
    HP6
    HP16
    2
    2
    2
    2
    3
    3
    3
    3
    3
    HP8
    HP8
    HP16
    HP8
    HP10
    4
    HP15
    HP1
    4
    4
    4
    4
    HP5
    HP7
    HP5
    5
    5
    5
    5
    5
    HP10
    HP13
    HP8
    HP19
    HP14
  • 102. Home Page (19)
    HP16 & HP15 designs consistently placed in the Top 5 across all filters
    Results List (14)
    RL4 consistently placed in the Top 3 across all sample filters, and was the #1 choice for 80% of all participants
    Document Display (9)
    DD3 placed in the Top 5 across all sample filters and was the #1 choice for 77% of all participants
    Phase 1: High-level Results
  • 103. Note, participants were asked to describe the current Westlaw before being shown the new designs.
    55
    Phase 1: Word Selection Results
  • 104. 5 design themes were derived from post-session discussions
    • “New design(s) are better than current Westlaw”
    • 105. “Clean and Fresh”
    • 106. “Contrast is Important”
    • 107. “Prefer Westlaw Blue”
    • 108. “No Big Fonts Please”
    The study narrowed the list of potential designs, and we better understood what design elements that Westlaw users liked and disliked.
    Phase 1: High-level Results
  • 109. 57
    Phase 2: September 2009
    Kansas City, MO, Sept 2009
  • 110. Goals
    • Refine preferences for selected design directions
    • 111. Understand users personal reasons for liking their preferred choices
    • 112. Get closure on other design options for online and printed content
    • 113. Sustain tight security
    Tool
    • Same as in Round 1, with some minor revisions to accommodate specialized input
    Phase 2: September 2009
  • 114. Method
    View, Rate, and Pick Top Choice for
    Homepage (3 options)
    Result List (2 options)
    Document Display (2 options)
    “Why?”
    Simple preference selection for two unresolved UI design issues
    Citing References: Grid display or List display?
    Out of Plan Indication design (6 options)
    Type formatting preferences for 3 different content types
    Font Face
    Font Size
    Margin Width
    Phase 2: September 2009
  • 115. Logistics
    3 cities (Philadelphia, Kansas City, Los Angeles)
    1 Day
    226 participants
    Analysis
    Filters (8 categories) were used to score the designs for each visual preference
    Results
    Clear choices for top designs in each of all categories
    “Why” feedback shed new light on designs under consideration and helped focus “homestretch” design activities
    Phase 2: September 2009
  • 116. Home Page (3)
    HP3 ranked #1in 94% of filter groups (54% of total participants)
    Results List (2)
    RL5 ranked #1in 97% of filter groups (58% of total participants)
    Document Display (2)
    DD7 ranked #1in 94% of filter groups (61% of total participants)
    Phase 2: High-level Results
  • 117. The main concerns regarding Homepage Design HP3
    Search Box
    Too small
    How do I do a Terms-and-Connectors search?
    Browse Section
    How do I specify multiple or specific search content?
    Poor organization
    Poor label
    Need access to “often-used” content
    Need better access to help
    62
    Participant Comments: Homepage
  • 118. Goals
    Get feedback on branding options from decision makers and those who influence purchase of the product
    Get closure on final outstanding design issues
    Tool
    Same as in Rounds 1 & 2, with some minor revisions to accommodate specialized input
    Phase 3: December 2010
  • 119. Method
    Wordmark/Branding
    View wordmark color combinations and design elements against different backgrounds, pick top choice and provide comments
    Make a final “Top Choice” from all selections
    Simple preference selection for outstanding UI design issues
    Header Space: Tile or No Tile?
    Notes Design
    Location: Inline or Column?
    State: Open or Closed?
    Headnote Icon design (4 variations)
    Phase 3: December 2010
  • 120. What color combination do you prefer? Please rank the 4 combinations below according to your preferences. To rank, click and drag an item from the left to a box on the right.
    Your Most Liked
    1
    2
    3
    4
    Your Least Liked
  • 121. Logistics
    3 cities (Seattle, Denver, Boston)
    1 Day
    214 participants
    Analysis
    Simple preference, no advanced filters
    Results
    Decision-makers confirmed that critical brand elements should be retained
    Phase 3: December 2010
  • 122. Decision Makers’ Picks (1 of 2)
  • 123. 68
  • 124. Measuring
    Emotional Response
    to Guide Design
    Why it succeeded
    • Quantitative & qualitative data to identify preference trends
    • 125. “Slicing” across identifiable filters
    • 126. Emphasis on “gut-level” reactions
    • 127. Intolerance for manipulation of product brand
    • 128. Rapid turnaround of data to all stakeholders
    • 129. Executive
    • 130. Design
    • 131. Development
  • 70
    Document Display
    May 2009
    Sept 2009
    Feb 2010
  • 132. At what cost(s)?
    • We held off asking “why” until the second round
    • 133. If we had asked why in the first round, we might have
    • 134. avoided some of internal design battles
    • 135. gotten more granular ammunition for communicating the design vision to stakeholders
    • 136. “Need for speed” attained at the cost of detailed analysis
    Retrospective
  • 137. Recommendations for anyone thinking of undertaking something like this
    • Procure a “Matt” to create and administer your tool
    • 138. Get a good technical vendor for on-site
    • 139. Report results in as close to real-time as possible on a wiki or other web-page
    Retrospective
  • 140. Summary/Comparison
    73
  • 141. Both groups valued support in design decision making
    Align methodology with needs of the project
    Research-inspired, not research-decided
    74
    Summary/Comparison
  • 142. Additional Reading and Tools
    75
  • 143. Benedek, Joey and Trish Miner. “Measuring Desirability: New Methods for Evaluating Desirability in a Usability Lab Setting.” Proceedings of UPA 2002 Conference, Orlando, FL, July 8–12, 2002. http://www.microsoft.com/usability/uepostings/desirabilitytoolkit.doc
    Lindgaard, Gitte, Gary Fernandes, Cathy Dudek, and J. Brown. "Attention Web Designers: You Have 50 Milliseconds to Make a Good First Impression!" Behaviour and Information Technology, 2006. http://www.imagescape.com/library/whitepapers/first-impression.pdf
    Rohrer, Christian. “Desirability Studies: Measuring Aesthetic Response to Visual Designs.” xdStrategy.com, October 28, 2008. Retrieved February 10, 2010. http://www.xdstrategy.com/2008/10/28/desirability_studies
    76
    Additional Reading
  • 144. User Focus. "Measuring satisfaction: Beyond the Usability Questionnaire." Retrieved February 10, 2010. http://www.userfocus.co.uk/articles/satisfaction.html
    UserEffect. "Guide to Low-Cost Usability Tools." Retrieved May 12, 2010.http://www.usereffect.com/topic/guide-to-low-cost-usability-tools
    Tullis, Thomas and Jacqueline Stetson. “A Comparison of Questionnaires for Assessing Website Usability.” Usability Professionals’ Association Conference, 2004.home.comcast.net/~tomtullis/publications/UPA2004TullisStetson.pdf
    Westerman, S. J., E. Sutherland, L. Robinson, H. Powell, and G. Tuck. “A Multi-method Approach to the Assessment of Web Page Designs.” Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction, 2007.http:// portal.acm.org/citation.cfm?id=1422200
    77
    Additional Reading
  • 145. Five Second Testhttp://fivesecondtest.com/
    Feedback Army http://www.feedbackarmy.com
    Wordlehttp://www.wordle.net
    PrEmohttp://www.premo-online.com
    78
    Additional Tools