Desirability Testing: Analyzing Emotional Response to a Design


Published on

In the design process we follow, once we have defined the conceptual direction and content strategy for a given design and refined our approach through user research and iterative usability testing, we start applying visual design. Generally, we take a key screen whose structure and functionality we have finalized—for example, a layout for a home page or a dashboard page—and explore three alternatives for visual style. These three alternative visual designs, or comps, include the same content, but reflect different choices for color palette and imagery. The idea is to present business owners and stakeholders with different visual design options from which they can choose. Sometimes there is a clear favorite among stakeholders or an option that makes the most sense from a brand perspective. However, there can often be disagreements among the members of a project team on which direction to choose. If we’ve done our job right, there are rationales for our various design decisions in the different comps, but even so, there may be disagreement about which rationale is most appropriate for the situation.
As practitioners of user-centered design, it is natural for us to turn to user research to help inform and guide the process of choosing a visual design. But traditional usability testing and related methods don’t seem particularly well suited for assessing visual design for two reasons:

1. When we reach out to users for feedback on visual design options, stakeholders are generally looking for large sample sizes—larger than are typical for a qualitative usability study.
2. The response we are looking for from users is more emotional—that is, less about users’ ability to accomplish tasks and more about their affective response to a given design.

With this in mind, we were very interested in articles we saw on Desirability Testing. In one article, the author posits desirability testing as a mix of quantitative and qualitative methods that allow you to assess users’ attitudes toward aesthetics and visual appeal. Inspired by his overview, we researched desirability studies a bit further and tried a modified version of the techniques on one of our projects. This presentation reviews the variants of desirability testing that we considered and the lessons we learned from a desirability study on visual design options for one of our projects. Interestingly, we found that while desirability testing did help us better understand participant’s self reported emotional response to a visual design, it also helped us identify other key areas of the experience that could be improved.

Published in: Technology, Business
1 Comment
  • Thanks for sharing. It was interesting to see your take on testing aesthetic desirability.
    Having used this method in previous projects, I decided to create an online, free version of Microsoft’s Product Reaction Card method. This can be found at
    Hopefully people viewing your presentation will find it useful
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Desirability Testing: Analyzing Emotional Response to a Design

  1. 1. boston upaRapid Desirability TestingANALYZING EMOTIONAL RESPONSE TO A DESIGN (ON ABUDGET)Prepared by:Michael Hawley – VP Experience DesignMegan Grocki – Senior Experience DesignerJune 9, 2009
  2. 2. boston upaAgenda• Introduction• The Situation• Desirability Testing Overview• Methods Considered• Our Selected Process• Case Study• Lessons Learned 2
  3. 3. boston upaAbout Mad*Pow 3
  4. 4. boston upaThe Situation 4
  5. 5. boston upaThe SituationVisual Designs Applied to Wireframe 5
  6. 6. boston upaThe SituationVisual Designs Applied to Wireframe 6
  7. 7. boston upaDesirability Testing Overview 7
  8. 8. boston upaWhat Is Desirability Testing?A collection of research methods intended to assess target audience semotional response to a design or stimulus.What It Is What It Is Not• Measure of how closely a • Measure of how much people likestimulus achieves the desired somethingemotional response • Figuring out which is the best 8
  9. 9. boston upaPositioning Desirability Studies 9
  10. 10. boston upaWhy Is It ImportantFirst impressions of a design to impact a product s or application s perceivedutility, usability, and credibility. Functionality Usability Aesthetics 10
  11. 11. boston upaMethods Considered 11
  12. 12. boston upaTriadingDefinitionPresent three different concepts or ideas to participants, and ask them toidentify how two of them are different from the third and why. 12
  13. 13. boston upaQuantitative QuestionnairesDefinitionBroad, experience-based questionnaires,that also include questions relating tovisual appeal and aesthetics• SUS (System Usability Scale),• QUIS (Questionnaire for User Interface Satisfaction)• WAMMI (Website Analysis and Measurement Inventory) 13
  14. 14. boston upaQuick Exposure Memory TestsShow participants a user interfacefor a very brief moment, then take it Attention designers:away. Participants recall their firstimpression, then moderator probesfor meaning You have 50 milliseconds• Helpful for layout decisions, prominence of content, even labels to make a good first impression• 14
  15. 15. boston upaPhysiological and Neurological MeasurementsDefinition• Sensors track participants physiologicalmeasurements to particular designs. Changes insuggest a particular emotional response.• Paired with attitudinal and self-reporting surveysmeasurements give a multifaceted view ofemotional reactions to a design •  Electroencephalography (EEG): Brain activity •  Electromyography (EMG): Muscles and Excitement •  Electrodermal Activity (EDA): Sweat, Excitement •  Blood Volume Pressure (BVP): Arousal •  Pupil Dilation: Arousal and Mental Workload •  Respiration: Negative Valence or Arousal 15
  16. 16. boston upaPrEmo Emotional Measurementhttp://www.premo-online.comDr. Pieter Desmet,Technical University of Delft 16
  17. 17. boston upaProduct Reaction Cards (Our Selected Approach) 17
  18. 18. boston upaProduct Reaction Cards Method 18
  19. 19. boston upaBefore You BeginDetermine intended brand attributes (and their opposites)1.  Leverage existing marketing/brand materials2.  Alternatively, stakeholder brainstorm to identify key brand attributes/descriptors using full list of product reaction cards as a start3.  Tip: If the brand was a person, how would it speak to your customers? 19
  20. 20. boston upaProcess - ConductingMethodology1.  Include 60/40 split of positive and negative words2.  Target 60 words, optimized to test brand3.  Simple question: Which of the following words do you feel best describe the site/design/product (please select 5):4.  One comp per participant, or multiple comps per participant (no more than 3)Participants1.  Qualitative: Paired with usability testing2.  Quantitative: Target minimum of 30 per option if possible 20
  21. 21. boston upaProcess - Analyzing1.  Calculate percentage of positive and negative attributes per design 68% Positive 32% Negative2.  Visualize overall sentiment of feedback using word clouds (see Use word list spreadsheet available at 21
  22. 22. boston upaCase Study: Greenwich Hospital Website Redesign 22
  23. 23. boston upaCase Study: Greenwich Hospital Website RedesignBackground and Goals• Align the website with the character of Greenwich Hospital •  luxurious, approachable, friendly, capable, multi-cultural/inclusive, established• Update the site after nearly 10 years• Counter impressions that Greenwich Hospital is more than just about maternityand elder care, without damaging those notions• Communicate that they are long-standing members of the community 23
  24. 24. boston upaCase Study: Greenwich Hospital Website RedesignMethodology• 3 visually designed comps• 50 people reacted to each comp (quantitative) via survey• Additional feedback obtained via participant interviews (qualitative)Survey QuestionsHello, I am requesting feedback on a website I am working on.Your answers let me know if the site is conveying the right feel.1. What are your initial reactions to the web site?2. Which of the following words best do you feel best describe the site (pleaseselect 5): 24
  25. 25. boston upaThree Different Visual Designs 25
  26. 26. boston upaResults: Concept 1 My initial reaction to this web site is that it seems kind ofplain. There is not much going on in the page, and the 12% Negativecolors seem kind of drab. This is a nice looking website. It is well designed, welllaid out, and is appealing to look at. It makes me want to 88% Positivecontinue to navigate the site to learn more. 26
  27. 27. boston upaResults: Concept 2 Men don`t really go with children… where`s ababy, there must be a mother. 87% Positive My initial reaction to the website is that it 13% Negativeseems very clean and modern. I like the layout,it looks like its easy to find information. 27
  28. 28. boston upaResults: Concept 3 I felt love. I saw a mother holding a child..that`s pretty touchy. The site looks good, and itmakes the hospital trustworthy. 5% Negative My initial reaction was that the Hospital isrepresented by a caring, warm and friendly 95% Positivewebsite. 28
  29. 29. boston upaLessons Learned 29
  30. 30. boston upaLessons LearnedMethodology•  Mix of qualitative and quantitative is key. Qualitative helps provide color to the results, quantitative resonates with stakeholders and executives•  Position results as one form of input to decision-making process, not declaring a winner•  Simple, cost-efficient way to assess audience s emotional response to a design 30
  31. 31. boston upaKey Take AwaysThe Challenge:•  Measuring emotional responses to a design important, but complex. Experiences of a visual design are multifaceted, and a number of design aspects can impact their response to a product.•  There are a number of alternatives available to measure emotional responseOur Experience:•  Leveraging Product Reaction Cards provides a low-cost, low-effort means to help us align aesthetics and general feel with desired brand attributes 31
  32. 32. boston upaThank YouDocumentation Have a question?Case Study results and full presentation slides: Michael Hawley @hawleymichael Megan Grocki @megangrocki 603-436-7177 32
  33. 33. boston upaAdditional ReadingBenedek, Joey and Trish Miner. Measuring Desirability: New Methods forEvaluating Desirability in a Usability Lab Setting. Proceedings of UPA 2002Conference, Orlando, FL, July 8–12, 2002., Gitte, Gary Fernandes, Cathy Dudek, and J. Brown. "Attention WebDesigners: You Have 50 Milliseconds to Make a Good First Impression!" Behaviourand Information Technology, 2006., Christian. Desirability Studies: Measuring Aesthetic Response to VisualDesigns., October 28, 2008. Retrieved February 10, 2010. 33
  34. 34. boston upaAdditional ReadingUser Focus. "Measuring satisfaction: Beyond the Usability Questionnaire." RetrievedFebruary 10, 2010. "Guide to Low-Cost Usability Tools." Retrieved May 12, 2010., Thomas and Jacqueline Stetson. A Comparison of Questionnaires forAssessing Website Usability. Usability Professionals Association Conference,, S. J., E. Sutherland, L. Robinson, H. Powell, and G. Tuck. A Multi-method Approach to the Assessment of Web Page Designs. Proceedings of the 2ndinternational conference on Affective Computing and Intelligent Interaction, 2007.http:// 34
  35. 35. boston upaAdditional ToolsFive Second Test Armyhttp://www.feedbackarmy.comWordlehttp://www.wordle.netPrEmo 35