Delivering Results: How Do You Report User Research Findings?

2,091 views

Published on

The long, textual written report is dead, isn’t it? So how do you deliver your findings to your clients? Is it PowerPoint? An email? A spreadsheet? Post-it notes? And what do you include? Positive findings? Screenshots with callouts? Just issues? Or recommendations as well? Are they prioritized? If you ask our panelists, some of us have developed templates that we use and modify for each research activity, and others change the deliverable based on the activity and client.

Published in: Technology, Design
0 Comments
5 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,091
On SlideShare
0
From Embeds
0
Number of Embeds
64
Actions
Shares
0
Downloads
31
Comments
0
Likes
5
Embeds 0
No embeds

No notes for slide

Delivering Results: How Do You Report User Research Findings?

  1. 1. May 7, 2012Delivering  Results:  How  Do  You  Report  User  Research  Findings?    Jen McGinn  Eva Kaniasty  Dharmesh Mistry  Kyle Soucy  Carolyn Snyder  Steve Krug  Bob Thomas
  2. 2. May 7, 2012Delivering  Results:  How  Do  You  Report  User  Research  Findings?  The long, textual written report is dead, isn’t it? So how do you deliver yourfindings to your clients? Is it PowerPoint? An e-mail? A spreadsheet? Post-itnotes? And what do you include? Positive findings? Screenshots with callouts?Just issues? Or recommendations as well? Are they prioritized?
  3. 3. May 7, 2012Panelists  If you ask our panelists, some of us have developed templates that we useand modify for each research activity, and others change the deliverablebased on the activity and client. Each panelist will spend 3-5 minutesshowing you their typical deliverables, and then we’ll open the floor foraudience Q&A.  Jen McGinn, Principal Usability Engineer, Oracle  Eva Kaniasty, Founding Principal, RedPill UX  Dharmesh Mistry, Usability Specialist, Acquia  Kyle Soucy, Founding Principal, Usable Interface  Carolyn Snyder, Founding Principal, Snyder Consulting  Steve Krug, Founding Principal, Advanced Common Sense
  4. 4. May 7, 2012Jen McGinnPrincipal Usability Engineer, Oracle
  5. 5. May 7, 2012Overview    I’ve worked at hardware and software companies, conducting research on phones, Macs, and PCs  I present my research results in one of two ways, neither of which is a long, written report in Word   RITE-Krug study: bullet points at the bottom of a wiki page   Traditional Study: slides in 60-minute meeting (generally remote, via web conference)  I’m going to spend 10 seconds showing you a wiki page, and 2 minutes walking you through the structure of one of my PowerPoint presentations  Then I’ll summarize the take-aways 
  6. 6. May 7, 2012What  I  call  RITE-­‐Krug  TesCng  for  Agile    3  or  4  par(cipants    Prototype  will  likely  change     between  par(cipants    Stakeholders  a9end  every     session  and  a  debrief     mee(ng  in  a  single  day    A>er  the  debrief  mee(ng,  a     list  of  items  that  the  designers     will  change  is  posted  on  the     wiki  page  
  7. 7. May 7, 2012
  8. 8. May 7, 2012ExecuCve  Summary    In [When?], the [what product?] was tested by [number and type of participants] in [method type] to evaluate the ease of use of several features including [features or use cases].  High level findings included [usually a total of 3 to 4 bullets]: •  [ 1 - 2 biggest positive findings] •  [ 1 - 2 biggest positive findings] •  [ 2 or 3 biggest usability issues] •  [ 2 or 3 biggest usability issues]  This presentation covers all of the findings and subsequent recommendations.
  9. 9. May 7, 2012Agenda    Goals  Tasks  Participants  Findings  Recommendations  Next Steps
  10. 10. May 7, 2012Goals  Evaluate the usability of the following features of theU-Haul.com website:  Are users confused about how to price a rental? A storage unit?  How do users react to the insurance options? Do they understand the coverage?  How do users feel about the presentation of items for purchase or for rent?  How effective is the shopping cart content? Are users confused by when they need to pay for items?  Do users value the star ratings? U-Haul brand?  How do users feel about the targeted FAQ and search result pages?  Does our online documentation help prevent calls to the service center? Can they determine how to reach out to the U-Haul vendor nearest them?
  11. 11. May 7, 2012Tasks  1.  Get the price of a 1-way move across country2.  Find a specific piece of information in the FAQ3.  Determine the size and cost of a storage unit needed to hold specific items4.  Find the phone number of a U-Haul location5.  Book the truck (and insurance), adding rental items and purchased items6.  Determine insurance coverage7.  Find the U-Haul location nearest you
  12. 12. May 7, 2012ParCcipants  Participant Gender Age Occupation Web-savvy ID U1 Male 24 Missionary Average U2 Male 52 Small business manager Average U3 Female 62 62 Retired. Formerly Average television news producer, then licensed paralegal. U4 Female 36 Housewife Average U5 Male 31 Sales and marketing Average
  13. 13. May 7, 2012Findings  
  14. 14. May 7, 2012Choosing  a  Truck   2  par(cipants    had  this  issue  and   did   x  to  work  around  it   Another  issue   One  par(cipant   suggested  this  fix  
  15. 15. May 7, 2012Goals  and  QuesCons  Revisited      [All the same as before] Are users confused about how to price a rental? A storage unit?  How do users react to the insurance options? Do they understand the coverage?  How do users feel about the presentation of items for purchase or for rent?  How effective is the shopping cart content? Are users confused by when they need to pay for items?  Do users value the star ratings? U-Haul brand?  How do users feel about the targeted FAQ and search result pages?  Does our online documentation help prevent calls to the service center? Can they determine how to reach out to the U-Haul vendor nearest them?
  16. 16. May 7, 2012PosiCve  Findings  [these  always  come  first]    All  par(cipants  easily  found  the  links  to  the  FAQs  and  had  no  trouble  finding  the   answer  to  the  license  ques(on  under  FAQs      All  par(cipants  made  use  of  the  maps  when  comparing  op(ons.      All  par(cipants  did  scroll  down  to  compare  prices,  loca(ons  and  reviews      4  par(cipants  valued  the  presence  of  the  [higher]  star  ra(ngs    2  par(cipants  valued  U-­‐Haul  loca(on  more  than  the  off-­‐brand  vendors      2  par(cipants  were  pleased  that  the  truck  rental  page  "retained  her  informa(on"  -­‐-­‐   the  addresses  and  dates      2  par(cipants  appreciated  the  visuals  of  the  items  inside  the  storage  units  and  the   graphic  of  the  person  shown  in  the  small  unit  icon      2  par(cipants  easily  added  the  dolly,  blankets  and  boxes  during  the  truck  rental   task  flow  
  17. 17. May 7, 2012RecommendaCons  Priority   DescripCon   RecommendaCon   LocaCon   Re-­‐format  coverage  and   Par(cipants  dont  understand  what  the   High   exclusions  into  bulleted  lists;   Damage  coverage     purchased  insurance  actually  covers     Dont  use  legal  jargon   Par(cipants  have  a  very  hard  (me   High   Self  Storage  loca(on   es(ma(ng  the  storage  unit  size  that   Provide  more  user  assistance     details  page   would  meet  their  needs   Put  the  purchased  items  into     Up-­‐sell  process  for  items  to  rent  or   another  page  in  the  flow,  and   Addi(onal  rental  items,   Medium   purchase  is  confusing   make  it  clearer  that  users  can   Shopping  cart   opt  out.     Par(cipants  are  concerned  that  the  site   Add  a  link  to  display  the  map,   Select  your  preferred   Medium   is  incorrectly  calcula(ng  the  mileage   so  they  can  check  it  in  place   pickup  loca(on   and  therefore  overcharging   Par(cipants  were  not  sure  what   Display  the  distance  "from"   loca(on  the  giant  thumbtack/pin  was   Select  your  preferred   Low   the  specified  loca(on,  like  the   (address  or  zip  code)  or  how  far  away   pickup  loca(on,  Loca(on   Self-­‐storage  results  page   the  loca(ons  were  
  18. 18. May 7, 2012Next  Steps    Work with [which stakeholders or teams] to prioritize changes  Work with [stakeholders or teams] to design alternatives  Validate that the new designs address the issues with users
  19. 19. May 7, 2012Summary    Tell them what you’re going to tell them   Executive summary   Agenda   Goals/Questions  Tell them   Tasks & participants (sometimes methodology)   Animated slides for progressive disclosure   Screen shots annotated with findings  Tell them what you told them   Review goals of the research and the questions they were intended to answer   Positive findings (go slowly here)   Prioritized opportunities for improvement
  20. 20. May 7, 2012Eva KaniastyFounding Principal, RedPill UX
  21. 21. May 7, 2012Report  Formats  PPT: visually engaging but real-estate constrained (andwill force you to be brief). Formatting can be time-consuming.MS Word/Narrative: more room for context; quick, butcan appear dry and boring.3rd Option: No report.
  22. 22. May 7, 2012Deciding  Factors    Time/Budget  (Mode of) Presentation of Results  Company Culture / Industry  Stakeholder Involvement  Deliverable Shelf Life
  23. 23. May 7, 2012ReporCng  Findings  (1)  
  24. 24. May 7, 2012ReporCng  Findings  (2)  
  25. 25. May 7, 2012Dharmesh MistryUsability Specialist, AcquiaContent Management System Products built on DrupalOpen Source Software Open Source/ ProprietaryCommunity Start-up
  26. 26. May 7, 2012Deciding Factors Stakeholders Thousands of Stakeholders (New and Existing) Development Cycle ? Turn around time Weeks-Months Credibility Mix Reputation Tracking Issues Low-Medium Presenting Twitter, Conferences, Blog post, Drupal.org Provide No, never! recommendations
  27. 27. May 7, 2012Drupal.org Blog Post Conferences/ Videos
  28. 28. May 7, 2012 Supporting Main Detailed Tracking information Report Informationhttp://drupal.org/node/1289476 http://drupal.org/node/1399056 http://drupal.org/node/1399258 http://www.drupalusability.org/
  29. 29. May 7, 2012Deciding Factors Stakeholders 3-5 Development Cycle Agile (3 week sprints) Turn around time Hours/ Days/ Weeks Credibility Good Tracking Issues High-Very High Presenting Conference calls Provide Sometimes recommendations
  30. 30. May 7, 2012Email Reports Google Doc Reports
  31. 31. May 7, 2012Spreadsheet Reports
  32. 32. May 7, 2012Spreadsheet Reports
  33. 33. May 7, 2012Kyle SoucyFounding Principal, Usable Interface@kylesoucywww.usableinterface.com
  34. 34. May 7, 2012Formal  Usability  TesCng/Research  Report   ! ings ! Find ized egor cat y ! b a ges! or p scr eens
  35. 35. May 7, 2012 e re, ! , Wh Who ent! hat, tate m en, W hyWh d W S an 3 -4 ! ings! d itiv e FinPos 3-4 ! ings! d v e Fin N egati
  36. 36. May 7, 2012Findings:  Severity  RaCngs  
  37. 37. May 7, 2012Findings   Major Usability Problem
  38. 38. May 7, 2012Findings:  RecommendaCons  
  39. 39. May 7, 2012Highlight  Video  
  40. 40. May 7, 2012Observer  Debrief  Notes  
  41. 41. May 7, 2012Carolyn SnyderFounding Principal, Snyder Consulting•  There is no one “best” format•  Do what works for the client, culture, circumstances•  Steal good ideas, drop losers
  42. 42. May 7, 2012Formal  Text  Report:  “I’m  not  dead  yet!”   FindingSeverity ratingExplanation ofissueSupportingobservations fromnotesRecommendations
  43. 43. May 7, 2012PowerPoint,  Screen  Shots  with  Callouts   Interest in these Interest in these links AmbiguousImportant sentenceburied in paragraph (Imagine a screen shot here) Most people read this text; everyone drilled into [noun]People wantedconcrete, prioritizedadvice. Amount isn’t explicit. The user must do the math. People understoodCan’t explore the stacked bar[action]. People graphs,knew it wasimportant. Not clear why it showed 43 People understood the 2 variations of graph purpose
  44. 44. May 7, 2012PowerPoint  with  “report”  in  Notes  Field  
  45. 45. May 7, 2012SomeCmes  the  best  report  is…    …no  report     Can  you  do  something  more  useful  instead?  
  46. 46. May 7, 2012Steve KrugUsability Consultant, Advanced Common Sense
  47. 47. May 7, 2012Expert  Reviews  –  What  I  do    No report, no slides. Live remote walkthrough.  Gave up writing Big Honking Report years ago   I hate writing   I’m inherently lazy   Only real purpose seemed to be to justify cost   Mostly: I could get away with it (I have a book)  I tell clients up front:   I’ll report my observations in a GoToMeeting session   Encourage them to have all interested parties attend, question, argue   Option: Written report—for double the price
  48. 48. May 7, 2012Expert  Reviews  –  What  I  do    90-120 minute session  Strive for best audio (VOIP)  I walk through the site/app, doing narrative of observed issues (cf. Carol Barnum’s session on storytelling)  Limited to only the most serious problems (n < 10)  My recommendations for fixing them  Encourage them to get objections out of their system while I’m there to answer   Major weakness of written report: no dialogue  Record the session for their use later
  49. 49. May 7, 2012Expert  Reviews  –  What  I  do    I don’t accentuate the positive   Feels artificial, patronizing to me   We’re all grownups on this bus   I tend to be very encouraging anyway   “Getting it all right is very hard.”   “Everybody has these kinds of issues.”   “You can fix them.”
  50. 50. May 7, 2012Usability  Tests–  What  I  recommend    I don’t do them anymore; I teach other people to do them
  51. 51. May 7, 2012Usability  Tests–  What  I  recommend    Forget the report: GET THEM TO COME TO THE TESTS!   Most crucial success factor   Seeing is believing: watching makes converts   Many other good effects flow from watching as a group  Do whatever it takes to get them to come   Keep it brief (3 participants)   Keep it convenient (on-site)   Regular schedule (“A morning a month”)   THE BEST SNACKS MONEY CAN BUY!
  52. 52. May 7, 2012Usability  Tests–  What  I  recommend    “But I can’t get them to come…”   Please stop your incessant whining   Try harder  OK, yes, you can create a report   Two-page (max) bullet list email; 30 minutes to write   What we tested (site, prototype, etc.) with link to it   Tasks they did   Top three problems observed   Solutions to these problems, which will be implemented before next month’s tests   (Optional) Link to recordings
  53. 53. May 7, 2012Extra  Credit    Read Recommendations on Recommendations   Rolf Molich, Kasper Hornbæk, Steve Krug, Josephine Scott and Jeff Johnson   http://www.dialogdesign.dk/tekster/ Recommendations_on_Recommendations.pdf  Get Jen McGinn to share her report from CUE 9   Best in show, out of 19 seasoned UX pros   Try to figure out her secret sauce and imitate it
  54. 54. May 7, 2012Narrative on top of screenshots“N participants ________.”Participant quotesExcellent, terse writing)Key observations only
  55. 55. May 7, 2012QuesCons  1.  Do you change your delivery of usability results depending on your role as a internal/external consultant or as a company employee?2.  How important are positive vs. negative findings?3.  How have your reports changed over the years? Is there anything you do differently than when you first started writing them?4.  How do you categorize the findings in your reports? For example, do you categorize them by the page/screen, by the step in a certain process (e.g. checkout process), or by the task?5.  Lean UX is a trending topic. Have you had experience with Lean UX or Agile methods, and had to change the way you conduct research and deliver results?6.  What guidelines do you follow when writing recommendations or proposed solutions to problems?7.  Do you decide ahead of time how long a report should be and make an effort to keep it that length? If so, what dictates the length?8.  If you think a report is too long and needs to be trimmed down, how do you decide what to cut out?9.  What part of a report is the hardest for you to write?

×