Ignite combined uxpa2013_2013-7-10


Published on

UXPA Annual Conference Ignite! sessions were 5-minute rapid talks that included 5-7 speakers that focused their talk around a central session theme.

Published in: Technology, Business
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Creating a structured environment for participants. Provide participants with specific tasks to complete. Measure performance on tasks as well as collect other forms of data when in the lab.
  • Customizes the testing session depending on the participant’s behaviors
  • Participants provide feedback collaboratively.
  • Unmoderated tests provide participants with a list of tasks to perform and the session is recorded for the researcher to observe or collect data from later.
  • Surveys can try to uncover attitudes or behavior. Surveys can be given as a stand alone tool or be supplemental to other research methods.
  • There are three components of in-lab testing that make it a valuable way to assess usability, cognitive theory, or any hypothesis for that matter. With in lab testing you have strict experimental control over your environment. From that experimental control you can then test your hypotheses in controlled, confound-free way. Additionally, in-lab tests allow you to utilize technologies that are not necessarily portable. I’ll go through each of these three facets of in-lab testing in the hopes of convincing you how vital it is for any testing situation.
  • In your study if you are interested in task completion time, first impressions, or satisfaction, extraneous distractions like construction noises, appliances at home, and children can affect how a person perceives their interaction with a product. This is really the highlight of in-lab testing, the fact that you can create a confound free environment. This will help down the road when you are comparing experimental groups and looking for significant results in your data analysis.
  • Physiological assessment techniques are very exciting because they provide researchers with a way to objectively assess human cognition. Unlike subjective opinions or memory tests, physiological metrics are not subject to response bias or the malleability of memory.
  • Ethnography is about understanding people’s experience in their daily world, with all of their communication tools, systems, and collaborators that they typically have available to them. It’s a very versatile approach.
  • Ethnography is rooted in the anthropological studies of non-western cultures. Simple observational research wasn’t enough for them and it shouldn’t be enough for us. To gain a better understanding, researchers needed to get directly involved in the lives of the people they were studying.
  • We need to jump into the crowd and get directly involved in the conversation. We can gain insights that can’t be discovered from out-of-context situations.
  • Because real life can be unpredictable. And it’s difficult to understand user behavior when we are comfortable in our offices. We need to take a chance, get out there, and see what problems are thrown user’s way.
  • People don’t use technology in a vaccum - they use it with other people. Better understanding our user groups can inform us about how to better or create effective design. This research method can get at these group interaction subtleties that other methods just can’t.
  • We can gain better insights from this contextual information in the field. We can learn about time and social pressures that would be difficult to uncover if the study was conducted in the lab or another contrived environment.
  • We can take on different roles as the researcher depending on the information we need to acquire. I understand that usually we want to take on more of a strict observational role but if necessary we can take on more of a participatory role if the research requires.
  • And you don’t need to wait for your participants to come to you. You can go to them. To get representative data, you may need to go to some pretty odd places, because your users, may just not be interested in taking the time to participate in a focus group or usability test after work.
  • And if someone says that it’s too subjective, it’s just not true. We can triangulate our data using evidence from multiple perspectives to increase our confidence in our findings. Unlike interviews and focus groups, ethnography doesn’t need to be limited to a single session.
  • If we are designing an app for travel, we can study a group of tourists as they navigate DC. We can find out the type of information that is important to them and the areas of opportunity to make their travel experience better.
  • Because we are interacting with participants at a more personal level, we can use this information to build detailed personas that truly capture user behavior.
  • If we have the goal of building a hospital information system, this is the best method to understand system requirements and user needs. In this scenario, we would need to understand the context surrounding the system and all of the users of the system.
  • At its core ethnography is an inductive reasoning tool. Using it, we can generate ideas from the bottom-up and help us realize an opportunity that we wouldn’t have thought of without getting directly involved in the conversation with participants.
  • And to research a user group, we don’t necessarily need to be in the field. Social media and discussion boards allow us to use ethnographic research tools to gain better understanding of groups without even having to leave the office.
  • All of this makes ethnography a versatile tool – it’s the swiss army knife of research methods. We can gather a lot of rich disparate information about users, context, and interactions. Other methods are like hammer, they’re great for one thing.
  • Like usability testing, it can tell us what works and what doesn’t. We can support or reject our hypotheses, we can learn about trouble areas and opportunities for future development. (We can compare the interface to benchmarks and evaluate its effectiveness). Usability tests and surveys are mostly deductive reasoning tools.
  • http://www.flickr.com/photos/dmuth/5910828458/When we run a usability test, we have a predefined set of questions that we enter with. As researchers, we have expectations. It’s like playing where’s Waldo you’ll always find what you are looking for.
  • https://commons.wikimedia.org/wiki/File:Istanbul_spice_bazaar_02.jpgWith ethnography, we don’t need to have hypotheses or structured rules to play by. It’s like shopping at a market where you can discover something new every time.
  • Ignite combined uxpa2013_2013-7-10

    1. 1. 2013 2013
    2. 2. 2013 Jon Strohl Fors Marsh Group This is why my UX research method rocks!
    3. 3. 2013 What it is Ignite is a geek event in over 100 cities worldwide. At the events Ignite presenters share their personal and professional passions, using 20 slides that auto-advance every 15 seconds for a total of just five minutes.
    4. 4. 2013 • The optimal UX research method depends on many factors. • The question being asked • The desire for attitudinal or behavioral data • The point of development of the design • The development process • The resources available • Desire for qualitative or quantitative data • There is no single UX research method that can answer all of your questions. • Best to take a toolkit approach
    5. 5. 2013 • For the purposes of this session though, we are throwing level headedness to the wind! • Presenters have been asked to select a UX research method that they are passionate about and campaign for your vote for the best UX Research Method.
    6. 6. 2013 • Presenters have been given free reign as to how to structure their argument. • Presenters can play nice.
    7. 7. 2013 • Or play dirty
    8. 8. 2013 • Why use this format? • Allows audience to hear the best points of the different methods • Provides insights into an argument that you could make for using a particular method • Promotes a group discussion
    9. 9. 2013 David Hawkins • User Experience Researcher at Fors Marsh Group • @dHawk87 Presentation: Structured in- lab usability testing • Typically the first method thought of when usability testing is discussed
    10. 10. 2013 Cory Lebson • Principal User Experience Consultant at Lebsontech • @corylebson Presentation: Free form in- person usability testing • A spin-off of structured testing
    11. 11. 2013 Heather Gay • Director of Research at Media Barn Inc. • @heatherlgay Presentation: Focus Groups • Participants discuss a topic in a group session
    12. 12. 2013 Jonathan Strohl • UX Researcher at Fors Marsh Group • @jonstrohl Presentation: Ethnography • Observing and interviewing participants in their natural environment.
    13. 13. 2013 Jennifer Romano Bergstrom • User Experience Lead at Fors Marsh Group • @romanocog Presentation: Moderated and unmoderated remote usability testing • Performed at a different location than the participant
    14. 14. 2013 Bryan Wiggins • Senior Research Associate at Fors Marsh Group • @bwigginsfmg Presentation: Surveys • Participants respond to written questions
    15. 15. 2013 The Full Lineup Structured In-Lab Testing David Hawkins Fors Marsh Group Free-form Testing Cory Lebson Lebsontech Focus Groups Heather Gay Media Barn Ethnography Jon Strohl Fors Marsh Group Remote Testing Jennifer Romano Bergstrom Fors Marsh Group Surveys Bryan Wiggins Fors Marsh Group
    16. 16. 2013 This is Why Structured In-lab Testing Rocks! David Hawkins, Fors Marsh Group @dhawk87 @dHawk87
    17. 17. 2013 Laboratories are Simulators @dHawk87
    18. 18. 2013 Why test in a lab? @dHawk87
    19. 19. 2013 High-risk Environments @dHawk87
    20. 20. 2013 Consistency @dHawk87
    21. 21. 2013 Experimental Control @dHawk87
    22. 22. 2013 Valid Conclusions @dHawk87
    23. 23. 2013 Direct Comparisons @dHawk87
    24. 24. 2013 Distraction Free Environments @dHawk87
    25. 25. 2013 Structured Tasks ๏ All participants complete the same tasks in the same order. @dHawk87
    26. 26. 2013 Naturalistic Task Flow ๏ Design tasks in a logical progression. @dHawk87
    27. 27. 2013 Randomize and Counterbalance @dHawk87
    28. 28. 2013 Observe your product in use @dHawk87
    29. 29. 2013 Physiological Assessment @dHawk87
    30. 30. 2013 Memory is fallible @dHawk87
    31. 31. 2013 Eye Tracking @dHawk87
    32. 32. 2013 Galvanic Skin Response @dHawk87
    33. 33. 2013 Electroencephalograph (EEG) @dHawk87
    34. 34. 2013 Conclusions Structured in-lab testing gives you: ๏ experimental control ๏ objective measurements ๏ valid conclusions @dHawk87
    35. 35. 2013 And THAT is Why Structured In-lab Testing Rocks! @dHawk87
    36. 36. David Hawkins – @dHawk87 Cory Lebson - @corylebson Heather Gay - @heatherlgay Jon Strohl - @jonstrohl Jennifer Romano Bergstrom - @romanocog Bryan Wiggins - bwigginsfmg This is Why My UX Research Method Rocks!
    37. 37. 2013 This is Why Free-Form Testing Rocks! @corylebson
    38. 38. 2013 We are hired to add value! @corylebson
    39. 39. 2013 We are not robots. @corylebson
    40. 40. 2013 Trapped by our script @corylebson
    41. 41. 2013 Let’s think outside the box @corylebson
    42. 42. 2013 Go with the flow. @corylebson
    43. 43. 2013 Connect with participants. @corylebson
    44. 44. 2013 Feel what they are feeling. @corylebson
    45. 45. 2013 And adapt. @corylebson
    46. 46. 2013 Synch with who they are. @corylebson
    47. 47. 2013 Will you corrupt your data? @corylebson
    48. 48. 2013 This isn’t quantitative! @corylebson
    49. 49. 2013 Focus on your end goal. @corylebson
    50. 50. 2013 You’ll gain more insights. @corylebson
    51. 51. 2013 Have more of an adventure @corylebson
    52. 52. 2013 Make stakeholders happy. @corylebson
    53. 53. 2013 Because you went beyond. @corylebson
    54. 54. 2013 But set expectations. Please explain! @corylebson
    55. 55. 2013 And let your report jump out! @corylebson
    56. 56. 2013 To recap: be flexible and you’ll be a superhero. And THAT is Why Free-Form Testing rocks! @corylebson
    57. 57. 2013 David Hawkins – @dHawk87 Cory Lebson - @corylebson Heather Gay - @heatherlgay Jon Strohl - @jonstrohl Jennifer Romano Bergstrom - @romanocog Bryan Wiggins - bwigginsfmg This is Why My UX Research Method Rocks!
    58. 58. 2013 This Is Why Focus Groups Rock! Heather Gay Director of Usability Research Mediabarn Inc. @heatherlgay
    59. 59. 2013 Do It as a Group @heatherlgay
    60. 60. 2013 Have something to test? @heatherlgay
    61. 61. 2013 What is a Focus Group? @heatherlgay
    62. 62. 2013 User differences @heatherlgay
    63. 63. 2013 In-person groups @heatherlgay
    64. 64. 2013 Online groups @heatherlgay
    65. 65. 2013 You don’t know what you don’t know @heatherlgay
    66. 66. 2013 Why do a group? ๏ You don’t have anything for users to use ๏ You’re not really sure what to build ๏ You want to test your initial assumptions ๏ You need fuel to innovate ๏ You need to acquire data to create a UX plan ๏ You need to identify important research questions @heatherlgay
    67. 67. 2013 @heatherlgay
    68. 68. 2013 Exploratory Discussion Barriers Perceptions DriversAttitudes Feelings Brand Product Concept @heatherlgay
    69. 69. 2013 Talking about features @heatherlgay
    70. 70. 2013 what do you Really want? @heatherlgay
    71. 71. 2013 How do you compare? @heatherlgay
    72. 72. 2013 What drives behavior? @heatherlgay
    73. 73. 2013 Find the Gems @heatherlgay
    74. 74. 2013 General to specific @heatherlgay
    75. 75. 2013 High-Level, Rich Feedback @heatherlgay
    76. 76. 2013 Best UX research method ๏ Organic conversation in a group setting ๏ Online or in person ๏ Provide rich information on: - The reasons why users make decisions - The types of features that are important to users - How users perceive you vs. your competition @heatherlgay
    77. 77. 2013 And THAT is why Focus Groups rock! Heather Gay www.mediabarnresearch.com Twitter: @heatherlgay @heatherlgay
    78. 78. 2013 David Hawkins – @dHawk87 Cory Lebson - @corylebson Heather Gay - @heatherlgay Jon Strohl - @jonstrohl Jennifer Romano Bergstrom - @romanocog Bryan Wiggins - bwigginsfmg This is Why My UX Research Method Rocks!
    79. 79. 2013 @jonstrohl This is Why Ethnography Rocks! Jon Strohl
    80. 80. 2013 @jonstrohl
    81. 81. 2013 @jonstrohl
    82. 82. 2013
    83. 83. 2013 @jonstrohl
    84. 84. 2013 @jonstrohl
    85. 85. 2013 @jonstrohl
    86. 86. 2013 @jonstrohl
    87. 87. 2013 @jonstrohl
    88. 88. 2013 @jonstrohl
    89. 89. 2013 @jonstrohl
    90. 90. 2013 @jonstrohl
    91. 91. 2013 @jonstrohl
    92. 92. 2013 @jonstrohl
    93. 93. 2013 @jonstrohl
    94. 94. 2013 @jonstrohl
    95. 95. 2013 @jonstrohl
    96. 96. 2013 @jonstrohl
    97. 97. 2013 @jonstrohl
    98. 98. 2013 And That is Why Ethnography Rocks! @jonstrohl
    99. 99. 2013 David Hawkins – @dHawk87 Cory Lebson - @corylebson Heather Gay - @heatherlgay Jon Strohl - @jonstrohl Jennifer Romano Bergstrom - @romanocog Bryan Wiggins - bwigginsfmg This is Why My UX Research Method Rocks!
    100. 100. 2013 This Is Why Remote Testing Rocks! Jennifer Romano Bergstrom User Experience Research Leader Fors Marsh Group @romanocog
    101. 101. 2013 Remote Testing = Running sessions from afar @romanocog
    102. 102. 2013 Moderated Remote Testing Un-moderated Remote Testing @romanocog
    103. 103. 2013 Working from home @romanocog
    104. 104. 2013 Observer, unseen Participa nt at home Moderator at the office @romanocog
    105. 105. 2013 Participants in natural environment @romanocog
    106. 106. 2013 Busy participant in natural environment @romanocog
    107. 107. 2013 Hard to reach populations Hard to reach participants @romanocog
    108. 108. 2013 Hard to reach participants in natural environment @romanocog
    109. 109. 2013 Un-moderated Sessions = Reach lots of people fast @romanocog
    110. 110. 2013 Interview a lot of people quickly Un-moderated Sessions = Lots of data fast @romanocog
    111. 111. 2013 Sample from the country…or the world Collect data from participants in your hometown @romanocog
    112. 112. 2013 Collect data from participants all over the country @romanocog
    113. 113. 2013 Collect data from participants all over the world @romanocog
    114. 114. 2013 No travel costs @romanocog
    115. 115. 2013 No traffic @romanocog
    116. 116. 2013 No annoying fellow travelers @romanocog
    117. 117. 2013 No getting sick @romanocog
    118. 118. 2013 No jet lag @romanocog
    119. 119. 2013 More time for family, friends, and fun @romanocog
    120. 120. 2013 And THAT is why Remote Testing rocks! Jennifer Romano Bergstrom www.forsmarshgroup.com @forsmarshgroup @romanocog
    121. 121. 2013 David Hawkins – @dHawk87 Cory Lebson - @corylebson Heather Gay - @heatherlgay Jon Strohl - @jonstrohl Jennifer Romano Bergstrom - @romanocog Bryan Wiggins - bwigginsfmg This is Why My UX Research Method Rocks!
    122. 122. 2013 Web Surveys as Part of the UX Process Bryan Wiggins, Fors Marsh Group This is Why Surveys Rock! @bwigginsfmg
    123. 123. 2013 Types of Web Surveys ๏ Site Intercept ๏ Surveys of Registered Users ๏ Customer Satisfaction Surveys @bwigginsfmg
    124. 124. 2013 Site Intercept Surveys ๏ Pros: Actual visitors ๏ Cons: Nonresponse bias @bwigginsfmg
    125. 125. 2013 Surveys of Registered Users ๏ Pros: Engaged users ๏ Cons: Must create an account @bwigginsfmg
    126. 126. 2013 Customer Satisfaction Surveys ๏ Pros: Actual customers, can evaluate product or service ๏ Cons: Why didn’t customers purchase? Photo by: McFlickr @bwigginsfmg
    127. 127. 2013 Advantages of Web Surveys ๏ 1. Actual Users/Customers Photo by: dustball @bwigginsfmg
    128. 128. 2013 Advantages of Web Surveys ๏ 2. Real-Time Feedback @bwigginsfmg
    129. 129. 2013 Advantages of Web Surveys ๏ 3. Large amount of feedback Photo courtesy of: Boston Public Library @bwigginsfmg
    130. 130. 2013 Disadvantages of Web Surveys ๏ 1. Nonresponse bias Photo courtesy of: Boston Public Library Photo by: Bahai.us @bwigginsfmg
    131. 131. 2013 Disadvantages of Web Surveys ๏ 2. How do you fix the problem? Photo courtesy of: Boston Public Library Photo by: Bahai.us Photo by: osseous @bwigginsfmg
    132. 132. 2013 Surveys as Part of Evaluation Process User Surveys @bwigginsfmg
    133. 133. 2013 Evaluation Method: Focus Groups ๏ Pros: Gather feedback at various stages of development ๏ Cons: Not actual users or specific issues @bwigginsfmg
    134. 134. 2013 Evaluation Method: In-Person Usability Testing ๏ Pros: Feedback on specific site issues ๏ Cons: Testing environment @bwigginsfmg
    135. 135. 2013 Evaluation Method: Remote Usability Testing ๏ Pros: Access to diverse/busy/hard to recruit groups ๏ Cons: Less control and flexibility @bwigginsfmg
    136. 136. 2013 Evaluation Method: Free Form Testing ๏ Pros: Greater flexibility ๏ Cons: Not structured or quantitative @bwigginsfmg
    137. 137. 2013 Evaluation Method: Ethnography ๏ Pros: Naturalistic ๏ Cons: Time consuming @bwigginsfmg
    138. 138. 2013 Quantitative vs. Qualitative Methods @bwigginsfmg
    139. 139. 2013 Developmental and Evaluative Methods @bwigginsfmg
    140. 140. 2013 Unified Approach to UX Photo courtesy of: Boston Public Library Photo by: Bahai.us Photo by: Sweetie187 @bwigginsfmg
    141. 141. 2013 Web Surveys Are One Piece of the Puzzle Photo courtesy of: Boston Public Library Photo by: Bahai.us Bryan Wiggins bwiggins@forsmarshgroup.com Photo by: yann.co.nz @bwigginsfmg
    142. 142. 2013 David Hawkins – @dHawk87 Cory Lebson - @corylebson Heather Gay - @heatherlgay Jon Strohl - @jonstrohl Jennifer Romano Bergstrom - @romanocog Bryan Wiggins - bwigginsfmg This is Why My UX Research Method Rocks!