Improve the quality of your customer research through use of effective research objectives, planning, and synthesis. Delivered as a CX training workshop in 2020.
The second lecture in the HIT Lab NZ Design Thinking class on understanding and empathising with end users.
Taught by Mark Billinghurst at the University of Canterbury on December 10th 2013.
ProductCamp Boston is the world's largest and most exciting
crowd-sourced one-day event for product people. It's
organized by and for product managers, product marketers and
entrepreneurs, so attendees get the most out of the day.
Attendees learn about and discuss topics in product
management and product marketing, product discovery,
product development & design, go-to-market, product strategy
and lifecycle management, and product management 101,
startups, and career development.
www.ProductCampBoston.org
Lecture given by Mark Billinghurst on Ideation Technique for the HITD 201 Design Thinking course. The lecture was given on December 16th 2013. The key things covered are Ideation Techniques from the book Idea Stormers, and how to use the formal method of TRIX for inventive problem solving.
The fourth lecture in the HITD 201 course. This lecture was taught by Mark Billinghurst at the University of Canterbury on Wednesday, December 11th 2013. It talks about how to generate problem solving ideas.
Organizing Your First Website Usability Test - WordCamp Toronto 2016Anthony D. Paul
You’ve built a shiny, new WordPress site. You asked your co-worker and your boss if they like it and they both do. However, you’re lying awake at night wondering if you’re missing something—because you know you’re not the end user. You yearn for actionable feedback. In this talk, I’ll distill my background in usability research into a how-to framework for taking your site and conducting your first unmoderated usability test. I’ll cover why and when you should be running usability tests; how to set research goals and draft a script for them; setting up your lab environment and capturing feedback; and best practices for facilitation, minimizing bias, keeping users on task and gleaning the most from each brief test.
The second lecture in the HIT Lab NZ Design Thinking class on understanding and empathising with end users.
Taught by Mark Billinghurst at the University of Canterbury on December 10th 2013.
ProductCamp Boston is the world's largest and most exciting
crowd-sourced one-day event for product people. It's
organized by and for product managers, product marketers and
entrepreneurs, so attendees get the most out of the day.
Attendees learn about and discuss topics in product
management and product marketing, product discovery,
product development & design, go-to-market, product strategy
and lifecycle management, and product management 101,
startups, and career development.
www.ProductCampBoston.org
Lecture given by Mark Billinghurst on Ideation Technique for the HITD 201 Design Thinking course. The lecture was given on December 16th 2013. The key things covered are Ideation Techniques from the book Idea Stormers, and how to use the formal method of TRIX for inventive problem solving.
The fourth lecture in the HITD 201 course. This lecture was taught by Mark Billinghurst at the University of Canterbury on Wednesday, December 11th 2013. It talks about how to generate problem solving ideas.
Organizing Your First Website Usability Test - WordCamp Toronto 2016Anthony D. Paul
You’ve built a shiny, new WordPress site. You asked your co-worker and your boss if they like it and they both do. However, you’re lying awake at night wondering if you’re missing something—because you know you’re not the end user. You yearn for actionable feedback. In this talk, I’ll distill my background in usability research into a how-to framework for taking your site and conducting your first unmoderated usability test. I’ll cover why and when you should be running usability tests; how to set research goals and draft a script for them; setting up your lab environment and capturing feedback; and best practices for facilitation, minimizing bias, keeping users on task and gleaning the most from each brief test.
Day 1 slides from a two-day workshop on UX foundations by Meg Kurdziolek and Karen Tang. Day 1 covered the building blocks of design process and design research methods.
Define Before Diving: An intro to Product StrategyAnna Youngs
Watch webinar here: https://youtu.be/RbpGjNh9Mj0
Defining your product and what you expect from it can be as important as creating the product itself. It is what allows a company to align their strategic vision with short-term and long-terms results, allowing companies to reach their users and market in a more direct and clear way, instead of producing a product whose strategy is too general and ambiguous.
Lydia and Anna, Product Design Managers at Novoda, gave a talk at Codurance on the essential concepts of product strategy and the steps to a product definition, the key phases and importance of design thinking and the innovation value it adds plus research methods and tools to analyse the obtained information. We also learn about the huge value of clear communication and good practices when working with the rest of the team.
This talk provides an enriching and useful insight for companies and stakeholders looking for a more effective way of making their vision a reality and wanting to know more about the components of a good product strategy.
Desirability Testing: Analyzing Emotional Response to a DesignMegan Grocki
In the design process we follow, once we have defined the conceptual direction and content strategy for a given design and refined our approach through user research and iterative usability testing, we start applying visual design. Generally, we take a key screen whose structure and functionality we have finalized—for example, a layout for a home page or a dashboard page—and explore three alternatives for visual style. These three alternative visual designs, or comps, include the same content, but reflect different choices for color palette and imagery. The idea is to present business owners and stakeholders with different visual design options from which they can choose. Sometimes there is a clear favorite among stakeholders or an option that makes the most sense from a brand perspective. However, there can often be disagreements among the members of a project team on which direction to choose. If we’ve done our job right, there are rationales for our various design decisions in the different comps, but even so, there may be disagreement about which rationale is most appropriate for the situation.
As practitioners of user-centered design, it is natural for us to turn to user research to help inform and guide the process of choosing a visual design. But traditional usability testing and related methods don’t seem particularly well suited for assessing visual design for two reasons:
1. When we reach out to users for feedback on visual design options, stakeholders are generally looking for large sample sizes—larger than are typical for a qualitative usability study.
2. The response we are looking for from users is more emotional—that is, less about users’ ability to accomplish tasks and more about their affective response to a given design.
With this in mind, we were very interested in articles we saw on Desirability Testing. In one article, the author posits desirability testing as a mix of quantitative and qualitative methods that allow you to assess users’ attitudes toward aesthetics and visual appeal. Inspired by his overview, we researched desirability studies a bit further and tried a modified version of the techniques on one of our projects. This presentation reviews the variants of desirability testing that we considered and the lessons we learned from a desirability study on visual design options for one of our projects. Interestingly, we found that while desirability testing did help us better understand participant’s self reported emotional response to a visual design, it also helped us identify other key areas of the experience that could be improved.
Abstract
Gone are the days, when innovation was believed to be the playground for experts and genius. IDEO consultants found in 1991 at Paolo Alto, California that is well known for innovations ranging from shopping carts to Apple mouse to medical devices, propagated that, “Anybody can innovate” via recipe called as, ‘Design thinking’. Design thinking is a user-centric approach that is a strategic department in many Organizations for Innovation and problem-solving. With scaled Agile framework promoting SAFe 5.0, Medical Device industry applies Design thinking in various areas of customer engagement, usability to ensure user needs are understood and met.
Day 2 slides from a two-day workshop on UX Foundations by Meg Kurdziolek and Karen Tang. Day 2 covered research methods that can be used throughout the design process to evaluate and validate design.
Slides from a glass on personas I gave at General Assembly Melbourne.
Might not make a lot of sense without commentary next time i will record it i promise.
Get things done : pragmatic project managementStan Carrico
Bitovi summer training camp presentation on communication and project / task management.
Roleplay dialog:
Version 1 (not the best)
PM : How is this new chart progressing? You have been working on it for two weeks and it needs to be complete by end of week.
Dev round A:
I'm working as fast as I can! I'm trying to get it done by the end of the week.
PM : Well, I'll check in with you again in a few hours.
Dev round A : I need more time than that, why don't you give me a day and then try me again?
PM : I need this to be done by Friday and it's already Thursday. How much longer do you need?
Dev round A : I don't know, but longer than a few hours..
__ Version 2 (better)
PM : How is this new chart progressing? You have been working on it for two weeks and it needs to be complete by end of week.
Dev round B : The chart consists of the plot, the axes and the css styles we're applying from the design mockup. I have completed the plot, and I estimate that the axis and applying styles will each take about 6 hours to complete. The plot took me longer than I expected. I think we should plan to demo the full chart on Monday.
PM : Can you update me when the axis and styles are done?
Dev round B : Sure. Does the business need to give us feedback on the plot, axis or styles? We can demo the plot now, the axis will be ready in the morning, and the styles will be applied and ready to demo on Monday morning.
PM : No, I think we need the complete product. I'll verify the don't need to give feedback on the pieces.
Dev round B : Ok, I'll send you a note when they are finished tomorrow evening, or I will update you before that if I run into any blockers.
References
The Pragmatic Programmer 1999 By Andrew Hunt and Dave Thomas
Team Geek 2012 By Brian W. Fitzpatrick, Ben Collins-Sussman
Head First Object-Oriented Analysis and Design 2006 By Brett McLaughlin, Gary Pollice, David West
The Agile Samurai 2014 Jonathan Rasmusson
Behind Closed Doors 2014 By Johanna Rothman, Esther Derby.
This presentation was given at a Design Thinking workshop as part of Philly Tech Week 2017. Topics covered include an intro to design thinking, a User Journey mapping activity, and a Team Design Challenge.
A mini workshop designed to prepare teams with the knowledge and practice they need to better understand their problems and project gaps, determine appropriate participants, ask the right qualitative questions, and gather information in an unbiased and thoughtful way.
Day 1 slides from a two-day workshop on UX foundations by Meg Kurdziolek and Karen Tang. Day 1 covered the building blocks of design process and design research methods.
Define Before Diving: An intro to Product StrategyAnna Youngs
Watch webinar here: https://youtu.be/RbpGjNh9Mj0
Defining your product and what you expect from it can be as important as creating the product itself. It is what allows a company to align their strategic vision with short-term and long-terms results, allowing companies to reach their users and market in a more direct and clear way, instead of producing a product whose strategy is too general and ambiguous.
Lydia and Anna, Product Design Managers at Novoda, gave a talk at Codurance on the essential concepts of product strategy and the steps to a product definition, the key phases and importance of design thinking and the innovation value it adds plus research methods and tools to analyse the obtained information. We also learn about the huge value of clear communication and good practices when working with the rest of the team.
This talk provides an enriching and useful insight for companies and stakeholders looking for a more effective way of making their vision a reality and wanting to know more about the components of a good product strategy.
Desirability Testing: Analyzing Emotional Response to a DesignMegan Grocki
In the design process we follow, once we have defined the conceptual direction and content strategy for a given design and refined our approach through user research and iterative usability testing, we start applying visual design. Generally, we take a key screen whose structure and functionality we have finalized—for example, a layout for a home page or a dashboard page—and explore three alternatives for visual style. These three alternative visual designs, or comps, include the same content, but reflect different choices for color palette and imagery. The idea is to present business owners and stakeholders with different visual design options from which they can choose. Sometimes there is a clear favorite among stakeholders or an option that makes the most sense from a brand perspective. However, there can often be disagreements among the members of a project team on which direction to choose. If we’ve done our job right, there are rationales for our various design decisions in the different comps, but even so, there may be disagreement about which rationale is most appropriate for the situation.
As practitioners of user-centered design, it is natural for us to turn to user research to help inform and guide the process of choosing a visual design. But traditional usability testing and related methods don’t seem particularly well suited for assessing visual design for two reasons:
1. When we reach out to users for feedback on visual design options, stakeholders are generally looking for large sample sizes—larger than are typical for a qualitative usability study.
2. The response we are looking for from users is more emotional—that is, less about users’ ability to accomplish tasks and more about their affective response to a given design.
With this in mind, we were very interested in articles we saw on Desirability Testing. In one article, the author posits desirability testing as a mix of quantitative and qualitative methods that allow you to assess users’ attitudes toward aesthetics and visual appeal. Inspired by his overview, we researched desirability studies a bit further and tried a modified version of the techniques on one of our projects. This presentation reviews the variants of desirability testing that we considered and the lessons we learned from a desirability study on visual design options for one of our projects. Interestingly, we found that while desirability testing did help us better understand participant’s self reported emotional response to a visual design, it also helped us identify other key areas of the experience that could be improved.
Abstract
Gone are the days, when innovation was believed to be the playground for experts and genius. IDEO consultants found in 1991 at Paolo Alto, California that is well known for innovations ranging from shopping carts to Apple mouse to medical devices, propagated that, “Anybody can innovate” via recipe called as, ‘Design thinking’. Design thinking is a user-centric approach that is a strategic department in many Organizations for Innovation and problem-solving. With scaled Agile framework promoting SAFe 5.0, Medical Device industry applies Design thinking in various areas of customer engagement, usability to ensure user needs are understood and met.
Day 2 slides from a two-day workshop on UX Foundations by Meg Kurdziolek and Karen Tang. Day 2 covered research methods that can be used throughout the design process to evaluate and validate design.
Slides from a glass on personas I gave at General Assembly Melbourne.
Might not make a lot of sense without commentary next time i will record it i promise.
Get things done : pragmatic project managementStan Carrico
Bitovi summer training camp presentation on communication and project / task management.
Roleplay dialog:
Version 1 (not the best)
PM : How is this new chart progressing? You have been working on it for two weeks and it needs to be complete by end of week.
Dev round A:
I'm working as fast as I can! I'm trying to get it done by the end of the week.
PM : Well, I'll check in with you again in a few hours.
Dev round A : I need more time than that, why don't you give me a day and then try me again?
PM : I need this to be done by Friday and it's already Thursday. How much longer do you need?
Dev round A : I don't know, but longer than a few hours..
__ Version 2 (better)
PM : How is this new chart progressing? You have been working on it for two weeks and it needs to be complete by end of week.
Dev round B : The chart consists of the plot, the axes and the css styles we're applying from the design mockup. I have completed the plot, and I estimate that the axis and applying styles will each take about 6 hours to complete. The plot took me longer than I expected. I think we should plan to demo the full chart on Monday.
PM : Can you update me when the axis and styles are done?
Dev round B : Sure. Does the business need to give us feedback on the plot, axis or styles? We can demo the plot now, the axis will be ready in the morning, and the styles will be applied and ready to demo on Monday morning.
PM : No, I think we need the complete product. I'll verify the don't need to give feedback on the pieces.
Dev round B : Ok, I'll send you a note when they are finished tomorrow evening, or I will update you before that if I run into any blockers.
References
The Pragmatic Programmer 1999 By Andrew Hunt and Dave Thomas
Team Geek 2012 By Brian W. Fitzpatrick, Ben Collins-Sussman
Head First Object-Oriented Analysis and Design 2006 By Brett McLaughlin, Gary Pollice, David West
The Agile Samurai 2014 Jonathan Rasmusson
Behind Closed Doors 2014 By Johanna Rothman, Esther Derby.
This presentation was given at a Design Thinking workshop as part of Philly Tech Week 2017. Topics covered include an intro to design thinking, a User Journey mapping activity, and a Team Design Challenge.
A mini workshop designed to prepare teams with the knowledge and practice they need to better understand their problems and project gaps, determine appropriate participants, ask the right qualitative questions, and gather information in an unbiased and thoughtful way.
Getting started with UX research October 2017.pptxCarol Rossi
You know you need customer insights to make good design decisions but without a dedicated researcher on your team how do you run the research? These tips will help you get started.
UXSA - Preparing for the Interview - 3-12-20Cherri Pitts
UX Researcher Cherri Pitts offers lots of information about prepping for user interviews followed by an engaging and active question and answer section.
Getting Started with UX Research OCUX Camp CRossi Aug 2017Carol Rossi
As user experience professionals, we all realize the importance of getting real insights from real users and not just making decisions based on a hunch. In this talk, you'll discover how to make those insights actionable within your company.
Requirements Engineering for the HumanitiesShawn Day
This workshop explores how requirements engineering can be employed by digital and non-digital humanities scholars (and others) to conceptualise and communicate a research project.
requirementsEngineeringAs the field of digital humanities has evolved, one of the biggest challenges has been getting the marrying technical expertise with humanities scholarly practice to successfully deliver sustainable and sound digital projects. At its core this is a communications exercise. However, to communicate effectively demands an ability to effectively translate, define and find clarity in your own mind.
Highlights from Just Enough Research by Erika Hall - User Experience Abu Dhab...Jonathan Steingiesser
The User Experience (UX) Abu Dhabi Meetup is a monthly gathering for UX practioners, UX fanatics and anyone curious about User Experience Design. All are welcome! UX Abu Dhabi is sponsored by UX UAE which looks to grow User Experience awareness and practice in the UAE and MENA.
This presentation was created for the October 2014 meetup and has highlights from the book Just Enough Research by Erika Hall .
Step Up Your Survey Research - Dawn of the Data Age Lecture SeriesLuciano Pesci, PhD
Most surveys are terrible. From poorly designed questions, to incoherent survey flow, to useless results, it’s no wonder data-driven organizations have so little faith in survey research. But this isn’t the fault of the tool, it’s because most surveys are built without adhering to some basic best practices, which once fixed can transform any survey from a zero to a hero. This lecture will show you how to create data-science quality surveys that provide unique and immediately actionable insight about your customers, competitors, and marketplace.
This Lecture Will:
-EXPLAIN THE DATA SCIENCE APPROACH TO SURVEY LAYOUT AND QUESTION DESIGN.
-HOW TO INCREASE RESPONSE AND COMPLETION RATES THROUGH ITERATIVE TESTING.
-LINKING SURVEY RESULTS TO OTHER DATA SOURCES TO ENRICH YOUR ANALYSIS.
You can watch this lecture here: https://youtu.be/WuBenXuVzqc
This was a 4-hour workshop that was given at World Usability Day Colombia. #wudco14
Summary:
Now more than ever is the survival of the easiest. Whether the product is a website or a handheld device, success depends largely on how easy it is to use. Usability testing is one of the most effective for creating an intuitive methods. By observing actual people when they use the product, you can get valuable insights if your design is easy to use. Attendees will learn how to conduct a usability test with end users of a product. This workshop is highly interactive and includes several practical exercises to give participants practical experience.
You will learn:
- How to plan a usability testing study
- How to define the goals and objectives
- Explore options (unmoderated usability testing vs. unmoderated & remote vs. in-person)
- How to recruit the right participants
- How to create tasks (Interview-based vs. predefined tasks)
- How to moderate a usability test
- How to analyze and report the results
NCV 3 Business Practice Hands-On Support Slide Show - Module 6Future Managers
This slide show complements the learner guide NCV 3 Business Practice Hands-On Training by Nickey Cilliers, published by Future Managers Pty Ltd. For more information visit our website www.futuremanagers.net
Cracking the Product Sense Interview by TikTok Product Leader.pdfProduct School
Interviewing for a Product Manager position is never a piece of cake. It takes experience, spectacular communication skills, and extensive prep. In this session, you’ll hear about the golden rules of interviewing that will help you score your dream job.
Main takeaways:
- Deep understanding of the problem and users
- Define a framework that works to solve the problem
- Ace the product sense interview
Unleash Your Inner Demon with the "Let's Summon Demons" T-Shirt. Calling all fans of dark humor and edgy fashion! The "Let's Summon Demons" t-shirt is a unique way to express yourself and turn heads.
https://dribbble.com/shots/24253051-Let-s-Summon-Demons-Shirt
Dive into the innovative world of smart garages with our insightful presentation, "Exploring the Future of Smart Garages." This comprehensive guide covers the latest advancements in garage technology, including automated systems, smart security features, energy efficiency solutions, and seamless integration with smart home ecosystems. Learn how these technologies are transforming traditional garages into high-tech, efficient spaces that enhance convenience, safety, and sustainability.
Ideal for homeowners, tech enthusiasts, and industry professionals, this presentation provides valuable insights into the trends, benefits, and future developments in smart garage technology. Stay ahead of the curve with our expert analysis and practical tips on implementing smart garage solutions.
Between Filth and Fortune- Urban Cattle Foraging Realities by Devi S Nair, An...Mansi Shah
This study examines cattle rearing in urban and rural settings, focusing on milk production and consumption. By exploring a case in Ahmedabad, it highlights the challenges and processes in dairy farming across different environments, emphasising the need for sustainable practices and the essential role of milk in daily consumption.
Book Formatting: Quality Control Checks for DesignersConfidence Ago
This presentation was made to help designers who work in publishing houses or format books for printing ensure quality.
Quality control is vital to every industry. This is why every department in a company need create a method they use in ensuring quality. This, perhaps, will not only improve the quality of products and bring errors to the barest minimum, but take it to a near perfect finish.
It is beyond a moot point that a good book will somewhat be judged by its cover, but the content of the book remains king. No matter how beautiful the cover, if the quality of writing or presentation is off, that will be a reason for readers not to come back to the book or recommend it.
So, this presentation points designers to some important things that may be missed by an editor that they could eventually discover and call the attention of the editor.
You could be a professional graphic designer and still make mistakes. There is always the possibility of human error. On the other hand if you’re not a designer, the chances of making some common graphic design mistakes are even higher. Because you don’t know what you don’t know. That’s where this blog comes in. To make your job easier and help you create better designs, we have put together a list of common graphic design mistakes that you need to avoid.
7 Alternatives to Bullet Points in PowerPointAlvis Oh
So you tried all the ways to beautify your bullet points on your pitch deck but it just got way uglier. These points are supposed to be memorable and leave a lasting impression on your audience. With these tips, you'll no longer have to spend so much time thinking how you should present your pointers.
2. Who is this for?
● This pack is aimed at people who are experienced with customer research
● It is for “People Who Do Research” which includes CX/UX Researchers but
also any other role that conducts research (e.g. PMs, Designers, BAs, etc.)
● This content may be less useful to beginners because it (1) has a lot of
assumed knowledge, and (2) moves at a relatively fast pace
● This was originally prepared for 2x workshops conducted over 2 days
5. Write a research brief
1. Read my article on research objectives
2. Before looking at the next slides write an outline for some upcoming research (1-page if
using a document or 3-5 slides if using a presentation)
○ If you don’t have any upcoming research then you could (a) Write a plan for some past research
that you feel could have been done better, or (b) Write a plan for some hypothetical research you
want to do.
○ TIP: You’ll get more out of this training if you focus on generative rather than evaluative research
(e.g. discovery rather than testing)
3. Spend some time articulating the assumptions that you are making (these should not
necessarily be in your brief)
6. Why is a research brief important?
● You should plan your research in collaboration with your stakeholders - and a brief is a
great way to get them involved without focusing on the minutiae
● Writing things down helps to clarify your thoughts
● It’s easier to notice higher level issues with a higher level plan
7. Research Brief Structure (Part 1)
Now rewrite your research brief using this structure
1. Project Objectives
○ This will normally come from stakeholders (though you may need to get started to give them
something to work with)
○ You may want to include some top-level project objectives. But the focus should be on what you
want to get out of this research for the project (e.g. understand what to design for a particular
problem, understand customer problems, validate a solution, understand if a design is usable, etc.)
2. Research Objectives
○ What are you trying to learn (making sure that what you learn will help you achieve your project
objectives)
○ What difference will it make if you achieve these objectives?
8. Research Brief Structure (Part 2)
3. Research Design
○ How will you structure your research to achieve your objectives?
○ What methodology will you use? Why?
○ Why this design over an alternative approach?
4. Participant Recruitment
○ How will you find participants?
○ How will you try to get an unbiased sample?
5. Admin Details
○ Timelines
○ Budget
○ Other admin (e.g. approval process, physical location / video chat tool, other tools you need, etc)
9. Review Your Brief
● Does this structure help you?
○ What would you change for your specific circumstances?
● How do you think your stakeholders would respond?
○ If they approve this do you think they would need to review your detailed research plan? E.g.
would you still get them to review your research questions / observation criteria?
● Do you think that writing this brief would lead to better research?
11. Mistake #1: Asking customers if they would use [idea]
● Two main ways this is done:
○ User Interviews - this is particularly bad because the sample size for interviews is going to be
small. You have no way of knowing if this will generalise.
○ Surveys - You may have a large sample size but are still asking the wrong question!
● Discuss: Why is this a mistake?
12. Personal Anecdote
● The way I previously found research participants:
1. Write a detailed participant brief
2. Send to an agency
3. They select participants from their panel and organise sessions
● One of those agencies showed me a prototype for a new product where:
1. You write a survey that they automatically send to their panel
2. You see all the responses and select participants that meet your criteria
3. You organise sessions with participants in the app
● My feedback: “Why would I come to you if I’m going to do all the work?”
● Fortunately they did not listen to my feedback and made the app anyway. I now use them
for all my participant recruitment.
Unprompted recommendation: if you’re in Australia try Askable
13. Mistake #2: Asking participants which design
they prefer
● Two main ways this is done:
○ Qualitative user testing
○ Post-testing survey (quant user testing)
● Why is this a mistake? A user’s expressed preference does not necessarily mean it would
be a better experience
○ Are you trying to understand (1) which experience is better, or (2) which design users prefer?
○ Discuss: why aren’t these the same thing?
14. Mistake #3: Asking users which features they want
● This is not necessarily a mistake but it depends on a lot of factors
● Listening to user feedback is good - doing discovery by asking users what they want is bad
● The biggest issues with asking users for features:
○ Users don’t know how to design for other users (that’s a designers job) - the feature they are
requesting might only work for their specific use case
○ Adding features adds complexity and makes it harder to organise information architecture - so
choosing feature priorities is an important task
○ Certain kinds of users are more likely to be vocal and these users may not reflect your actual user
base (e.g. experts, power users, and generally outspoken users)
○ What you want to know is the actual underlying problem that the user needs solved (which they
also might not be able to articulate without strong research design)
15. Mistake #4: Asking leading questions
● Conducting research learn specific things is great - but a bad way to do this is to ask
leading questions about the thing you want to learn
● For example: you want to know how people use a shopping list when doing their grocery
shopping
○ Bad: You end up asking “how do you use a shopping list?” (Discuss: why is this bad?)
○ Better: You ask users to describe how they go about grocery shopping - if they fail to mention a list
you ask “how do you decide what you to buy?”
● Leading questions bias responses and prevent you from uncovering interesting insights
(e.g. other ways that people might solve the underlying problem)
16. Mistake #5: Thinking your are validating a hypothesis
● This is especially bad when it comes to qualitative research - but the problem can also
come up with poorly design quant
● Small sample sizes make it difficult to accept/reject a hypothesis without making a lot of
assumptions
○ Sometimes this is justified. For example: you have a hypothesis that users will know how to use the
filters in an app. If (1) the filters follow common design patterns found on similar apps, (2) you
follow best practices for interface design, and (3) you have a diverse (though small) group of
participants. In this case you have a strong “prior” belief about your filters. Then during a user test
you observe participants use filters unprompted and find what they need. It’s valid to conclude
that your hypothesis is accepted but the real test will be usage/behavioural stats post-launch.
○ Discuss: why is this not always the case?
● The term “hypothesis” is loaded and has an academic feeling to it - it should be avoided
unless you have statistical justification for your conclusions
17. Mistake #6: Only researching the easy things
● If you set out to “research everything” what you end up doing is researching the easy
things that are low-risk and already have strong priors
● How to know if you are making this mistake:
○ You do a lot of user testing but almost never find (high priority) issues to fix
○ You find yourself repeating research that someone else has done before
○ You make artefacts like personas and journey maps but never refer back to them
○ You find it hard to justify your design decisions based on your research
○ You never cancel a project because of research
● This means that:
○ You complete easy / simple projects based on a lot of research
○ But harder/complex things are completed without justification
18. Mistake #7: Not being aware of your assumptions
● All customer research involves assumptions
● Example assumptions:
○ The participant understands your question (particularly for unmoderated research)
○ The participant is willing to give an honest answer to your question
○ The participant is capable of articulating their internal reasoning
○ That your participant comprehends the experience that they just had
○ That past behaviour is indicative of future behaviour
○ That your sample is unbiased
○ That your questions are unbiased
○ That it’s possible to measure an experience
○ That a better experience is necessarily better for the user (think about a good gambling experience
or a social media app that engages users to scroll endlessly)
19. Mistake #8: Not sense checking your research
● Have you reviewed your questions with a colleague or friend? Do they understand your
questions? Does the research flow well?
● If you are running a survey - have you asked the questions to some colleagues or friends
and had them explain the questions back to you?
● These sense checking exercises help reduce some of the previously mentioned
assumptions
20. Mistake #9: Not running a pilot study
● After spending a long time planning their work many researchers jump straight into their
study with back-to-back sessions
● Running a pilot study is a great way to identify potential issues and fix them before
engaging all of your participants
● For qualitative research this would be speaking to 1-2 participants
● For a survey this would be sending the survey out to 1-2% of the sample
21. Discussion: Avoiding these mistakes
● What do you think of these mistakes?
○ Have you made them before?
○ Do you disagree with any of them?
○ Are there any other mistakes that you’ve seen?
● Activity: Review your research brief
○ Some of these mistakes may already be apparent in your brief
○ Have you already made any of these common mistakes?
○ Your next step will be to turn this brief into a research plan. What steps do you think you could
take in your research plan to avoid making these mistakes?
23. Leading Words
● Your objective will have a leading verb which indicates what you think you will learn from
the research. Some example leading words:
○ Identify - e.g. identify problems, identify opportunities, identify which users, etc. Be aware of how
generalised your objective is. You are unlikely to identify all things but are more likely to identify
the kinds of things you are interested in
○ Understand - e.g. understand how a user might solve a problem, understand customers’ potential
mental models, etc. Again - are you trying to understand potential things or are you trying to
understand a general trend among all customers?
○ Determine - e.g. determine which approach is effective, determine whether or not users will be
interested. What kind of research do you need to determine something?
○ Rank - e.g. rank the importance of different pain points, rank potential features. How will you get
this ranking? How do you know this generalises to all customers?
24. Hedging
● From the previous slide you will notice that there is a lot of hedging when phrasing your
objectives
● You want to avoid objectives that suggest certainty. More certainty is possible but
requires more rigour in your research design
● Your research will help to de-risk your project by updating your prior beliefs about your
product or service. Note: de-risking does not guarantee future success
● Your objectives should accurately articulate what you are actually going to learn from
your research
25. Consider these examples and discuss what the research is actually telling you. In each circumstance compare
what you actually learn from the research against what you might want to learn in those circumstances.
1. Interview 6 participants about how they are planning an upcoming trip
2. Survey 300 potential customers asking them to rate 20 “pain points” on a 1-5 scale of importance (“not at
all important” to “extremely important”)
3. Spending 1 day observing people interact with a self-serve kiosk at a fast food restaurant
4. Creating a landing page for a new (potential) product where you ask users to register their interest - then
using online ads to send traffic to the page
5. Interviewing 10 participants for feedback on designs for a new digital product
6. User Testing with 5 participants with a prototype for a new checkout experience
7. Running an A/B test between two positions for filters for an ecommerce store (left-hand side or top)
8. Adding a button for a planned feature which takes users to a “coming soon” page and measuring the click
through rate
9. Use Experience Sampling on 30 participants to ask how they how they are planning/managing tasks at
random intervals throughout the day
Activity: What your research is actually telling you
26. ● Look at your product brief again and refine your research objectives
● Now focus specifically on your research design - does your research design actually help
you meet your research objectives?
● Discuss: what are the biggest challenges when converting research objectives into
research design?
Activity: Review Your Research Design
27. ● It can be hard to plan high quality research that actually tells you what you need to know
○ Your research should help you make informed decisions
● Poor quality research might lead to incorrect conclusions
● Discuss: Would it be better to do no research rather than bad research?
● Discuss: Suppose that instead of doing research you just built something (MVP) based on
assumptions and collected real data about usage.
○ What are the risks with this kind of approach?
○ When might you choose to do it anyway?
Do you even need to do research?
29. ● Has this changed your perspective on how to plan research?
● What do you think comes next after a research plan?
● What kinds of research methods could you use to meet your objectives? (rather than just
doing customer interviews all the time)
Possible questions to discuss / consider
32. ● How will you collect / document participant responses?
● How would you turn those responses into insights?
● Read my article on User Interview Analysis Methods
● Read my article on Task Analysis and consider whether it is right for your project
● Does thinking about the synthesis for your research change how you would ask
questions? (or make observations)
How will you synthesise your response?
33. ● Either use your research brief from the previous section or create a new one
● Write a research plan which includes the questions you are going to ask participants (for
observational research: how you will track your observations?)
● When you have finished your research plan review your questions against your research
objectives.
○ If the research went according to plan would you actually be able to meet your objectives?
○ Did you miss anything?
● Adjust your plan based on your review
Write a research plan
35. ● What we are actually learning from our research turns out to be different to what we
really want to learn
● Whether or not our research generalises hinges on assumptions
● These assumptions may be valid but we often have no way of knowing
● The only real way to know is to put something into production and see the effects
○ Side note: sometimes this too can be hard to determine
● All our research does is de-risk the effort we put in
○ De-risking does not guarantee success
● The level of rigour we put into our research is related to the amount of risk we are willing
to take
○ Larger / more complex projects should require more rigour
○ Projects which cannot be launched with iterative MVPs require more rigour
Relevant Takeaways from Part 1
36. It is possible to bridge the gap between what we actually learn and what we want to learn
● Effective participant recruitment (a deep topic not covered in this workshop)
● Relying on this assumption: “Many different people will respond the same way to the
same situation”
● Writing questions that come closer to telling us what we want to know by:
○ Eliminating potential bias
○ Avoiding leading questions
○ Framing questions in the right way
○ Planning your research to make synthesis easier
Bridging the Gap
38. ● Plan your research questions (or observation methodology) in such a way that it simplifies
your synthesis
● Think about your outputs: Task Analysis, list of usability issues, research report, etc.
○ Will your research plan actually allow you to create these outputs?
○ Might you need to do more than one round of research? Have you planned for this?
● How will your synthesis approach help you meet your research objectives?
Well planned research leads to better synthesis
39. Activity: Review the common mistakes
● Review the common mistakes from the previous section (Part 1)
● Have any of these mistakes made their way into your research plan?
● How would you change your research plan to avoid these mistakes?
● Discuss: Why are these mistakes easy to make even when you know about them in
advance?
40. Activity: Hypothetical questions
Given these hypothetical scenarios how would you choose a methodology and what questions /
observations would help you achieve your goals?
(TIP: Avoid the “Common Research Mistakes” from before)
1. People have a lot of unanticipated problems when getting a new pet, you want to make an
experience that reduces the risk of these problems. You have existing research about what the
problems are. You need to determine which problems you should prioritise.
2. Your customers fill out a paper application form. You’ve designed / built a prototype for a simpler
online form they can fill out instead. You want to know if the simplified form might introduce
problems that don’t exist with the current paper form.
3. You show your customers a list of hotels at their destination. But the list is long and it’s hard to
choose which hotel to stay at. You want to know which filters you should add.
4. You work for a bank and want to give customers insights into their financial situation and
spending habits. You need to decide what kind of insights customers will find useful.
42. ● You have all of your research data (raw text answers/observations or individual quant
data), now what?
● Particularly with qualitative research it’s not immediately obvious how this raw data will
help you meet your research objectives
● This is particularly an issue if your research plan is done quickly to meet business
timelines and budgets
● You need a lot of practice planning, running, and synthesising research to avoid this
problem - but it never goes away
● Sometimes you need to run multiple rounds of research where round 1 will help you work
out how to synthesise future research
An unexpected reality
43. ● The first thing you want to do is group raw notes into patterns and themes - i.e. group
notes where participants talk about the same thing
● Traceability is important so you can use quotes from research to justify findings - you
always want to know who said what (and ideally when)
● There is no reason that a single note/quote can’t go into more than one theme. Themes
can overlap and intersect
● You are likely to have sub-themes. So start with a “first pass” that groups notes into high
level areas and then go over each area to see if you can refine it further
● Once you have your notes organised in as much detail as possible you can start to make
findings. But remember: what have you actually learned from this data?
● Review your findings against your research objectives. Do you need to do more research
to learn more?
Qual Synthesis: Start with patterns / themes
44. ● It’s important to have well formed questions (either questions asked in a survey or
questions you ask of your data)
● These questions should directly address a research objective (or work to address nested
sub-objectives)
● Quant synthesis is a complex topic and you should not start quant research unless you
already have some knowledge of data analysis (or have someone that can do it for you)
● The key challenge is finding the right aggregates of data, including: segmentation,
measures of central tendency (mean, median), measures of spread (standard deviation,
quantiles), and covariates.
Quant Synthesis: Find the right aggregates
45. ● Discuss: Suppose you want to go through your research notes and just find the ones that
address a research objective. This would be a fast method but what are the drawbacks?
● Discuss: When might it be OK to do this (in spite of the drawbacks)?
Fast synthesis
47. ● Suppose you want to complete a task analysis: either (1) understanding tasks is a direct
research objective, or (2) you believe that understanding tasks would help meet a
research objective
● Then your analysis is focused on creating the task analysis rather than identifying findings
that address your research objectives
● The first step is the same: find patterns and themes
● From there you need to start organising these patterns into elements of a task analysis:
goals, outcomes, tasks, or task details (e.g. in a Task-Outcome Canvas)
● You will likely notice gaps in your analysis that you will need to fill with assumptions which
you can validate with further research
Task Analysis
48. ● Suppose you want to create a prioritised list of features for your product
● This is a complex topic as too many variables need to be factored in
○ Which features are likely to be used
○ Which users will use those features (and competing interest from different types of users)
○ Which features solve the customer problem the best (think about this: what happens the feature
that best solves the customer problem is one that people aren’t likely to use)
○ Which features will deliver revenue (many features may have indirect effects on revenue by
improving customer satisfaction)
○ And many more…
● The best bet is to come up with a way of scoring features based on how well it will allow
customers to solve some underlying need
○ This scoring method will be subjective and laden with assumptions
● You need to synthesise in a way that lets you (objectively-ish) come up with scores
Prioritised Feature List
50. ● It’s important to evaluate your research at the end of the project
● The key consideration: have you made a design decision because of your research?
● Framed another way: have you done something that you otherwise would not have done
if you never did the research?
● In this way you can determine whether or not the research was worthwhile
Have you made design decisions?
51. We previously looked at this list to determine what we are actually learning from this research. For each item
write a hypothetical research objective, a suitable question, and plan for synthesis (advanced):
1. Interview 6 participants about how they are planning an upcoming trip
2. Survey 300 potential customers asking them to rate 20 “pain points” on a 1-5 scale of importance (“not at
all important” to “extremely important”)
3. Spending 1 day observing people interact with a self-serve kiosk at a fast food restaurant
4. Creating a landing page for a new (potential) product where you ask users to register their interest - then
using online ads to send traffic to the page
5. Interviewing 10 participants to get feedback on designs (for a new digital product)
6. User Testing with 5 participants with a prototype of a new checkout experience
7. Running an A/B test between two positions for filters for an ecommerce store (left-hand side or top)
8. Adding a button for a planned feature which takes users to a “coming soon” page and measuring the click
through rate
9. Use Experience Sampling on 30 participants to ask how they how they are planning/managing tasks at
random intervals throughout the day
Activity: Review
53. ● What makes writing good research questions so difficult?
● How effective are other kinds of research artefacts? (e.g. personas or journey maps).
Consider this baseline: have you ever used these artefacts to make a design decision?
● Which synthesis approach is best?
Possible questions to discuss / consider
54. Thank You.
Get in Touch
● More articles / content
○ https://www.rickdzekman.com
○ http://slideshare.net/rickdzekman
● Consulting
○ https://evolvingexperience.com.au/
● Social media
○ https://twitter.com/rickdzekman
○ https://www.linkedin.com/in/rickdzekman/