Are you the sole UX researcher in your organization? Do you struggle to get timely research insights and feedback for your stakeholders? Online research tools offer practitioners the ability to gather feedback quickly and asynchronously, without the need for facilitation or moderation.
In this presentation, we provide an overview of some of the many online research tools that are available for gathering quick feedback on requirements, designs, and stakeholder sentiment. We offer general guidelines for recruiting, planning, implementing, and analyzing feedback, and then present how to use specific methods that have proven particularly useful for design and requirements research.
Attendees will hear about several problem scenarios, and vote on methods they think would work best to addressing the problems. After a group discussion about pros and cons, the presenters will share case study information about the methods they chose, and what worked well, and not so well.
Presented by: Brian Utesch, Annette Tassone, Jon Temple and Stephen Woodburn. Businesses strive to monetize the relationship between user sentiment and success outcomes including user adoption, user retention, and revenue. Customer satisfaction is embraced as a top predictor of success. There are of course many ways that satisfaction can be measured. We will review several methods of measuring user satisfaction, including simple Likert scale measures of overall satisfaction, the System Usability Scale (SUS), UMUX-Lite and the popular Net Promoter Scale (NPS). Not all of these measures are created equally or even measure the same sentiment. We’ll further compare the advantages and disadvantages of each measure, best practices around the use of each, and original research we’ve conducted that informs our recommended best practices.
UXPA 2021: Novel Prioritization Surveys: Opportunity Maps to Tame the NPS and...UXPA International
Despite its flaws, the Net Promoter Score (NPS) is often chosen by management for measuring customer satisfaction. Learn ways to mitigate damage from a poorly implemented NPS survey, enriching it with data that really matters to users and your stakeholders—while staying in the good graces of those bewitched by the traditional NPS:
1. What’s the NPS and how is it calculated?
2. What are its strengths and weaknesses?
3. How can you make the NPS more trustworthy and interpretable?
4. What other overall performance measures could replace or complement the NPS?
• Traditional “”Voice of the Customer”” (VOC) research
• “”Outcome Driven Innovation”” (ODI)
• “”Outcome Mapping.”” A new model that addresses the weaknesses of other approaches. Identify and measure key outcomes and opportunities, then predict how changes will impact future performance.
This presentation will be equal parts part survey design, data visualization, user needs research, and prioritization process.
Presented by Ari Weissman. How do you start from scratch? How do you build and grow a UX team within your organization where none existed?
Many organizations “do UX” in name only. There are people who might have the UX Designer title, but aren’t talking to users, leaving the product or engineering teams to drive the experience. It’s not that these organizations don’t want to be user-driven. It’s just that they don’t know how. That is what I walked into when I started as Director of UX for [my company].
This is the story of my ongoing successes and failures at building a UX practice. It’s not about one decision, but the many strategies you can employ to build, grow, and thrive.
UXPA Boston 2015 | Discussion Guides PresentationMotivate Design
Discussion guides are universal research artifacts and often informed by a diverse range of stakeholders (from researchers to clients). With that many cooks in the kitchen, things are bound to get messy. This presentation introduces a reflection tool that allows researcher to define their rationale for what stays and goes in a discussion guide and to help shape the appropriate research methodology to get you where you need to go.
Motivate Design has effectively used this tool to align stakeholders on the most meaningful discussion points for research; what was in scope and what needed to be considered for future research. This tool will empower you to guide research initiatives toward the right direction.
UXPA Boston 2015
Presented by: Brian Utesch, Annette Tassone, Jon Temple and Stephen Woodburn. Businesses strive to monetize the relationship between user sentiment and success outcomes including user adoption, user retention, and revenue. Customer satisfaction is embraced as a top predictor of success. There are of course many ways that satisfaction can be measured. We will review several methods of measuring user satisfaction, including simple Likert scale measures of overall satisfaction, the System Usability Scale (SUS), UMUX-Lite and the popular Net Promoter Scale (NPS). Not all of these measures are created equally or even measure the same sentiment. We’ll further compare the advantages and disadvantages of each measure, best practices around the use of each, and original research we’ve conducted that informs our recommended best practices.
UXPA 2021: Novel Prioritization Surveys: Opportunity Maps to Tame the NPS and...UXPA International
Despite its flaws, the Net Promoter Score (NPS) is often chosen by management for measuring customer satisfaction. Learn ways to mitigate damage from a poorly implemented NPS survey, enriching it with data that really matters to users and your stakeholders—while staying in the good graces of those bewitched by the traditional NPS:
1. What’s the NPS and how is it calculated?
2. What are its strengths and weaknesses?
3. How can you make the NPS more trustworthy and interpretable?
4. What other overall performance measures could replace or complement the NPS?
• Traditional “”Voice of the Customer”” (VOC) research
• “”Outcome Driven Innovation”” (ODI)
• “”Outcome Mapping.”” A new model that addresses the weaknesses of other approaches. Identify and measure key outcomes and opportunities, then predict how changes will impact future performance.
This presentation will be equal parts part survey design, data visualization, user needs research, and prioritization process.
Presented by Ari Weissman. How do you start from scratch? How do you build and grow a UX team within your organization where none existed?
Many organizations “do UX” in name only. There are people who might have the UX Designer title, but aren’t talking to users, leaving the product or engineering teams to drive the experience. It’s not that these organizations don’t want to be user-driven. It’s just that they don’t know how. That is what I walked into when I started as Director of UX for [my company].
This is the story of my ongoing successes and failures at building a UX practice. It’s not about one decision, but the many strategies you can employ to build, grow, and thrive.
UXPA Boston 2015 | Discussion Guides PresentationMotivate Design
Discussion guides are universal research artifacts and often informed by a diverse range of stakeholders (from researchers to clients). With that many cooks in the kitchen, things are bound to get messy. This presentation introduces a reflection tool that allows researcher to define their rationale for what stays and goes in a discussion guide and to help shape the appropriate research methodology to get you where you need to go.
Motivate Design has effectively used this tool to align stakeholders on the most meaningful discussion points for research; what was in scope and what needed to be considered for future research. This tool will empower you to guide research initiatives toward the right direction.
UXPA Boston 2015
UXPA International 2013 The Note-Taker's Perspective UserWorks
Kristen Davis's and Dick Horst's 2013 UXPA International presentation on The Note-Taker's Perspective During Usability Testing: Recognizing What's Important, What’s Not.
UX Field Research Toolkit - Updated for Big Design 2018Kelly Moran
Looking for practice with in-depth UXR fieldwork methods? You may have read about these techniques in the past, but methods must be practiced to be understood. projekt202 has been employing the experience research craft with great success since 2003. This workshop is your opportunity to try these tools of the trade in a structured environment without pressing deadlines or looming stakeholders. Our experienced research and design professionals will share industry tips and tricks that will help you put theory to practice.
The workshop will be hands-on and interactive; instructional elements will be reinforced with stories of impact to real projects. We will not only cover methods of gathering user data, but the importance of spending time internalizing and analyzing the data through activities such as affinity diagramming, persona building, and journey mapping. Participants will gain exposure to these important practices in a low-pressure atmosphere and with the guidance of experienced professionals.
EffectiveUI's Ari Weissman (Lead Experience Architect) and Lys Maitland (Senior Experience Planner) spoke at Denver Startup Week 2016. Discussion description:
Test early, test often.
It’s a mantra that’s been proven successful time and again when it comes to innovation and design. So why aren’t you doing it? In the start-up world, when everything is moving so quickly, it can be easy to overlook or postpone collecting feedback from real people because of cost, time, or lack of preparation. Don’t let those things stop you. Valid data can be captured cheaply, quickly, and with half-finished products and strategies.
This talk will cover:
What is user testing and why is it important
How to plan for user testing
What are ways to make testing cheaper
What are ways to make testing quicker
How to test with different fidelities of concept and design
How to collect data more frequently
Opportunities for getting the whole team engaged
What to do with the insights/outcomes of research
Beyond Usability Testing: Assessing the Usefulness of Your Designhawleymichael
Usability tests are meant to find usability problems. If your question is, “where are the usability problems in this design”, usability testing is right for you. With usability testing, can study how well someone can get from point A to point B and where are the problems along the way. Finding usability problems is the focus, and the method works great.
But, we are finding that many of the questions business sponsors and stakeholders have are not about finding usability problems. The questions they have are more about the overall usefulness of a design, its potential for success, and how well it meets expectations.
This presentation will define usefulness research, show how it is different from usability tests, and offer different approaches for asking the right questions of users. Whether you think this is slap-your-forehead obvious or a method that needs to be expanded and refined, we seek to have a lively conversation.
UX Designer's Toolkit - to design a better worldRachel Liu
Presented at the Creative Meetup: http://www.meetup.com/Creative-Class/events/162137382/ on 9th April 2014.
A UX Designer's Toolkit to design a better world with case studies of good and bad websites/apps as well as interactive exercises to understand the Lean UX process
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.pptUserWorks
Kristen Davis and Dick Horst from UserWorks presentation slides on the "Notetaker's Perspective During Usability Testing: Recognizing What's Important, What's Not" from UXPA-DC Conference Redux 2013
The ROI Of User Experience: Consider, Calculate & Measure SuccessUserZoom
We’ve all heard that providing a better user experience can help your organization improve performance, increase exposure, gain more credibility, reduce the resource burden and ultimately increase sales, but as nice as these goals are, how are they truly being measured?
Join us in a webinar with Dr. Susan Weinschenk as she dives into the trending topic of User Experience ROI (Return On Investment) – Should you spend all this time and money on user experience research and design? Is it worth it? How do even go about figuring out the ROI of UX?
Key concepts Susan will discuss in the webinar:
-When and why to consider the ROI of UX
-How to measure the ROI of UX
-The biggest mistakes to avoid in calculating the ROI of UX
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...UserZoom
Does this sound familiar? Researchers sitting around a meeting table arguing about which methods to use, especially when it comes to unmoderated remote testing vs moderated? Usually without any empirical data?
In this webinar we'll give you the power of data to say "ELMO!" (Enough, let’s move on!) and end the argument once and for all.
We collected this data by conducting 10 moderated and 10 unmoderated remote sessions across six tasks on Patagonia.com, in order to show how moderated and unmoderated remote studies compare in terms of the number and severity of usability issues surfaced.
Register for this upcoming webinar and discover the theoretical and actual strengths and weaknesses of various user research methods to stop the argument before it even begins.
UX STRAT Online 2020: Victoria Sosik, VerizonUX STRAT
Demand for UX insights is higher than ever--as UX Researchers, we’ve become “victims of our own success.” While a cause for celebration, with it comes challenges managing bandwidth, prioritizing work, and being viewed as a bottleneck in the design process. For this reason, we began exploring a program to democratize Design Research at Verizon. In this talk, I’ll walk through our approach, our decisions around which types of research to democratize, and how we’re striking the balance between democratization and control. I’ll also reflect back on our early experiences with the program and where we plan to go in the future.
A presentation given at the IBM UX conference in Israel in March 2015.
In this presentation I discuss what is UX research and why it's a lot simpler than people think it is.
Informed & Agile: Test Driven Design w/ Jon InnesUserZoom
Do you find yourself sprinting without a clear direction? Pushing feature after feature out, only to wonder if your app or website is really getting better? Join Jon Innes of UX Innovation in a webinar on-demand, where he will discuss how to improve your sprints by incorporating UX/usability metrics that the whole team can use to measure progress on your agile journey as a product team.
Designing Better Applications, Website and IntranetsDennis Breen
Creating great websites and applications is hard work. There are so many aspects to juggle; so much complexity to control. You have to understand the needs of your users, get buy-in from stakeholders, organize lots of content and create an intuitive interface. This is no small order.
Fortunately, nForm has created a simple resource to pass on a little of what we’ve learned about planning for great design. Our User Experience Cards feature tried-and-true methods for designing better interactive products of all kinds--from online stores to corporate intranets to mobile apps.
Learn about why these methods are needed, how they can help you achieve success, and how you can use the User Experience Cards to plan your own projects.
UserTesting 2016 webinar: Research to inform product design in Agile environm...Steve Fadden
Designing in agile environments demands many decisions be made in short periods of time. Informing these decisions with formative research enhances our understanding what we’re building, from the viability of concepts, to the effectiveness of designs, to the ultimate success of our solutions.
Coaching teams in creative problem solvingFlowa Oy
Agile has helped teams to collaborate and organize work better. That’s great. Better teamwork and better understanding of the work definitely helps a team to do right things. Agile has also lead the way toward technical practices such as Continuous Integration and Delivery, Test Driven Development and SOLID-architecture principles. Great, these things definitely help the team to do things right.
Then again, most of the time in software projects goes into problem solving and similar creative acts. Agile has relatively little to give on these areas. Currently, agile is not about creativity nor is it about problem solving.
This coaching circle session will focus on the creative core of software development: solving creatively novel, original and broad problems more effectively all the time. I will introduce some principles and tools I’ve found useful when helping people to solve hard problems and to find creative solutions.
UXPA International 2013 The Note-Taker's Perspective UserWorks
Kristen Davis's and Dick Horst's 2013 UXPA International presentation on The Note-Taker's Perspective During Usability Testing: Recognizing What's Important, What’s Not.
UX Field Research Toolkit - Updated for Big Design 2018Kelly Moran
Looking for practice with in-depth UXR fieldwork methods? You may have read about these techniques in the past, but methods must be practiced to be understood. projekt202 has been employing the experience research craft with great success since 2003. This workshop is your opportunity to try these tools of the trade in a structured environment without pressing deadlines or looming stakeholders. Our experienced research and design professionals will share industry tips and tricks that will help you put theory to practice.
The workshop will be hands-on and interactive; instructional elements will be reinforced with stories of impact to real projects. We will not only cover methods of gathering user data, but the importance of spending time internalizing and analyzing the data through activities such as affinity diagramming, persona building, and journey mapping. Participants will gain exposure to these important practices in a low-pressure atmosphere and with the guidance of experienced professionals.
EffectiveUI's Ari Weissman (Lead Experience Architect) and Lys Maitland (Senior Experience Planner) spoke at Denver Startup Week 2016. Discussion description:
Test early, test often.
It’s a mantra that’s been proven successful time and again when it comes to innovation and design. So why aren’t you doing it? In the start-up world, when everything is moving so quickly, it can be easy to overlook or postpone collecting feedback from real people because of cost, time, or lack of preparation. Don’t let those things stop you. Valid data can be captured cheaply, quickly, and with half-finished products and strategies.
This talk will cover:
What is user testing and why is it important
How to plan for user testing
What are ways to make testing cheaper
What are ways to make testing quicker
How to test with different fidelities of concept and design
How to collect data more frequently
Opportunities for getting the whole team engaged
What to do with the insights/outcomes of research
Beyond Usability Testing: Assessing the Usefulness of Your Designhawleymichael
Usability tests are meant to find usability problems. If your question is, “where are the usability problems in this design”, usability testing is right for you. With usability testing, can study how well someone can get from point A to point B and where are the problems along the way. Finding usability problems is the focus, and the method works great.
But, we are finding that many of the questions business sponsors and stakeholders have are not about finding usability problems. The questions they have are more about the overall usefulness of a design, its potential for success, and how well it meets expectations.
This presentation will define usefulness research, show how it is different from usability tests, and offer different approaches for asking the right questions of users. Whether you think this is slap-your-forehead obvious or a method that needs to be expanded and refined, we seek to have a lively conversation.
UX Designer's Toolkit - to design a better worldRachel Liu
Presented at the Creative Meetup: http://www.meetup.com/Creative-Class/events/162137382/ on 9th April 2014.
A UX Designer's Toolkit to design a better world with case studies of good and bad websites/apps as well as interactive exercises to understand the Lean UX process
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.pptUserWorks
Kristen Davis and Dick Horst from UserWorks presentation slides on the "Notetaker's Perspective During Usability Testing: Recognizing What's Important, What's Not" from UXPA-DC Conference Redux 2013
The ROI Of User Experience: Consider, Calculate & Measure SuccessUserZoom
We’ve all heard that providing a better user experience can help your organization improve performance, increase exposure, gain more credibility, reduce the resource burden and ultimately increase sales, but as nice as these goals are, how are they truly being measured?
Join us in a webinar with Dr. Susan Weinschenk as she dives into the trending topic of User Experience ROI (Return On Investment) – Should you spend all this time and money on user experience research and design? Is it worth it? How do even go about figuring out the ROI of UX?
Key concepts Susan will discuss in the webinar:
-When and why to consider the ROI of UX
-How to measure the ROI of UX
-The biggest mistakes to avoid in calculating the ROI of UX
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...UserZoom
Does this sound familiar? Researchers sitting around a meeting table arguing about which methods to use, especially when it comes to unmoderated remote testing vs moderated? Usually without any empirical data?
In this webinar we'll give you the power of data to say "ELMO!" (Enough, let’s move on!) and end the argument once and for all.
We collected this data by conducting 10 moderated and 10 unmoderated remote sessions across six tasks on Patagonia.com, in order to show how moderated and unmoderated remote studies compare in terms of the number and severity of usability issues surfaced.
Register for this upcoming webinar and discover the theoretical and actual strengths and weaknesses of various user research methods to stop the argument before it even begins.
UX STRAT Online 2020: Victoria Sosik, VerizonUX STRAT
Demand for UX insights is higher than ever--as UX Researchers, we’ve become “victims of our own success.” While a cause for celebration, with it comes challenges managing bandwidth, prioritizing work, and being viewed as a bottleneck in the design process. For this reason, we began exploring a program to democratize Design Research at Verizon. In this talk, I’ll walk through our approach, our decisions around which types of research to democratize, and how we’re striking the balance between democratization and control. I’ll also reflect back on our early experiences with the program and where we plan to go in the future.
A presentation given at the IBM UX conference in Israel in March 2015.
In this presentation I discuss what is UX research and why it's a lot simpler than people think it is.
Informed & Agile: Test Driven Design w/ Jon InnesUserZoom
Do you find yourself sprinting without a clear direction? Pushing feature after feature out, only to wonder if your app or website is really getting better? Join Jon Innes of UX Innovation in a webinar on-demand, where he will discuss how to improve your sprints by incorporating UX/usability metrics that the whole team can use to measure progress on your agile journey as a product team.
Designing Better Applications, Website and IntranetsDennis Breen
Creating great websites and applications is hard work. There are so many aspects to juggle; so much complexity to control. You have to understand the needs of your users, get buy-in from stakeholders, organize lots of content and create an intuitive interface. This is no small order.
Fortunately, nForm has created a simple resource to pass on a little of what we’ve learned about planning for great design. Our User Experience Cards feature tried-and-true methods for designing better interactive products of all kinds--from online stores to corporate intranets to mobile apps.
Learn about why these methods are needed, how they can help you achieve success, and how you can use the User Experience Cards to plan your own projects.
UserTesting 2016 webinar: Research to inform product design in Agile environm...Steve Fadden
Designing in agile environments demands many decisions be made in short periods of time. Informing these decisions with formative research enhances our understanding what we’re building, from the viability of concepts, to the effectiveness of designs, to the ultimate success of our solutions.
Coaching teams in creative problem solvingFlowa Oy
Agile has helped teams to collaborate and organize work better. That’s great. Better teamwork and better understanding of the work definitely helps a team to do right things. Agile has also lead the way toward technical practices such as Continuous Integration and Delivery, Test Driven Development and SOLID-architecture principles. Great, these things definitely help the team to do things right.
Then again, most of the time in software projects goes into problem solving and similar creative acts. Agile has relatively little to give on these areas. Currently, agile is not about creativity nor is it about problem solving.
This coaching circle session will focus on the creative core of software development: solving creatively novel, original and broad problems more effectively all the time. I will introduce some principles and tools I’ve found useful when helping people to solve hard problems and to find creative solutions.
You don’t need a big budget, weeks of time or special labs to get user insights quickly and inexpensively. We’ll discuss how you can meet your goals, improve your products and make informed decisions through user research. Usability testing (remote & in-person), interviews, surveys and analytics are a few methods we’ll review, particularly in the context of your own business challenges and user questions.
Bridging Current Reality & Future Vision with Reality MapsMalini Rao
Using a versatile design research technique, this presentation calls designers to give themselves permission to be flexible in their design practice by being the master of their techniques and get creative with the design process as much as they get creative with the experiences they design!
Are you looking to gather insights from your potential customers? When it comes to your prospects, do you really know what they want? Many startup teams tell us they are missing the key information they need to get into their users' mind. Without this information, the products often fall short of delighting users.
There are those that believe that user research and usability testing must be a complex and scientific process that takes lots of time, money, and resources. However, in the real world, most startups don't have the luxury to spend weeks or months on their user research. That's where guerrilla research techniques come into play.
Julie Grundy gives an overview of user experience Design, why it's important, guiding principles, UX research overview, and tactics used by UX professionals. November 2015.
Renee Anderson, Techniques for prioritizing, road-mapping, and staffing your ...museums and the web
A presentation from Museums and the Web 2009.
Maybe you’re supposed to overhaul your institution’s Web site. Or maybe you’ve been directed to visualize and implement new on-line initiatives. Other than knowing your stakeholders’ wish lists and extensive ideas for Web site content and features – from blogs to on-line collections – you don’t have a clear plan of action. You don’t even have a defense strategy for why or why not to invest in some of their requests. How, then, can your team drive decision-making? How can you get features implemented based on rational reasons, while balancing institutional goals and audience needs – all without going over budget? This mini-workshop will focus on an often-overlooked core Web site activity: the Feature Prioritization Workshop. You will be introduced to prioritization techniques and tools, how and when to use them, methods for navigating the myriad needs and wants of stakeholders, and some approaches for achieving compromise. You will learn to balance “requirements” with “desires” by using concrete proof points and a convincing defense. And you will also learn about building a phased roadmap that will accommodate the immediate needs of your organization at launch, yet will provide a plan for future iterations and builds.
Mini-Workshop: Redesign: Prioritizing [Mini-Workshop]
Early Signal Testing: Designing Atlassian’s New LookAtlassian
You probably have noticed the new look of Atlassian's Cloud products. Our new Design Guidelines took many months to create, and our team had many tough decisions to make. Luckily, we incorporated customer research along the way to guide us.
One of our most valuable research tools is called “early signal testing”, and we think it can help you too. Early signal testing can help you gain confidence in a direction, rather than being paralyzed by a choice. It can help assess your design's usability, clarity, comprehension, and more. This talk explains how your team can gather measurable user feedback in as little as a week, for even the very biggest of problems.
The Trick of Designing User Interfaces that SlayMindfire LLC
Now, the problem every time was the expectation to offer perfect answers, along with supporting reasons. Although my intuition helped me get through on a few, the part I would struggle most was in offering convincing reasons to support my answers. I felt my supposed intuition had some underlying logic and concepts but I needed to reify those. I needed to have constructs that I could use to easily judge any Design, be it a web page, mobile application UI or even a poster.
Newbie UX: Something I learned about UX (Business vs Design)Soon-Aik Chiew
Sharing some tips to those who are new to UX and wish to learn more about UX. The findings and sharing are based on my past learning mistakes, experience and observations.
http://blog.netizentesting.com/newbie-ux-something-learned-user-experience/
I'm currently drafting a material on Startup (Digital) Marketing: Growth Hacking Thru UX. Stay Tuned.
To read more articles, visit: blog.NetizenTesting.com
2 hours training on Mobile UX with Farah Nuraini, Interaction Designer at Traveloka, Indonesia
45 min theory: Research, Analysis, Design solutions and Testing
+ 1h15 min of hands-on exercises with the 5 facilitators from Traveloka.
Supporting Distributed Critique through Interpretation and Sense-Making in an...colin gray
Critique is an important component of creative work in design education and practice, through which individuals can solicit advice and obtain feedback on their work. Face-to-face critique in offline settings such as design studios has been well-documented and theorized. However, little is known about unstructured distributed critique in online creative communities where people share and critique each otherâs work, and how these practices might resemble or differ from studio critique. In this paper, we use mixed-methods to examine distributed critique practices in a UX-focused online creative community on Reddit. We found that distributed critique resembles studio critique categorically, but differs qualitatively. While studio critique often focuses on depth, distributed critique often revolved around collective sensemaking, through which creative workers engaged in iteratively interpreting, defining, and refining the artifact and their process. We discuss the relationship between distributed critique and socio-technical systems and identify implications for future research.
Similar to How to Effectively Implement Different Online Research Techniques for Rapid Unmoderated Feedback - Niyati Bedekar and Steve Fadden (20)
UXPA 2023: Start Strong - Lessons learned from associate programs to platform...UXPA International
Imagine creating experiences for your rookie designers’ first couple years that are rewarding, enriching, and full of learning — without taking all your time or energy to manage. We’ll share techniques any team leader can put into practice using real-life examples from associate programs, apprenticeships, and internships.
Topics include onboarding, varied work challenges, developing multiple capabilities, buddy systems, group sharing, guest speakers, time with executives, and mentorship. We’ll also share how to operationalize learning, soft skills like communication and collaboration, setting boundaries, time management, achieving deep work, and more skills we all wish we were explicitly taught early on.
We’ll focus on modern-day associate programs, but even if you can’t create a full-fledged program, you’ll leave this session with ideas to use with your fledgling professionals. The benefits go beyond efficiency; it’s a foundation for culture, camaraderie, autonomy, and mastery.
UXPA 2023: Disrupting Inaccessibility: Applying A11Y-Focused Discovery & Idea...UXPA International
Digital advances are being made at a rapid-fire pace, yet disability inclusivity continues to fall short of the digital revolution. As the number of people living with disabilities rises, the time to take digital accessibility to the next level is now. Let’s disrupt inaccessibility together! Come hear about a multi-part discovery research and ideation project informing foundational UX designs for our customers. You’ll get insights from our unique study, which are widely applicable across industries, and walk away with tips and inspiration to kick off your own accessibility-focused discovery and ideation. Only YOU can prevent inaccessibility – are you in?
User experience can be drastically elevated by combining data science insights with user-based insights from research. Data analytics on its own can make themes and correlations difficult to explain and to provide accurate recommendations. For example, themes identified via large global surveys and usage data can be better understood with UX insights from focused user research, such as user interviews and/or cognitive walkthroughs. This presentation will highlight the complimentary nature of data science and UX and will focus on the benefits of bringing the two disciplines together. This will be buttressed with practical examples of enterprise projects and applications that combined data and skills from the two disciplines, guidance on how the two disciplines can better work together, and the skills needed to improve as a UX professional when working with data science teams.
UXPA 2023: UX Fracking: Using Mixed Methods to Extract Hidden InsightsUXPA International
Users do not always accurately describe what they mean or feel. There are many reasons for this, ranging from politeness to poor introspection, to lack of sufficient technical vocabulary. Fortunately, UX researchers have tools in their trade to deduce what was really meant. We call this UX Fracking, a mixed methods approach that is optimized for extracting hidden user insights. We will illustrate the dangers of inadequate, superficial research, and how this may lead to outcomes incapable of addressing the users’ core issues. We will explore ways to avoid these pitfalls by leveraging mixed research methods to test hypotheses about the users’ intent and needs. This starts with a thorough understanding of who the user is, their goals, and how they work today, to an approach that combines surveys, interviews, and comment analysis with behavioral observation, and finally, validating the newly discovered user insights with the users themselves.
UXPA 2023: Learn how to get over personas by swiping right on user rolesUXPA International
This session walks through the concept of user roles as an alternative to personas as a means to generate and disseminate user insights for product development teams. We will describe the tools and methods used to create a research database organized by user roles, along with examples and short exercises to help attendees think through user roles within their own context.
By the end of the session, attendees should be aware of tools and approaches for:
Organizing user research information in a database
Disseminating user role information to product and design teams
Managing a user roles database as part of a long term UX Research program
If you’re ready to ditch personas but don’t know how, this session is for you!
We will present a case study that details our approach for replacing user personas with user roles for a multi-national SAAS company. We will take the audience on a journey that starts with an executive request for personas, travels through the tribulations of realizing personas suck, and concludes with convincing others to accept a new and innovative way to understand the people who use the product. Our key message is that personas lack real value for organizations that already understand the importance of empathizing with users. Building user-centered products requires easily accessible and well organized user insights. We will discuss defining users through a process of stakeholder consultation and content review, and structuring data around Jobs to Be Done and product interactions. We will also discuss the dissemination of user roles in our organization using relational databases, interactive dashboards and online wikis. Spoiler alert, our stakeholders loved user roles!
UXPA 2023: Experience Maps - A designer's framework for working in Agile team...UXPA International
Agile Methodology refers to software design and development methodologies centered around the idea of iterative design and development, where requirements and concepts evolve through collaboration between self-organizing cross-functional teams. Thus, Agile enables teams to deliver value faster, with greater quality and predictability, and greater aptitude to respond to change. With evolving product features every design sprint, designers & researchers find it difficult to follow the design process. This sometimes leads to designs delivered in haste or sub-par design artifacts which result in UX debt. UX debt is accumulated when design teams take actions or shortcuts to expedite the delivery of a piece of functionality or a project which later needs to be refactored. It is the result of prioritizing speedy delivery of design to the development team over a perfect experience journey. Experience Maps is a great tool to practice UX in Agile as well as manage UX Debt.
UXPA 2023: UX Enterprise Story: How to apply a UX process to a company withou...UXPA International
How to build a UX Department from scratch, in an environment they think UX people do social media posters and posts! An agile implementation just started, and people are moving from a waterfall and ad-hoc mindset to agility. In this session, I will talk about my Journey to establish a UX Department for a company that is part of a global brand, but this local branch just started the digital transformation movement. Challenges like: spreading awareness and educating people about UX, hiring the right team, defining the right team structure, establishing workflow and day-to-day operations, and applying localization (non-western culture).
UXPA 2023: High-Fives over Zoom: Creating a Remote-First Creative TeamUXPA International
I started my current job in March of 2020. Many of us remember something clearly about the month that COVID started to shut things down. I remember being surprised to hear that my new on-site-only job would be starting in my living room over zoom. How do you lead a design team when none of the team members live near each other and creativity is highly collaborative? Taking from over a decade of working in HR software, I knew whatever I did needed to put people first. That what employees love about a job is often deeper than the work, it’s the culture, the relationships and people they work with. It’s the feeling that their work has value, and their contribution matters. In this talk I will walk though some of the rituals and best practices I have learned over the last two years building a remote-first creative team.
UXPA 2023: Behind the Bias: Dissecting human shortcuts for better research & ...UXPA International
As humans, we are biased by design. Our intricate and fascinating brains have developed shortcuts through centuries of human evolution. They reduce an unimaginable load of paralyzing decisions, keep us alive, and help us navigate this complex world. Now, these life saving biases affect how we behave with modern technology. Understanding some of the theories and reasons why these biases exist is the key to unlocking their power. In this workshop we will cover some theories around how the brain works. We will review some of our mental shortcuts, take a look at some common biases, and learn how they affect our users, our research, and our designs. Lastly we will review some advantages of biases, and ways to identify and reduce bias. This workshop is targeted for designers who do their own research, and researchers looking to learn more about removing bias from their studies.
UXPA 2023 Poster: Improving the Internal and External User Experience of a Fe...UXPA International
UXPA 2023 Poster: Improving the Internal and External User Experience of a Federal Government Legacy Application Using User Experience and Agile Principles
Are you new to UX management, or thinking of getting into management? Then this talk is for you. After reading countless books, attending countless trainings, mentoring and being menteed, nothing quite prepared me for management like my first year. I’ll share with you what I wish they’d told me. I’ll also share my process for generating team research roadmaps, establishing team values, keeping employees motivated, and not burning out.
UXPA 2023: Redesigning An Automotive Feature from Gasoline to Electric Vehicl...UXPA International
Join us for an interaction design case study from the automotive industry. We created a Human-Machine Interface (HMI) for a vehicle feature that provides household-levels of power in electrical outlets for our customers to use at work and play. This case study will reveal: · Our debate of re-using version 1.0’s HMI vs designing a new user interface for the electric vehicle—when to break with consistency and why? · User research we conducted to guide our early design concept. · Paper prototypes we created to support our usability testing of the concept with vehicle owners. · How we solved internal debate over the interaction design in moving from internal combustion vehicles to electric vehicles. * Advice to help you evangelize user-centered design that is also brand-centered for a new product.
White wonder, Work developed by Eva TschoppMansi Shah
White Wonder by Eva Tschopp
A tale about our culture around the use of fertilizers and pesticides visiting small farms around Ahmedabad in Matar and Shilaj.
Book Formatting: Quality Control Checks for DesignersConfidence Ago
This presentation was made to help designers who work in publishing houses or format books for printing ensure quality.
Quality control is vital to every industry. This is why every department in a company need create a method they use in ensuring quality. This, perhaps, will not only improve the quality of products and bring errors to the barest minimum, but take it to a near perfect finish.
It is beyond a moot point that a good book will somewhat be judged by its cover, but the content of the book remains king. No matter how beautiful the cover, if the quality of writing or presentation is off, that will be a reason for readers not to come back to the book or recommend it.
So, this presentation points designers to some important things that may be missed by an editor that they could eventually discover and call the attention of the editor.
You could be a professional graphic designer and still make mistakes. There is always the possibility of human error. On the other hand if you’re not a designer, the chances of making some common graphic design mistakes are even higher. Because you don’t know what you don’t know. That’s where this blog comes in. To make your job easier and help you create better designs, we have put together a list of common graphic design mistakes that you need to avoid.
Hello everyone! I am thrilled to present my latest portfolio on LinkedIn, marking the culmination of my architectural journey thus far. Over the span of five years, I've been fortunate to acquire a wealth of knowledge under the guidance of esteemed professors and industry mentors. From rigorous academic pursuits to practical engagements, each experience has contributed to my growth and refinement as an architecture student. This portfolio not only showcases my projects but also underscores my attention to detail and to innovative architecture as a profession.
How to Effectively Implement Different Online Research Techniques for Rapid Unmoderated Feedback - Niyati Bedekar and Steve Fadden
1. How to effectively implement
different online research techniques
for rapid unmoderated feedback
Niyati Bedekar
@nbedekar
Steve Fadden
@sfadden
Presented at UXPA 2015, San Diego Slides: https://goo.gl/X8dolV
2. Agenda
Online techniques
Method toolkit
Common requests and solutions
Case studies and templates
Effective practices
Image source: http://pixabay.com/en/modesto-california-scenic-trail-205544/
5. Who are you?
Years experience in
user research:
<1
1-2
2-5
5+
Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)
6. Who are you?
Total number of
employees:
1-20
21-100
101-500
500+
Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)
7. Who are you?
Most recent research
request?
Most common
research request?
Jot down
Image source: http://pixabay.com/en/photos/note%20paper/
12. What People Do
What People Say
Why &
How to Fix
How Many & How
Much
Behavioral
Attitudinal
Qualitative Quantitative
Rohrer, C. October 12, 2014. When to use which user experience research methods.
Retrieved from http://www.nngroup.com/articles/which-ux-research-methods/
Toolkit is growing
(Rohrer’s framework)
Image source: http://www.freestockphotos.biz/stockphoto/1772
13. Method (participant effort) Types of answers provided
Click Behavioral: Where to start or go next?
Preference Attitudinal: Compare between options
Recall Hybrid: What do you remember? What are your
first impressions?
Sentiment Attitudinal: How does this make you feel?
Embedded questions Hybrid: What happens next, and why?
How would you rate this?
Terminology/naming Attitudinal: What does something mean?
Commenting Hybrid: What comes to mind while reviewing a
concept/flow? OR Open feedback
Go-to methods
Image source: http://www.geograph.org.uk/photo/1911269
14. Method (participant effort) Types of answers provided
Card sorting Hybrid: What items belong together and what
should they be called?
Discussion groups / Focus
groups
Attitudinal: What comes to mind while reviewing
other feedback?
Unmoderated usability
testing
Hybrid: What do you expect? What do you do?
Why?
Additional methods to consider
Image source: http://www.geograph.org.uk/photo/1911269
16. “Finals week starts on June 1. Where would you first click to put a
reminder on your calendar?”
Click methods (Behavior: Where do users click)
UsabilityTools
17. “Describe what you would expect to see after you clicked the area in
the previous screen?”
Embedded question (Hybrid: What happens next)
Qualtrics
18. “Please click the variation you prefer. [after] Why did you choose it?”
Preference (Attitude: Which do you prefer)
Verify
19. “You will see a screen for 5 seconds. After reviewing the screen, you’ll
be asked questions about it. [after] What do you remember?”
Recall (Hybrid: What do you remember)
Verify
20. “Review this screen and think about how it makes you feel.”
Sentiment (Attitude: How does this make you feel)
Verify
21. “Do you find this design to be attractive?”
Embedded question (Attitude: How do you rate this)
SurveyMonkey
22. “Label each marker with what you would call the icon.”
Terminology/naming (Attitude: What does this mean)
Verify
23. “This design shows what happens when you click the ‘+’ icon.
Comment on areas you find confusing, problematic, helpful, usable.”
Commenting (Hybrid: What comes to mind)
Verify
27. Form groups of 3-5
Review common requests
Discuss how you typically research
Consider online solutions
Discuss pros/cons
Discussion: Research requests
Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird
28. Reference (for Activity)
Image source: http://www.geograph.org.uk/photo/1911269
Method (participant effort) Types of answers provided
Click Behavioral: Where to start or go next?
Preference Attitudinal: Compare between options
Recall Hybrid: What do you remember? What are your
first impressions?
Sentiment Attitudinal: How does this make you feel?
Embedded questions Hybrid: What happens next, and why?
How would you rate this?
Terminology/naming Attitudinal: What does something mean?
Commenting Hybrid: What comes to mind while reviewing a
concept/flow? OR Open feedback
29. Group discussion: Share thoughts
● Problem
● Typical solution
● Online research solution
● Pros/cons
Discussion: Research requests
Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird
31. Case Study 1: Evaluate new data export concept
Background
- New functionality for an existing
product
- Integrated with 3rd party software
- To be implemented ASAP
Goals
- “Boil the ocean” to learn if concept
was understood, desired, and usable
Methods
Embedded question
Critical incident
Embedded question
Comprehension rating
Commenting
On each storyboard panel, after
presenting full story
Embedded question
Open feedback, questions, and
expectations
32. “Consider the last time you had to export data. Describe why you
needed to export data, and list the steps you remember from that
process. (If you haven’t exported data before, or don’t remember the
last time, just skip to the next question).”
Embedded question (Critical Incident)
“I’m pretty old school, so I export
my credit card transaction data
about every quarter. My credit card
site has a button to export to CSV,
so I just click that and it downloads
to my computer.”
“We have our marketing, sales, and
inventory data in different systems.
I have to export data from each
system in order to combine it into a
spreadsheet for my stakeholders.
The export process is easy.
Combining the data is more
involved.
33. “Consider the concept presented on the next 4 slides. After reading
about the concept, you will be asked about what you found to be
confusing, problematic, useful, and appealing about the concept.”
1.
New concept scenario
2.
3.
4.
100%
35. Commenting (Identify strengths and weaknesses)
“You will now be shown each concept slide again. On each slide,
indicate anything you found to be particularly confusing, problematic,
useful, and appealing.”
1.
2.
3.
4.
100%“Doing this would
require a lot of clicks,
even for a small
number of columns.”
“You should embed best
practices for naming
here. Otherwise, the
result could be messy.”
“Will we be able to save
the mappings? That
could save time in the
future.”
36. “Any final comments, questions, or feedback you’d like to share?”
Embedded question (Open feedback)
“It’s great that you don’t have to
jump around different parts of the
system to do this. Very valuable to be
able to complete this from one
place.”
“Seems very clear to me. I think
anyone who has used [XYZ] would
be able to understand it too.”
“Hi, I wanted to
follow up to
reiterate that this
is a REALLY COOL idea
and it fills a much
needed requirement
for our use of the
product. Please
consider me for
future studies like
this, because we need
this functionality!”
37. Template 1: Exploring a new concept
NDA,
Confidentiality,
Demographics
Embedded
Question: Critical
incident to activate
[Present concept]
Video, illustration,
storyboard,
description
Embedded
Question:
Comprehension
rating, after
presenting concept
Commenting:
Concept slides
(storyboards work
well)
Embedded
Question: Open
feedback
38. Case Study 2: Identify problems and preferences for
calendar range selection tools
Background
- Tool developed without support
- Early stage prototype, only worked
within company firewall
- Team wanted feedback before further
refinement
Goals
- Recruit internal participants only
- Identify heuristic violations
- Gauge preference compared to
existing tools
Methods
Click
How would you start task
Commenting
(after using prototype) See
screenshots of tool in different
states
Preference
Compare tool to existing tool
Embedded Question
Explain preference and next steps
39. Template 2: Eliciting usability/heuristic feedback
NDA,
Confidentiality,
Demographics
Recall: What is
remembered? [or]
Sentiment: How
does this make you
feel?
Click: How would
you start this task?
Embedded
Question: What
would you expect to
see after clicking?
Commenting: Open
feedback, after
engaging
Embedded
Question: Usability
rating
40. Case Study 3: Redesign chart type & update visual
treatment
Background
- Existing component used frequently
by customers and loved by many!
- Not scalable
- Prone to misinterpretation
- Team wanted to test new designs
Goals
- Understand if users comprehend the
new design
- Gauge preference among 3 different
approaches (including existing)
- Mix of internal users and customers
Methods
Embedded question
Understandability of information
Preference
Among the various options
Commenting
Open feedback, expectations
41. Template 3: Redesigned visual treatment
NDA,
Confidentiality
Embedded
Question: to gather
understanding of
information on chart
(randomize)
Preference: Which
design do you
prefer? (randomize)
Embedded
Question: Why the
selected design?
Commenting: Open
feedback
Demographics
42. Case Study 4: Understand how people find content
Background
- Team assigned to build new system
- Wanted to create a system where
content was easy to locate
Goals
- Identify how users locate content
- Discover differences based on content
type
- Understand pain points to see if they
can be reduced or eliminated
Methods
Click
(for each method) Where do you
click first to locate this kind of
content?
Sentiment
What feeling is associated?
Commenting
Open feedback, expectations
Embedded Question
(after each method) What do you
find most/least usable?
43. Template 4: Understanding behavior and
expectations
NDA,
Confidentiality,
Demographics
Embedded
Question: Critical
incident to activate
Click: What do you
do first?
Sentiment: How do
you feel when you do
this?
Commenting: What
works well and not
well?
Embedded
Question: Open
feedback
62. Type of Test
Tools Click /
Suc-
cess
Prefer-
ence
Recall Senti-
ment
Ques-
tion
Termin-
ology/
Label
Com-
menting
Card
sorting
Discus-
sion
Unmoder-
ated
usability +
video on
website
Metrics
&
Results
Verify ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Usabilla ✓ ✓ ✓ ✓
Loop11 ✓ ✓ ✓
UserTesting.com ✓ ✓
UserZoom ✓ ✓ ✓ ✓ ✓ ✓
Optimal
Workshop
✓ ✓
Yahoo Groups,
Facebook,
LinkedIn
✓
Survey tools
(Getfeedback,
Qualtrics,
SurveyMonkey)
✓ ✓ ✓ ✓
Examples of types of tests available (Incomplete list)
63. Chrisitan Rohrer’s NNG article about when appropriatenes of a method to help answer specific
questions: http://www.nngroup.com/articles/which-ux-research-methods/
A review of usability and UX testing tools: http://www.smashingmagazine.
com/2011/10/20/comprehensive-review-usability-user-experience-testing-tools/
How to select an unmoderated user testing tool to fit your needs: http://www.nngroup.
com/articles/unmoderated-user-testing-tools/
List of tools for unmoderated testing:
1. http://remoteresear.ch/tools/
2. http://www.infragistics.com/community/blogs/ux/archive/2012/11/07/6-tools-for-remote-
unmoderated-usability-testing.aspx
Kyle Soucy’s article in UX Matters (Unmoderated, Remote Usability Testing: Good or Evil?) http:
//www.uxmatters.com/mt/archives/2010/01/unmoderated-remote-usability-testing-good-or-evil.php
Additional Links