Beyond Usability Testing: Assessing the Usefulness of Your Designhawleymichael
Usability tests are meant to find usability problems. If your question is, “where are the usability problems in this design”, usability testing is right for you. With usability testing, can study how well someone can get from point A to point B and where are the problems along the way. Finding usability problems is the focus, and the method works great.
But, we are finding that many of the questions business sponsors and stakeholders have are not about finding usability problems. The questions they have are more about the overall usefulness of a design, its potential for success, and how well it meets expectations.
This presentation will define usefulness research, show how it is different from usability tests, and offer different approaches for asking the right questions of users. Whether you think this is slap-your-forehead obvious or a method that needs to be expanded and refined, we seek to have a lively conversation.
Building Products Your Customers Love with Empathy and Human InsightsAggregage
Product teams are continuously under tight deadlines to quickly validate new ideas, features, and offerings to innovate successfully, ensure product-market fit, and avoid rework. Without the customer’s perspective, these teams often end up wasting time and resources building features that customers don’t use. This webinar will highlight the critical areas during the design and development process when reaching out to customers, as understanding their needs, testing hypotheses, and refining your approach are imperative.
Ericsson Review: Crafting UX - designing the user experience beyond the inter...Ericsson
There is more to a good user experience than attractive products and services that solve problems and function according to a given set of requirements. Creating products and services that provide compelling experiences for users requires planning, resources, and processes for monitoring progress and measuring quality – crafting UX.
Modern users are savvy and demanding, and their expectations are high. They want products and services that provide some level of value. They want their products to be aesthetically pleasing, emotionally satisfying, as well as easy to learn, use, install, maintain and upgrade.
Ericsson is shifting from being driven by technology to being driven by needs and experiences. This shift has manifested itself in the development of a design approach that gets close to the user. Crafting UX is a user experience (UX) framework with roles, responsibilities and guidelines to better understand, define and meet users’ needs.
Designing similar – yet not identical – assets that provide comparable functionality, in different ways for different products, is neither financially justifiable nor good in terms of usability. By reusing common assets and code for similar functionalities, design teams can focus on the important task of creating relevant content and functionality; in other words, content that is useful and usable.
By establishing a shared vision across all groups involved in the development of products and services teamwork becomes more effective and coordinated efforts lead to a greater design and a better user experience.
Overview of what should be taken into account when building out the estimate for User Experience work efforts. Along with an overview of to design estimation methods.
Tips for involving users in your website design - commercial property markete...estatesgazette.com, RBI
Jessica Hall, Research and UX Manager at Reed Business Insight, will be returning as our guest speaker to discuss top tips for user research including:
- surveys
- interviews
- persona development
- usability testing
Beyond Usability Testing: Assessing the Usefulness of Your Designhawleymichael
Usability tests are meant to find usability problems. If your question is, “where are the usability problems in this design”, usability testing is right for you. With usability testing, can study how well someone can get from point A to point B and where are the problems along the way. Finding usability problems is the focus, and the method works great.
But, we are finding that many of the questions business sponsors and stakeholders have are not about finding usability problems. The questions they have are more about the overall usefulness of a design, its potential for success, and how well it meets expectations.
This presentation will define usefulness research, show how it is different from usability tests, and offer different approaches for asking the right questions of users. Whether you think this is slap-your-forehead obvious or a method that needs to be expanded and refined, we seek to have a lively conversation.
Building Products Your Customers Love with Empathy and Human InsightsAggregage
Product teams are continuously under tight deadlines to quickly validate new ideas, features, and offerings to innovate successfully, ensure product-market fit, and avoid rework. Without the customer’s perspective, these teams often end up wasting time and resources building features that customers don’t use. This webinar will highlight the critical areas during the design and development process when reaching out to customers, as understanding their needs, testing hypotheses, and refining your approach are imperative.
Ericsson Review: Crafting UX - designing the user experience beyond the inter...Ericsson
There is more to a good user experience than attractive products and services that solve problems and function according to a given set of requirements. Creating products and services that provide compelling experiences for users requires planning, resources, and processes for monitoring progress and measuring quality – crafting UX.
Modern users are savvy and demanding, and their expectations are high. They want products and services that provide some level of value. They want their products to be aesthetically pleasing, emotionally satisfying, as well as easy to learn, use, install, maintain and upgrade.
Ericsson is shifting from being driven by technology to being driven by needs and experiences. This shift has manifested itself in the development of a design approach that gets close to the user. Crafting UX is a user experience (UX) framework with roles, responsibilities and guidelines to better understand, define and meet users’ needs.
Designing similar – yet not identical – assets that provide comparable functionality, in different ways for different products, is neither financially justifiable nor good in terms of usability. By reusing common assets and code for similar functionalities, design teams can focus on the important task of creating relevant content and functionality; in other words, content that is useful and usable.
By establishing a shared vision across all groups involved in the development of products and services teamwork becomes more effective and coordinated efforts lead to a greater design and a better user experience.
Overview of what should be taken into account when building out the estimate for User Experience work efforts. Along with an overview of to design estimation methods.
Tips for involving users in your website design - commercial property markete...estatesgazette.com, RBI
Jessica Hall, Research and UX Manager at Reed Business Insight, will be returning as our guest speaker to discuss top tips for user research including:
- surveys
- interviews
- persona development
- usability testing
User Experience Design + Agile: The Good, The Bad, and the UglyJoshua Randall
There's a rumor going around that user experience design (UXD) and Agile don't play well together. In this talk, I'll explain that they do -- most of the time! Learn about the historical reasons for why these two disciplines sometimes butt heads, as well as the good/bad/ugly of various approaches to integrating design and development.
Presented by: Brian Utesch, Annette Tassone, Jon Temple and Stephen Woodburn. Businesses strive to monetize the relationship between user sentiment and success outcomes including user adoption, user retention, and revenue. Customer satisfaction is embraced as a top predictor of success. There are of course many ways that satisfaction can be measured. We will review several methods of measuring user satisfaction, including simple Likert scale measures of overall satisfaction, the System Usability Scale (SUS), UMUX-Lite and the popular Net Promoter Scale (NPS). Not all of these measures are created equally or even measure the same sentiment. We’ll further compare the advantages and disadvantages of each measure, best practices around the use of each, and original research we’ve conducted that informs our recommended best practices.
Tackle the Problem with Design Thinking - GDSC UADgallangsadewa
Design thinking is most useful to tackle problems that are ill-defined or unknown. In user experience (UX) design, it’s crucial to develop and refine skills to understand and address rapid changes in users’ environments and behaviors. In this session, we will discuss about design thinking in digital product development or UI/UX.
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.pptUserWorks
Kristen Davis and Dick Horst from UserWorks presentation slides on the "Notetaker's Perspective During Usability Testing: Recognizing What's Important, What's Not" from UXPA-DC Conference Redux 2013
Imagine if designers conversed with you in a way that felt like object-oriented programming. Imagine if they handed off a design where, page after page, the objects you needed to code were edged in neon, so clearly defined they popped off the wireframe or comp. Imagine those objects were consistently presented; no one-off cases or guesswork required. Imagine you could take a design and almost create an ERD or rough out an API with it.
Well, good news. There’s no need to imagine it. It exists, and it’s called Object-Oriented UX (OOUX).
OOUX is a design methodology that helps us define usable, consistent products that naturally align with end users’ mental models. Similar to OOP, it asks us to define the objects in the real-world problem domain and design the information and relationships in each object before designing how the user might manipulate them. It's a powerful tool for any digital team, it's relatively easy to do, and it pays dividends fast.
Whether you are a developer, a designer, a content modeler, or someone who has influence over digital teams, OOUX offers a new and exciting option to add to your toolkit that will allow you to deliver better digital projects, quicker and more efficiently, and at a higher level of quality than ever before.
Presentation originally given at THAT Conference 2019
Preference and Desirability Testing: Measuring Emotional Response to Guide De...Paul Doncaster
(From UPA 2011-Atlanta) Usability practitioners have a variety of methods and techniques to inform interaction design and identify usability problems. However, these tools are not as effective at evaluating the visceral and emotional response generated by visual design and aesthetics. This presentation will discuss why studying visual design is important, review considerations for preference and desirability testing and present two alternative approaches to user studies of visual designs in the form of case studies.
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...SlideTeam
If your company needs to submit a Proposal Template To Increase Traffic To A Website PowerPoint Presentation Slides look no further. Our researchers have analyzed thousands of proposals on this topic for effectiveness and conversion. Just download our template, add your company data and submit to your client for a positive response. https://bit.ly/30H9zcm
UXPA International 2013 The Note-Taker's Perspective UserWorks
Kristen Davis's and Dick Horst's 2013 UXPA International presentation on The Note-Taker's Perspective During Usability Testing: Recognizing What's Important, What’s Not.
The process of reviewing design work can seem like an arcane endeavor that only senior designers and creative directors truly understand. Even then, it's frequently an opinion-laden process that can be easily steered off course by the loudest voices or non-design stakeholders. Design critique can and should be a more accessible process for everyone, from junior designers to C-level stakeholders.
In this webinar, Zac Halbert covers a systematic approach that maintains focus on the right elements at the right time, and educates non-design stakeholders so they can offer more meaningful feedback rather than obstruct the design process.
Zac Halbert runs the Product Design & UX track at Tradecraft, an immersive program that trains people to work in high growth startups. He also owns an independent product design consultancy called Scout Hawk Product Design Studio, where he helps entrepreneurs turn hazy ideas into concrete digital products, and Foliotwist, a portfolio and marketing SaaS company for visual artists. Zac's expertise lies in user experience design, product design, management, and rapid prototyping and idea validation that draw heavily from the Lean Startup philosophy.
Tradecraft is an Educational Partner with TryMyUI.
Visit TryMyUI's Educational Partnerships at http://trymyui.com/edu
How to effectively implement different online research methods - UXPA 2015 - ...Steve Fadden
Are you the sole User Experience Researcher in your organization? Do you struggle to get timely research insights and feedback for your stakeholders? Online research tools offer practitioners the ability to gather feedback quickly and asynchronously, without the need for direct facilitation or moderation.
In this presentation, we provide an overview of some of the many online research tools that are available for gathering quick, asynchronous feedback on requirements, designs, and stakeholder sentiment. We offer general guidelines for recruiting, planning, implementing, and analyzing feedback, and then present how to use specific methods that have proven particularly useful for design and requirements research.
Presented at UX Scotland in Edinburgh on 6/8/2016. Many of us are thrust into an Agile Development world. How do we do our best UX in a process designed by developers? Where do we belong and how do we work within a Scrum team?
Beyond Just Usability: Desirability and Usefulness TestingSusan Mercer
Much of our work in UX research focuses on usability – evaluating products and interfaces to ensure they are easy-to-use. However, in today’s digital world, they are no longer enough. Consumers also have come to expect entertaining and engaging experiences. Web and mobile applications need to be usable, useful and engaging.
So, how do we evaluate web interfaces to determine how useful and engaging they are? Desirability has been evaluated in recent years by the use of the Product Reaction Card technique, originated by folks at Microsoft. However, there are many other techniques used in market and industrial design research that we can borrow to complement this technique. Likewise, we can use standard usability testing techniques with lines of questioning with a slightly different focus to evaluate the relative usefulness of different solutions for a particular user group.
In this talk, I discuss several techniques that I have used in recent months to evaluate the usefulness and desirability of interfaces The best techniques I have discovered to evaluate usefulness involve open-ended interview questions regarding current processes and pain points, followed by a usability evaluation of the interface and then a reflective interview discussing the benefits and drawbacks of that solution to their personal situation. To evaluate desirability, I will discuss the product reaction card technique and variations using more defined vocabularies for emotional responses and product personalities. In addition I will show results from techniques borrowed from psychology and marketing research - sentence completion, collaging, and the use of dyad rating scales. These techniques offer a variety of both qualitative and quantitative data that can be used to compare different interface options.
User Experience Design + Agile: The Good, The Bad, and the UglyJoshua Randall
There's a rumor going around that user experience design (UXD) and Agile don't play well together. In this talk, I'll explain that they do -- most of the time! Learn about the historical reasons for why these two disciplines sometimes butt heads, as well as the good/bad/ugly of various approaches to integrating design and development.
Presented by: Brian Utesch, Annette Tassone, Jon Temple and Stephen Woodburn. Businesses strive to monetize the relationship between user sentiment and success outcomes including user adoption, user retention, and revenue. Customer satisfaction is embraced as a top predictor of success. There are of course many ways that satisfaction can be measured. We will review several methods of measuring user satisfaction, including simple Likert scale measures of overall satisfaction, the System Usability Scale (SUS), UMUX-Lite and the popular Net Promoter Scale (NPS). Not all of these measures are created equally or even measure the same sentiment. We’ll further compare the advantages and disadvantages of each measure, best practices around the use of each, and original research we’ve conducted that informs our recommended best practices.
Tackle the Problem with Design Thinking - GDSC UADgallangsadewa
Design thinking is most useful to tackle problems that are ill-defined or unknown. In user experience (UX) design, it’s crucial to develop and refine skills to understand and address rapid changes in users’ environments and behaviors. In this session, we will discuss about design thinking in digital product development or UI/UX.
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.pptUserWorks
Kristen Davis and Dick Horst from UserWorks presentation slides on the "Notetaker's Perspective During Usability Testing: Recognizing What's Important, What's Not" from UXPA-DC Conference Redux 2013
Imagine if designers conversed with you in a way that felt like object-oriented programming. Imagine if they handed off a design where, page after page, the objects you needed to code were edged in neon, so clearly defined they popped off the wireframe or comp. Imagine those objects were consistently presented; no one-off cases or guesswork required. Imagine you could take a design and almost create an ERD or rough out an API with it.
Well, good news. There’s no need to imagine it. It exists, and it’s called Object-Oriented UX (OOUX).
OOUX is a design methodology that helps us define usable, consistent products that naturally align with end users’ mental models. Similar to OOP, it asks us to define the objects in the real-world problem domain and design the information and relationships in each object before designing how the user might manipulate them. It's a powerful tool for any digital team, it's relatively easy to do, and it pays dividends fast.
Whether you are a developer, a designer, a content modeler, or someone who has influence over digital teams, OOUX offers a new and exciting option to add to your toolkit that will allow you to deliver better digital projects, quicker and more efficiently, and at a higher level of quality than ever before.
Presentation originally given at THAT Conference 2019
Preference and Desirability Testing: Measuring Emotional Response to Guide De...Paul Doncaster
(From UPA 2011-Atlanta) Usability practitioners have a variety of methods and techniques to inform interaction design and identify usability problems. However, these tools are not as effective at evaluating the visceral and emotional response generated by visual design and aesthetics. This presentation will discuss why studying visual design is important, review considerations for preference and desirability testing and present two alternative approaches to user studies of visual designs in the form of case studies.
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...SlideTeam
If your company needs to submit a Proposal Template To Increase Traffic To A Website PowerPoint Presentation Slides look no further. Our researchers have analyzed thousands of proposals on this topic for effectiveness and conversion. Just download our template, add your company data and submit to your client for a positive response. https://bit.ly/30H9zcm
UXPA International 2013 The Note-Taker's Perspective UserWorks
Kristen Davis's and Dick Horst's 2013 UXPA International presentation on The Note-Taker's Perspective During Usability Testing: Recognizing What's Important, What’s Not.
The process of reviewing design work can seem like an arcane endeavor that only senior designers and creative directors truly understand. Even then, it's frequently an opinion-laden process that can be easily steered off course by the loudest voices or non-design stakeholders. Design critique can and should be a more accessible process for everyone, from junior designers to C-level stakeholders.
In this webinar, Zac Halbert covers a systematic approach that maintains focus on the right elements at the right time, and educates non-design stakeholders so they can offer more meaningful feedback rather than obstruct the design process.
Zac Halbert runs the Product Design & UX track at Tradecraft, an immersive program that trains people to work in high growth startups. He also owns an independent product design consultancy called Scout Hawk Product Design Studio, where he helps entrepreneurs turn hazy ideas into concrete digital products, and Foliotwist, a portfolio and marketing SaaS company for visual artists. Zac's expertise lies in user experience design, product design, management, and rapid prototyping and idea validation that draw heavily from the Lean Startup philosophy.
Tradecraft is an Educational Partner with TryMyUI.
Visit TryMyUI's Educational Partnerships at http://trymyui.com/edu
How to effectively implement different online research methods - UXPA 2015 - ...Steve Fadden
Are you the sole User Experience Researcher in your organization? Do you struggle to get timely research insights and feedback for your stakeholders? Online research tools offer practitioners the ability to gather feedback quickly and asynchronously, without the need for direct facilitation or moderation.
In this presentation, we provide an overview of some of the many online research tools that are available for gathering quick, asynchronous feedback on requirements, designs, and stakeholder sentiment. We offer general guidelines for recruiting, planning, implementing, and analyzing feedback, and then present how to use specific methods that have proven particularly useful for design and requirements research.
Presented at UX Scotland in Edinburgh on 6/8/2016. Many of us are thrust into an Agile Development world. How do we do our best UX in a process designed by developers? Where do we belong and how do we work within a Scrum team?
Beyond Just Usability: Desirability and Usefulness TestingSusan Mercer
Much of our work in UX research focuses on usability – evaluating products and interfaces to ensure they are easy-to-use. However, in today’s digital world, they are no longer enough. Consumers also have come to expect entertaining and engaging experiences. Web and mobile applications need to be usable, useful and engaging.
So, how do we evaluate web interfaces to determine how useful and engaging they are? Desirability has been evaluated in recent years by the use of the Product Reaction Card technique, originated by folks at Microsoft. However, there are many other techniques used in market and industrial design research that we can borrow to complement this technique. Likewise, we can use standard usability testing techniques with lines of questioning with a slightly different focus to evaluate the relative usefulness of different solutions for a particular user group.
In this talk, I discuss several techniques that I have used in recent months to evaluate the usefulness and desirability of interfaces The best techniques I have discovered to evaluate usefulness involve open-ended interview questions regarding current processes and pain points, followed by a usability evaluation of the interface and then a reflective interview discussing the benefits and drawbacks of that solution to their personal situation. To evaluate desirability, I will discuss the product reaction card technique and variations using more defined vocabularies for emotional responses and product personalities. In addition I will show results from techniques borrowed from psychology and marketing research - sentence completion, collaging, and the use of dyad rating scales. These techniques offer a variety of both qualitative and quantitative data that can be used to compare different interface options.
Diving Deep: Uncovering Hidden Insights Through User InterviewsSusan Mercer
User interviews are a great technique for getting to know your target audience. However, sometimes people don’t feel comfortable answering questions from a researcher completely honestly. Other times they don’t know how to articulate exactly what they need, want, or feel.
We will examine research from psychology and market research to understand techniques for interviews to help you uncover insights beyond people’s superficial answers. We’ll explore conversation theory, projective techniques such as image associations, collaging, and others to encourage participants to share their stories. You'll learn to uncover hidden, actionable insights to fuel your designs.
Desirability Testing: Analyzing Emotional Response to a DesignMegan Grocki
In the design process we follow, once we have defined the conceptual direction and content strategy for a given design and refined our approach through user research and iterative usability testing, we start applying visual design. Generally, we take a key screen whose structure and functionality we have finalized—for example, a layout for a home page or a dashboard page—and explore three alternatives for visual style. These three alternative visual designs, or comps, include the same content, but reflect different choices for color palette and imagery. The idea is to present business owners and stakeholders with different visual design options from which they can choose. Sometimes there is a clear favorite among stakeholders or an option that makes the most sense from a brand perspective. However, there can often be disagreements among the members of a project team on which direction to choose. If we’ve done our job right, there are rationales for our various design decisions in the different comps, but even so, there may be disagreement about which rationale is most appropriate for the situation.
As practitioners of user-centered design, it is natural for us to turn to user research to help inform and guide the process of choosing a visual design. But traditional usability testing and related methods don’t seem particularly well suited for assessing visual design for two reasons:
1. When we reach out to users for feedback on visual design options, stakeholders are generally looking for large sample sizes—larger than are typical for a qualitative usability study.
2. The response we are looking for from users is more emotional—that is, less about users’ ability to accomplish tasks and more about their affective response to a given design.
With this in mind, we were very interested in articles we saw on Desirability Testing. In one article, the author posits desirability testing as a mix of quantitative and qualitative methods that allow you to assess users’ attitudes toward aesthetics and visual appeal. Inspired by his overview, we researched desirability studies a bit further and tried a modified version of the techniques on one of our projects. This presentation reviews the variants of desirability testing that we considered and the lessons we learned from a desirability study on visual design options for one of our projects. Interestingly, we found that while desirability testing did help us better understand participant’s self reported emotional response to a visual design, it also helped us identify other key areas of the experience that could be improved.
The fifth class of a 15 week course in Information Architecture taught at Parsons, the New School for Design. Topics include: Putting the Why before the what and the what before the how. The relationship of goals, requirements and features. How to deal with needed research and data as a requirement.
The goal of this presentation is to give attendees a deeper understanding of usability testing so they can leverage it in their own work. The material will shed light on what is important to the research buyer and will help the research provider to better understand how to plan, moderate, and report on a usability study. It will also provide information on where they can go to learn more about this very practical qualitative method.
Kay will cover what a usability test is and when to use it, the key planning steps, the language around it, and the unique insights this method produces. She will also discuss the various approaches a market researcher can take when running a usability study at different points in a product’s development (e.g., concept, early prototype, released product).
Jeff Belden MD and Janey Barnes PhD co-presented at HIMSS Virtual Conference June 2010. You can hear the audio recording if you are a HIMSS member, available online.
User experience & design user centered analysisPreeti Chopra
UCA is a multistage process which allows designers to analyze and foresee how user is going to use the product. UCA employs proven and objective data-gathering and analysis techniques to develop a clear understanding of who the users are and how they will approach a website or application.
This handout is connected to the Mentoring Program Evaluation & Goals webinar from Monday, May 16, 2011, as part of the free monthly webinar series from Friends for Youth's Mentoring Institute.
World Usability Day 2016 in Antwerp (Belgium), Thursday, November 10th - Jan Moons, UX expert and co-founder at UXprobe
"Hands on with Lean and Agile User Testing"
Jan Moons shows how to use the latest tools to easily integrate user testing into a lean process. Discover how user testing can be the answer for problems of conversion, usability, and UX quality. In the workshop you will explore all sides of user testing (be the user, be the moderator, be the client) and you will see how lean and agile user testing can be.
Jan is the co-founder of UXprobe, company that is focused on a mission of helping companies build great digital products that deliver a fantastic user experience. Jan has almost 20 years of experience as a software engineer and is a certified usability designer.
Everyone always want their own site look nice but how much they know about their user characteristics. This presentation will guide you about "key success factor to design a web site", "how to reach your target", "leading to win-win situation" and "testing your site and analyze results"
Lean UX in the Enterprise: A Government Case Studyuxpin
You'll learn:
- How to quickly identify user groups despite vague assumptions.
- How to define clear features amidst complex requirements and business objectives.
- How to establish efficient UX processes across disjointed teams.
This proposal of work contains details and samples of the user centric design process I follow. I have been trying to find a good graph that represents the process, but at the end I have decided to make my own! ;)
Similar to Beyond Usability Testing: Assessing the Usefulness of Your Design (20)
UXPA Boston 2024 Maximize the Client Consultant Relationship.pdfDan Berlin
It is very common for enterprise companies to use the services of external consultants, perhaps especially so in the field of user experience (UX). This is sometimes in the service of augmenting the company’s UX team who may not have the resources to complete all their desired projects. Consultants may also help companies who are newer to UX, where they introduce the client team to best practices and typical workflows. In either case, it’s critical to project success for both the consultant and client team to work in harmony. This presentation will provide generalizable best practices for collaborating with consultants from both the consultant and client viewpoints. Though the presentation focuses on the consultant/client relationship, all conference attendees will benefit from the provided communication and collaboration tips. Dan Berlin and Yina Turchetti presented this talk at the UXPA Boston 2024 conference.
Your "Psychologist Voice": Leveraging Voice Mindfulness for UX ResearchDan Berlin
Moderating a one-on-one interview to elicit the most actionable data is an acquired skill. A primary aspect of this, which we don’t normally talk about, is the tone, timbre, and pace of our voice. Some say that a moderator should try to match the participant’s tone; that this makes the participant feel that you are similar to him or her. But I believe that it is better to use your “psychologist voice” when moderating sessions. That is, you should always keep a soft tone, modulate your voice, stay quiet, and always be ready to turn a question back to the participant. In this presentation, I’ll reveal the fun origins of how I discovered the psychologist voice and why it not only makes for sessions that yield useful data, but is also an important life-skill.
Biometrics in UX Research: The Next Big StepDan Berlin
My talk from the 2015 Big Design Conference in Dallas, TX. Discusses how the use of biometric capture devices may give us a new tool in our user experience research toolkit.
User Experience (UX) Research in HealthcareDan Berlin
Healthcare companies should embrace iterative user research so that they may design products that aligns with their customers' wants and needs. UX research studies are not clinical trials - they are a means of learn how to best design a product for customers.
Beyond Eye Tracking: Bringing Biometrics to Usability ResearchDan Berlin
User experience research has traditionally relied upon qualitative techniques that entail users telling us their feelings, wants, and needs. This creates an inherent cognitive bias – data is filtered through the participant’s cognition. That is, we may not necessarily be hearing the participants’ true feelings. They may be trying to please the moderator or may just be unable to articulate the cause of their emotions. But researchers and stakeholders alike are thirsty for quantitative data that complements the qualitative. Luckily, we live in exciting times – there are two particular technologies that are becoming more accessible that will help usability researchers break through cognitive bias and provide that ever tantalizing quantitative data: eye tracking and biometrics. Eye tracking equipment has only recently started to become affordable to most anyone who wants to use it. Researchers must now get up-to-speed on eye tracking methodology and analysis. When is it appropriate? How can we turn the data into actionable findings? What the heck do I do with all of this new data?! More importantly, we should find new research techniques that will break through cognitive bias.
This is where the second technology comes in: biometrics. Psychophysiology is the study of how emotions affect changes in the body. Changes in heart rate, breathing rate, heart rate variability, and galvanic skin response (GSR) have all been shown to be accurate indicators of a person’s emotions, among others. Just as with eye tracking, the equipment to measure these biometrics are just now starting to become accessible to usability researchers. Until very recently, the equipment to gather this data was rather obtrusive and invasive. This not only affected participant comfort, but also did not lend to conducting “discount” usability research. But new technology allows the collection of biometrics in non-invasive ways. For instance, Affectiva’s Q Sensor is worn on the wrist and wirelessly gathers a participant’s GSR. The problem with integrating psychophysiological data into usability research is that individual researchers will need to come up with not only the algorithms to interpret the biometrics but also the technology to temporally marry the biometrics to the eye tracking data. These are no small tasks. There are companies out there that will collect and interpret the data for you for a hefty fee. But this technique should be in every usability researcher’s toolkit. As such, we should come together as a research community to figure this out. We need an open dialogue. We need to share techniques and stories.
Visual Principles of Experience Design: Blending Art and ScienceDan Berlin
Webinar description: What makes a user interface engaging and intuitive? Conversely, what makes some programs so difficult to use? The practice of experience design is a blending of art and science, informed by principles drawn from graphic arts, information theory and cognitive psychology. We are pattern seekers, and the more we understand how our visual system builds the patterns we see (or don't see), the more effectively we can control the user's experience.
We invite you to join Mad*Pow's Experience Design Director, Paul Kahn, and Experience Research Director, Dan Berlin, as they review visual cognition theories and show how the resulting principles are applied in experience design. Whether you are new to the field or an experienced practitioner, this presentation will introduce new topics and serve as a review of subjects that you may not have thought about in quite the same way. By raising awareness of how we think and how we see, we will show how theory informs our real-world visual design projects.
Between Filth and Fortune- Urban Cattle Foraging Realities by Devi S Nair, An...Mansi Shah
This study examines cattle rearing in urban and rural settings, focusing on milk production and consumption. By exploring a case in Ahmedabad, it highlights the challenges and processes in dairy farming across different environments, emphasising the need for sustainable practices and the essential role of milk in daily consumption.
7 Alternatives to Bullet Points in PowerPointAlvis Oh
So you tried all the ways to beautify your bullet points on your pitch deck but it just got way uglier. These points are supposed to be memorable and leave a lasting impression on your audience. With these tips, you'll no longer have to spend so much time thinking how you should present your pointers.
Hello everyone! I am thrilled to present my latest portfolio on LinkedIn, marking the culmination of my architectural journey thus far. Over the span of five years, I've been fortunate to acquire a wealth of knowledge under the guidance of esteemed professors and industry mentors. From rigorous academic pursuits to practical engagements, each experience has contributed to my growth and refinement as an architecture student. This portfolio not only showcases my projects but also underscores my attention to detail and to innovative architecture as a profession.
Dive into the innovative world of smart garages with our insightful presentation, "Exploring the Future of Smart Garages." This comprehensive guide covers the latest advancements in garage technology, including automated systems, smart security features, energy efficiency solutions, and seamless integration with smart home ecosystems. Learn how these technologies are transforming traditional garages into high-tech, efficient spaces that enhance convenience, safety, and sustainability.
Ideal for homeowners, tech enthusiasts, and industry professionals, this presentation provides valuable insights into the trends, benefits, and future developments in smart garage technology. Stay ahead of the curve with our expert analysis and practical tips on implementing smart garage solutions.
White wonder, Work developed by Eva TschoppMansi Shah
White Wonder by Eva Tschopp
A tale about our culture around the use of fertilizers and pesticides visiting small farms around Ahmedabad in Matar and Shilaj.
Beyond Usability Testing: Assessing the Usefulness of Your Design
1. Beyond Usability Testing: Assessing the Usefulness
of Your Design
UPA Boston Mini-Conference 2011
Prepared by:
Michael Hawley – Chief Design Officer
Daniel Berlin – Experience Research Director
May 25, 2011
4. Virzi, R.A., Refining the Test Phase of Usability Evaluation: How Many Subjects is Enough? Human Factors, 1992. 34(4): p. 457-468.
5. Trend
Business sponsors turn to us as UX professionals with
questions that are not about usability problems.
Rather, their questions are about overall user experience
strategy, value and usefulness.
8. Usefulness:
Inform a re-structure of the application to best align with
workflow.
Determine where to position productivity tips and help
buttons within the application for best utilization.
Find optimal level of personalization and customization
that users would take advantage of.
9.
10. Usability:
Assess effectiveness of navigation system in guiding users
to desired pages.
Evaluate descriptiveness and clarity of links.
Gauge ability of page layouts to orient users to relevant
content.
11. Usefulness:
Identify content that is missing which will help overcome
objections or answer critical questions?
Understand how branded labels and content themes
contribute to the overall experience or detract from it.
Determine level/types of promotions and interstitials that
are acceptable to users.
Understand how different audience personas prefer to
consume information for the particular domain.
12.
13. Usability:
Determine optimal level of difficulty to encourage
advancement to multiple levels of the game.
Assess discoverability of game features and controls.
14. Usefulness:
Find the optimal rate of point accumulation and alignment
with prize levels.
Understand best use of social media within or around the
game.
Determine the threshold for ads, interstitials and
registration for game play.
15.
16. Usability:
Assess if users can figure out how to add a comment,
share content, or use a tagging mechanism to find what
they are looking for.
17. Usefulness:
Determine the most compelling and appealing topics or
categories for conversation.
Understand the level of involvement the sponsoring
company should have in the social experience, if any.
Balance branded or non-branded experience for optimal
trust of the site.
Determine the elements or attributes that should allow
comment and review.
18.
19. Usability:
Find any confusion points or interruptions that prevent
users from registering.
Find misleading or ambiguous terminology.
20. Usefulness:
Determine the most persuasive elements that will compel
the target audience to register.
Understand a design’s impact a user’s perception of the
brand.
Position the offering and messaging against the company’s
competitors.
Determine missing content that can help target audience
make an informed decision about the product.
22. Natural Reaction
Turn to what we know:
Usability Testing
(one-on-one interviews, design
artifact, and tasks)
23. Are You Forgetting Contextual Inquiry and Foundational Research?
Discovery research and needs analysis is valid, but:
• Time and budget for separate research is not always an
option
• Many participants need design artifacts to elicit
appropriate reaction and commentary
24. Our Goal
Leverage the strengths of usability testing but adjust our
approach when objectives differ from finding usability
problems.
26. Three Components
Phase Usability Usefulness
Pre-Task Demographics, level of Daily task flow, pain points,
Questions expertise, prior experience. expectations, desires,
Goal: validating groups scenarios.
and classify results. Goal: set mindset for
usefulness evaluation of the
design
Tasks
Post-Task
Questions
27. Three Components
Phase Usability Usefulness
Pre-Task Demographics, level of Daily task flow, pain points,
Questions expertise, prior experience. expectations, desires,
Goal: validating groups scenarios.
and classify results. Goal: set mindset for
usefulness evaluation of the
design
Tasks Pre-defined tasks, minimal Emphasis on participant-
moderator intrusion, directed tasks
Goal: find usability Goal: understand how proposed
problems, measure time design aligns with user needs
on task and completion
percentage.
Post-Task
Questions
28. Three Components
Phase Usability Usefulness
Pre-Task Demographics, level of Daily task flow, pain points,
Questions expertise, prior experience. expectations, desires,
Goal: validating groups scenarios.
and classify results. Goal: set mindset for
usefulness evaluation of the
design
Tasks Pre-defined tasks, minimal Emphasis on participant-
moderator intrusion, directed tasks
Goal: find usability Goal: understand how proposed
problems, measure time design aligns with user needs
on task and completion
percentage.
Post-Task Level of satisfaction with Comparison with expectations
Questions the design, and points of and value. Task retrospectives.
confusion or ambiguity. Goal: discuss opportunities for
Goal: measure usability improvement.
30. Example: Usefulness
Goal:
Role of personalization
Pre-Task Questions:
Common tasks, pain
points
Tasks:
Emphasis on participant
direction
Post-Task Questions:
Comparison of
expectation
31. Summary
Foundational research is still important.
Usability testing is still important.
However, recognize when you have different goals and
adapt the research method as necessary.
32. Additional Information
Complete Presentation Slides
• http://www.slideshare.net/hawleymichael
• http://www.slideshare.net/banderlin
Contact Information
Michael Hawley Dan Berlin
mhawley@madpow.net dberlin@madpow.net
@hawleymichael @banderlin
Editor's Notes
Goal of usability testing is to find usability problems.
Transactional Applications
Transactional Applications
Transactional Applications
Content / Informational Sites
Content / Informational Sites
Content / Informational Sites
Gaming
Gaming
Gaming
Social Sites
Transactional Applications
Transactional Applications
Marketing / Persuasive
Marketing / Persuasive
Marketing / Persuasive
So when a client wants discount research to guide their UX strategy and wants to learn about the value and usefulness of a prototype or existing interface… what do we do? We turn to what we know – usability testing.
What strengths can we take from usability testing?How should the methodology evolve?What questions should we pose to elicit this information from participants?What is an acceptable prototype fidelity level?
Moderator and participant interact one-on-one (except for unmoderated) Three primary components:Pre-task questions: moderator interviews participant to clarify demographics, assess level of expertise, and gauge prior experienceTasks: participant proceeds through defined set of tasks with design artifact while “thinking aloud”. Moderator minimizes intrusion, identifies usability problems, and tracks metrics such as time on task and completion percentagePost-task questions: moderator asks participant about their experience with the tasks, level of satisfaction with the design, and points of confusion or ambiguityNo longer just determining demographics and level of experienceRather, need to set participant’s mindset for an evaluation of the design from a usefulness perspectiveBorrow from Ethnography, Contextual Inquiry, Interviewing and Laddering Daily task flow, current mechanisms Pain points Expectations, desires Ideal scenarios Other systemsEmphasis changes from contrived tasks to participant-directed tasks Tasks determined from pre-task and probing questions“Tell me how you would use this application in your daily routine. Okay, now please go do some of those on this prototype.”“You mentioned that you would need to determine your size – how would you do that?”Gather expectations at outset of the task “What would you expect to happen? What information would expect the page to contain?”Sets the basis for participant commentary (not that you should design exactly what they say, but it can inform ideas)This may not be a natural way to interact with the design, but we’re not testing usabilityFollow up questions emphasize expectations and value “How did completing this task compare with your expectations? Was it better or worse and why? What’s missing? What’s superfluous?” “How does it compare with other applications or systems?” “What other features or functions would you need when completing this task?”Participant directed to walk back through the task, responding to questions at key interaction pointsStrategic probing questions on salient components that interrupt or inform the task“What would make this more useful? How would you use this in your work/life?”“Does this table contain the columns that you need? Did the categories make sense?” Probe on areas that were not covered by the participant task“How would you get the detailed information about this product?"
Moderator and participant interact one-on-one (except for unmoderated) Three primary components:Pre-task questions: moderator interviews participant to clarify demographics, assess level of expertise, and gauge prior experienceTasks: participant proceeds through defined set of tasks with design artifact while “thinking aloud”. Moderator minimizes intrusion, identifies usability problems, and tracks metrics such as time on task and completion percentagePost-task questions: moderator asks participant about their experience with the tasks, level of satisfaction with the design, and points of confusion or ambiguityNo longer just determining demographics and level of experienceRather, need to set participant’s mindset for an evaluation of the design from a usefulness perspectiveBorrow from Ethnography, Contextual Inquiry, Interviewing and Laddering Daily task flow, current mechanisms Pain points Expectations, desires Ideal scenarios Other systemsEmphasis changes from contrived tasks to participant-directed tasks Tasks determined from pre-task and probing questions“Tell me how you would use this application in your daily routine. Okay, now please go do some of those on this prototype.”“You mentioned that you would need to determine your size – how would you do that?”Gather expectations at outset of the task “What would you expect to happen? What information would expect the page to contain?”Sets the basis for participant commentary (not that you should design exactly what they say, but it can inform ideas)This may not be a natural way to interact with the design, but we’re not testing usabilityFollow up questions emphasize expectations and value “How did completing this task compare with your expectations? Was it better or worse and why? What’s missing? What’s superfluous?” “How does it compare with other applications or systems?” “What other features or functions would you need when completing this task?”Participant directed to walk back through the task, responding to questions at key interaction pointsStrategic probing questions on salient components that interrupt or inform the task“What would make this more useful? How would you use this in your work/life?”“Does this table contain the columns that you need? Did the categories make sense?” Probe on areas that were not covered by the participant task“How would you get the detailed information about this product?"
Moderator and participant interact one-on-one (except for unmoderated) Three primary components:Pre-task questions: moderator interviews participant to clarify demographics, assess level of expertise, and gauge prior experienceTasks: participant proceeds through defined set of tasks with design artifact while “thinking aloud”. Moderator minimizes intrusion, identifies usability problems, and tracks metrics such as time on task and completion percentagePost-task questions: moderator asks participant about their experience with the tasks, level of satisfaction with the design, and points of confusion or ambiguityNo longer just determining demographics and level of experienceRather, need to set participant’s mindset for an evaluation of the design from a usefulness perspectiveBorrow from Ethnography, Contextual Inquiry, Interviewing and Laddering Daily task flow, current mechanisms Pain points Expectations, desires Ideal scenarios Other systemsEmphasis changes from contrived tasks to participant-directed tasks Tasks determined from pre-task and probing questions“Tell me how you would use this application in your daily routine. Okay, now please go do some of those on this prototype.”“You mentioned that you would need to determine your size – how would you do that?”Gather expectations at outset of the task “What would you expect to happen? What information would expect the page to contain?”Sets the basis for participant commentary (not that you should design exactly what they say, but it can inform ideas)This may not be a natural way to interact with the design, but we’re not testing usabilityFollow up questions emphasize expectations and value “How did completing this task compare with your expectations? Was it better or worse and why? What’s missing? What’s superfluous?” “How does it compare with other applications or systems?” “What other features or functions would you need when completing this task?”Participant directed to walk back through the task, responding to questions at key interaction pointsStrategic probing questions on salient components that interrupt or inform the task“What would make this more useful? How would you use this in your work/life?”“Does this table contain the columns that you need? Did the categories make sense?” Probe on areas that were not covered by the participant task“How would you get the detailed information about this product?"
Moderator and participant interact one-on-one (except for unmoderated) Three primary components:Pre-task questions: moderator interviews participant to clarify demographics, assess level of expertise, and gauge prior experienceTasks: participant proceeds through defined set of tasks with design artifact while “thinking aloud”. Moderator minimizes intrusion, identifies usability problems, and tracks metrics such as time on task and completion percentagePost-task questions: moderator asks participant about their experience with the tasks, level of satisfaction with the design, and points of confusion or ambiguityNo longer just determining demographics and level of experienceRather, need to set participant’s mindset for an evaluation of the design from a usefulness perspectiveBorrow from Ethnography, Contextual Inquiry, Interviewing and Laddering Daily task flow, current mechanisms Pain points Expectations, desires Ideal scenarios Other systemsEmphasis changes from contrived tasks to participant-directed tasks Tasks determined from pre-task and probing questions“Tell me how you would use this application in your daily routine. Okay, now please go do some of those on this prototype.”“You mentioned that you would need to determine your size – how would you do that?”Gather expectations at outset of the task “What would you expect to happen? What information would expect the page to contain?”Sets the basis for participant commentary (not that you should design exactly what they say, but it can inform ideas)This may not be a natural way to interact with the design, but we’re not testing usabilityFollow up questions emphasize expectations and value “How did completing this task compare with your expectations? Was it better or worse and why? What’s missing? What’s superfluous?” “How does it compare with other applications or systems?” “What other features or functions would you need when completing this task?”Participant directed to walk back through the task, responding to questions at key interaction pointsStrategic probing questions on salient components that interrupt or inform the task“What would make this more useful? How would you use this in your work/life?”“Does this table contain the columns that you need? Did the categories make sense?” Probe on areas that were not covered by the participant task“How would you get the detailed information about this product?"
Usability is still important!But, recognize when you have different goals and adjust accordinglyBe aware of nuances and differences between usability and usefulnessSet clear expectations and goals with project teamName it “usefulness study” rather than a “usability test”?Drawbacks: may need deeper prototypes, not getting at usability problems, Danger: asking the participant to help design it is easy to fall into, but not good